Google released T5X checkpoints for MADLAD-400 a couple of months ago, but nobody could figure out how to run them. Turns out the vocabulary was wrong, but they uploaded the correct one last week.
I’ve converted the models to the safetensors format, and I created this space if you want to try the smaller model.
I also published quantized GGUF weights you can use with candle. It decodes at ~15tokens/s on a M2 Mac.
It seems that NLLB is the most popular machine translation model right now, but the license only allows non commercial usage. MADLAD-400 is CC BY 4.0.
Not sure if this has been asked yet, but how good are the translations from this model compared to normal GPT-3.5 and Claude?
Thanks.
Good question. ALMA compares itself against NLLB and GPT3.5, and the 13B barely surpasses GPT3.5. MADLAD-400 probably beats GPT3.5 on lower resource languages only.