Google released T5X checkpoints for MADLAD-400 a couple of months ago, but nobody could figure out how to run them. Turns out the vocabulary was wrong, but they uploaded the correct one last week.
I’ve converted the models to the safetensors format, and I created this space if you want to try the smaller model.
I also published quantized GGUF weights you can use with candle. It decodes at ~15tokens/s on a M2 Mac.
It seems that NLLB is the most popular machine translation model right now, but the license only allows non commercial usage. MADLAD-400 is CC BY 4.0.
I’ve been relying on Claude AI to translate Korean texts to english. I’m excited to use a local version if the context window is large enough.
I haven’t tested it but I’m surprised to see llms good enough to translate multiple languages running locally. I expected to see one to one language translation llms before this. Like an llm dedicated to Chinese - English translation, another llm dedicated to Korean - French etc.
Sorry to be pedantic, but the translation models they released are not LLMs. They are T5 seq2seq models with cross-encoding, as in the original Transformer paper. They did also release a LM that’s a Decoder-Only T5. They tried few-shot learning with it, but it performs much worse than the MT models.
I think that the first multilingual Neural Machine Translation model is from 2016: https://arxiv.org/abs/1611.04558. However, specialized models for pairs of languages are still popular. For example: https://huggingface.co/Helsinki-NLP/opus-mt-de-en
So I did with korean novel chapters, but since yesterday it started to either refuse translate, stopping in 1/6 of the text or writing some sort of summaries instead of translations.