Google released T5X checkpoints for MADLAD-400 a couple of months ago, but nobody could figure out how to run them. Turns out the vocabulary was wrong, but they uploaded the correct one last week.

I’ve converted the models to the safetensors format, and I created this space if you want to try the smaller model.

I also published quantized GGUF weights you can use with candle. It decodes at ~15tokens/s on a M2 Mac.

It seems that NLLB is the most popular machine translation model right now, but the license only allows non commercial usage. MADLAD-400 is CC BY 4.0.

    • Igoory@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Yes, it indeed works. I managed to run the 10B model on CPU, it uses 40GB of ram, but somehow I felt like your 3b space gave me a better translation.

      • cygn@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        How do you load the model? I pasted jbochi/madlad400-3b-mt in the download model field and used “transformers” model loader, but it can’t handle it. OSError: It looks like the config file at ‘models/model.safetensors’ is not a valid JSON file.

        • Igoory@alien.topB
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I think I did exactly like you say, so I have no idea why you got an error.