Igoory@alien.topBtoLocalLLaMA@poweruser.forum•Translate to and from 400+ languages locally with MADLAD-400English
1·
1 year agoI think I did exactly like you say, so I have no idea why you got an error.
I think I did exactly like you say, so I have no idea why you got an error.
The normal Yi model can go up to 32K btw
Llama2 base model can do it reasonably well.
Yes, it indeed works. I managed to run the 10B model on CPU, it uses 40GB of ram, but somehow I felt like your 3b space gave me a better translation.
I second this. I paid 5 bucks one month ago and to this day I only used $0.63!