ahm_rimer@alien.topOPBtoLocalLLaMA@poweruser.forum•1.3B with 68.29% Humaneval lol, don't behead me. Part of my project PIC (partner-in-crime)English
1·
1 year agoTry the chat inference code mentioned in the model card if you’re running it on GPU. The size is good enough to test on free colab as well.
Not enough instruct data for Rust, i also am not familiar with Rust so can’t test it out.
I usually test things out with C, C++ and python only on my level.
Though if you know of some good source, I’ll use it for Rust fine tuning.