You_Wen_AzzHu@alien.topBtoLocalLLaMA@poweruser.forum•100B, 220B, and 600B models on huggingface!English
1·
11 months agoWe need some 4090s with 500gb VRAM modified in China if possible.
We need some 4090s with 500gb VRAM modified in China if possible.
Then you realize,it would be great to have an a100 80gb.
Who gives a shit about the age?
There is nothing open in openai.
Where else to make 1 mil a year?
0.3 tokens per second is not “handling”.
Game Snake will be included in all models.
I know phind codellama 34b is the best local LLM. But it is not close to GPT4.
Limit the cpu to run at 30% . Remove the battery .then you are fine.
Your home PC is the cheapest.
I don’t believe Oracle for anything.
I will pay $1 to use it. $20 is GPT4 level.