I have a good PC, but the problem is with my RAM. The model is LLM, which makes the memory requirement understandable, but I can’t afford to get that much RAM right now so I’m looking for a roundabout. Any help?
I have a good PC, but the problem is with my RAM. The model is LLM, which makes the memory requirement understandable, but I can’t afford to get that much RAM right now so I’m looking for a roundabout. Any help?
Do You Have the Slightest Idea How Little That Narrows It Down?
That is exactly the reason, I’m literally looking at my options and from the comments, it definitely isn’t looking good 😭
Let’s say I want to work on Noon (https://huggingface.co/Naseej/noon-7b), how much would I actually need?
Paste your model name in this HF space, https://huggingface.co/spaces/Vokturz/can-it-run-llm
https://imgur.com/a/Ednemii (the result)
It seems you need less than 32 GiBs vram
And thanks to you, it now works. I knew exactly what to do and what I needed to get it to work. Thanks, man!
Not trying to be rude, but you’re also like saying you want to participate in a car race and you don’t have a fast car kind of a prerequisite unfortunately.
You could look at things that offload your model to disk but they’re going to be slow as hell.