Was wondering if theres anyway to use a bunch of old equipment like this to build an at home crunch center for running your own LLM at home, and whether it would be worth it.
Was wondering if theres anyway to use a bunch of old equipment like this to build an at home crunch center for running your own LLM at home, and whether it would be worth it.
Those series of Nvidia gpus didn’t have tensor cores yet and believe they started in 20xx series. I not sure how much it impacts inference purposes vs training/fine tuning but worth doing more research. From what I gathered the answer is “no” unless you use a 10xx for like monitor output, TTS, or other smaller co-llm use that you don’t want taking vram away from your main LLM GPUs.