No_Baseball_7130@alien.topBtoLocalLLaMA@poweruser.forum•Cheapest site for hosting custom LLM models?English
1·
1 year ago0.000575
that is nearly 2.1$ per hour. on https://runpod.io, you could get an a40 for 0.79$ / hr. for a 34b model, 24gb vram is more than enough so you could get a A5000 for around 0.44$ / hr
you could get a gpu like a p100 16gb for simple ai work or a v100/A4 for slightly more heavy duty work
p100s only cost around 170$, so it’s cheap to upgrade the gpu