minus-squarePacmanIncarnate@alien.topBtoLocalLLaMA@poweruser.forum•Optimizing Your Language Model Experience: A Student's Journey with a Cutting-Edge PC featuring Core i7 14th Gen, RTX 4070 Ti, and 32GB DDR5 RAMlinkfedilinkEnglisharrow-up1·10 months agoA 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp linkfedilink
minus-squarePacmanIncarnate@alien.topBtoLocalLLaMA@poweruser.forum•Any alternatives to couqi for TTS?linkfedilinkEnglisharrow-up1·10 months agoAnd fast. Not sure they’ll find something better. linkfedilink
A 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp