minus-squareradianart@alien.topBtoLocalLLaMA@poweruser.forum•Fitting 70B models in a 4gb GPU, The whole model, no quants or distil or anything!linkfedilinkEnglisharrow-up1·11 months agoIs here a better way to use bigger models than can fit in RAM\VRAM? I’d want to try 70b or maybe even 120b but I only have 32\8gb. linkfedilink
Is here a better way to use bigger models than can fit in RAM\VRAM? I’d want to try 70b or maybe even 120b but I only have 32\8gb.