Is this accurate?

  • JoseConseco_@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    So how much vram would be required for 34b model or 14b model? I assume no cpu offloading right? With my 12gb vram, I guess I could only feed 14bilion parameters models, maybe even not that.