Hello, I’m a student delving into the study of large language models. I recently acquired a new PC equipped with a Core i7 14th Gen processor, RTX 4070 Ti graphics, and 32GB DDR5 RAM. Could you kindly suggest a recommended language model for optimal performance on my machine?

  • opi098514@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    So you are soon gunna realize that unfortunately your pc is not as cutting edge as you think. Your main need is vram. For the 4070 ti you only have 12 gigs of vram. So you will be limited to 7b and 13b models. You can load into ram though but your speeds plummet. Mistal 7b is a good option to start with.

    • PacmanIncarnate@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      A 24 GB GPU is still limited to fitting a 13B fully in VRAM. His PC is a great one; not the highest end, but perfectly fine to run anything up to a 70B in llama.cpp

      • opi098514@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        I didn’t say it wasn’t. But getting into LLMs really just shows you how much better your PC can be and you will never been as cutting edge as you think or want.