Hey guys, thinking of upgrading my PC, I’m a dev and i wanna run my own LLMs, it’s more to run my own copilot locally instead on relying on outside services. This is what I got now

Ryzen 7 3700x 32GB RAM 5500XT

Debating whether I should get a 3950x or 5800x3D as I can game abit better as well As for the GPU I might just go for the 4090, but if this is overkill please let me know. What you guys think?

  • FullOf_Bad_Ideas@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I upgraded from gtx 1080 to rtx 3090 ti 2 weeks ago. I think going with rtx 3090 / 3090 ti / 4090 would be a good option for you, I don’t know how big of a difference having stronger cpu would have, I think exllama v2 has some cpu bottlenecking going on, but I have no idea what is computed on cpu and why. There were moments during generation where it seemed like it was using only 1 thread and it was maxing it out, being bottleneck for gpu. I don’t think ram matters a lot unless you train and merge loras and models.

    • SupplyChainNext@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Meh I’m running 13bs on a 13900k 64 gb ddr5 and a 6900xt with LM studio and it’s faster than my office workstations 12900ks 3090ti. Sometimes ram and processor with decent VRM is enough.

      • ntn8888@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Your comparison proves his point! 13b will fit snuggly in your 6900 this is a head on comparison of the cards!

  • ntn8888@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Welcome to the rabbit hole 😁. On a serious note, going for the newer generations pays dividends, in my opinion.

  • OneConfusion3313@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Curious, does anyone consider rtx A5000 / 6000? It’s twice the price of 4090/3090, but the performance improvement should be more than double

  • ababana97653@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    You want large vram on your video card. What ever is the largest ram video card that you can afford is the answer.