tldr: can i just drop a used 3090 in my old PC or would be a 4060ti new safer option?

Hi all!
I really want to make my feet wet in running local LLM, especially

  • inferencing of 7b models
  • some QLora fun

I’d like also to have fun running bigger, quantized models and, if possible, finetune some smallish model like GPT2-XL (like 1B) but if it’s feasible otherwise i’ll just rent some cloud. A little bit of gaming (Escape from Tarkov) in my freetime would’nt hurt

I’ve figure it out that my best GPU options are :

  • 4060ti 16gb for around 450€ new and hoping for some black friday deals
  • 3090 24gb used for around 700€

My current (very old) pc spec are the following:

  • i5 2500 3.3GHz
  • 16gb DDR3
  • Asus p8p67 LGA1155 ( 4x PCI-E 32 but bus width)
  • AMR R9 270 Sapphire
  • a 600 W PSU

So my questions are:

  1. Can I afford to invest all my budget in the 3090? I have a second PSU at home that will be used only to power the gpu out of the case
  2. Is it better to buy the 4060ti and use the remaining budget to upgrade older parts (in this case, which one?)

Thanks for the help guys!

  • Arkonias@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    3090 on old hardware sounds like bottleneck hell.

    Use the black friday sales to upgrade your Motherboard, Ram, CPU, and PSU.

  • DedyLLlka_GROM@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    The problem with 4060ti on an old hardware is that it runs using only 8x PCIe lanes, and it shows on even PCIe gen3. And your I5 2500 uses even older PCIe gen2. For LLMs, if you only going to use 7B, which might fit entirely on 16GB VRAM, it should be kinda OK. But for everything else you are going to have a very bad time.

  • FullOf_Bad_Ideas@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I would go with rtx 3090. You can always upgrade the rest once you get budget or needs in the future. I don’t think you should be bottlenecked by PCI Express. I had my rtx 3090 ti running in pcie 4.0 4x before I figured out there was some issue and I was getting good performance with exllama 2.4bpw llama 70B and other models that fit in VRAM. Training with QLoRA worked just fine too. That’s basically what you will get with pcie 2.0 x 16. I don’t know why it doesn’t want to kick in into 16x for me, but I was able to get it working with x8 by physically putting it in again, but I don’t see much difference with AI stuff.

  • ThisGonBHard@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Both are bad choices:

    IDK about the 3090 and the PSU, that thing can spike HARD, in the 1.6KW range, and if you PSU is a lower end one, it will kill it. I head a lot of people on 120V countries complain because it actually cause their lights to flicker.

    The 4060 Ti is limited by PCI-E 8X at 2.0 speeds, but that is a XX50 chip masquerading as a 60, so it sips power.

    3090 would be better, but you dont have the PC for it. Get a cheap used AMD B550 board for PCI-E 4.0, 64 GB of RAM and whatever CPU is within your remaining budget, all the way down from R5 3600 to R7 5800X3D, AM4 is really well segmented for price, even new. You will get better gaming performance too, and can run a lot of stuff in CPU with 64 GB of RAM.

    • crantob@alien.topB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Euro means he’s probably on 220-240v. Those spikes are in the millisecond range - but they do a number on lesser PSUs. I’ve found to run a 3090 on my lower end 525w psu, I needed to cap power to 280W and frequency to 1650 Mhz. Asus ROG Strix by default was punching up to 1900+Mhz and blackscreening me.

      Now things chug along just fine though. 9-year old PSU still inaudible with fanless hydro.

  • NekoHikari@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Try selling everything (the pc and spare PSU) and consider buying a decent used platform with AVX2 and a 1000w+ PSU, then see what GPU you can still afford. Plus, if you are just toying llms without training them, P40s are options too, if you can get one for around 100 Euro.

  • LocoLanguageModel@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I went 4060ti 16gb on small black Friday sale because I don’t want to upgrade my PSU, and it’s my workstation for both my jobs so I don’t want to mess with a used or refurbished 3090 which would be a lot more money if I factor the PSU.

    This way it costs me less than half as much for new gear and my goals is to just run 20b, 13b and smaller coding models and in time I feel like something with higher vram will come out at a reasonable price without requiring a huge PSU.

    I also have 64 gigs of ram if I need to call on a larger model.

    • C080@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      thanks for the feedback! I think my situation match a lot with your comment!