tldr: can i just drop a used 3090 in my old PC or would be a 4060ti new safer option?

Hi all!
I really want to make my feet wet in running local LLM, especially

  • inferencing of 7b models
  • some QLora fun

I’d like also to have fun running bigger, quantized models and, if possible, finetune some smallish model like GPT2-XL (like 1B) but if it’s feasible otherwise i’ll just rent some cloud. A little bit of gaming (Escape from Tarkov) in my freetime would’nt hurt

I’ve figure it out that my best GPU options are :

  • 4060ti 16gb for around 450€ new and hoping for some black friday deals
  • 3090 24gb used for around 700€

My current (very old) pc spec are the following:

  • i5 2500 3.3GHz
  • 16gb DDR3
  • Asus p8p67 LGA1155 ( 4x PCI-E 32 but bus width)
  • AMR R9 270 Sapphire
  • a 600 W PSU

So my questions are:

  1. Can I afford to invest all my budget in the 3090? I have a second PSU at home that will be used only to power the gpu out of the case
  2. Is it better to buy the 4060ti and use the remaining budget to upgrade older parts (in this case, which one?)

Thanks for the help guys!

  • FullOf_Bad_Ideas@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    I would go with rtx 3090. You can always upgrade the rest once you get budget or needs in the future. I don’t think you should be bottlenecked by PCI Express. I had my rtx 3090 ti running in pcie 4.0 4x before I figured out there was some issue and I was getting good performance with exllama 2.4bpw llama 70B and other models that fit in VRAM. Training with QLoRA worked just fine too. That’s basically what you will get with pcie 2.0 x 16. I don’t know why it doesn’t want to kick in into 16x for me, but I was able to get it working with x8 by physically putting it in again, but I don’t see much difference with AI stuff.