Well I am shocked, SHOCKED I say! Well, not that shocked.

  • candyman337@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    edit-2
    31 minutes ago

    It’s just because I’m not impressed, like the raster performance bump for 1440p was just not worth the price jump at all. On top of that they have manufacturing issues and issues with their stupid 12 pin connector? And all the shit on the business side not providing drivers to reviewers etc. Fuuucccckk all that man. I’m waiting until AMD gets a little better with ray tracing and switching to team red.

  • moktor@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    21 minutes ago

    I’m still surviving on my RX580 4GB. Limping along these days, but no way I can justify the price of a new GPU.

  • Phoenicianpirate@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    27 minutes ago

    I bought my most expensive dream machine last year (when the RTX-4090 was still the best) and I am proud of it. I hope it’ll be my right for at least 10 years.

    But it was expensive.

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 hours ago

    I stopped maintaining a AAA-capable rig in 2016. I’ve been playing indies since and haven’t felt left out whatsoever.

    • tea@lemmy.today
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 minutes ago

      Indies are great. I can play AAA titles but don’t really ever… It seems like that is where the folks with the most creativity are focusing their energy anyways.

  • localhost443@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 hours ago

    Bought a 5700xt on release for £400, ran that til last year when the 7900gre released in the UK. Can’t remember what I paid but it was a lot less than the flagship 7900 and I forsee lasting many years as I have no desire to go above 2K.

    AMD GPUs have been pretty great value compared to nvidia recently as long as you’re not tying your self worth to your average FPS figures.

  • simple@piefed.social
    link
    fedilink
    English
    arrow-up
    25
    ·
    8 hours ago

    Unfortunately gamers aren’t the real target audience for new GPUs, it’s AI bros. Even if nobody buys a 4090/5090 for gaming, they’re always out of stock as LLM enthusiasts and small companies use them for AI.

  • ArxCyberwolf@lemmy.ca
    cake
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    7 hours ago

    I bought a 3070 for far more than I should’ve back when that was new, and I don’t plan to make that mistake twice. This GPU is likely going to be staying in this PC til it croaks. Never felt the need for anything more powerful anyway, it runs everything I need it to on high settings.

    • keyez@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      40 minutes ago

      Same I got a 3080 12G a few months after release for $1k from EVGA and it’s the most I’ve ever spent on a computer part. Next upgrade is def gonna be in the 600-700 range, not making that mistake again.

      • Bakkoda@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        I had lost all interest in games for a while. Desktop just ended up with me tinkering in the homelab. Steam deck has been so great to fall in love with gaming again.

  • bluesheep@lemm.ee
    link
    fedilink
    English
    arrow-up
    39
    ·
    13 hours ago

    Paying Bills Takes Priority Over Chasing NVIDIA’s RTX 5090

    Yeah no shit, what a weird fucking take

  • sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    39
    ·
    edit-2
    14 hours ago

    In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.

    (Lowest price I can find)

    … That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.

    The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.

    This reality is a farce.

    Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.

    RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.

    If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.

    That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.

    Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      ·
      14 hours ago

      Saved up for a couple of years and built the best (consumer grade) non nvidia PC I could, 9070XT, 9950X3D, 64gig of RAM. Pretty much top end everything that isn’t Nvidia or just spamming redundant RAM for no reason. The whole thing still costs less than a single RTX 5090 and on average draws less power too.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 hours ago

          I tried Mint and Ubuntu but Linux dies a horrific death trying to run newly released hardware so I ended up on ghost spectre.
          (I also assume your being sarcastic but I’m still salty about wasting a week trying various pieces of advice to make linux goddamn work)

          • bitwolf@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            5
            ·
            7 hours ago

            Levelone techs had relevant guidance.

            Kernel 6.14 or greater Mesa 25.1 or greater

            Ubuntu and Mint idt have those yet hence your difficult time.

      • sp3ctr4l@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        12 hours ago

        Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.

        I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.

        Does that sound about right?

        Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:

        Consoles cannot really do what they claim to do at 4K… at actual 4K.

        They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.

        Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          12 hours ago

          1000W PSU for theoretical maximum draw of all components at once with a good safety margin. But even when running a render I’ve never seen it break 500W.

  • JordanZ@lemmy.world
    link
    fedilink
    English
    arrow-up
    79
    ·
    17 hours ago

    When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal? I don’t understand people upgraded phones every year either. Both of those things are high cost for minimal gains between years. You really need 3+ years for any meaningful gains. Especially over the last few years.

    • qweertz (they/she)@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      ·
      7 hours ago

      Still rocking a GTX 1070 and I plan on using my Graphene OS Pixel 8 Pro till 2030 (only bought it (used ofc) bc my Huawei Mate 20 Pro died on my in October last year 😔)

    • Sixty@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      Sticking with 1440p on desktop has gone very well for me. 2160p isn’t worth the costs in money or perf.

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      41
      ·
      16 hours ago

      It doesn’t help that the gains have been smaller, and the prices higher.

      I’ve got a RX 6800 I bought in 2020, and nothing but the 5090 is a significant upgrade, and I’m sure as fuck not paying that kind of money for a video card.

      • arudesalad@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 hours ago

        I have a 6700xt and 5700x and my pc can do vr and play star citizen, they are the most demanding things I do on my pc, why should I spend almost £1000 to get a 5070 or 9070 and an am5 board+processor?

      • AndyMFK@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        11 hours ago

        I just picked up a used RX 6800 XT after doing some research and comparing prices.

        The fact that a gpu this old can outperform or match most newer cards at a fraction of the price is insane, but I’m very happy with my purchase. Solid upgrade from my 1070 Ti

      • ByteJunk@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        14 hours ago

        I’m in the same boat.

        In general, there’s just no way I could ever justify buying a Nvidia card in terms of cost per buck, it’s absolutely ridiculous.

        I’ll fork over 4 digits for a gfx when salaries go up by a digit as well.

      • GrindingGears@lemmy.ca
        link
        fedilink
        English
        arrow-up
        5
        ·
        13 hours ago

        Not to mention the cards have gotten huge and you just about need a nuclear reactor to power them. Melting cables and all.

    • Jesus_666@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      14 hours ago

      When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?

      Somewhere around 1996 when the 3dfx Voodoo came out. Once a year was a relatively conservative upgrade schedule in the late 90s.

        • Jesus_666@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          13 hours ago

          That’s still not cheap when you account for inflation. Of course there’s a world of difference between “not cheap” and what they charge these days.

    • missingno@fedia.io
      link
      fedilink
      arrow-up
      13
      ·
      16 hours ago

      I don’t think they’re actually expecting anyone to upgrade annually. But there’s always someone due for an upgrade, however long it’s been for them. You can compare what percentage of users upgraded this year to previous years.

      • dditty@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 hours ago

        I just finally upgraded from a 1080 Ti to a 5070 Ti. At high refresh-rate 1440p the 1080 Ti was definitely showing its age and certain games would crash (even with no GPU overclock). Fortunately I was able to get a PNY 5070 Ti for only ~$60 over MSRP at the local Microcenter.

        5000 series is a pretty shitty value across the board, but I got a new job (and pay increase) and so it was the right time for me to upgrade after 8 years.

    • 474D@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      15 hours ago

      “When did it just become expected that everybody would upgrade GPU’s every year and that’s suppose to be normal?” - that’s a really good question because I don’t think normal PC gamers have ever, and still aren’t, like that. It’s basically part of the culture to stretch your GPU to the limit of time so idk who you’re complaining about. Yeah, GPU prices are bullshit rn but let’s not make up stuff

      • zurohki@aussie.zone
        link
        fedilink
        English
        arrow-up
        8
        ·
        11 hours ago

        Nah, there was a time when you’d get a new card every two years and it’d be twice as fast for the same price.

        Nowadays the new cards are 10% faster for 15% more money.

        I bought a new card last year after running a Vega 64 for ages and I honestly think it might last me ten years because things are only getting worse.

    • overload@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      18
      ·
      17 hours ago

      Absolutely. True creative games are made by smaller dev teams that aren’t forcing ray tracing and lifelike graphics. The new Indianna Jones game isn’t a GPU-selling card, and is the only game that I’ve personally had poor performance on with my 3070ti at 1440p.