• Norgur@kbin.social
    link
    fedilink
    arrow-up
    107
    arrow-down
    4
    ·
    1 year ago

    Thing is: there is always the “next better thing” around the corner. That’s what progress is about. The only thing you can do is choose the best available option for you when you need new hardware and be done with it until you need another upgrade.

      • wrath_of_grunge@kbin.social
        link
        fedilink
        arrow-up
        20
        arrow-down
        3
        ·
        1 year ago

        really my rule of thumb has always been when it’s a significant upgrade.

        for a long time i didn’t really upgrade until it was a 4x increase over my old. certain exceptions were occasionally made. nowadays i’m a bit more opportunistic in my upgrades. but i still seek out ‘meaningful’ upgrades. upgrades that are a decent jump over the old. typically 50% improvement in performance, or upgrades i can get for really cheap.

    • Hydroel@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      Yeah it’s always that: “I want to buy the new shiny thing! But it’s expensive, so I’ll wait for a while for its price to come down.” You wait for a while, the price comes down, you buy the new shiny thing and then comes out the newest shiny thing.

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        Yep. There will always be “just wait N months and there will be the bestest thing that beats the old bestest thing”. You are guaranteed to get buyers remorse when shopping for hardware. Just buy what best suits you or needs and budget at the time you decided is the best.time for you (or at the time your old component bites the dust) and then stop looking at any development on those components for at least a year. Just ignore any deals, new releases, whatever and be happy with the component you bought.

    • Nik282000@lemmy.ca
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      1 year ago

      I bought a 1080 for my last PC build, downloaded the driver installer and ran the setup. There were ads in the setup for the 2k series that had launched the day before. FML

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        Yep. I bought a 4080 just a few weeks ago. Now there is ads for the refresh all over… Thing is: you card didn’t get any worse. You thought the card was a good value proposition for you when you bought it and it hasn’t lost any of that.

    • alessandro@lemmy.caOP
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      choose the best available option

      “The” point. Which is the best available option?

      The simplest answer would be “price per fps”.

      • Norgur@kbin.social
        link
        fedilink
        arrow-up
        6
        ·
        1 year ago

        Not always. I’m doing a lot of rendering and such. So FPS aren’t my primary concern.

  • Schmuppes@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    1 year ago

    Major refresh means what nowadays? 7 instead of 4 percent gains compared to the previous generation?

    • NOT_RICK@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      The article speculates a 5% gain for the 4080 super but a 22% gain for the 4070 super which makes sense because the base 4070 was really disappointing compared to the 3070.

    • massive_bereavement@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      1 year ago

      For anything ML related, having the additional memory is worth the investment, as it allows for larger models.

      That said, at these prices it raises the question if it is more sensible to just throw money at GCP or AWS for their GPU node time.

    • zoe@jlai.lu
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      edit-2
      1 year ago

      just 10-15 years at least, for smartphones\electronics overall too. Process nodes are now harder to reduce, more than ever. holding up to my 12nm ccp phone like there is no tomorrow …

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      AMD is a better decision, but my nVidia works great with Linux, but I’m on OpenSUSE and nVidia hosts their own OpenSUSE drivers so it works out of the get go once you add the nVidia repo

      • gnuplusmatt@reddthat.com
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I had an nvidia 660 GT back in 2013, it was a pain in the arse being on a leading edge distro, used to break xorg for a couple of months every time there was an xorg release (which admittedly are really rare these days since its in sunset mode). Buying an amd was the best hardware decision, no hassles and I’ve been on Wayland since Fedora 35.

          • gnuplusmatt@reddthat.com
            link
            fedilink
            arrow-up
            1
            arrow-down
            2
            ·
            edit-2
            1 year ago

            yeah no, I dont want to be fucking with my machine just because I want to run a modern display server. I want my driver as part of my system. Until NV can get out of their own way and match the AMD experience (or even intel), not interested

        • lowmane@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          It’s not at all. You have a dated notion of the experience of the past few years+ with an nvidia gpu

          • gnuplusmatt@reddthat.com
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            dated notion of the experience

            Do I still have to load a module that taints my kernel and could break due to ABI incompatibility? Does wayland work in an equivalent manner to the in kernel drivers that properly support GBM?

  • state_electrician@discuss.tchncs.de
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Only slightly related question: is there such a thing as an external nVidia GPU for AI models? I know I can rent cloud GPUs but I am wondering if long-term something like an external GPU might be worth it.

    • baconisaveg@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      A 3090 (used) is the best bang for your buck for any LLM / StableDiffusion work right now. I’ve seen external GPU enclosures, though they probably cost as much as slapping a used 3090 into a barebones rig and running it headless in a closet.

    • AnotherDirtyAnglo@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Generally speaking, buying outright is always cheaper than renting, because you can always continue to run the device potentially for years, or sell it to reclaim some capital.

  • dellish@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Perhaps this is a good place to ask now the topic has been raised. I have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

    • vivadanang@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      have an ASUS TUF A15 laptop with an nVidia GTX 1650Ti graphics card and I am SO sick of 500MB driver “updates” that are basically beta tests that break one thing or another. What are the chances of upgrading to a Raedon/AMD graphics card? Or am I stuck with this shit?

      in a laptop? practically none. there are some very rare ‘laptops’ out there - really chonk tops - that have full size desktop gpu’s inside them. the vast majority, on the other hand, will have ‘mobile’ versions of these gpus that are basically permanently connected to the laptop’s motherboard (if not being on the mobo itself).

      one example of a laptop with a full-size gpu (legacy, these aren’t sold anymore): https://www.titancomputers.com/Titan-M151-GPU-Computing-Laptop-workstation-p/m151.htm note the THICK chassis - that’s what you need to hold a desktop gpu.

    • chemsed@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      In my experience, AMD is not more reliable on updates. I had to clean install trice to be able to have my RX 6600 function properly and months later, I have a freezing issue that may be caused by my GPU.

    • gazab@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      You could use an separate external gpu if you have thunderbolt ports. It’s not cheap and you sacrifice some performance but worth it for the flexibility in my opinion. Check out https://egpu.io/

  • LemmyIsFantastic@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    6
    ·
    1 year ago

    If the super is even remotely priced in a reasonable way I’ll be jumping on the 4080. Finally will get close to consistent 4k60.

    • MudMan@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      I miss that small time window where maxing out games and not having to tweak and tune was a thing.

      Is 4K60 the goal? Because I have a bunch of 120Hz displays, so… 4K60? Or what about 1440p120? Or maybe you can split the difference and try to get 90-ish at upscaled 4K and the VRR will eat the difference. And of course I have handhelds so those are a separate performance target altogether.

      These days you are tuning everything no matter what unless you’re running… well, a game from that era when 1080p60 was the only option.

      • BaroqueInMind@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        You’ll have your dream come true when consumers are told to upgrade their televisions after the next generation of game consoles mandate it as their next new shiny feature.

        • LemmyIsFantastic@lemmy.world
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          1 year ago

          I have a c1. I actually target 4k120 but that’s not going to happen consistently even on a 4090. Your seething is a lol.

        • skizzles@lemmy.ml
          link
          fedilink
          arrow-up
          5
          ·
          1 year ago

          Swapped over to a 7800XT about 3 months ago. Better Linux performance, tested a bit on Windows also and it worked fine, I’m more than satisfied with my decision to hop over from my 3060.

          • Bratwurstboy@iusearchlinux.fyi
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            Yeah, same here. I switched from a 3080 to a 7900 XTX and couldn’t be happier. FPS doubled in some games, it only needs two 8 pin connectors and I really like the adrenaline software. Haven’t tested Linux yet but I am going to soon.

            • skizzles@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              The Linux driver isn’t the most straightforward to install but it’s not difficult. You have to install the installer first, then install the driver with the installer.

              Only caveat is you don’t get the adrenaline software, but it’s kind of a moot point as it wouldn’t work anyway due to how Linux works.

              I’m running Ubuntu 22.04, was using gnome but switched over to KDE and stopped getting crashes in certain games and got a small performance increase. I suppose that would be due to dropping Wayland for X when I moved to KDE.

    • UnspecificGravity@discuss.tchncs.de
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      For the vast majority of customers that aren’t looking to spend close to a grand for a card that is infinitesimally better than a card for half the price, AMD has plenty to offer.

    • Fridgeratr@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago

      AMD is absolutely cutting it!! They may not get DLSS or ray trace as well but their cards still kick ass