• Sludgehammer@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    2 months ago

    Gee, we’ve had over a half century of computer graphics at this point. However, suddenly when a technology arises that requires obscene amount of GPU’s to generate a results a GPU manufacturer is here to tell us that all computer graphics without that new technology is dead for… reasons. I cannot see any see any connections between these points.

    • abruptly8951@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Devils advocate: Splatting, dlss, neural codecs to name a few things that will change the way we make games

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      I think what he means is that AI is needed to keep making substantial improvements in graphic quality, and he phrased it badly. Your interpretation kind of presumes he’s not only lying, but that he thinks we’re all idiots. Given that he’s not running for office as a Republican, I think that’s a very flawed assumption.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      1
      arrow-down
      2
      ·
      2 months ago

      What do you mean “suddenly”? I was running path tracers back in 1994. It’s just that they took minutes to hours to generate a 480p image.

      The argument is that we’ve gotten to the point where new rendering features rely on a lot more path tracing and light simulation that used to not be feasible in real time. Pair that with the fact that displays have gone from 1080p60 vsync to 4K at arbitrarily high framerates and… yeah, I don’t think you realize how much additional processing power we’re requesting.

      But the good news is if you were happy with 1080p60 you can absolutely render modern games like that in a modern GPU without needing any upscaling.

      • Kushan@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        I think you just need to look at the PS5 Pro as proof that more GPU power doesn’t translate linearly to better picture quality.

        The PS5 Pro has a 67% beefier GPU than the standard PS5 - with a price to match - yet can anyone say the end result is 67% better? Is it even 10% better?

        We’ve been hitting diminishing returns on raw rasterising for years now, a different approach is definitely needed.