Screens keep getting faster. Can you even tell? | CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wo…::CES saw the launch of several 360Hz and even 480Hz OLED monitors. Are manufacturers stuck in a questionable spec war, or are we one day going to wonder how we ever put up with ‘only’ 240Hz displays?

  • a1studmuffin@aussie.zone
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    3
    ·
    10 months ago

    I’d much rather they invest efforts into supporting customisable phones. Instead of just releasing a few flavours of the same hardware each year, give us a dozen features we can opt into or not. Pick a base size, then pick your specs. Want a headphone jack, SD card, FM radio, upgraded graphics performance? No problems, that’ll cost a bit extra. Phones are boring now - at least find a way to meet the needs of all consumers.

    • ggwithgg@feddit.nl
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      Not exactly what you are talking about, but slightly related: the company Fairphone makes phones with parts that can easily be replaced. The philosophy is that you will not have to buy a new phone every 3 years. They do have some customized options aswell (i.e. ram, storage, models) but its limited.

      But going full on optimization with phones, laptops and tablets, similar as a desktop, is just incredibly hard due to the lack of space in the device for the components. As such it makes more sense to offer a wide variety of models, with some customizable options, and then have the user pick something.

      • 1rre@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        8
        ·
        10 months ago

        On Fairphone, they flat out refuse to even discuss adding a headphone jack (check the posts in their forums - it’s a “hands over ears” no) so I’m sticking with Sony/ASUS (the latter atm as they’ve been slightly less anticompetitive recently but I’d much rather go to a decent company) until they do… It’s not like you notice a phone being 1mm thicker when you have a 3mm case on it anyway

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        My problem with fair phone is that they use old hardware.

        I never replaced any parts on my old phone and only replaced the phone with a new one because it was getting really slow. I replaced the xr with an iPhone 15.

        So my concern with the fair phone is that I’ll replace it with faster hardware more frequently than I would have replaced a no repairable phone that’s faster.

    • stevecrox@kbin.run
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      I wish a company would build 4.5"-5.5" and 5.5"-6.5" flagship phones, put as many features that make sense in each.

      Then when you release a new flagship the last flagship devices become your ‘mid range’ and you drop the price accordingly, with your mid range dropping to budget the year after.

      When Nokia had 15 different phones out at a time it made sense because they would be wildly different (size, shape, button layout, etc…).

      These days everyone wants as large a screen as possible on a device that is comfortable to hold, we really don’t need 15 different models with slightly different screen ratios.

      • wikibot@lemmy.worldB
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        10 months ago

        Here’s the summary for the wikipedia article you mentioned in your comment:

        Project Ara was a modular smartphone project under development by Google. The project was originally headed by the Advanced Technology and Projects team within Motorola Mobility while it was a Google subsidiary. Google retained the ATAP group when selling Motorola Mobility to Lenovo, and it was placed under the stewardship of the Android development staff; Ara was later split off as an independent operation. Google stated that Project Ara was being designed to be utilized by "6 billion people": 1 billion current smartphone users, and 5 billion feature phone users.Under its original design, Project Ara was intended to consist of hardware modules providing common smartphone parts, such as processors, displays, batteries, and cameras, as well as modules providing more specialized components, and "frames" that these modules were to be attached to. This design would allow a device to be upgraded over time with new capabilities and upgraded without requiring the purchase of an entire new device, providing a longer lifecycle for the device and potentially reducing electronic waste. However, by 2016, the concept had been revised, resulting in a base phone with non-upgradable core components, and modules providing supplemental features. Google planned to launch a new developer version of Ara in the fourth quarter of 2016, with a target bill of materials cost of $50 for a basic phone, leading into a planned consumer launch in 2017. However, on September 2, 2016, Reuters reported that two non-disclosed sources leaked that Alphabet's manufacture of frames had been canceled, with possible future licensing to third parties. Later that day, Google confirmed that Project Ara had been shelved.

        to opt out, pm me ‘optout’. article | about

      • wikibot@lemmy.worldB
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Here’s the summary for the wikipedia article you mentioned in your comment:

        Project Ara was a modular smartphone project under development by Google. The project was originally headed by the Advanced Technology and Projects team within Motorola Mobility while it was a Google subsidiary. Google retained the ATAP group when selling Motorola Mobility to Lenovo, and it was placed under the stewardship of the Android development staff; Ara was later split off as an independent operation. Google stated that Project Ara was being designed to be utilized by "6 billion people": 1 billion current smartphone users, and 5 billion feature phone users.Under its original design, Project Ara was intended to consist of hardware modules providing common smartphone parts, such as processors, displays, batteries, and cameras, as well as modules providing more specialized components, and "frames" that these modules were to be attached to. This design would allow a device to be upgraded over time with new capabilities and upgraded without requiring the purchase of an entire new device, providing a longer lifecycle for the device and potentially reducing electronic waste. However, by 2016, the concept had been revised, resulting in a base phone with non-upgradable core components, and modules providing supplemental features. Google planned to launch a new developer version of Ara in the fourth quarter of 2016, with a target bill of materials cost of $50 for a basic phone, leading into a planned consumer launch in 2017. However, on September 2, 2016, Reuters reported that two non-disclosed sources leaked that Alphabet's manufacture of frames had been canceled, with possible future licensing to third parties. Later that day, Google confirmed that Project Ara had been shelved.

        to opt out, pm me ‘optout’. article | about

  • aaaantoine@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    ·
    10 months ago

    On one hand, 360hz seems imperceptibly faster than 240hz for human eyes.

    On the other hand, if you get enough frames in, you don’t have to worry about simulating motion blur.

    • DosDude👾@retrolemmy.com
      link
      fedilink
      English
      arrow-up
      49
      arrow-down
      6
      ·
      10 months ago

      I never worry about motion blur, because I turn it off. The stupidest effect ever. If I walk around I don’t see motion blur. Cameras see motion blur because of shutter speed, not the human eye.

      • Petter1@lemm.ee
        link
        fedilink
        English
        arrow-up
        24
        arrow-down
        1
        ·
        10 months ago

        Umm, well, there is something like motion blur experienced by humans, in fact, your brain creates the time bending effect based on picture 1 and picture 2

        https://www.abc.net.au/science/articles/2012/12/05/3647276.htm

        There is a trick where you watch a clock that counts seconds and turn your head fastly away and back there (or something like that) and you will see, that the rate of seconds seem to be inconsistent

        See “1. CHRONOSTASIS” https://bigthink.com/neuropsych/time-illusions/

        • DosDude👾@retrolemmy.com
          link
          fedilink
          English
          arrow-up
          9
          ·
          10 months ago

          Alright. I didn’t know, thanks. Though the human motion blur is vastly different to camera blur in my experience. And games that have motion blur look really unnatural.

          • VindictiveJudge@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 months ago

            More realistic blur smudges things based on how the object is moving rather than how the camera is moving. For example, Doom Eternal applies some blur to the spinning barrels and the ejected shells on the chaingun while it’s firing, but doesn’t blur the world while you’re sprinting.

        • Fermion@mander.xyz
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          10 months ago

          On the other hand, humans don’t see in defined frames. The signals aren’t synchronized. So a big part of perceived blurring is that the succession of signals isn’t forming a single focused image. There isn’t really a picture 1 and 2 for your brain to process discreetly. And different regions in your vision are more sensitive to small changes than others.

          A faster refresh rate is always “better” for the human eye, but you’ll need higher and higher panel brightness to have a measurable reaction time difference.

          But hitting really high refresh rates requires too many other compromises on image quality, so I won’t personally be paying a large premium for anything more than a 120hz display for the time being.

      • Ms. ArmoredThirteen@lemmy.ml
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        10 months ago

        Motion blur in games gives me bad motion sickness and garbles what I’m seeing. I already have a hard enough time processing information fast enough in any kind of fast paced game I don’t need things to be visually ambiguous on top of that

    • Ms. ArmoredThirteen@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      That also depends on the person. Save for really fast moving things I can barely tell the difference between 30 and 60fps, and I cap out at 75 before I can’t notice a difference in any situation. One of my friend’s anything less than 75 gives them headaches from the choppiness.

      • Clam_Cathedral@lemmy.ml
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        Yeah, personally playing games at 30fps feels disruptively laggy at least for the first few minutes. 60 is good, but the jump to 120 is night and day. I was shocked that going from 120 to 240 was just as noticeable an improvement as the last to me, especially when so many people say they don’t notice it much. Hard to find newer games that give me that much fps though.

  • morrowind@lemmy.ml
    link
    fedilink
    English
    arrow-up
    16
    ·
    10 months ago

    Well no, because most people aren’t getting them. It’s nice but it’s difficulty to justify spending hundreds on a lightly better screen

    • 1984@lemmy.today
      link
      fedilink
      English
      arrow-up
      9
      ·
      10 months ago

      This tech trickles down to mainstream in a few years. That’s always how it is.

  • vext01@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    2
    ·
    10 months ago

    Reminiscent of the hi-res audio marketing. Why listen at a measly 24bit 48khz when you can have 32/192?!

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      12
      ·
      10 months ago

      These have an actual perceivable difference even if subtle. Hires audio, however, is inaudible by humans.

      • vext01@lemmy.sdf.org
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        I tend to agree, but the audiophiles always have an answer to rebuttal it with.

        I’m into audio and headphones, but since I’ve never been able to reliably discern a difference with hi-res audio, I no longer let it concern me.

        • PastyWaterSnake@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          10 months ago

          I’ve bought pretty expensive equipment, tube amplifier, many fancy headphones, optical DACs. A library full of FLAC files. I even purchased a $500 portable DAP. I’ve never been able to reliably tell a difference between FLAC and 320k MP3 files. At this point, it really doesn’t concern me anymore either, but I at least like to see my fancy tube amp light up.

          I will say, though, $300 seems to be the sweet-spot for headphones for me.

        • bitwolf@lemmy.one
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Imo the biggest bump is from mp3 to lossless. The drums sound more organic on flacs whereas on most mp3s they sound like a computer MIDI sound.

          The biggest bump for me was the change in headphones. It made my really old aac 256kbps music sound bad.

          • vext01@lemmy.sdf.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Tried flac vs 192 vorbis with various headphones. E.g. moondrop starfield, fiio fa1, grado sr80x…

            Can’t tell a difference. Kept using vorbis.

      • Sombyr@lemmy.zip
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I’d thought I could hear a difference in hires audio, but after reading up on it I’m starting to think it may have been some issue with the tech I was using, whether it be my headphones or something else, that made compressed audio sound veeeery slightly staticky when high notes or loud parts of the track played.
        Personally though, even if it wasn’t, the price for the equipment wasn’t worth it for a difference that was only perceptible if I was listening for it. Not to mention it’s near impossible to find hires tracks from most bands. Most claiming to be hires are just converted low res tracks and thus have no actual difference in sound quality, the only difference being the file is way larger for no good reason.

      • otp@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        10 months ago

        They have tests you can take to see if you can hear the difference. A lot of people fail! Lol

        • Lesrid@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Usually percussion is where it’s easiest to notice the difference. But typically people prefer the relatively more compressed sound!

  • Snoopey@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 months ago

    All I want is a 27/28 inch oled 4k monitor with good hdr. I don’t care about the refresh rate as long a it’s 60Hz+

    • dai@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      10 months ago

      Minimum for me would be 120hz, i’ve been using 120hz since 2012 (12 years… man) and anything less feels like a massive step backwards. My old S10+ and my cheapie laptop feel sluggish in any animated / transmission scenario.

    • bitwolf@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10 months ago

      I’m sticking out with IPS until MicroLED matures enough for me to afford.

      OLED was never designed to be used as a computer monitor and I don’t want a monitor that only lasts a couple years.

      Researchers just designed a special two layer (thicker than current OLED) that doubles the lifespan to 10,000hours at 50% brightness without degrading.

      I’m totally with you on good HDR though. When it works, it’s as night -and-day as 60 -> 144hz felt for me.

        • bitwolf@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          10 months ago

          It doesn’t only last for two years, however it begins to degrade after one year of illuminating blue. This would reduce the color accuracy.

          However OLEDs are also very bad at color accuracy across it’s brightness range. Typically at lower brightness their accuracy goes out the window.

          This isn’t as bad on smart phones ( smart phones also apply additional mitigations such as subpixel rotation) however desktop computers typically display static images for much longer and so not use these mitigations afaik.

  • ColeSloth@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    2
    ·
    10 months ago

    I don’t need or want a phone over 90hz, and a pc screen over 180hz. A phone is a waste of battery and a pc screen over that is a waste of money.

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Then don’t buy them? With better screens coming out the ones you do want to buy get cheaper.

      Back in the day 144hz screens cost a premium, now you can have them for cheap.

      • Potatos_are_not_friends@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I stopped buying tvs from 2000 until like two years ago, when i saw them on sale for like $200. Been living off of projectors & a home server. I skipped so many “innovations” like curve, flat, HD, 4K, trueColor.

        Weird that it has a OS and that was a shocker.

        I look forward to what TVs bring in 2040.

        • Vlyn@lemmy.zip
          link
          fedilink
          English
          arrow-up
          4
          ·
          10 months ago

          I mean OLEDs are damn amazing image quality wise, but I’m also not a fan of “smart” TVs. The apps can be useful (like native Netflix, Amazon video and so on), but 90% of the time I use my PC over HDMI.

        • Dark Arc@social.packetloss.gg
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          I think there’s an argument to make screens faster. Graphics have hit a point where resolution isn’t going to give anything of substance… It’s now more about making lighting work “right” with ray tracing… I think the next thing might be making things as fluid as possible.

          So at least in the gaming space, these higher refresh rates make sense. There’s still fluidity that we as humans can notice that we’re not yet getting. e.g. if you shake your mouse like crazy, even on a 144hz the mouse will jump around to different spots it’s not a fluid motion (I’ve never seen a 180hz but I bet the same applies).

          • ColeSloth@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            10 months ago

            You can see it moving a mouse super quick on a static background, but I never notice it happening in games. There’s probably something there a touch noticeable in some fps online games if you really paid attention and could lock your max fps at 120fps with a 240hz monitor, but that would be about it, and I don’t competitively play fps games. I’m perfectly happy with running 60fps at 120hz for myself.

  • BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    10 months ago

    It won’t matter until we hit 600. 600 integer scales to every common media framerate so frametimings are always perfect. Really they should be focusing on better and cheaper variable refresh rate but that’s harder to market.

    • patatahooligan@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      10 months ago

      Well, not really, because television broadcast standards do not specify integer framerates. Eg North America uses ~59.94fps. It will take insanely high refresh rates to be able to play all common video formats including TV broadcasts. Variable refresh rate can fix this only for a single fullscreen app.

    • Vlyn@lemmy.zip
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      I mean the 240 I use already does that. So would 360 or 480. No clue why you fixate on 600.

  • azenyr@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    4
    ·
    edit-2
    10 months ago

    The bigger the screen, the more you notice because it covers more of your field of view. I would say 240Hz is the sweet spot. You can definitely feel the improvement from lower rates, but rates above it start to be barely noticeable. However I am fine with 144-165Hz if I wanted to save money and still get a great experience. Bellow 120Hz is unusable for me. Once you go high refresh, you cannot go back, ever. 60Hz feels like a slideshow. For gaming 60 is fine, but for work use and scrolling around I can’t have 60. Yes people, high refresh rate is useful even outside of gaming.

    Funny thing is, while gaming, even if my monitor and PC can do it, I rarely let my fps go above 120-140. I limit them in the game. PC gets much quieter, uses less power, heats up less and its smooth enough to enjoy a great gameplay. I will never understand people who get a 4090 and play with unlocked fps just to get 2000 fps on minecraft while their pc is screaming for air. Limit your fps at least to your Hz people, have some care for your hardware. I know you get less input lag but you are not Shroud, those less 0.000001ms of input lag will not make a difference.

    • morbidcactus@lemmy.ca
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      I went from 1080p60 as my standard for literal decades to 3440x1440 @144hz over the last 2 years and I can’t go back, mostly for non-gaming activities, find the ultrawide better than multi monitor for me, would love a vertical e-ink display though for text. I also limit my fps to 120, I don’t like feeling like my PC is going to take off and the place I rent is older so the room I use for my office is smaller, heats up quickly.

    • Euphoma@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      10 months ago

      Minecraft actually gets better FPS when you don’t limit its FPS. When I play at 60 fps, it usually dips into the 50’s and 40’s, while when its unlimited, there are no noticeable dips.

    • PLAVAT🧿S@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Maybe this is what Jaden Smith meant when he famously stated:

      How Can Mirrors Be Real If Our Eyes Aren’t Real

      Wow, still blown away…

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    This is the best summary I could come up with:


    After all, it wouldn’t be the first time manufacturers have battled over specs with debatable benefit to customers, whether that’s the “megahertz myth” or megapixel wars of the ‘00s or, more recently, smartphone display resolution.

    You can read an in-depth breakdown of the reasoning in this post in which they argue that we’ll have to go beyond 1000Hz refresh rates before screens can reduce flicker and motion blur to a level approaching the real world.

    Higher refresh rate monitors might be smoother, with better visual clarity and lower input latency for gamers — but at what point does it stop making sense to pay the price premium they carry, or prioritize them over other features like brightness?

    All of this also assumes that you’ve got the hardware to play games at these kinds of frame rates, and that you’re not tempted to sacrifice them in the name of turning on some visual eye candy.

    But even as a person who’s been enjoying watching the monitor spec arms race from afar, I’m not looking to imminently replace my 100Hz ultrawide LCD, which I’ve been using daily for over half a decade.

    Or, to use an even sillier example, it’s like drinking bad coffee after a pandemic spent obsessing over brewing the perfect cup at home.


    The original article contains 1,095 words, the summary contains 214 words. Saved 80%. I’m a bot and I’m open source!

  • M500@lemmy.ml
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    10 months ago

    I still use a 75hz desktop and 60hz on my Laptop. I can’t tell the difference.

    Even looking at the iPhone 15 pro next to a 15 looks the same to me.

    • azenyr@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      10 months ago

      To be fair, 60Hz to 75Hz is barely noticeable even by those who are used to notice these things. If you can’t tell the difference, it’s understandable.

      Then, the bigger the screen, the more noticeable high refresh rate is, because it covers more of your field of vision. So in a small iPhone screen it’s not easy for everyone to notice (and then there is the fact that iPhones rarely go to 120Hz anyway which is an absolute mess by itself but that’s another topic, so your 15 Pro probably rarely goes above a noticeable Hz change anyway).

      However, if you get a 144Hz or above, 24" and above monitor, you will IMMEDIATELY see the difference against your 60Hz monitor. Even moving the mouse feels more responsive and accurate. Makes targeting stuff with your mouse easier. Reading text while scrolling is possible also. It unlocks a new world, it’s not only for gaming. Casual and work use also benefit a lot from high refresh rates.

    • Liz@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      When it comes to my phone, I can tell the difference between 90 and 60 Hz. I prefer 90 Hz but I would be okay with 60. Ideally I’d like the highest refresh rate I can get inside my budget when I have to buy a new device with a screen. One annoying limitation is that all the video is played in 24-60 Hz. I notice it when things move with any kind of speed or when I’m looking at background object and the camera moves.

      All things considered, it’s a minor annoyance. My quality of life wouldn’t change if I were limited to 30 Hz.

      • M500@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Now I need to find a 144hz monitor. I’m not gaming outside of my steamdeck and don’t do online, so I don’t care about that. But I’m always working with text.

  • ccdfa@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    My screen is still the same 144hz. How do you make it go faster?