• BananaTrifleViolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    85
    ·
    edit-2
    10 months ago

    The actual answer in on Stack exchange in their comments.

    https://unix.stackexchange.com/questions/740319/why-is-gnome-fractional-scaling-1-7518248558044434-instead-of-1-75

    It is related to a mix of actual display resolution vs conversions to virtual resolutions (the scaled resolution), and use of single precision floating point calculations.

    Essentially my understanding is what it is doing is storing the value needed to convert your actual resolutions number of pixels (2160p) to a virtual resolution number of pixels (2160/1.75 horizontally) but that gets you fractions of a virtual pixel. So instead of 1.75 it scaled by 1.75182… to get to a whole number of virtual pixels to work with. Then on top of that the figure is slightly altered from what we’d expect by floating point errors.

    If you take the actual horizontal resolution 2190 and divide it by the virtual resolution it’s trying to use 1233 pixels, you need a conversion value of 1.75182… to convert to it so you don’t get fractions of a pixel. If you used 1.75 you’d get 1234.2857… pixels. So gnome is storing the fraction that gets you a clean conversion in pixels to about 4 decimal places of a pixel.

    Full credit to rakslice at Stack Exchange who also goes into the detail.

    • MotoAsh@lemmy.world
      link
      fedilink
      arrow-up
      56
      arrow-down
      1
      ·
      10 months ago

      Floating point error? Yeaahhh no. No. Just… no. That is NEVER as big as 0.01 unless the number is also insanely massive.

      The error is relative in scale. It’s not magically significant fractions off.

      • Giooschi@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        10 months ago

        TBF the error can become that big if you do a bunch of unstable operations (i.e. operations that continue to increase the relative error), though that’s probably not what is happening here.

        • MotoAsh@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          10 months ago

          To get to 0.01 error, you’d need to add up trillions of trillions of floating point errors. It will not happen solely because of floating point unless you’re doing such crazy math that you shouldn’t be using primitives in the first place.

          • Giooschi@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 months ago

            That’s why I said unstable operations. Addition is considered a stable operation (for values with the same sign)

      • Dave.@aussie.zone
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        As the answer in the link explains, it’s adjustment of your scaling factor to the nearest whole pixel, plus a loss of precision rounding to/from single/double floating point values.

        So I’m not really sure of the point of this post. It’s not a question, as the link quite effectively answers it. It’s more just “here’s why your scaling factor looks weird in your gnome config file”, and it’s primarily the first reason - rounding to whole pixels.

    • gian @lemmy.grys.it
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      True, but it is not that difficult to trucante (or round) the value at the second decimal value.

    • ⲇⲅⲇ@lemmy.ml
      link
      fedilink
      arrow-up
      4
      arrow-down
      22
      ·
      edit-2
      10 months ago

      Gnome is coded with JavaScript (lmao 🤣) so yeah, I Think you are right.

      EDIT: Actually, even if JavaScript and other languages have this issue, the value 1.7518248558044434 has not this issue. There is another reply that explains it and makes totally sense. But still pretty lame to know the desktop runs with JavaScript. (Yeah, I hate Gnome)

      • atzanteol@sh.itjust.works
        link
        fedilink
        arrow-up
        13
        ·
        10 months ago

        It’s not a “language” issue it’s a “computer” issue. This math is being done on the CPU.

        IEEE 754

        Some languages do provide for “arbitrary precision math” (Java’s BigDecimal for example) but it’s slower to do that. Not what you want if you’re multiplying a 4k matrix every millisecond.

      • TheGrandNagus@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        10 months ago

        It’s mostly C.

        And Gnome is far from the only desktop that uses JS, KDE Plasma, for example, also uses a lot of JavaScript.

        It’s weird when people bash Gnome for using JS, when practically everybody else uses it a lot too. Shows that they’re just regurgitating “Gnome = bad!!!” nonsense.

        We get it, you think disliking Gnome is a quirky, edgy personality trait.

          • TheGrandNagus@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            10 months ago

            There’s a lot more to your UX than just the Plasma desktop. And you’re also trying to pass off Gnome’s shell as being Gnome desktop. Pretty disingenuous.

            • ⲇⲅⲇ@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              10 months ago

              But at least the desktop itself isn’t using JavaScript that much like Gnome do. Show me the repo with the % to see what are you referring.