https://xkcd.com/2867

Alt text:

It’s not just time zones and leap seconds. SI seconds on Earth are slower because of relativity, so there are time standards for space stuff (TCB, TGC) that use faster SI seconds than UTC/Unix time. T2 - T1 = [God doesn’t know and the Devil isn’t telling.]

      • phoneymouse@lemmy.world
        link
        fedilink
        English
        arrow-up
        19
        arrow-down
        1
        ·
        11 months ago

        If your system hasn’t been upgraded to 64-bit types by 2038, you’d deserve your overflow bug

        • Appoxo@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          1
          ·
          11 months ago

          Let’s just nake it 128-Bit so it’s not our problem anymore.
          Hell, let’s make it 256-Bit because it sounds like AES256

          • phoneymouse@lemmy.world
            link
            fedilink
            English
            arrow-up
            14
            ·
            edit-2
            11 months ago

            64 bits is already enough not to overflow for 292 billion years. That’s 21 times longer than the estimated age of the universe.

            • nybble41@programming.dev
              link
              fedilink
              English
              arrow-up
              13
              ·
              11 months ago

              If you want one-second resolution, sure. If you want nanoseconds a 64-bit signed integer only gets you 292 years. With 128-bit integers you can get a range of over 5 billion years at zeptosecond (10^-21 second) resolution, which should be good enough for anyone. Because who doesn’t need to precisely distinguish times one zeptosecond apart five billion years from now‽

              • Hamartiogonic@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                11 months ago

                If you run a realistic physical simulation of a star, and you include every subatomic particle in it, you’re going to have to use very small time increments. Computers can’t handle anywhere near that many particles yet, but mark my words, physicists of the future are going want to run this simulation as soon as we have the computer to do it. Also, the simulation should predict events billions of years in the future, so you may need to build a new time tracking system to handle that.

                • nybble41@programming.dev
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  ·
                  11 months ago

                  Good point. You’d need at least 215 bits to represent all measurably distinct times (in multiples of the Planck time, approximately 10^-43 seconds) out to the projected heat death of the universe at 100 trillion (10^14) years. That should be sufficient for even the most detailed and lengthy simulation.

    • The_Lurker@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      11 months ago

      Swatch’s Internet Beats are making more and sense every time Daylight Savings forces a timezones change. Why are we still using base 12 for time anyway?