• borZ0 the t1r3D b3aR@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 days ago

    I like a lot of your responses. I agree about nostalgia being a main driver of his article. However, i think the bits about how a doctor needs to know how a medical tool functions etc, is a little misplaced. I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user. I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they’re built upon and that unintended consequences can occur when that happens. Worse, if the current technology has been abstracted enough times, eventually no one will know enough to fix it.

    • partial_accumen@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      I think the author was referring to the makers of the device not understanding what theyre making, not so much the end user.

      Just to make sure I’m following your thread of thought, are you referring to this part of the author’s opinion piece or something else in his text?

      “This wouldn’t matter if it were just marketing hyperbole, but the misunderstanding has real consequences. Companies are making billion-dollar bets on technologies they don’t understand, while actual researchers struggle to separate legitimate progress from venture capital fever dreams. We’re drowning in noise generated by people who mistake familiarity with terminology for comprehension of the underlying principles.”

      • borZ0 the t1r3D b3aR@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        22 hours ago

        Yes, but also the bit about when someone creates an application without understanding the underlying way that it actually functions. Like I can make a web app, but i don’t need to understand memory allocation to do it. The maker of the app is a level or two of abstraction from what the base metal of the computer is being told to do.

        • partial_accumen@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          21 hours ago

          Gotcha, thank you for the extra context so I understand your point. I’ll respond to your original statement now that I understand it better:

          I ALSO think the author would prefer more broad technical literacy, but his core arguement seemed to be that those making things dont understand the tech they’re built upon and that unintended consequences can occur when that happens.

          I think the author’s argument on that is also not a great one.

          Lets take your web app example. As you said, you can make the app, but you don’t understand the memory allocation, and why? Because the high level language or framework you wrote it in does memory management and garbage collection. However, there are many, many, MANY, more layers of abstraction beside just your code and the interpreter. Do you know the webserver front to back? Do you know which ring your app or the web server is operating in inside the OS (ring 3 BTW)? Do you know how the IP stack works in the server? Do you know how the networking works that resolves names to IP addresses or routes the traffic appropriately? Do you know how the firewalls work that the traffic is going over when it leaves the server? Back on the server, do you know how the operating system makes calls to the hardware via device drivers (ring 1) or how those calls are handled by the OS kernel (ring 0)? Do you know how the system bus works on the motherboard or how the L1, L2, and L3 cache affect the operation and performance of the server overall? How about that assembly language isn’t even the bottom of abstraction? Below that all of this data is merely an abstraction of binary, which is really just the presence or absence of voltage on a pit or in a bit register in ICs scattered across the system?

          I’ll say probably not. And thats just fine! Why? Because unless your web app is going to be loaded onto a spacecraft with a 20 to 40 year life span and you’ll never be able to touch it again, then having all of that extra knowledge and understanding only have slight impacts on the web app for its entire life. Once you get one or maybe two levels of abstraction down, the knowledge is a novelty not a requirement. There’s also exceptions to this if you’re writing software for embedded systems where you have limited system resources, but again, this is an edge case that very very few people will ever need to worry about. The people in those generally professions do have the deep understanding of those platforms they’re responsible for.

          Focus on your web app. Make sure its solving the problem that it was written to solve. Yes, you might need to dive a bit deeper to eek out some performance, but that comes with time and experience anyway. The author talks like even the most novice people need the ultimately deep understanding through all layers of abstraction. I think that is too much of a burden, especially when it acts as a barrier to people being able to jump in and use the technology to solve problems.

          Perhaps the best example of the world that I think the author wants would be the 1960s Apollo program. This was a time where the pinnacle of technology was being deployed in real-time to solve world moving problems. Human kind was trying to land on the moon! The most heroic optimization of machines and procedures had to be accomplished for even a chance for this to go right. The best of the best had to know every. little. thing. about. everything. People’s lives were at stake! National pride was at stake! Failure was NOT an option! All of that speaks to more of what the author wants for everyone today.

          However, that’s trying to solve a problem that doesn’t exist today. Compute power today is CHEAP!!! High level program languages and frameworks are so easy to understand that programming it is accessible to everyone with a device and a desire to use it. We’re not going to the moon with this. Its the kid down the block that figured out how to use If This Then That to make a light bulb turn on when he farts into a microphone. The beauty is the accessibility. The democratization of compute. We don’t need gatekeepers demanding the deepest commitment to understanding before the primitive humans are allowed to use fire.

          Are there going to be problems or things that don’t work? Yes. Will the net benefit of cheap and readily available compute in the hands of everyone be greater than the detriments, I believe yes. It appears the author disagrees with me.

          /sorry for the wall of text

          • borZ0 the t1r3D b3aR@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            5 hours ago

            As with your original comment, i like your argument. :) Additionally, I dig the wall of text. WoT, written well, leaves little ambiguity and helps focus the conversation. I don’t disagree on any particular point. I agree that its a net positive for programming to be approachable to more people, and that it can’t be approachable to many while requiring apollo era genius and deep understanding of technology. It would be a very different world if only PhDs could program computers. To that, I believe the article author is overstating a subtle concern that I think is theoretically relevant and important to explore.
            If, over the fullness of decades, programming becomes so approachable (ie, you tell an AI in plain language what you want and it makes it flawlessly), people will have less incentive to learn the foundational concepts required to make the same program “from scratch”. Extending that train of thought, we could reach a point where a fundamental, “middle-technology” fails and there simply isn’t anyone who understands how to fix the problem. I suspect there will always be hobbiests and engineers that maintain esoteric knowledge for a variety of reasons. But, with all the levels of abstraction and fail points inadvertently built in to code over so much time passing, it’s possible to imagine a situation where essentially no-one understands the library of the language that a core dependency was written in decades before. Not only would it be a challange to fix, it could be hard to find in the first place. If the break happens in your favorite cocktail recipe app, its Inconvenient. If the break happens in a necessary system relied on by fintec to move peoples money from purchase to vendor to bank to vendor to person, the scale and importance of the break is devastating to the world. Even if you can seek out and find the few that have knowledge enough to solve the problem, the time spent with such a necessary function of modern life unavailable would be catastrophic. If a corporation, in an effort to save money, opts to hire a cheap ‘vibe-coder’ in the '20s and something they ‘vibe’ winds up in important stacks, it could build fault lines into future code that may be used for who-knows-what decades from now. There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs. You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.

            • partial_accumen@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 hours ago

              There are a lot of ifs in my examples. It may never happen and we’ll get the advantage of all the ideas that are able to be made reality through accessibility. However, it’s better to think about it now rather than contend with the eventually all at once when a catastrophe occurs. You’re right that doom and gloom isn’t helpful, but I don’t think the broader idea is without merit.

              There are some actual real-life examples that match your theoreticals, but the piece missing is the scale of consequences. What has generally occurred is that the fallout from the old thing failing wasn’t that big of a deal, or that a modern solution could be designed and built completely replacing the legacy solution even without full understanding of it.

              A really really small example of this if from my old 1980s Commodore 64 computer. At the time it used a very revolutionary sound chip to make music and sound effects. It was called the SID chip. Here’s one of the them constructed in 1987.

              It combined digital technologies (which are still used today) with analog technologies (that nobody makes anymore in the same way). Sadly, these chips also have a habit of dying over time because of how they were originally manufactured. With the supply of these continuously shrinking there were efforts to come up with a modern replacement. Keep in mind these are hobbyists. What they came up with was this:

              This is essentially a whole Raspberry Pi computer that fits in the same socket in the 1980s Commodore 64 that accepts the input music instructions from the computer and runs custom written software to produce the same desired output the legacy digital/analog SID chip built in 1982. The computing power in this modern replacement SID chip replacement is more than 30x that of the entire Commodore 64 from the 80s! It could be considered overkill to use so much computing power where the original didn’t, but again, compute is dirt cheap today. This new part isn’t expensive either. Its about $35 to buy.

              This is what I think will happen when our legacy systems finally die without the knowledge to service or maintain them. Modern engineers using modern technologies will replace them providing the same function.

              • borZ0 the t1r3D b3aR@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                3 hours ago

                I certainly hope so! Human ingenuity has gotton us here. I’m interacting with you across who knows how much distance, using a handheld device that folds up. …but, just because we’ve gotten ahead of trouble and found solutions thus far, doesn’t mean that an unintended bit of code, or hardware fault, or lack of imagination can’t cause consequences further down the road. I appreciate your optimism and pragmatic understanding. You seem to be a solution driven person that believes in our ability to reason and fix things. We’ll definitely need that type of attitude and approach when and if something goes sideways.

                • partial_accumen@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  2 hours ago

                  …but, just because we’ve gotten ahead of trouble and found solutions thus far, doesn’t mean that an unintended bit of code, or hardware fault, or lack of imagination can’t cause consequences further down the road.

                  Absolutely true.

                  I guess my thought is that the benefits of our rapid growth outweigh the consequences of forgotten technology. I’ll admit though, I’m not unbiased. I have a vested interest. I do very well professionally being the bridge of some older technologies to modern ones myself.