It could be the key to making tomorrow’s smart tech sustainable.

  • CmdrShepard@lemmy.one
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    2
    ·
    edit-2
    1 year ago

    Seems interesting but it has the stink of “buzzword marketing” all over it. The example given in the article about using it for wake words is just using a microphone connected to the device. Microphones and speakers are both analog devices that all digital phones have already. Also the fact that it’s an IC that’s programmable leads me to believe it’s not analog at all, or else how can it be programmed?

    I also thought it was funny to talk about environmental damage from all these digital sensors and then using thermometers filled with mercury as an example of an analog sensor. Mercury is a heavy metal and extremely toxic to most lifeforms, which is why we don’t use it in thermometers anymore.

    • El Barto@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      leads me to believe it’s not analog at all, or else how can it be programmed?

      Not to take away from your main point, but analog things can be programmed - those old school power socket timers, or that toy car that follows a line drawn on the floor. Maybe those programmable units are tiny baggage analytical machines? But yeah, in the end, I side with you.

      I also thought it was funny to talk about environmental damage from all these digital sensors and then using thermometers filled with mercury as an example of an analog sensor.

      What’s even funnier is that he called a thermometer “a computer.” Eh no. You can’t make thermometers compute anything.

    • ekky43@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      Lots of buzzwords indeed, author apparently doesn’t even know what a smart sensor is, as they described a regular sensor in their first paragraph.

      That said, you can absolutely program analog ICs, such as by using a Field Programmable Gate Array instead of just your regular Gate Array (your usual, ‘stupid’ IC). Though, while a random IC might cost you less than half a dollar, a FPGA will cost you around 100$ for a simple chip.

      On the other hand, skipping any GPU or CPU and their limitations by clock speed should speed up the AI considerably, though parallel programming (not concurrent programming, and not multi-core “parallel” programming either) is much harder and comes with almost no safety when compared to serial programming.

      • El Barto@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        What do you mean by “almost no safety”?

        Edit: OP edited his comment to clarify. He only had “parallel” without the “no multi-thread/multi-core” bit.

        • ekky43@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          A CPU is a very complex gate array which handles bothersome tasks such as synchronization (run conditions) and memory access, and presents you with a very limited set of instructions. All serial programming builds upon this very limited set of instructions, and the instructions have been thoroughly tested over the past 6 decades.

          Not to say that CPU architecture or microcode is fail-safe, but the chance of your computer blue-screening because of a failure of your CPU is rather small.

          Now, parallel programming (the low level variant, not the hijacked definition) is the art of “wiring” those gate arrays. A CPU is actually made using parallel programming, so all the safeties it presents for serial programming will not be present in parallel programming, as parallel programming does not use a CPU.

          EDIT: the above is of course simplified, there exist multiple architectures, collected into more common instructions sets such as amd64, armhf, arm64, etc. but even the most barebone processing unit contains a lot of securities and nicities that parallel does not have.

          • El Barto@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            You lost me at parallel programming not using a CPU.

            Perhaps you mean that it uses the lower levels of the CPU.

            But regardless, I see that you mean that parallel programming involves almost no safety at the hardware level. Which is a weird thing to say since “serial programming” at the assembly level also offers no safety (e.g. if your program runs at ring 0.)

            • ekky43@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 year ago

              I think you are misunderstanding me. Are you perhaps thinking about multithreading or multi core? Because some people have also started calling that “parallel”, even if it is nothing like low-level parallel.

              A CPU does not build upon a CPU, a CPU builds upon transistors which are collected into gates, and which can be assembled into the correct order using parallel programming.

              EDIT: as an example, you do not actually need a computer to parallel program. Get yourself a box of transistor, some cable, and a soldering iron, and you can build some very rudimentary gate arrays, like a flip-flop.

              This link might give a better understanding of our confusion.

              EDIT 2: One could perhaps illustrate the confusion which this topic is often victim of as such:

              Transistors are part of the hardware and are parallel programmed to form complex gate arrays called “Processors”, which feature instruction sets used by machine code, which is made using assembly, which is called “serial programming”, which enables high-complexity operations such as multi-core “parallel” programming.

              I’m talking about the former “PGA parallel programming”, and not the latter “multi-core parallel programming”.

              • El Barto@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I understand all that. I wrote my first 6502 assembler program in 1989 - and it was fun, by the way!

                I am also aware that today’s CPUs are nothing like the 8-bit CPUs of the 80s. So we’re on the same page in that respect.

                I understand what you’re saying now. You’re talking about programmable gate arrays, which is cool. But I still don’t understand how “parallel programming” gate arrays comes with almost no safety compared to “serial programming” gate arrays. If you are not careful in either mode, you can introduce serious bugs in the programming.

                • ekky43@lemmy.dbzer0.com
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  1 year ago

                  Right, apologies for dumping it down so far, I find it hard to properly gauge the knowledge of others on the internet, and just try and play safe.

                  I wasn’t aware that one could serial program gate arrays, as, as far as I know, the definition of serial programming is code that is governed by a processor, and which prohibits anything but serial execution of commands. So it’s new to me that gate arrays can run serial code without any governance or serialization process, since gate arrays by themselves are anything but serial. Or rather, you need to synchronize anything and everything that is supposed to be serial by yourself, or use pre-built and pre-synced blocks, I guess.

                  Anyway, going by the definition that serial programming can only be performed using some kind of governance or synchronizing authority, that alone would be another layer of security.

                  As serial implies, it rid us, or lessened the burden, of those timing related issues, some of which included:

                  • All the problems of accessing in-use resources that multi-cored serial “parallel” programming reintroduced.
                  • Making a block and not properly timing it resulting in the clock changing while it’s still flipping gates and produce unexpected behavior.
                  • As the above, just generally having to time everything, as having too many clock blocks or sync checks results in unnecessary speed loss, and having too few checks might result in unexpected behavior.
                  • Over/underclocking and other slight power and clock variations.
                  • Uninitialized gates producing random behavior.
                  • And by extension: the power up process not being exactly the same every time, resulting in more unexpected behavior. Very annoying to debug when it looks all right to start with.
                  • Reading through seconds of timing diagrams (that’s a lot of reading with a clock time of nano seconds).
                  • Block placement and connection problems.
                  • Using gate array layouts/code with differing transistor specs.

                  And the list goes on, but you know.

                  Serial also has a lot of pitfalls, and you can definitely screw things up bad, but at least you don’t have to think much about clock or timing, or memory placement, unless communicating between devices or cores, and those sync problems tend to be rather tame and simple compared to intra-processor problems.

                  At least from my experience.

  • Flipper@feddit.de
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    1 year ago

    As my prof put it: “Every few years it’s announced that analog computing is coming back. But everyone is still using digital computing only”

  • rorschah@lemdro.id
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    1 year ago

    If you are interested learning more, there was a veritasium video about analog chips where he quite well explained the working of and the usecases .

  • kubica@kbin.social
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    I’m not sure with that information if internally they are analog or not. But at least it sounds like smart limit switches, which it does make sense to be more efficient than having a computer monitoring a signal and comparing it to the desired value.

    • Salamendacious@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      This impressed me:

      In August, IBM unveiled a prototype of a low-power analog chip designed specifically for speech recognition — it was able to detect 12 “wake words” more quickly and just as accurately as a digital system.

      I’m always wary of extraordinary claims (Theranos and all) but this could potentially be interesting.

  • HenriVolney@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    1 year ago

    Very interesting!

    Happy to see people recalling everybody about the analog nature of the world, in spite of big corps pushing for a cold fully digital metaverse