I know the typical answer is “no because all the libs are in python”… but I am kind of baffled why more porting isn’t going on especially to Go given how Go like Python is stupid easy to learn and yet much faster to run. Truly not trying to start a flame war or anything. I am just a bigger fan of Go than Python and was thinking coming in to 2024 especially with all the huge money in AI now, we’d see a LOT more movement in the much faster runtime of Go while largely as easy if not easier to write/maintain code with. Not sure about Rust… it may run a little faster than Go, but the language is much more difficult to learn/use but it has been growing in popularity so was curious if that is a potential option.

There are some Go libs I’ve found but the few I have seem to be 3, 4 or more years old. I was hoping there would be things like PyTorch and the likes converted to Go.

I was even curious with the power of the GPT4 or DeepSeek Coder or similar, how hard would it be to run conversions between python libraries to go and/or is anyone working on that or is it pretty impossible to do so?

    • Dry-Vermicelli-682@alien.topOPB
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      But the actual training code… isn’t there a crap ton of code that trains the model so that the model can respond with NLP and other capabilities? There has to be code behind all that somewhere? The ability for the “logic” of the AI to do what it does… that code is python as well yah? I would assume that in Go or Rust or C would execute much faster and this AI could be much faster (and less memory, no python runtime, etc)? Or is there already some back end c/cpp code that does ALL that AI logic/guts, and python even for training models is still just glue that calls in to the c/cpp layer?

      • m98789@alien.topB
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Correct, even for training the models, all the Python code you see is really just a friendly interface over highly optimized C/cuda code.

        There are no “loops” or matrix multiplication being done in Python. All the heavy lifting is done in lower level highly optimized code.

        • Dry-Vermicelli-682@alien.topOPB
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          So most of the python AI coding folks… aren’t writing CUDA/high end math/algo style code… they’re just using the library similar to any other SDK?

          • m98789@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yes. Even the authors of the AI frameworks like PyTorch aren’t usually writing the low level cuda code for NNs. They are wrapping the cuDNN library from NVIDIA which has highly optimized cuda code for NN operations.

          • adarshiscool@alien.topB
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yes, of course! Don’t make the mistake of thinking any particular area of CS is fundamentally different from the classic case: we stand on the shoulders of great libraries :)

            Also - it’s important to note that the “logic” that you are trying to find is not written in Python or any other programming language, it is held in the weights and architecture of the model you use (which hint: is also something you will be borrowing from an academic or research arm of a tech firm rather than writing yourself!)