I know the typical answer is “no because all the libs are in python”… but I am kind of baffled why more porting isn’t going on especially to Go given how Go like Python is stupid easy to learn and yet much faster to run. Truly not trying to start a flame war or anything. I am just a bigger fan of Go than Python and was thinking coming in to 2024 especially with all the huge money in AI now, we’d see a LOT more movement in the much faster runtime of Go while largely as easy if not easier to write/maintain code with. Not sure about Rust… it may run a little faster than Go, but the language is much more difficult to learn/use but it has been growing in popularity so was curious if that is a potential option.
There are some Go libs I’ve found but the few I have seem to be 3, 4 or more years old. I was hoping there would be things like PyTorch and the likes converted to Go.
I was even curious with the power of the GPT4 or DeepSeek Coder or similar, how hard would it be to run conversions between python libraries to go and/or is anyone working on that or is it pretty impossible to do so?
I do a great deal of work with AI using Go.
TL;dr: I strongly recommend using Go with the llama.cpp bindings (https://github.com/ggerganov/llama.cpp).
Go is particularly well-suited for LLMs, but in a different way than most people discuss here. Firstly, Go is a better language for LLMs to learn and write. This becomes evident when comparing Go with other languages across a variety of coding tasks using ChatGPT. The code generated for Go is often much better: more code can be correctly generated per prompt, and more complicated logic is often handled well in Go, while it’s difficult to generate in other languages. This is because Go has less variability; there are fewer alternative ways to do something. Go is very uniformly structured across codebases due to enforced formatting, shares most symbols with C and C++, and has a very complete standard library, making it unambiguous which libraries to include. Training and generating Go code with LLMs is far more performant than any other language I’ve tried, especially if you use a local model with a Go LoRA, or train your own model on Go.
Go is also well-suited for efficiently handling blocking data tasks such as loading large models from disk to RAM or GPU, or from remote RAM to local RAM, thereby achieving the best available I/O performance (often the bottleneck).
Additionally, Go is the language of choice for web service client and server implementations, working well in contexts where you are using ChatGPT, GitHub, HuggingFace, and other API-based AI and supporting services.
Finally, Go is compilable and cross-compilable to a single executable, making the distribution and remote deployment of applications much less error-prone.
I strongly recommend Go for AI work due to its interaction with all these aspects of the AI ecosystem.
WOW… fantastic. This is the first (only?) response I’ve seen of the dozens here that says any of this. As a long time back end dev, switching to Go for that was amazing. I was FAR faster and more productive than in Java, Python or NodeJS. Especially once I built my own set of frameworks that I quickly copy/paste (or import). I love how fast/easy it is to stand up a docker wrapper as well. Very tiny images with very fast runtimes.
My point in this post was partially stemmed from the need to install so much runtime to use python and the various aspects with speed/memory/etc running python apps. If it is just basic glue code like some have said, then it’s not a big deal. But that I know Go and love the language makes me want to use it over anything else if I can.