Hi everyone!
A couple of days ago, I finally decided to follow Andrej Karpathy’s micrograd tutorial. I liked the concepts so much I implemented it in C++ and added more functionality:
- Optimizers: Adam, SGD with momentum
- Activation functions: tanh, sigmoid, relu
I tried to make both the API as well as internal library as clean as possible so that everyone could understand the basic algorithms behind deep learning. Here is the project’s repo and a get started guide. Let me know what you think :)
I really like this pattern!
Someone starts a cool ml project in python. Another person re-creates it more efficiently in c++.
Then someone else rebuilds it in rust.
Indeed. 😀 In fact I just recently made a version in Zig: https://github.com/nurpax/zigrograd
Oh Today I discovered there’s a programming language called zig.