Micrograd-C is a simple automatic differentiation library implemented in C, basically a 'C' version of Karpathy's Micrograd. It provides a basic framework for creating and manipulating scalar values with gradient support, allowing for backpropagation through a computation graph.
Click the image above or this link to watch the tutorial.
- Automatic Differentiation: Compute gradients for scalar operations using backpropagation.
- Scalar Operations: Supports addition, multiplication, power, ReLU, sigmoid, and tanh functions.
- Backward Pass: Propagate gradients through a computation graph.
To get started with Micrograd-C, clone this repository and compile the provided code.
- A C compiler (e.g.,
gcc
) - A C++ compiler (e.g.
g++
)
-
Clone the repository:
git clone https://github.com/shivendrra/micrograd.c.git cd micrograd-c
-
Compile the code:
- For c version
gcc csrc/scalar.c test.c -o test -lm
- For cpp version
g++ cppsrc/value.cpp cppsrc/module.cpp main.cpp -o main -lm
After compiling, run the example:
- For c version
./test
- For cpp version
./main
Here is an example of what you can expect when running the program:
Scalar[data=(2), grad=0.0074]
Scalar[data=(3), grad=0.0049]
Scalar[data=(5), grad=0]
Scalar[data=(6), grad=0.0025]
Scalar[data=(6), grad=0]
Scalar[data=(1), grad=0]
Scalar[data=(0.9975), grad=1]
Header file defining the Scalar
structure and function prototypes for scalar operations and backward propagation.
C file containing all the necessary functions for Scalar
value structure for ops & backprop.
Header file defining the Neuron
, Layer
, & MLP
structures & function prototypes for creating a small MLP.
C file containing all the basic functions for building & implementing MLP in C using Scalar
values.
Demonstrates the usage of the Micrograd-C library by creating scalar values, performing operations, and computing gradients.
initialize_scalars
: Initializes a scalar value with a given data value and its children.add_val
,mul_val
,pow_val
,relu
,sigmoid
,tan_h
,sub_val
: Functions for scalar operations.backward
: Computes gradients for all scalar values in the computation graph.print
: Prints the scalar data and gradient values.
Feel free to contribute by submitting issues or pull requests. Your contributions are welcome!
None!