This repo attempts to build a neural network library from scratch following Karpathy's excercise in building micrograd which is a Pytorch inspired micro library of neural networks
Topics Covered
- Calculus
- Backpropogation
- Forward pass
- Pytorch
- Multi Layer Perceptrons
- Gradient descent
- Learning rate
- Training the network
- Loss Function