Skip to content

Latest commit

 

History

History
47 lines (37 loc) · 1.95 KB

README.md

File metadata and controls

47 lines (37 loc) · 1.95 KB

micrograd_pp

GitHub

Micrograd++ is a minimalistic wrapper around NumPy which adds support for automatic differentiation. It also provides various composable classes ("layers") and other tools to simplify building neural networks.

Micrograd++ draws inspiration from Andrej Karpathy's awesome micrograd library, prioritizing simplicity and readability over speed. Unlike micrograd, which tackles scalar inputs, Micrograd++ supports tensor inputs (specifically, NumPy arrays). This makes it possible to train larger networks.

Usage

Micrograd++ is not yet pip-able. Therefore, you will have to clone the Micrograd++ repository to your home directory and include it in any script or notebook you want to use it in by first executing the snippet below:

import sys
sys.path.insert(0, os.path.expanduser("~/micrograd-pp/python"))

Examples

Features

  • Core
    • ☒ Reverse-mode automatic differentiation (.backward)
    • ☒ GPU support
  • Layers
    • ☒ BatchNorm1d
    • ☒ Dropout
    • ☒ Embedding
    • ☒ LayerNorm
    • ☒ Linear
    • ☒ MultiheadAttention
    • ☒ ReLU
    • ☒ Sequential
  • Optimizers
    • ☐ Adam
    • ☒ Stochastic Gradient Descent (SGD)