Skip to content

Deesus/neural-network-from-scratch

Repository files navigation

Deep Neural Networks from Scratch

A mini "framework" for deep neural networks, written using only NumPy. Check out the interactive notebook. Currently, only binary classification and multi-class classification problems are supported.

Features:

  • Backpropagation implementation
  • Custom initialization: He, Xavier
  • Custom hyperparameters: learning rate, custom layers and layer size, number of iterations
  • Custom regularization: L2 regularization, dropout
  • Gradient Descent Optimizers: Momentum, Adam, RMSProp

Notes:

  • The duplicate .py files (e.g. nn_binary_classification.py) are Jupyter pairings -- used for diffing Jupyter Notebook changes via Jupytext. Normally, we'd version control only the .py files and ignore the .ipynb pairings; however, for quick viewing on GitHub and general convenience, I'm keeping both file extensions.

TODO:

  • Add API documentation
  • Replace nn_utils with custom functions
  • Add method docstrings
  • Add features:
    • Mini-batch gradient descent
    • Batch norm
    • Softmax

License:

Copyright 2020-2021 Deepankara Reddy. BSD-2 License.

Releases

No releases published

Packages

No packages published