- Reimplement material from deeplearning.ai in F# to gain deeper understanding.
- Build a generic .NET Core DNN library for other projects.
- TBD
- Written in F#/.NET Core 3 using Math.NET Numerics on MKL
- Componentized - each of the aspects below can be tested and extended on it own
- Entirely Test Driven Developed
- Static Computation Graph
- Tensor abstraction (Uniform API for Matrix/Vector with minimal broadcasting support)
- Minimal transfer learning
- He
- Linear
- Sigmoid
- ReLU
- TanH
- Mean squared error
- Binary Cross entropy with logits
- Categorical Cross entropy with logits
- Batch Gradient Descent
- Momentum
- AdaM
- L1
- L2
- Dropout
- Stochastic/Mini-Batch Gradient Descent
- Normalization
- MNIST
- ?Multilabel?
- ?Regression?
TBD - compare with numpy
- Implment Tensor functions on GPU
- Implement CNN and RNN class networks based on the Computation Graph
- Enable dynamic version of the Computation Graph
- ConvNetJS
- Hacker's guide to Neural Networks: Computation Graphs
- CS231n Winter 2016: Lecture 4: Backpropagation basics
- CS231n Winter 2016: Lecture 4: Jacobians
- fsharpforfunandprofit - Catamorphisms
- Introduction to Machine Learning - Neural Network
- Exhaustive list of loss functions
- https://towardsdatascience.com/cross-entropy-for-classification-d98e7f974451
- https://levelup.gitconnected.com/killer-combo-softmax-and-cross-entropy-5907442f60ba
- https://peterroelants.github.io/posts/cross-entropy-logistic/
- https://deepnotes.io/softmax-crossentropy#cross-entropy-loss
- https://deepai.org/machine-learning-glossary-and-terms/softmax-layer