Copyright (c) Pius Braun 2018
Neural networks are one of the most important methods in machine learning.
This project implements some of the widely used components of neural networks in C++
using Back Propagation and Stochastic Gradient Descent.
The result is a very basic kernel for machine learning, that can train a neural network with different neurons and cost functions.
The following cost functions are implemented:
The following neuron functions are implemented:
All parameters of the network can be configured from the command line.
As an example, the network trains the MNIST Dataset and achieves an accuracy of up to 98.5 %.
Section | Content |
---|---|
Specification | describes the Math behind Neural Networks |
Implementation | explains the Code |
References | contains the list of books, websites, tools and third party libraries |
If you want to contribute to the project:
- Write a specification, what you intend to achive, and what's the Math behind it.
- Send the sepcification to me: pius.braun@t-online.de.
- Update the source code in a sandbox on your own system and test it for bugs.
- Run the tests similar to my Test section in the documentation.
There is room for improvements:
- I did not implement the code for validation data.
- The data input is restricted to the IDX format as defined by Jann LeCun. CSV would be better.
- The results are stored to a CSV file without any real useful structure. Maybe there are better ideas to store training results.
- The network is fully connected in all layers. Convolutional networks should be better for some purposes.
- Some matrix operations in
backprop()
andfeedforward()
may run faster if I could dig deeper into the eigen matrix code.