Skip to content

Implementation of a Neural Network written in Python. Uses the RMSProp optimizer and supports sigmoid, tanh, and ReLu activation functions.

Notifications You must be signed in to change notification settings

sshreyas999/Neural-Network-Implementation

Repository files navigation

Neural Network Implementation

A simple neural network implementation with one hidden layer. Activation functions like sigmiod, tanh, and ReLu are supported, and the RMSProp optimizer is used to carry out backpropagation.

Quick Links

  1. Code

  2. Data

  3. Report - Logs, Proofs & Analysis.

Prerequisites

The implemenation was written using the Spyder IDE, and basic packages like pandas, numpy, seaborn, and matplotlib are required. However, to write out metrics to an excel sheet we also need to install the xlrd and openpyxl package.

Code Breakdown

Complete code can be found here.

Breakdown of classes:

  1. The dataManager class is responsible for reading in the raw data and preparing the data for learning. It cleans the dataset, seperates attributes and splits the set into test and train.

  2. The NeuralNet class is responsible for learning using a given activation function (sigmoid, tanh, or ReLu), and an added optimization of RMSProp. It performs all the computations for learning and also calculates and stores metrics.

  3. The modelComparer class takes in different sets of parameters and trains the model repeatedly. It stores all the metrics so that we can compare and take a look at which model is the best. The code for this has also been commented out since it takes a while to run. The results are included in the report. There is an option to write the different model metrics to an excel file. An example can be found here.

Dataset

The dataset is hosted on S3, but can also be found here.

The main goal is to predict the proabability of heart failure, encoded as DEATH_EVENT. There are 299 observations with 13 attributes per observation. Kindly note that this is only one example that is used to demonstrate the implementation. Other datasets can be loaded via the dataManager class.

Results

Apart from the modelComparer class in the code, a separate analysis has been carried out with the Heart Failure dataset. Multiple trials have been conducted and documented in the report. A thorough explanation of the optimizer is also provided in the report which can be found here.

About

Implementation of a Neural Network written in Python. Uses the RMSProp optimizer and supports sigmoid, tanh, and ReLu activation functions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages