This repository contains the implementation of a Residual Network (ResNet) using Keras. ResNet is a deep learning architecture designed to tackle the vanishing gradient problem in very deep networks.
-
Efficient Training: ResNet enables the training of networks with hundreds or thousands of layers efficiently.
-
Skip Connections: The core innovation lies in the introduction of skip connections, allowing gradients to flow through alternate shortcut paths.
-
Addressing Vanishing Gradient: ResNet mitigates the vanishing gradient problem, ensuring effective learning in deep neural networks.
The implemented ResNet consists of multiple residual blocks, each addressing the vanishing gradient problem. The architecture is inspired by the original ResNet50 architecture proposed by He et al. in 2015.
Make sure you have the following installed:
- Python 3.x
- TensorFlow
- Keras
- Jupyter Notebook (if you want to run the Jupyter file)
Run the ResNet implementation in a Jupyter notebook or integrate it into your own projects.
To train the model, use the provided dataset. Make sure to adjust hyperparameters as needed.
- python train.py
Evaluate the model on a test dataset to assess its performance.
- python evaluate.py
Provide insights into the performance of the trained model, including accuracy and loss metrics.
- Keras
- TensorFlow
- He et al. for introducing the ResNet architecture
- François Chollet for the Keras library