Skip to content

Latest commit

 

History

History
24 lines (16 loc) · 1.27 KB

README.md

File metadata and controls

24 lines (16 loc) · 1.27 KB

Capsule-Net-on-MNIST

CNNs (convolutional neural networks) are awesome. They are one of the reasons deep learning is so popular today. They can do amazing things that people used to think computers would not be capable of doing for a long, long time. Nonetheless, they have their limits and they have fundamental drawbacks.

Hinton: “The pooling operation used in convolutional neural networks is a big mistake and the fact that it works so well is a disaster.”

Capsules introduce a new building block that can be used in deep learning to better model hierarchical relationships inside of internal knowledge representation of a neural network. Intuition behind them is very simple and elegant. Sara Sabour, Nicholas Frost and Geoffrey Hinton released a paper titled “Dynamic Routing Between Capsules”. Now when one of the Godfathers of Deep Learning “Geoffrey Hinton” is releasing a paper it is bound to be ground breaking.

Basic Architecture:

498

Requirements:

Python 3

PyTorch

TorchVision

TorchNet

Benchmarks:

Highest accuracy obtained was more than 99.4% on the 20th epoch. The model will achieve a higher accuracy as shown by the trend of the test accuracy.