Skip to content

Using Sensitivity Heatmap, Deconvolution, Guided Backpropagation, Class Activation Mapping, Grad-CAM and Guided Grad-CAM methods to interpret the predictions of different CNN models

Notifications You must be signed in to change notification settings

123dddd/Visualizations-of-the-interpretations-of-CNNs

Repository files navigation

Visualizations-of-the-interpretations-of-CNNs

Using Sensitivity Heatmap, Deconvolution, Guided Backpropagation, Class Activation Mapping, Grad-CAM and Guided Grad-CAM methods to interpret the predictions of different CNN models

python 3.6.12; torchvision 0.8.1; opencv 4.5.0; numpy 0.19.2; matplotlib 3.3.2

PLUS: If you are interested in exploring more advanced things about Model Interpretability (for PyTorch), have a look https://captum.ai/. It provides easy-to-use APIs for visulization as well as interpretation of the DL models.

Credits:

  1. https://github.com/utkuozbulak/pytorch-cnn-visualizations
  2. https://github.com/FrancescoSaverioZuppichini/A-journey-into-Convolutional-Neural-Network-visualization-
  3. https://github.com/kazuto1011/grad-cam-pytorch

About

Using Sensitivity Heatmap, Deconvolution, Guided Backpropagation, Class Activation Mapping, Grad-CAM and Guided Grad-CAM methods to interpret the predictions of different CNN models

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published