A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
-
Updated
Mar 25, 2023 - Python
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
knowledge distillation papers
Knowledge distillation implemented in TensorFlow
Transfer learning on VGG16 using Keras with Caltech256 and Urban Tribes dataset. Dark knowledge in transfer learning.
My reading list focused around offensive web automation
Official code for "SCL-IKD: intermediate knowledge distillation via supervised contrastive representation learning" Published at Applied Intelligence Journal
Add a description, image, and links to the dark-knowledge topic page so that developers can more easily learn about it.
To associate your repository with the dark-knowledge topic, visit your repo's landing page and select "manage topics."