Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
-
Updated
Apr 24, 2023 - Jupyter Notebook
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Awesome Knowledge Distillation
PaddleSlim is an open-source library for deep model compression and architecture search.
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
Pytorch implementation of various Knowledge Distillation (KD) methods.
A PyTorch-based knowledge distillation toolkit for natural language processing
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
irresponsible innovation. Try now at https://chat.dev/
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
The Official Repo for "Quick Start Guide to Large Language Models"
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
Quantization library for PyTorch. Support low-precision and mixed-precision quantization, with hardware implementation through TVM.
(CVPR 2022) A minimalist, mapless, end-to-end self-driving stack for joint perception, prediction, planning and control.
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."