《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
-
Updated
Dec 19, 2024 - Jupyter Notebook
《李宏毅深度学习教程》(李宏毅老师推荐👍,苹果书🍎),PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
Neural Network Quantization & Low-Bit Fixed Point Training For Hardware-Friendly Algorithm Design
Group Fisher Pruning for Practical Network Compression(ICML2021)
Using ideas from product quantization for state-of-the-art neural network compression.
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
MUSCO: MUlti-Stage COmpression of neural networks
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
This is the official implementation of "DHP: Differentiable Meta Pruning via HyperNetworks".
Pytorch implemenation of "Learning Filter Basis for Convolutional Neural Network Compression" ICCV2019
Deep Neural Network Compression based on Student-Teacher Network
💍 Efficient tensor decomposition-based filter pruning
李宏毅教授 ML 2020 機器學習課程筆記 & 實作
Code for "Variational Depth Search in ResNets" (https://arxiv.org/abs/2002.02797)
Overparameterization and overfitting are common concerns when designing and training deep neural networks. Network pruning is an effective strategy used to reduce or limit the network complexity, but often suffers from time and computational intensive procedures to identify the most important connections and best performing hyperparameters. We s…
MUSCO: Multi-Stage COmpression of neural networks
Add a description, image, and links to the network-compression topic page so that developers can more easily learn about it.
To associate your repository with the network-compression topic, visit your repo's landing page and select "manage topics."