Skip to content

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher [CVPR 2022 Oral]

License

Notifications You must be signed in to change notification settings

iamkanghyunchoi/ait

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher [CVPR 2022 Oral]

This folder contains the official implementation of paper It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher on GDFQ, Qimera, AutoReCon framework.

AIT Performance comparison

Requirements

  • Python 3.6
  • PyTorch 1.10.1
  • Refer requirements.txt for other requirements

Setup

We recommend using Python virtual environment to run this code.

You can install requirements with the command below.

pip install -r requirements.txt

Folder Structure

ait_code
├── figs
├── AutoReCon_AIT
│   ├── main.py
│   ├── optimizer.py                                # GI implementation
│   ├── option.py 
│   ├── trainer.py
│   ├── {DATASET}_{NETWORK}.hocon                   # Setting files
│   ├── run_{DATASET}_{NETWORK}_{BITWIDTH}bit.sh    # Train scripts
│   ├── trainer.py
│   └── ...                                         # Utils
├── GDFQ_AIT
│   └── ...                                         # Similar to above
├── Qimera_AIT
│   └── ...                                         # Similar to above
├── LICENSE.md
├── README.md
└── requirements.txt

Training

For Imagenet training, change the path of the validation set in .hocon file. To train the model described in the paper, run one of this command:

./run_cifar10_4bit.sh
./run_cifar100_4bit.sh
./run_imgnet_resnet18_4bit.sh
./run_imgnet_resnet50_4bit.sh
./run_imgnet_mobilenet_v2_4bit.sh

The script name is same for all experiment framework.

Major Arguments

  • --conf_path : path to .hocon file
  • --ce_scale : coefficient of Cross-Entropy loss
  • --kd_scale : coefficient of KL-Divergence loss
  • --passing_threshold : update ratio of quanized parameter per step
  • --alpha_iter : GI search step limit
  • --adalr : enable GI

License

This project is licensed under the terms of the GNU General Public License v3.0

About

It's All In the Teacher: Zero-Shot Quantization Brought Closer to the Teacher [CVPR 2022 Oral]

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published