This work shows how to train neural networks with quantized weights directly via backpropagation. The code quantizes weights as well as activations. It runs LeNet-300 on MNIST and ResNet18 on CIFAR10 datasets. More details in this pdf
--wbits specifies the number of bits for weights
--abits specifies the number of bits for activations
python Trainer.py --wbits 4 --abits 4