Releases: Xilinx/brevitas
Release version 0.2.1
Release version 0.2.1.
Changelog:
- Fix a few issues when using QuantTensors w/ zero point.
- Fix Hadamard layer, the implementation had fallen behind w.r.t QuantLayer and QuantTensor semantics.
- Make sure that the training flag in a QuantTensor is always set by the Module generating it.
Release version 0.2.0
First release on PyPI, version 0.2.0.
*FC topologies with TensorNorm as last layer
Updated *FC networks from @maltanar with TensorNorm instead of BatchNorm as last year, to ease deployment to FINN.
Pretrained 4b MobileNet V1 r2
Update pretrained MobileNet V1 w/ 4b weights in the first layer.
CNV test reference vectors r0
Reference tests vectors for CNV models, r0.
BNN-PYNQ topologies
CNV, LFC, SFC, TFC topologies, originally designed for BNN-PYNQ, trained with Brevitas. Thanks to @maltanar and @ussamazahid96 .
Matching txt files contain batch-by-batch accuracy results, taken directly from the evaluation scripts.
Pretrained 4b QuartzNet r0
Pretrained 4b QuartzNet for automatic speech recognition.
Pretrained 8b QuartzNet r0
Pretrained 8b QuartzNet encoder and decoder for automatic speech recognition.
Pretrained 8b MelGAN r0
Pretrained quantized 8b MelGAN vocoder on LJSpeech.
Pretrained 4b ProxylessNAS Mobile14 w/ Hadamard classifier r0
Pretrained quantized ProxylessNAS Mobile14 with everything at 4b (except input and weights of the first layer at 8 bits) and an Hadamard classifier as the last layer.