Skip to content

Commit

Permalink
Examples (mobilenetv1): update to r2 w/ 4b first layer
Browse files Browse the repository at this point in the history
Signed-off-by: Alessandro Pappalardo <volcacius@users.noreply.github.com>
  • Loading branch information
volcacius committed Aug 11, 2020
1 parent 2c0019f commit 7703823
Show file tree
Hide file tree
Showing 4 changed files with 9 additions and 7 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,7 +127,7 @@ Below a list of relevant pretrained models available, currently for evaluation o

| Name | Scaling Type | First layer weights | Weights | Activations | Avg pool | Top1 | Top5 | Pretrained model | Retrained from |
|--------------|----------------------------|---------------------|---------|-------------|----------|-------|-------|-------------------------------------------------------------------------------------------------|---------------------------------------------------------------|
| MobileNet V1 | Floating-point per channel | 8 bit | 4 bit | 4 bit | 4 bit | 71.14 | 90.10 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_mobilenet_v1_4b-r1/quant_mobilenet_v1_4b-0100a667.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| MobileNet V1 | Floating-point per channel | 4 bit | 4 bit | 4 bit | 4 bit | 70.36 | 89.61 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_mobilenet_v1_4b-r2/quant_mobilenet_v1_4b-e75a094f.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| ProxylessNAS Mobile14 w/ Hadamard classifier | Floating-point per channel | 8 bit | 4 bit | 4 bit | 4 bit | 73.52 | 91.46 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_proxylessnas_mobile14_hadamard_4b-r0/quant_proxylessnas_mobile14_hadamard_4b-4acbfa9f.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| ProxylessNAS Mobile14 | Floating-point per channel | 8 bit | 4 bit | 4 bit | 4 bit | 74.42 | 92.04 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_proxylessnas_mobile14_4b-r0/quant_proxylessnas_mobile14_4b-e10882e1.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| ProxylessNAS Mobile14 | Floating-point per channel | 8 bit | 4 bit, 5 bit | 4 bit, 5 bit | 4 bit | 75.01 | 92.33 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_proxylessnas_mobile14_4b5b-r0/quant_proxylessnas_mobile14_4b5b-2bdf7f8d.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
Expand Down
2 changes: 1 addition & 1 deletion brevitas_examples/imagenet_classification/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ Below in the table is a list of example pretrained models made available for ref

| Name | Cfg | Scaling Type | First layer weights | Weights | Activations | Avg pool | Top1 | Top5 | Pretrained model | Retrained from |
|--------------|-----------------------|----------------------------|---------------------|---------|-------------|----------|-------|-------|-------------------------------------------------------------------------------------------------|---------------------------------------------------------------|
| MobileNet V1 | quant_mobilenet_v1_4b | Floating-point per channel | 8 bit | 4 bit | 4 bit | 4 bit | 71.14 | 90.10 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_mobilenet_v1_4b-r1/quant_mobilenet_v1_4b-0100a667.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| MobileNet V1 | quant_mobilenet_v1_4b | Floating-point per channel | 4 bit | 4 bit | 4 bit | 4 bit | 70.36 | 89.61 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_mobilenet_v1_4b-r2/quant_mobilenet_v1_4b-e75a094f.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| ProxylessNAS Mobile14 w/ Hadamard classifier | quant_proxylessnas_mobile14_hadamard_4b | Floating-point per channel | 8 bit | 4 bit | 4 bit | 4 bit | 73.52 | 91.46 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_proxylessnas_mobile14_hadamard_4b-r0/quant_proxylessnas_mobile14_hadamard_4b-4acbfa9f.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| ProxylessNAS Mobile14 | quant_proxylessnas_mobile14_4b | Floating-point per channel | 8 bit | 4 bit | 4 bit | 4 bit | 74.42 | 92.04 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_proxylessnas_mobile14_4b-r0/quant_proxylessnas_mobile14_4b-e10882e1.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
| ProxylessNAS Mobile14 | quant_proxylessnas_mobile14_4b5b | Floating-point per channel | 8 bit | 4 bit, 5 bit | 4 bit, 5 bit | 4 bit | 75.01 | 92.33 | [Download](https://github.com/Xilinx/brevitas/releases/download/quant_proxylessnas_mobile14_4b5b-r0/quant_proxylessnas_mobile14_4b5b-2bdf7f8d.pth) | [link](https://github.com/osmr/imgclsmob/tree/master/pytorch) |
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
[MODEL]
ARCH: quant_mobilenet_v1
WIDTH_SCALE: 1.0
PRETRAINED_URL: https://github.com/Xilinx/brevitas/releases/download/quant_mobilenet_v1_4b-r1/quant_mobilenet_v1_4b-0100a667.pth
PRETRAINED_URL: https://github.com/Xilinx/brevitas/releases/download/quant_mobilenet_v1_4b-r2/quant_mobilenet_v1_4b-e75a094f.pth

[QUANT]
BIT_WIDTH: 4
FIRST_LAYER_BIT_WIDTH: 4

[PREPROCESS]
MEAN_0: 0.485
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -36,8 +36,6 @@
from .common import *


FIRST_LAYER_BIT_WIDTH = 8


class DwsConvBlock(nn.Module):
def __init__(self,
Expand Down Expand Up @@ -109,6 +107,7 @@ class MobileNet(nn.Module):
def __init__(self,
channels,
first_stage_stride,
first_layer_bit_width,
bit_width,
in_channels=3,
num_classes=1000):
Expand All @@ -120,7 +119,7 @@ def __init__(self,
out_channels=init_block_channels,
kernel_size=3,
stride=2,
weight_bit_width=FIRST_LAYER_BIT_WIDTH,
weight_bit_width=first_layer_bit_width,
activation_scaling_per_channel=True,
act_bit_width=bit_width)
self.features.add_module('init_block', init_block)
Expand Down Expand Up @@ -162,13 +161,15 @@ def quant_mobilenet_v1(cfg):
first_stage_stride = False
width_scale = float(cfg.get('MODEL', 'WIDTH_SCALE'))
bit_width = cfg.getint('QUANT', 'BIT_WIDTH')
first_layer_bit_width = cfg.getint('QUANT', 'FIRST_LAYER_BIT_WIDTH')

if width_scale != 1.0:
channels = [[int(cij * width_scale) for cij in ci] for ci in channels]

net = MobileNet(channels=channels,
first_stage_stride=first_stage_stride,
bit_width=bit_width)
bit_width=bit_width,
first_layer_bit_width=first_layer_bit_width)
return net


Expand Down

0 comments on commit 7703823

Please sign in to comment.