-
Notifications
You must be signed in to change notification settings - Fork 479
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
qat输出模型如何支持后续的部署工作? #87
Comments
1、inference区别仅仅是先整体做weight量化,推理时仅做activation量化,可以再检查下; |
非常感谢🙏。这几天研究发现qat所得模型确实非常敏感,对于不同的量化参数或算法兼容性基本为0。最近正好看到一篇论文有讨论这个问题: https://arxiv.org/abs/2002.07686 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi~首先非常感谢你的工作。我最近将micronet的dorefa量化部分适用到我的工程中(检测模型),发现几个问题,希望讨论一下:
The text was updated successfully, but these errors were encountered: