Inference issue with Quantized yolov8 OpenVINO model #2853
Unanswered
chandra-ps612
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello @MaximProshin,
Recently, I'm doing post-training quantization using NNCF with yolov8_custom_model(Uncompressed OpenVINO model). I observed that It is generating similar inference results on test set, Albeit its quantized version is unable to do inference on the same dataset.
Creating and instantiating CustomDataset Class
Optimize model using NNCF Post-training Quantization API
Installed packages
Even, I tried without ignored_scope arguments, but still getting the same results. Please help me out in this context.
Beta Was this translation helpful? Give feedback.
All reactions