-
Notifications
You must be signed in to change notification settings - Fork 201
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
onnx转换trt报错 #112
Comments
@Yi-hash1 导出的onnx是静态输入,可参考仓库中导出onnx的指令 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
[04/10/2024-16:11:31] [W] [TRT] onnx2trt_utils.cpp:366: Your ONNX model has been generated with INT64 weights, while TensorRT does not natively support INT64. Attempting to cast down to INT32.
[04/10/2024-16:11:32] [I] Finish parsing network model
[04/10/2024-16:11:32] [E] Static model does not take explicit shapes since the shape of inference tensors will be determined by the model itself
[04/10/2024-16:11:32] [E] Network And Config setup failed
[04/10/2024-16:11:32] [E] Building engine failed
[04/10/2024-16:11:32] [E] Failed to create engine from model.
[04/10/2024-16:11:32] [E] Engine set up failed
The text was updated successfully, but these errors were encountered: