-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
num_io_tensors get error of TensorRT 8.5 when running on GPU 4090 #3803
Comments
You can try as follow trtexec --onnx=superpoint_lightglue.onnx --loadEngine=superpoint_lightglue.engine --verbose 2>&1 |tee log
cat log |grep "Using random values for input"
cat log |grep "Using random values for output" can show all inputs and outputs. |
I try this command and I get output as follow.
Actually have two inputs. But onnx file have four input tensors.
|
1 similar comment
I try this command and I get output as follow.
Actually have two inputs. But onnx file have four input tensors.
|
@peter5232 trtexec --onnx=superpoint_lightglue.onnx --saveEngine=superpoint_lightglue.engine --verbose 2>&1 | tee build.log and then upload the build.log file |
what does |
Check inputs/outputs by netron is not always right. Sometimes netron can not see the hidden inputs/outputs. |
@zerollzeng I come across one case, the onnx(39MB) open by netron show nothing, but use trtexec can build pass.
|
Description
I have four input tensors [ "kpts0", "kpts1", "desc0", "desc1" ].
I convert engine with the following command. onnx file
trtexec --onnx=superpoint_lightglue.onnx --saveEngine=superpoint_lightglue.engine
But when I use the Python API to obtain the IO Tensor, I only get desc0, desc1, matches0, mscores0.
I get output as follow.
Environment
TensorRT Version: v8.5.3 and v8.6.1
NVIDIA GPU: 4090
NVIDIA Driver Version: 535.129.03
CUDA Version: 11.8
CUDNN Version: 8.9.6
Operating System:
Python Version (if applicable): 3.11
Tensorflow Version (if applicable):
PyTorch Version (if applicable): 2.1.0
Baremetal or Container (if so, version):
The text was updated successfully, but these errors were encountered: