Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not running in ONNX Runtime #151

Open
vaghawan opened this issue Jul 31, 2024 · 2 comments
Open

Not running in ONNX Runtime #151

vaghawan opened this issue Jul 31, 2024 · 2 comments

Comments

@vaghawan
Copy link

vaghawan commented Jul 31, 2024

Hi,
I'm currently using pipeless v1 currently in Jetson Orien with Jetpack 5.1.2, I've the ONNX runtime installed, and if I run the independent code out form pipeless, my inference runs fine. But when I run it using pipeless using onnx runtime, it throws the following error

terminate called after throwing an instance of 'onnxruntime::OnnxRuntimeException'
  what():  /home/user/onnxruntime/include/onnxruntime/core/framework/ort_value.h:85 const T& OrtValue::Get() const [with T = onnxruntime::Tensor] IsTensor() was false. Trying to get a Tensor, but got: (null)

This happens for the given example of onnx-yolo as well. And if I were to use the Ultralytics YOLO the .pt version of the model, the pipeless gets stucks and doesn't reach to the point where it process frame.

Any help would be appreciated.

Thanks

@miguelaeh
Copy link
Collaborator

Hi @vaghawan ,

I have not seen that error before, which version of the onnx runtime are you using?

@vaghawan
Copy link
Author

It's the 1.17.0 version : onnxruntime_gpu-1.17.0-cp38-cp38-linux_aarch64.whl

It's available through nvidia complied whl file, it seems this is the only option in Orien devices. These build are available through this : https://elinux.org/Jetson_Zoo#ONNX_Runtime

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants