Having trouble exporting model to ONNX #1821
-
Hey, thanks for the great model, but i seem to be having some issues with exporting to ONNX. I wanted to create a model that could perform faster and (possibly) lighter to run on a server. So far i've been using a docTR model that i am fine tuning for text recognition using my own dataset. I've been using pytorch, since i've been having some issues with tensorflow. I am now trying to export the model to ONNX, but i've been running to an issue following the documentation Here's the code
and here's the error
Any ideas on what the issue is? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 4 replies
-
Hi @JulianAndhika 👋, You missed to set the model to
Best, |
Beta Was this translation helpful? Give feedback.
Hi @JulianAndhika 👋,
You missed to set the model to
exportable
:)Best,
Felix