[Detector Support]: #11841
Replies: 3 comments 1 reply
-
there may be some issue with floats in that part, @blakeblackshear may be able to answer this better than I can but I want to note that even if the model was accepted you would not see object detections. This is because your labelmap and object -> track list do not match |
Beta Was this translation helpful? Give feedback.
-
The error message is saying that your model is expecting float32, but Frigate is passing a uint8 input. Your model doesn't have the inputs (and probably outputs) that are expected by Frigate. I would start by developing a simple python script that runs your model on a test image. Then compare that to what Frigate's source code does. You will either need to modify Frigate with a custom detector or modify your model to align with the expected inputs and outputs. |
Beta Was this translation helpful? Give feedback.
-
have the same issue ,the model is converted from pytorch model |
Beta Was this translation helpful? Give feedback.
-
Describe the problem you are having
We are trying to use a custom model (built on the Azure Custom Vision service) on our frigate service, as we have found that DeepStack does not quite meet the accuracy levels we expect.
We have trained our models with specific courier company logos and named faces for facial recognition.
We saved the .tflite and label.txt files within the same location as our frigate folders (we run Linux Debian Bookworm OS).
However, when we update the config.yaml file with the detector settings for the custom model, it breaks frigate. The specific error message and config.yml entries are as shown below.
In summary, the error expects a FLOAT32, which is the same format we exported our Azure-trained model as, but it appears that frigate sees the model as a UINT8, which it isn’t.
We have inspected the .tflite file to validate it is indeed in FLOAT32.
We will be eternally grateful if you could offer any guidance or support, as we have exhausted every avenue of research or approach to solving this problem, with no luck at all.
Error Message
ValueError: Cannot set tensor: Got value of type UINT8 but expected type FLOAT32 for input 0, name: Placeholder
Config.yml entry for custom model
detectors:
custom_model:
objects:
track:
model:
labelmap:
Version
Frigate v3.9
Frigate config file
docker-compose file or Docker CLI command
Relevant log output
Operating system
Debian
Install method
Docker Compose
Coral version
CPU (no coral)
Any other information that may be helpful
No response
Beta Was this translation helpful? Give feedback.
All reactions