Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Where is the feature of bluring previous frames in case of missed detections implemented? #79

Open
aaronpohle opened this issue Sep 6, 2023 · 13 comments · May be fixed by #87
Open

Where is the feature of bluring previous frames in case of missed detections implemented? #79

aaronpohle opened this issue Sep 6, 2023 · 13 comments · May be fixed by #87
Assignees

Comments

@aaronpohle
Copy link

According to the readme, the dashcamcleaner has a nice feature:
"Sometimes, a license plate might be missed for just one frame. This one frame, usually 1/30th of a second long, still means the license plate or face could easily be identified - a computationally very cheap (as opposed to increasing the inference scale) way to fix such false negatives can be the frame memory option. In essence, it blurs not only the detected boxes in the current frame, it also blurs regions that were detected in n frames before."

Where is it implemented and how can i influence the number of previous blurred frames?

@RegularTom91
Copy link

Hast du ein 60fps Video genommen zum probieren?

Did you took a 60 frames per second video?

@joshinils
Copy link
Contributor

@tfaehse removed it in favor of faster batch processing

@graebec
Copy link

graebec commented Sep 7, 2023

Would be nice to have it back, just to avoid flashing license plate within scenes.

@aaronpohle
Copy link
Author

Thanks you for the answer, i found the commit:
d5e792e

Now I can look how I can enable the feature again, probably by losing performance. But I have cases in which this feature would help and is more important than speed.

@graebec
Copy link

graebec commented Sep 7, 2023

Thanks you for the answer, i found the commit: d5e792e

Now I can look how I can enable the feature again, probably by losing performance. But I have cases in which this feature would help and is more important than speed.

Could you please share with us in case you find a way ?

@joshinils
Copy link
Contributor

also, this issue is now a duplicate of #68, both are asking the same thing

@nils8107
Copy link

nils8107 commented Sep 8, 2023

I also would like it back :-)

@Counterdoc
Copy link

Same here, this feature would help a lot. I have a lot of flickering between some frames unfortunately.

@tfaehse
Copy link
Owner

tfaehse commented Dec 2, 2023

I'll try to address this over the holidays, I didn't know it was actually popular :)

@Mauzifus
Copy link

I would really like to have the feature back - it doesn't matter if it takes longer - it's still faster than correcting the errors by hand.

@JZ4142
Copy link

JZ4142 commented Jan 29, 2024

Hi,
is there any news on this topic? Would be great to have the feature back.

@thkukuk
Copy link

thkukuk commented Jan 31, 2024

I also would like to have this feature back :)
Speed is no issue, as I "clean" the videos in batch mode on a server in the background, but fixing all this "flickering" manual is a huge amount of work. Flickering always means, the face or plate is visible for one or two pictures.

@tfaehse
Copy link
Owner

tfaehse commented Feb 5, 2024

This PR should fix this, hopefully: #87

Basically, the pipeline was (pretty massively) changed to

  1. Read frames, run detection for all frames
  2. Run forward/backward tracking
  3. Read frames, write blurred frames for all frames

Internally, this tracker (https://github.com/tryolabs/norfair, love it) uses kalman filters to estimate movement of boxes it can track. In the best case, this works better than a frame memory, because the position is not the same as in the previous frames but a better estimation. Worst case it's the same as before, because no real motion can be estimated. I hope this works well for you! :)

Next up, the detector has to be improved.

@tfaehse tfaehse self-assigned this Feb 5, 2024
@tfaehse tfaehse linked a pull request Feb 5, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

10 participants