Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GroundingDINO's high false positive rate make it difficult to detect obj hallucination #13

Open
ipheiman opened this issue Aug 22, 2024 · 1 comment

Comments

@ipheiman
Copy link

Hi authors,
Amazing work on post-hoc hallucination mitigation, I am excited to try this out!

I am looking at the individual steps and experimenting with GroundingDINO but found that it has the tendency to give false positives, which is counter-intuitive for flagging out hallucinated objects. This issue is also raised in GroundingDino's repo: IDEA-Research/GroundingDINO#84

I was wondering if you encountered something similar when developing your work. It would be beneficial to hear your opinion on this, thanks!!

@xjtupanda
Copy link
Collaborator

xjtupanda commented Aug 22, 2024

Thanks for your attention to our work!

We've met similar issues but haven't figured out a perfect solution for this since this is a problem inherent to the detection model. But most of the time, GroundingDINO worked just fine.

I think this could be a gap where you make improvements. You may try using a VQA model (e.g., BLIP-2) to double-check if something really exists. Or revise the pipeline, like performing a global detection first (like this one recognize-anything) and then checking if the object matches anything in those tags.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants