Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inability to get rotation error lower than ~2.5deg on KITTI, even with overfitting #11

Open
AndreiBarsan opened this issue Aug 28, 2020 · 3 comments

Comments

@AndreiBarsan
Copy link

AndreiBarsan commented Aug 28, 2020

First off, I'd like to just thank you for making this code base openly available. It is very quick to set up and easy to work with. I was able to reproduce the results on KITTI with little effort. Thank you!

My question is rather specific about the rotation estimation, and I am aware it probably doesn't matter too much given the option to run the refinement on the global reg. result. However, I am curious about the mechanism behind it.

Specifically, while training the network, I noticed that when training DGR on KITTI, it seems like the RRE can never go below approximately 2.5 degrees, even after multiple epochs.

Moreover, I am unable to reduce the error even when overfitting a minibatch of just four KITTI samples. I am able to get the RTE as low as 5cm, but the rotation error never drops below 2.56deg, even after 100+ iterations on the same batch. Reducing the translation weight doesn't help either. It seems every sample gets stuck at 0.0447 rad of rotation error and simply can't go lower.

This never gets any lower even with 10s of iterations:

08/28 20:32:14 Rot error: tensor([0.0447, 0.0447, 0.0447, 0.0447], grad_fn=<AcosBackward>)

Have you encountered this when working on Deep Global Registration? It's not a major issue but I am thinking about whether this could be due to how the model is structured (backpropagating error through the correspondences), or due to some other reason.

Thank you,
Andrei

P.S. Here is an easy way to support overfitting in trainer.py:

First, one can modify get_data:

    def get_data(self, iterator):
        while True:
            if self.overfit:
                if self._cached_data is None:
                    self._cached_data = iterator.next()
                else:
                    print("Returning the same old batch!")

                return self._cached_data
            return iterator.next()

And just set self.overfit = True in the constructor, pass it via the config, etc.

@Kausta
Copy link

Kausta commented Mar 15, 2021

Hello,

While trying to overfit the model (with no data augmentation/batch normalization or regularization), we observed the same issue under multiple different training scenarios. It seems that the rotation error gets stuck at 2.563 degrees and does not go lower, even when it is trying to register a small point cloud directly with itself.

Our observations show that this does not depend on voxel size, as the above commenter had the same issue with KITTI and we tried with 3D Match Features using voxel sizes 0.025 and 0.5.

Have you previously encountered this issue, or have an idea about what may be causing it ? We weren't able to find where this error comes from, but it is interesting that the lower bound of the error is the same in all cases at 2.56 degrees.

Thanks,
Caner

@Kausta
Copy link

Kausta commented Mar 18, 2021

Update: We found the reason of the error, it comes from the clamping operation applied in the batch_rotation_error function. Clamping the argument for arccos to 0.999 restricts the minimum rotation error to 2.563 degrees. We can send a pull request if you are interested in fixing the rotation error.

@AndreiBarsan
Copy link
Author

Dear Caner,

I had not managed to dig deeper into this anomaly since the time I posted the issue.

That's some amazing bug detective skills! Are you able to overfit to lower rotational errors now?

Kind regards,
Andrei

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants