Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why delete the sigmoid in the output layers? #18

Open
tongxueqing opened this issue Jul 23, 2022 · 3 comments
Open

Why delete the sigmoid in the output layers? #18

tongxueqing opened this issue Jul 23, 2022 · 3 comments

Comments

@tongxueqing
Copy link

Thanks for your wonderful work. I wonder why you delete the sigmoid in the output layers in DenseNet.py in the latter version? And I guess it will be reasonable if self.fc_dist(out) is followed by softmax , since the sum of gt_distribution is one .(https://github.com/fnzhan/Illumination-Estimation/blob/master/RegressionNetwork/DenseNet.py)

@fnzhan
Copy link
Owner

fnzhan commented Jul 24, 2022

Hi, I just find including sigmoid will make the network more difficult to converge during training.

@tongxueqing
Copy link
Author

Thanks for your timely reply. Can you share how large is the subset when you begin to add schedule for learning rate? And I found you do not save ambient term in the test.py, is it due to the ambient term is not important?

@tongxueqing
Copy link
Author

https://github.com/fnzhan/Illumination-Estimation/blob/master/RegressionNetwork/DenseNet.py
The if condition does not hold, it is a bug or you intend to do so?
截屏2022-07-29 下午3 21 58

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants