Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

hi train with my own data #1

Open
zhangyunming opened this issue Jul 13, 2020 · 5 comments
Open

hi train with my own data #1

zhangyunming opened this issue Jul 13, 2020 · 5 comments

Comments

@zhangyunming
Copy link

zhangyunming commented Jul 13, 2020

i train the v3 with my own data that with lables 0:backgrounbd and 1:portrait, just two classes ,so the n_classes=1 , but you masks labels are 1-255 , and my train loss is not change and the test result are all black 0 . so how can i do .thanks

@zhangyunming zhangyunming changed the title hi can you give a train log? hi train with my own data Jul 13, 2020
@zhangyunming
Copy link
Author

the bug is ok.its similar with pytorch-unet

@avBuffer
Copy link
Owner

hey, you maybe change into UNet2plus to try. If the error is same, there is wrong in your data or using loss function. Else there is wrong in UNet3plus model structure.

@avBuffer
Copy link
Owner

Hey,thanks! I tried it and found the training loss was changed after about 18 steps. In fact, from step-0 to step-17, the training loss had a bit a bit changed. You maybe train a long steps.

@jialeqaq
Copy link

jialeqaq commented Jul 7, 2023

I train myself on data, DICE is low and constant, how do I solve it

@jialeqaq
Copy link

jialeqaq commented Jul 8, 2023

i train the v3 with my own data that with lables 0:backgrounbd and 1:portrait, just two classes ,so the n_classes=1 , but you masks labels are 1-255 , and my train loss is not change and the test result are all black 0 . so how can i do .thanks

hello,why my training data DICE is very low and the more I train, the worse it gets

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants