Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

difficulty #12

Open
PanXGzhu opened this issue Oct 29, 2024 · 2 comments
Open

difficulty #12

PanXGzhu opened this issue Oct 29, 2024 · 2 comments

Comments

@PanXGzhu
Copy link

When I run the FashionIQ dataset again, it generates such a warning:
Training loop started
0%
0/140 [00:00<?,?it/s]/homt
/px/miniconda3/envs/cir_sprc/lib/python3.9/sit
optimizer.step(). In PyTorch 1.1.0 and later, you
should call them in the opposite order:`optim
e-packages/torch/optim/lr_scheduler.py:139:Use
Warning: Detected call oflr_scheduler.step() befor
izer.step()beforelr_scheduler.step(). Fai
ure to do this will result in PyTorch skipping the fi
st value of the learning rate schedule. See more det
ils at https://pytorch.org/docs/stable/optim.h
tml#how-to-adjust-learning-rate
warnings.warn("Detected call of'lr scheduler
step() beforeoptimizer.step()
[0/30]loss itc:2.515,loss_rtc:3.010,loss_a
ign:0.195,:100%

And why is it at 0% when the training loop started?

I really hope you can help me solve this problem. Thank you

@PanXGzhu
Copy link
Author

Uploading 99286A18A9048BD4371CB160C34EF3FC.png…

@PanXGzhu
Copy link
Author

99286A18A9048BD4371CB160C34EF3FC

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant