You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I run the FashionIQ dataset again, it generates such a warning:
Training loop started
0%
0/140 [00:00<?,?it/s]/homt
/px/miniconda3/envs/cir_sprc/lib/python3.9/sit
optimizer.step(). In PyTorch 1.1.0 and later, you
should call them in the opposite order:`optim
e-packages/torch/optim/lr_scheduler.py:139:Use
Warning: Detected call oflr_scheduler.step() befor
izer.step()beforelr_scheduler.step(). Fai
ure to do this will result in PyTorch skipping the fi
st value of the learning rate schedule. See more det
ils at https://pytorch.org/docs/stable/optim.h
tml#how-to-adjust-learning-rate
warnings.warn("Detected call of'lr scheduler
step() beforeoptimizer.step()
[0/30]loss itc:2.515,loss_rtc:3.010,loss_a
ign:0.195,:100%
And why is it at 0% when the training loop started?
I really hope you can help me solve this problem. Thank you
The text was updated successfully, but these errors were encountered:
When I run the FashionIQ dataset again, it generates such a warning:
Training loop started
0%
0/140 [00:00<?,?it/s]/homt
/px/miniconda3/envs/cir_sprc/lib/python3.9/sit
optimizer.step(). In PyTorch 1.1.0 and later, you
should call them in the opposite order:`optim
e-packages/torch/optim/lr_scheduler.py:139:Use
Warning: Detected call oflr_scheduler.step() befor
izer.step()beforelr_scheduler.step(). Fai
ure to do this will result in PyTorch skipping the fi
st value of the learning rate schedule. See more det
ils at https://pytorch.org/docs/stable/optim.h
tml#how-to-adjust-learning-rate
warnings.warn("Detected call of'lr scheduler
step() beforeoptimizer.step()
[0/30]loss itc:2.515,loss_rtc:3.010,loss_a
ign:0.195,:100%
And why is it at 0% when the training loop started?
I really hope you can help me solve this problem. Thank you
The text was updated successfully, but these errors were encountered: