-
Notifications
You must be signed in to change notification settings - Fork 24
Open
Labels
bugSomething isn't workingSomething isn't workinggood first issueGood for newcomersGood for newcomershacktoberfestHacktoberfestHacktoberfesttemplates
Description
Describe the bug
Following warning is shown in the CI: step-504
/opt/hostedtoolcache/Python/3.6.13/x64/lib/python3.6/site-packages/torch/optim/lr_scheduler.py:134:
UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`.
In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`.
Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.
See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
Reproduction
Steps to reproduce
python main.py \
--data_path <path_to_dataset> \
--train_batch_size 4 \
--eval_batch_size 4 \
--num_workers 2 \
--max_epochs 2 \
--train_epoch_length 4 \
--eval_epoch_length 4
Expected result
No warning to show.
Environment info
Output of python -m torch.utils.collect_env
:
OS: Linux
torch: 1.9.0
torchvision: 0.10.0
ignite: 0.4.5
If you like to tackle this issue, please comment that you want to work on and see the contributing guide.
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workinggood first issueGood for newcomersGood for newcomershacktoberfestHacktoberfestHacktoberfesttemplates
Type
Projects
Milestone
Relationships
Development
Select code repository
Activity