Skip to content

Commit

Permalink
chore: update submodules (#189)
Browse files Browse the repository at this point in the history
Co-authored-by: ydcjeff <[email protected]>
  • Loading branch information
github-actions[bot] and ydcjeff authored Oct 3, 2023
1 parent b908cda commit c881068
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion src/how-to-guides/03-time-profiling.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ This example demonstrates how you can get the time breakdown for:
- Individual epochs during training
- Total training time
- Individual [`Events`](https://pytorch-ignite.ai/concepts/02-events-and-handlers#events)
- All [`Handlers`](https://pytorch-ignite.ai/concepts/02-events-and-handlers#handlers) correspoding to an `Event`
- All [`Handlers`](https://pytorch-ignite.ai/concepts/02-events-and-handlers#handlers) corresponding to an `Event`
- Individual `Handlers`
- Data loading and Data processing.

Expand Down
2 changes: 1 addition & 1 deletion src/how-to-guides/04-fastai-lr-finder.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ optimizer = torch.optim.RMSprop(model.parameters(), lr=1e-06)
criterion = nn.CrossEntropyLoss()
```

We will first train the model with a fixed learning rate (lr) of 1e-06 and inspect our results. Let's save the initial state of the model and the optimizer to restore them later for comparision.
We will first train the model with a fixed learning rate (lr) of 1e-06 and inspect our results. Let's save the initial state of the model and the optimizer to restore them later for comparison.


```python
Expand Down
2 changes: 1 addition & 1 deletion static/examples

0 comments on commit c881068

Please sign in to comment.