Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit e0929a4

Browse files
authoredDec 30, 2018
remove more spurious perplexity references
1 parent 97e3e13 commit e0929a4

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed
 

‎word_language_model/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,9 @@ By default, the training script uses the Wikitext-2 dataset, provided.
55
The trained model can then be used by the generate script to generate new text.
66

77
```bash
8-
python main.py --cuda --epochs 6 # Train a LSTM on Wikitext-2 with CUDA, reaching perplexity of 117.61
9-
python main.py --cuda --epochs 6 --tied # Train a tied LSTM on Wikitext-2 with CUDA, reaching perplexity of 110.44
10-
python main.py --cuda --tied # Train a tied LSTM on Wikitext-2 with CUDA for 40 epochs, reaching perplexity of 87.17
8+
python main.py --cuda --epochs 6 # Train a LSTM on Wikitext-2 with CUDA
9+
python main.py --cuda --epochs 6 --tied # Train a tied LSTM on Wikitext-2 with CUDA
10+
python main.py --cuda --tied # Train a tied LSTM on Wikitext-2 with CUDA for 40 epochs
1111
python generate.py # Generate samples from the trained LSTM model.
1212
```
1313

0 commit comments

Comments
 (0)
Please sign in to comment.