-
Notifications
You must be signed in to change notification settings - Fork 212
Open
Description
Is this model particularly memory intensive? The model hangs on building the loss model for seq_length 10 having used 16GB of RAM.
This might be caused by the fact that I am using tensorflow 1.0.0-rc1 (because of #26, legacy_seq2seq is present in 1.0.0-rc1 and not 0.12.1). I've modified your code to run on this version, and it trains correctly with lower length sequences, but still eats memory.
Is this normal, or some sort of memory leak caused by the newer version of tensorflow?
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels