Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Clarify non-opitionality of TensordictPrimer #2362

Open
matteobettini opened this issue Aug 5, 2024 · 0 comments
Open

[BUG] Clarify non-opitionality of TensordictPrimer #2362

matteobettini opened this issue Aug 5, 2024 · 0 comments
Assignees
Labels
bug Something isn't working

Comments

@matteobettini
Copy link
Contributor

matteobettini commented Aug 5, 2024

Without the primer, the collector does not feed any hidden state to the policy

in the RNN tutorial it is stated that the primer is optional and it is used just to store the hidden states in the buffer.

This is not true in practice. Not adding the primer will result in the collector not feeding the hidden states to the policy during execution. Which will silently cause the rnn to loose any recurrency.

In the tutorial it seems like the only reason the Primer is there is to store hidden states in the buffer (which I would also highly advise against, as this makes any algorithm on-policy and can lead to severe issues).

I think it needs to be changed and remove any claims on optionality of the primer. Instead strong claims about its non-optionality should be made as, if a user removes it, the tutorial will silently run without recurrency.

Reproduce

To reproduce, comment out this line

env.append_transform(lstm.make_tensordict_primer())

and print the policy input at this line

policy_output = self.policy(policy_input)

you will see that no hidden state is fed to the rnn during execution and no errors or warnings are thrown

The future I want to live in

In the future I want to live in there are no primers. The torchrl components are able to look at the policy outputs and carry forward whatever is in "next". The primer for me is a unique pain point particular to torchrl, as users doing recurrency in other libs won't have any equivalent of this and will probably forget it causing major silent bugs

@matteobettini matteobettini added the bug Something isn't working label Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants