You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Zero the gradient buffers of all parameters and backprops with random gradients:
net.zero_grad()
out.backward(torch.randn(1, 10))
What is the purpose of this? It is not part of standard ML workflows and can be confusing to beginners. (As evidence,I am helping some people learn basics of ML and I got questions about this line. This is how I found out about it!)
If there is no good reason for it, then I suggest:
dropping these few lines
changing wording of other parts of the page if needed. E.g. 'at this point we covered... calling backward'