-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Open
Description
There is the following issue on this page: https://docs.pytorch.org/tutorials/beginner/basics/optimization_tutorial.html
Within the section Full implementation, the loop does not contain the zero_grad
function on top of the backward propagation block as is recommended in the paragraph preceding this section.
Actual code:
# Backpropagation
loss.backward()
optimizer.step()
optimizer.zero_grad()
Recommended code:
optimizer.zero_grad()
loss.backward()
optimizer.step()
If you could instruct me how to make this change on the documentation, I would be glad to do that.
Metadata
Metadata
Assignees
Labels
No labels