Skip to content
Discussion options

You must be logged in to vote

Hi @wittyalias (good GitHub name too btw),

Good question!

As far as PyTorch is concerned, both of these are the same in terms of requires_grad.

Your assumption is right that nn.Parameter takes the requires_grad parameter of the tensor passed to it.

But also, even if we didn't set requires_grad=True, nn.Parameter has it set to True by default, see: https://pytorch.org/docs/stable/generated/torch.nn.parameter.Parameter.html

I set it explicitly in the videos to showcase an example.

But you can also check the two above are the same via:

import torch
from torch import nn

# Grad outside torch.randn()
grad_outside = nn.Parameter(torch.randn(1, # <- start with random weights (this will get adjus…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by wittyalias
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants