I am not able to get what the problem is? #1247
Unanswered
apurvkumar023
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
'''python
lossy = []
epochs = 1
###Training
#0. Loop thorugh the data
for epoch in range(epochs):
#now we need to set the model to training mode
model_0.train() # train model in pytorch sets all parameters that require gradients to require gradients
#1. Forward Pass
y_pred = model_0(X_train)
#2. Calculate the loss
loss = loss_fn(y_pred, y_train)
#lets print the loss and append the loss to make a graph
lossy.append(loss.item())
print(f"loss: {loss}")
3. Optimizer zero grad
optimizer.zero_grad() #we here use this line because the gradient accumulates over time .... exact reason unknown
#4. perform backpropagation on th eloss with respec to the parameters of the model
loss.backward()
#5. step the optimizer (perform gradient descent)
optimizer.step()
###testing
model_0.eval() # turns off gradient tracking
print(model_0.state_dict())
'''
I tried chatgpt but still I am not getting the answer. Loss value and the value of parameters nare not changing. I don't know where AM i DOING THE MISTAKE!!!
Beta Was this translation helpful? Give feedback.
All reactions