How to extract gradients from a tensor and add it to the data of another tensor? #2343
Replies: 1 comment
-
let mut weights = Var::randn(0_f32, 1_f32, &[27, 27], &DEVICE)?;
for k in 0..50 {
let loss = forward_pass(&xs, &ys, &weights)?;
println!("Iteration {k} - loss = {}", loss.to_scalar::<f32>()?);
weights.backward()?.remove(&weights);
let loss_grad = loss.backward()?;
let weights_grad = loss_grad.get(&weights).unwrap();
weights = Var::from_tensor(
&weights
.broadcast_sub(&weights_grad.broadcast_mul(&Tensor::new(&[50_f32], &DEVICE)?)?)?,
)?;
}Currently using this, but am unsure if this is the best way. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Taken from Karpathy's lecture - makemore_part1_bigrams.ipynb
Here W and loss are PyTorch tensors.
I'm having trouble understanding GradStore. How can I implement the same using candle tensors?
Beta Was this translation helpful? Give feedback.
All reactions