Re-enabling gradients in Validation loop? #7116
-
I'm trying to use Layer-wise relevance propagation as part of the training and validation loop of training my model, but it requires gradients to be present in order to calculate the relevance. I'm trying to figure out the best way of approaching this in a Lightning-friendly way. I can already do it for the training loop, but the validation loop is what's giving me problems. The only way I can think to do it at the moment is once the validation loop finishes, re-run the validation dataset with gradients enabled without calling the optimiser's |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
You can call |
Beta Was this translation helpful? Give feedback.
You can call
torch.set_grad_enabled(True)
in your validation loop or in any hook you want. Doesn't that work?And perhaps
model.train()
if you have normalization layers dropout or that kind of layers.