Skip to content
Discussion options

You must be logged in to vote

Hi @kiranvad,

The problem with the first implementation is that model.likelihood and log_prob are executed outside of the scope of functional_call, but access the noise variance parameter, and the lengthscales, respectively.

You can use the reparameterize_module context manager in the following way to ensure the parameters are replaced for all operations. I separated the individual calls out to highlight which line accesses which component of the Hessian. Removing any individual line from the context of reparameterize_module will lead the resulting Hessian to have zeros in the associated rows.

from torch.nn.utils.stateless import _reparametrize_module

def log_likelihood_fn(theta):
    pa…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@kiranvad
Comment options

Answer selected by kiranvad
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants