How come the StudentTLikelihood is 10 dimensional? #2313
-
I would expect it to be 1-dimensional for 1-dimensional observations but I'm guessing there is a good reason for the (to me) surprising behaviour. Can anyone please explain this to me? Thanks! See the snippet below for an example. import gpytorch
import torch
class GPModel(gpytorch.models.ApproximateGP):
def __init__(self, inducing_points):
variational_distribution = gpytorch.variational.NaturalVariationalDistribution(inducing_points.size(0))
variational_strategy = gpytorch.variational.VariationalStrategy(
self, inducing_points, variational_distribution, learn_inducing_locations=True
)
super(GPModel, self).__init__(variational_strategy)
self.mean_module = gpytorch.means.ConstantMean()
self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())
def forward(self, x):
mean_x = self.mean_module(x)
covar_x = self.covar_module(x)
return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)
x = torch.linspace(0, 10, 100)
model = GPModel(inducing_points=x)
likelihood = gpytorch.likelihoods.StudentTLikelihood()
model(x).mean.shape # torch.Size([100]), nothing out of the ordinary here
likelihood(model(x)).mean.shape # torch.Size([10, 100]), surprising! |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
See the documentation (currently available only under "latest", will be under "stable" upon next release): https://docs.gpytorch.ai/en/latest/likelihoods.html#gpytorch.likelihoods.Likelihood.__call__ Also see #1955 for a discussion on this topic. |
Beta Was this translation helpful? Give feedback.
-
Thank you! I didn't realise the likelihood Monte Carlo sampled automatically 👍 |
Beta Was this translation helpful? Give feedback.
See the documentation (currently available only under "latest", will be under "stable" upon next release): https://docs.gpytorch.ai/en/latest/likelihoods.html#gpytorch.likelihoods.Likelihood.__call__
Also see #1955 for a discussion on this topic.