SVGP Prediction Expression #1761
-
Hi, I am completely new to Gaussian Process Regression and was looking for some help understanding how the SVGP prediction expression looks like in terms of the parameters obtained from model.named_parameters() and likelihood.named_parameters(). For context, I wish to train an SVGP using GPyTorch and then deploy it manually on a platform that does not support GPyTorch. Therefore I need to understand what the prediction expression looks like. I followed the example at https://docs.gpytorch.ai/en/stable/examples/04_Variational_and_Approximate_GPs/SVGP_Regression_CUDA.html with the exception that I used my own training and test data. Here is what my GP model looks like -
According to titsias09a, the posterior mean of the SVGP is given by - I obtained I calculated Since the mean of targets in my dataset should be 0, I calculated the posterior mean as described in titsias09a, and also using the following equation (which takes a constant non-zero mean into consideration) - Note that here c is a constant scalar (obtained from However both these expressions evaluated to something that does not match the predictions from GPyTorch on the same test data. I am not sure whether I am using incorrect expressions, or wrongly interpreting the parameters. Any help would be appreciated. Thanks in advance! |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
This is due to the fact that the variational strategy is in the So you want to compute: |
Beta Was this translation helpful? Give feedback.
This is due to the fact that the variational strategy is in the
whitened
parameterization, see section 5.1 here although the concept dates back from here.So you want to compute:
K_{xm}K_{mm}^{-1/2} \mu
in the zero mean case andc + K_{xm}K_{mm}^{-1/2} (\mu - c)
in the non-zero mean case. The inverse matrix square roots should be computed via cholesky.