Methods for improving kernel conditioning - in the flat limit and beyond #2213
Replies: 2 comments
-
Hi @Otruffinet, thanks for starting this discussion topic. I think there are indeed some nice ideas in there. @dme65 has worked quite a bit on RBF interpolation in the past. We have been using such models in certain high-dimensional settings with large data sets. I am less familiar with the preconditioning ideas stemming from that, but it sounds like these could be quite useful. Have you implemented any of these and tried them out with gpytorch? |
Beta Was this translation helpful? Give feedback.
-
Hello @Balandat , No I haven't, I'm sorry. My interest in this topic was not so much practical as theoretical : in fact, I re-discovered this convergence by chance, and then did some research to find if this result was already known to the community... |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I've recently come across an interesting line of work which I wonder if you are aware of, as it was undertaken in the RBF community rather than the GP one. It all stems from the observation that in the limit where a (stationary) kernel input lenghtscale tends to infinity (the "flat limit"), the equations of Radial Basis Interpolation (equivalent to GP regression with zero noise) converge to these of polynomial interpolation. For instance, for a SE kernel in 1D, when sigma-->+infty, RBF interpolation simply boils down to... Lagrange interpolation. From the tests I've made, in practice the convergence is very fast : even sigma=2 (for a definition interval of [-1;1]) yields results very close to the asymptotic value.
This has inspired several methods for improving the conditioning of these interpolation systems. In [1][2] R.Schaback suggests a preconditioner for this flat limit ; in [3], [4] and other works, B.Fornberg introduces several algorithms for approximation of RBF interpolation, all based on Taylor expansions of the kernel - expansions which, truncated to the first order, yield the aforementioned flat limit.
I believe this topic is little-known to the GP community, and could arouse its interest, at least for its theoretical elegance. Maybe some parts of this work could be of practical use to you, for instance the preconditioner of [1][2] in situations of very small noise and relatively large lengthscales ; I've myself encountered this situation, which for instance has messed with my use of your LeaveOneOutPseudoLikelihood function from GPytorch.mlls (which implements a Cholesky decomposition instead of your more advanced linear algebra schemes).
Anyway, thanks for the great work, this library is simply amazing !
Olivier Truffinet
[1] R.Schaback, Multivariate Interpolation by Polynomials and Radial Basis Functions, Constructive Approximation, 2004
[2] R.Schaback, Limit problems for interpolation by analytic radial basis functions, Journal of Computational and Applied mathematics, 2006
[3] B. Fornberg, E. Larsson & N. Flyer, Stable Computations with Gaussian Radial Basis Functions, SIAM Journal on Applied Computing, 2011
[4] G.B.Wright and B.Fornberg, Stable computations with flat radial basis functions using vector-valued rational approximations, Journal of Computational Physics, 2017
Beta Was this translation helpful? Give feedback.
All reactions