webinar register page

Webinar banner
Simon Barthelmé - Kernel matrices in the flat limit
Kernel matrices are ubiquitous in statistics and machine learning. Within
Bayesian statistics they occur most often as covariance matrices of Gaussian
processes, in non-parametric or semi-parametric models. Most of the
theoretical work on kernel methods has focused on a large-$n$ asymptotics,
characterising the behaviour of kernel matrices as the amount of data
increases. Fixed-sample analysis is much more difficult outside of simple
cases, such as locations on a regular grid.

In this talk I will describe a fixed-sample analysis that was first studied in
the context of approximation theory by Fornberg \& Driscoll (2002), called the
``flat limit''. In flat-limit asymptotics, the goal is to characterise kernel
methods as the length-scale of the kernel function tends to infinity, so that
kernels appear flat over the range of the data. Even though flat kernel
matrices may seem trivial, because their rank goes to one, detailed analysis
reveals very interesting structure. We have been able to show that the
eigenvectors and eigenvalues in that regime are tightly related to orthogonal
polynomials or splines, depending on the smoothness of the kernel.

With the results on the spectrum of kernel matrices in hand, one may study a
wide range of kernel methods. In this talk I'll describe an application to
Gaussian Process regression. Results show that Gaussian process regression
tends in the flat limit to (multivariate) polynomial regression, or
(polyharmonic) spline regression, depending on the kernel. Since these methods
are simpler, the results may have practical implications for GP regression.

May 19, 2021 12:00 PM in Paris

* Required information
Loading

By registering, I agree to the Privacy Statement and Terms of Service.

Register