webinar register page

Webinar banner
This webinar is for users with a Zoom account. New to Zoom? Sign up free.
Anna Korba - Variational Inference of overparameterized Bayesian Neural Networks: a theoretical and empirical study
This work studies the Variational Inference (VI) used for training Bayesian Neural Networks (BNN) in the overparameterized regime, ie, when the number of neurons tends to infinity. More specifically, we consider overparameterized two-layer BNN and point out a critical issue in the mean-field VI training. This problem arises from the decomposition of the lower bound on the evidence (ELBO) into two terms: one corresponding to the likelihood function of the model and the second to the Kullback-Leibler (KL) divergence between the prior distribution and the variational posterior. In particular, we show both theoretically and empirically that there is a trade-off between these two terms in the overparameterized regime only when the KL is appropriately re-scaled with respect to the ratio between the the number of observations and neurons. We also illustrate our theoretical results with numerical experiments that highlight the critical choice of this ratio.

May 31, 2022 05:00 PM in Paris

Webinar logo
* Required information


Anna Korba
Assistant Professor @ENSAE/CREST
Anna Korba is Assistant Professor at ENSAE/CREST within the statistics department since September, 2020. She obtained her PhD from Telecom ParisTech in 2018 under the supervision of Stephan Clémençon. From December, 2018 to August, 2018, she was a postdoctoral researcher at the University College London working with Arthur Gretton within the computational neurosciences unit of Gatsby.