webinar register page

Webinar banner
The Surprisingly Overlooked Efficiency of Sequential Monte Carlo (and how to make it even more efficient) - Nicolas Chopin
SMC (Sequential Monte Carlo) samplers present clear several advantages over MCMC (Markov chain Monte Carlo). In particular, they require little tuning (or more precisely, it is easy to automate their tuning to a given problem); they are easy to parallelize; and they allow for estimating the marginal likelihood of the target distribution. In this talk, I will discuss how SMC may be used in various problems in machine learning and computational statistics, and why they remain slightly overlooked in these areas. One possible reason (among several others) is that the following difficulty with SMC samplers may have been overlooked in the literature: that, to obtain optimal performance, one may need to apply a large number of MCMC steps at each iteration.
I will also present a recent paper (joint work with Hai-Dang Dau) where we develop a new type of SMC sampler, where all the intermediate Markov steps are used as "particles". That makes the resulting algorithm typically more efficient, and more importantly much more robust to user choices, and thus ultimately easier to use.

Dec 2, 2020 05:00 PM in Paris

Webinar logo
* Required information
Loading

By registering, I agree to the Privacy Statement and Terms of Service.

Register

Speakers

Nicolas Chopin
Professor @ENSAE
Nicolas Chopin (PhD, Université Pierre et Marie Curie, Paris, 2003) has been a Professor of Statistics at ENSAE, Paris, since 2006. He was previously a lecturer at Bristol University (UK). He is a current or former associate editor for Annals of Statistics, Biometrika, Journal of the Royal Statistical Society, Statistics and Computing, and Statistical Methods & Applications. He has served as a member (2013-14) and secretary (2015-16) of the research section committee of the Royal Statistical Society. He received a Savage Award for his doctoral dissertation in 2002. His research interests include computational statistics, Bayesian inference, and machine learning.