Manage episode 362954552 series 2635823
Max Kochurov’s State of Bayes Lecture Series: https://www.youtube.com/playlist?list=PL1iMFW7frOOsh5KOcfvKWM12bjh8zs9BQ
Sign up here for upcoming lessons: https://www.meetup.com/pymc-labs-online-meetup/events/293101751/
We talk a lot about different MCMC methods on this podcast, because they are the workhorses of the Bayesian models. But other methods exist to infer the posterior distributions of your models — like Sequential Monte Carlo (SMC) for instance. You’ve never heard of SMC? Well perfect, because Nicolas Chopin is gonna tell you all about it in this episode!
A lecturer at the French university of ENSAE since 2006, Nicolas is one of the world experts on SMC. Before that, he graduated from Ecole Polytechnique and… ENSAE, where he did his PhD from 1999 to 2003.
Outside of work, Nicolas enjoys spending time with his family, practicing aikido, and reading a lot of books.
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Nathaniel Neitzke, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Raul Maldonado, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Trey Causey, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady and Kurt TeKolste.
Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
Links from the show:
- Old episodes relevant to these topics:
- LBS #14, Hidden Markov Models & Statistical Ecology, with Vianey Leos-Barajas: https://learnbayesstats.com/episode/14-hidden-markov-models-statistical-ecology-with-vianey-leos-barajas/
- LBS #41, Thinking Bayes, with Allen Downey: https://learnbayesstats.com/episode/41-think-bayes-allen-downey/
- Nicolas’ show notes:
- Nicolas on Mastodon: firstname.lastname@example.org
- 2-hour introduction to particle filters: https://www.youtube.com/watch?v=mE_PJ9ASc8Y
- Nicolas’ website: https://nchopin.github.io/
- Nicolas on GitHub: https://github.com/nchopin
- Nicolas on Linkedin: https://www.linkedin.com/in/nicolas-chopin-442a78102/
- Nicolas’ blog (shared with others): https://statisfaction.wordpress.com/
- INLA original paper: https://people.bath.ac.uk/man54/SAMBa/ITTs/ITT2/EDF/INLARueetal2009.pdf
- Nicolas’ book, An introduction to Sequential Monte Carlo: https://nchopin.github.io/books.html
- Laplace’s Demon, A Seminar Series about Bayesian Machine Learning at Scale: https://ailab.criteo.com/laplaces-demon-bayesian-machine-learning-at-scale/
- Paper about Expectation Propagation, Leave Pima Indians Alone – Binary Regression as a Benchmark for Bayesian Computation: https://projecteuclid.org/journals/statistical-science/volume-32/issue-1/Leave-Pima-Indians-Alone--Binary-Regression-as-a-Benchmark/10.1214/16-STS581.full
- Blackjax website: https://blackjax-devs.github.io/blackjax/
In episode 82 Nicolas Chopin is our guest. He is a graduate from the Ecole Polytechnique and currently lectures at the French university of ENSAE.
He is a specialist for Sequential Monte Carlo (SMC) samplers and explains in detail what they are, clearing up some confusion about what SMC stands for and when to use them.
We discuss the advantages of SMC over other types of commonly used samplers for bayesian models such as MCMC or Gibbs samplers.
Besides a detailed look at SMC we also cover INLA. INLA stands for Integrated Nested LaPlace Approximation.
INLA can be a fast, approximate sampler for specific kinds of models. It works well for geographic data and relationships, such as for example relationships between regions in a country.
We discuss the difficulties with and future of SMC and INLA and probabilistic sampling in general.
please note that the transcript was generated automatically and may therefore contain errors. Feel free to reach out if you're willing to correct them.
This podcast uses the following third-party services for analysis:
Podcorn - https://podcorn.com/privacy