Archive for webinar

dependable AI/ML [France is AI]

Posted in Statistics with tags , , , , , on January 21, 2021 by xi'an

the surprisingly overlooked efficiency of SMC

Posted in Books, Statistics, University life with tags , , , , , , , , , , , on December 15, 2020 by xi'an

At the Laplace demon’s seminar today (whose cool name I cannot tire of!), Nicolas Chopin gave a webinar with the above equally cool title. And the first slide debunking myths about SMC’s:

The second part of the talk is about a recent arXival Nicolas wrote with his student Hai-Dang DauI missed, about increasing the number of MCMC steps when moving the particles. Called waste-free SMC. Where only one fraction of the particles is updated, but this is enough to create a sort of independence from previous iterations of the SMC. (Hai-Dang Dau and Nicolas Chopin had to taylor their own convergence proof for this modification of the usual SMC. Producing a single-run assessment of the asymptotic variance.)

On the side, I heard about a very neat (if possibly toyish) example on estimating the number of Latin squares:

And the other item of information is that Nicolas’ and Omiros’ book, An Introduction to Sequential Monte Carlo, has now appeared! (Looking forward reading the parts I had not yet read.)

ABC with inflated tolerance

Posted in Mountains, pictures, Statistics, Travel, University life with tags , , , , , , , , on December 8, 2020 by xi'an

joutsniemi_01_srgb_300ppi_leivonmaen-kansallispuisto_jukka-paakkinen

For the last One World ABC seminar of the year 2020, this coming Thursday, Matti Vihola is speaking from Finland on his recent Biometrika paper “On the use of ABC-MCMC with inflated tolerance and post-correction”. To attend the talk, all is required is a registration on the seminar webpage.

The Markov chain Monte Carlo (MCMC) implementation of ABC is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We propose an approach that involves using a relatively large tolerance for the MCMC sampler to ensure sufficient mixing, and post-processing of the output which leads to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators and propose an adaptive ABC-MCMC algorithm, which finds a balanced tolerance level automatically based on acceptance rate optimization. Our experiments suggest that post-processing-based estimators can perform better than direct MCMC targeting a fine tolerance, that our confidence intervals are reliable, and that our adaptive algorithm can lead to reliable inference with little user specification.

my talk in Newcastle

Posted in Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , , , , , on November 13, 2020 by xi'an

I will be talking (or rather zooming) at the statistics seminar at the University of Newcastle this afternoon on the paper Component-wise approximate Bayesian computation via Gibbs-like steps that just got accepted by Biometrika (yay!). Sadly not been there for real, as I would have definitely enjoyed reuniting with friends and visiting again this multi-layered city after discovering it for the RSS meeting of 2013, which I attended along with Jim Hobert and where I re-discussed the re-Read DIC paper. Before traveling south to Warwick to start my new appointment there. (I started with a picture of Seoul taken from the slopes of Gwanaksan about a year ago as a reminder of how much had happened or failed to happen over the past year…Writing 2019 as the year was unintentional but reflected as well on the distortion of time induced by the lockdowns!)

 

David Frazier’s talk on One World ABC seminar tomorrow [watch for the time!]

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , on October 14, 2020 by xi'an

My friend and coauthor from Melbourne is giving the One World ABC seminar tomorrow. He will be talking at 10:30 UK time, 11:30 Brussels time, and 20:30 Melbourne time! On Robust and Efficient Approximate Bayesian Computation: A Minimum Distance Approach. Be on time!