Our survey paper on “computing Bayes“, written with my friends Gael Martin [who led this project most efficiently!] and David Frazier, has now been revised and resubmitted, the new version being now available on arXiv. Recognising that the entire range of the literature cannot be encompassed within a single review, esp. wrt the theoretical advances made on MCMC, the revised version is more focussed on the approximative solutions (when considering MCMC as “exact”!). As put by one of the referees [which were all very supportive of the paper], “the authors are very brave. To cover in a review paper the computational methods for Bayesian inference is indeed a monumental task and in a way an hopeless one”. This is the opportunity to congratulate Gael on her election to the Academy of Social Sciences of Australia last month. (Along with her colleague from Monash, Rob Hyndman.)
Archive for Approximate Bayesian computation
computing Bayes 2.0
Posted in Books, Statistics, University life with tags Approximate Bayesian computation, arXiv, ASSA, Australia, Bayesian computing, MCMC, Monash University, Monte Carlo methods, Monte Carlo Statistical Methods, review, revision, survey on December 11, 2020 by xi'anA is for…
Posted in Statistics with tags ABC, Approximate Bayesian computation, cartoon, lockdown, pandemic, The New Yorker on December 10, 2020 by xi'anABC with inflated tolerance
Posted in Mountains, pictures, Statistics, Travel, University life with tags ABC, ABC-MCMC, Approximate Bayesian computation, Finland, MCMC, One World ABC Seminar, tolerance, University of Jyväskylä, webinar on December 8, 2020 by xi'anFor the last One World ABC seminar of the year 2020, this coming Thursday, Matti Vihola is speaking from Finland on his recent Biometrika paper “On the use of ABC-MCMC with inflated tolerance and post-correction”. To attend the talk, all is required is a registration on the seminar webpage.
The Markov chain Monte Carlo (MCMC) implementation of ABC is often sensitive to the tolerance parameter: low tolerance leads to poor mixing and large tolerance entails excess bias. We propose an approach that involves using a relatively large tolerance for the MCMC sampler to ensure sufficient mixing, and post-processing of the output which leads to estimators for a range of finer tolerances. We introduce an approximate confidence interval for the related post-corrected estimators and propose an adaptive ABC-MCMC algorithm, which finds a balanced tolerance level automatically based on acceptance rate optimization. Our experiments suggest that post-processing-based estimators can perform better than direct MCMC targeting a fine tolerance, that our confidence intervals are reliable, and that our adaptive algorithm can lead to reliable inference with little user specification.