Archive for ABC

Scott Sisson’s ABC seminar in Paris [All about that Bayes]

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , on January 20, 2020 by xi'an

On the “All about that Bayes” seminar tomorrow (Tuesday 21 at 3p.m., room 42, AgroParisTech, 16 rue Claude Bernard, Paris 5ième), Scott Sisson, School of Mathematics and Statistics at UNSW, and visiting Paris-Dauphine this month, will give a talk on

Approximate posteriors and data for Bayesian inference

Abstract
For various reasons, including large datasets and complex models, approximate inference is becoming increasingly common. In this talk I will provide three vignettes of recent work. These cover a) approximate Bayesian computation for Gaussian process density estimation, b) likelihood-free Gibbs sampling, and c) MCMC for approximate (rounded) data.

BayesComp’20

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , , , , , , , on January 10, 2020 by xi'an

First, I really have to congratulate my friend Jim Hobert for a great organisation of the meeting adopting my favourite minimalist principles (no name tag, no “goodies” apart from the conference schedule, no official talks). Without any pretense at objectivity, I also appreciated very much the range of topics and the sweet frustration of having to choose between two or three sessions each time. Here are some notes taken during some talks (with no implicit implication for the talks no mentioned, re. above frustration! as well as very short nights making sudden lapse in concentration highly likely).

On Day 1, Paul Fearnhead’s inaugural plenary talk was on continuous time Monte Carlo methods, mostly bouncy particle and zig-zag samplers, with a detailed explanation on the simulation of the switching times which likely brought the audience up to speed even if they had never heard of them. And an opening on PDMPs used as equivalents to reversible jump MCMC, reminding me of the continuous time (point process) solutions of Matthew Stephens for mixture inference (and of Preston, Ripley, Møller).

The same morn I heard of highly efficient techniques to handle very large matrices and p>n variables selections by Akihiko Nishimura and Ruth Baker on a delayed acceptance ABC, using a cheap proxy model. Somewhat different from indirect inference. I found the reliance on ESS somewhat puzzling given the intractability of the likelihood (and the low reliability of the frequency estimate) and the lack of connection with the “real” posterior. At the same ABC session, Umberto Picchini spoke on a joint work with Richard Everitt (Warwick) on linking ABC and pseudo-marginal MCMC by bootstrap. Actually, the notion of ABC likelihood was already proposed as pseudo-marginal ABC by Anthony Lee, Christophe Andrieu and Arnaud Doucet in the discussion of Fearnhead and Prangle (2012) but I wonder at the focus of being unbiased when the quantity is not the truth, i.e. the “real” likelihood. It would seem more appropriate to attempt better kernel estimates on the distribution of the summary itself. The same session also involved David Frazier who linked our work on ABC for misspecified models and an on-going investigation of synthetic likelihood.

Later, there was a surprise occurrence of the Bernoulli factory in a talk by Radu Herbei on Gaussian process priors with accept-reject algorithms, leading to exact MCMC, although the computing implementation remains uncertain. And several discussions during the poster session, incl. one on the planning of a 2021 workshop in Oaxaca centred on objective Bayes advances as we received acceptance of our proposal by BIRS today!

On Day 2, David Blei gave a plenary introduction to variational Bayes inference and latent Dirichlet allocations, somewhat too introductory for my taste although other participants enjoyed this exposition. He also mentioned a recent JASA paper on the frequentist consistency of variational Bayes that I should check. Speaking later with PhD students, they really enjoyed this opening on an area they did not know that well.

A talk by Kengo Kamatani (whom I visited last summer) on improved ergodicity rates for heavy tailed targets and Crank-NIcholson modifications to the random walk proposal (which uses an AR(1) representation instead of the random walk). With the clever idea of adding the scale of the proposal as an extra parameter with a prior of its own. Gaining one order of magnitude in the convergence speed (i.e. from d to 1 and from d² to d, where d is the dimension), which is quite impressive (and just published in JAP).Veronica Rockova linked Bayesian variable selection and machine learning via ABC, with conditions on the prior for model consistency. And a novel approach using part of the data to learn an ABC partial posterior, which reminded me of the partial  Bayes factors of the 1990’s although it is presumably unrelated. And a replacement of the original rejection ABC via multi-armed bandits, where each variable is represented by an arm, called ABC Bayesian forests. Recalling the simulation trick behind Thompson’s approach, reproduced for the inclusion or exclusion of variates and producing a fixed estimate for the (marginal) inclusion probabilities, which makes it sound like a prior-feeback form of empirical Bayes. Followed by a talk of Gregor Kastner on MCMC handling of large time series with specific priors and a massive number of parameters.

The afternoon also had a wealth of exciting talks and missed opportunities (in the other sessions!). Which ended up with a strong if unintended French bias since I listened to Christophe Andrieu, Gabriel Stolz, Umut Simsekli, and Manon Michel on different continuous time processes, with Umut linking GANs, multidimensional optimal transport, sliced-Wasserstein, generative models, and new stochastic differential equations. Manon Michel gave a highly intuitive talk on creating non-reversibility, getting rid of refreshment rates in PDMPs to kill any form of reversibility.

ABC in Grenoble, 19-20 March 2020 [registration open]

Posted in Mountains, Statistics, Travel, University life with tags , , , , , , , , , , , , , , on January 7, 2020 by xi'an

Reminding readers that the next occurrence of the “ABC in…” workshops will very soon take place in Grenoble, France, on 19-20 March 2020. Confirmed speakers and sessions (with more to come) are

Misspecified models

Links with Machine Learning

  • Flora Jay (Université d’Orsay, France) TBA
  • Pierre-Alexandre Mattei (Inria Sophia Antipolis – Méditerranée, France) Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation
  • Dennis Prangle (Newcastle University, UK) Scalable approximate inference for state space models with normalising flows

As in most earlier versions of the “ABC in…”workshops (ABC in Paris, London, Roma, &tc.), we are aiming at a workshop atmosphere and, thanks to local sponsors, the registration fees are null, but registration is compulsory. And now open!

I also remind ‘Og’s readers that Grenoble can be easily reached by fast trains from Paris, Roissy, Geneva and Lyon. (There are also flights to Grenoble airport from Warwick, as well as Bristol, Edinburgh, London, Manchester, Rotterdam, Stockholm, Warsaw, but this is less convenient than flying to Lyon Saint-Exupery airport and then catching a direct train at the airport.) To add to the appeal of the place, the workshop occurs during the skiing season, with three mountain ranges in the close vicinity. Making ABski a genuine possibility for the weekend after!

repulsive postdoc!

Posted in Statistics with tags , , , , , , , , , , on December 20, 2019 by xi'an

Rémi Bardenet has been awarded an ERC grant on Monte Carlo integration via repulsive point processes and is now looking for a postdoc starting next March. (Our own ABSINT ANR grant still has an open offer of a postdoctoral position on approximate Bayesian methods, feel free to contact me if potentially interested.)

BayesComp 2020 at a glance

Posted in Statistics, Travel, University life with tags , , , , , , , , , , , , , , , , on December 18, 2019 by xi'an

off to Vancouver

Posted in Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , , , on December 7, 2019 by xi'an

Today I am flying to Vancouver for an ABC workshop, the second Symposium on Advances in Approximate Bayesian Inference, which is a pre-NeurIPS workshop following five earlier editions, to some of which I took part. With an intense and exciting programme. Not attending the following NeurIPS as I had not submitted any paper (and was not considering relying on a lottery!). Instead, I will give a talk at ABC UBC on Monday 4pm, as, coincidence, coincidence!, I was independently invited by UBC to the IAM-PIMS Distinguished Colloquium series. Speaking on ABC on a broader scale than in the workshop. Where I will focus on ABC-Gibbs. (With alas no time for climbing, missing an opportunity for a winter attempt at The Stawamus Chief!)

ABC for fish [PhD position]

Posted in Statistics with tags , , , , , , , on December 5, 2019 by xi'an

Richard Everitt (Warwick) is currently seeking a PhD candidate for working on approximate Bayesian computation methods, applied to Individual Based Models of fisheries, in collaboration with the government agency Centre for Environment, Fisheries and Aquaculture Science (CEFAS). Details on how to apply . More details can be found at the page of the doctoral training partnership (one can search for the project on this site). The application deadline is Friday, January 10, 2020, and it is open to UK and EU students.