Archive for ABC consistency

Marc Beaumont on One World ABC webinar [30 May, 9am]

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , , , , , on May 17, 2024 by xi'an

For the final talk of this Spring season of the One World ABC webinar, we are very glad to welcome Marc Beaumont, a central figure in the development of ABC methods and inference! (And a coauthor of our ABC-PMC paper.)

Model misspecification in population genomic
Mark Beaumont
University of Bristol
30th May 2024, 9.00am UK time

Abstract
In likelihood-free settings, problematic effects of model misspecification can manifest themselves during computation, leading to nonsensical answers, particularly causing convergence problems in sequential algorithms. This issue has been well studied in the last 10 years, leading to a number of methods for robust inference. In practical applications, likelihood-free methods tend to be applied to the output of complex simulations where there is a choice of summary statistics that can be computed. One approach to handling misspecification is to simply not use summary statistics computed from simulations of the model under the prior that cannot be with those observed in the data. This presentation gives a brief review of methods for observing and handling misspecification in ABC and SBI, and then discusses approaches that we have explored in a population genomic modelling framework.

discrepancy–based ABC posteriors via Rademacher complexity

Posted in Statistics with tags , , , , , , , , , , , on April 18, 2024 by xi'an

Sirio Legramanti, Daniele Durante, and Pierre Alquier just arXived a massive paper on the concentration of discrepancy–based ABC posteriors via Rademacher complexity, which includes MMD and Wasserstein distance-based ABC methods. The paper provides sufficient conditions under which a discrepancy within the integral probability semimetrics class guarantees uniform convergence and concentration of the induced ABC posterior, without necessarily requiring suitable regularity conditions for the underlying data generating process and the assumed statistical model, meaning that they also cover misspecified cases. In particular, the authors derive upper and lower bounds on the limiting acceptance probabilities for the ABC posterior to remain well–defined for a sample size large enough. They thus deliver an improved understanding of the factors that govern the uniform convergence and concentration properties of discrepancy–based ABC posteriors under a fairly unified perspective, which I deem a significant advance on the several papers my coauthors Ernst Bernton, David Frazier, Mathieu Gerber, Pierre Jacob, Gael Martin, Judith Rousseau, Robin Ryder, and yours truly produced in that domain over the past years (although our Series B misspecification paper does not appear in the reference list!)

“…as highlighted by the authors, these [convergence] conditions (i) can be difficult to verify for several discrepancies, (ii) do not allow to assess whether some of these discrepancies can achieve convergence and concentration uniformly over P(Y), and (iii) often yield bounds which hinder an in–depth understanding of the factors regulating these limiting properties”

The first result is that, asymptotically in n and a fixed large-enough tolerance, the ABC posterior is always well–defined but within a Rademacher ball of the pseudo-true posterior, larger than the tolerance ε when the Rademacher complexity does not vanish in n (a feature on which my intuition is found to be lacking!, since it seems to relate solely to the class of functions adopted for the definition of said discrepancy). When the tolerance ε(n) decreases to its minimum, as in our paper, the speed of concentration is similar to ours, with a speed slower than √n. And assuming the tolerance ε(n) decreases to its minimum slower than √n but faster than the Rademacher complexity.

“…the bound we derive crucially depends on [the Rademacher complexity], which is specific to each discrepancy D and plays a fundamental role in controlling the rate of concentration of the ABC posterior.”

The paper also opens towards non-iid settings (as in our Wasserstein paper) and generalized likelihood–free Bayesian inference à la Bissiri et al. (2016). A most interesting take on the universality of ABC convergence, thus, although assuming bounded function spaces from the start.

Metropolis-Hastings via Classification [One World ABC seminar]

Posted in Statistics, University life with tags , , , , , , , , , , , , , , , on May 27, 2021 by xi'an

Today, Veronika Rockova is giving a webinar on her paper with Tetsuya Kaji Metropolis-Hastings via classification. at the One World ABC seminar, at 11.30am UK time. (Which was also presented at the Oxford Stats seminar last Feb.) Please register if not already a member of the 1W ABC mailing list.

Metropolis-Hastings via classification

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , on February 23, 2021 by xi'an

Veronicka Rockova (from Chicago Booth) gave a talk on this theme at the Oxford Stats seminar this afternoon. Starting with a survey of ABC, synthetic likelihoods, and pseudo-marginals, to motivate her approach via GANs, learning an approximation of the likelihood from the GAN discriminator. Her explanation for the GAN type estimate was crystal clear and made me wonder at the connection with Geyer’s 1994 logistic estimator of the likelihood (a form of discriminator with a fixed generator). She also expressed the ABC approximation hence created as the actual posterior times an exponential tilt. Which she proved is of order 1/n. And that a random variant of the algorithm (where the shift is averaged) is unbiased. Most interestingly requiring no calibration and no tolerance. Except indirectly when building the discriminator. And no summary statistic. Noteworthy tension between correct shape and correct location.

talk at CISEA 2019

Posted in Statistics, University life with tags , , , , , , , on June 18, 2019 by xi'an

Here are my slides for the overview talk I am giving at CISEA 2019, in Abidjan, highly resemblant with earlier talks, except for the second slide!