**T**omorrow noon, I will give a talk at Trinity College Dublin on the asymptotic properties of ABC. (Here are the slides from the talk I gave in Berlin last month.)

## Archive for Dublin

## talk at Trinity College

Posted in pictures, Statistics, Travel, University life with tags ABC, ABC convergence, asymptotics, Dublin, Eire, Ireland, seminar, Trinity College Dublin on May 7, 2017 by xi'an## Dublin [porter]

Posted in pictures, Travel, Wines with tags beer, Dublin, Guinness, Ireland, maths house, Porter, University of Warwick on January 20, 2016 by xi'an## light and widely applicable MCMC: approximate Bayesian inference for large datasets

Posted in Books, Statistics, University life, Wines with tags ABC, big data, character recognition, delayed acceptance, Dublin, Ireland, Markov chains, MCMC algorithm, reversible jump MCMC, Russian roulette, subsampling on March 24, 2015 by xi'an**F**lorian Maire (whose thesis was discussed in this post), Nial Friel, and Pierre Alquier (all in Dublin at some point) have arXived today a paper with the above title, aimed at quickly analysing large datasets. As reviewed in the early pages of the paper, this proposal follows a growing number of techniques advanced in the past years, like pseudo-marginals, Russian roulette, unbiased likelihood estimators. firefly Monte Carlo, adaptive subsampling, sub-likelihoods, telescoping debiased likelihood version, and even our very own delayed acceptance algorithm. (Which is incorrectly described as restricted to iid data, by the way!)

The lightweight approach is based on an ABC idea of working through a summary statistic that plays the role of a pseudo-sufficient statistic. The main theoretical result in the paper is indeed that, when subsampling in an exponential family, subsamples preserving the sufficient statistics (modulo a rescaling) are optimal in terms of distance to the true posterior. Subsamples are thus weighted in terms of the (transformed) difference between the full data statistic and the subsample statistic, assuming they are both normalised to be comparable. I am quite (positively) intrigued by this idea in that it allows to somewhat compare inference based on two different samples. The weights of the subsets are then used in a pseudo-posterior that treats the subset as an auxiliary variable (and the weight as a substitute to the “missing” likelihood). This may sound a wee bit convoluted (!) but the algorithm description is not yet complete: simulating jointly from this pseudo-target is impossible because of the huge number of possible subsets. The authors thus suggest to run an MCMC scheme targeting this joint distribution, with a proposed move on the set of subsets and a proposed move on the parameter set conditional on whether or not the proposed subset has been accepted.

From an ABC perspective, the difficulty in calibrating the tolerance ε sounds more accute than usual, as the size of the subset comes as an additional computing parameter. Bootstrapping options seem impossible to implement in a large size setting.

An MCMC issue with this proposal is that designing the move across the subset space is both paramount for its convergence properties and lacking in geometric intuition. Indeed, two subsets with similar summary statistics may be very far apart… Funny enough, in the representation of the joint Markov chain, the parameter subchain is secondary if crucial to avoid intractable normalising constants. It is also unclear for me from reading the paper maybe too quickly whether or not the separate moves when switching and when not switching subsets retain the proper balance condition for the pseudo-joint to still be the stationary distribution. The stationarity for the subset Markov chain is straightforward by design, but it is not so for the parameter. In case of switched subset, simulating from the true full conditional given the subset would work, but not simulated by a fixed number L of MCMC steps.

The lightweight technology therein shows its muscles on an handwritten digit recognition example where it beats regular MCMC by a factor of 10 to 20, using only 100 datapoints instead of the 10⁴ original datapoints. While very nice and realistic, this example may be misleading in that 100 digit realisations may be enough to find a tolerable approximation to the true MAP. I was also intrigued by the processing of the probit example, until I realised the authors had integrated the covariate out and inferred about the mean of that covariate, which means it is not a genuine probit model.

## National Gallery of Ireland

Posted in pictures, R, Travel with tags architecture, books, Dublin, histogram, Mathematica, National Gallery of Ireland, R, sculpture on October 16, 2011 by xi'an**D**uring a short if profitable visit to Dublin for a SFI meeting on Tuesday/Friday, I had the opportunity to visit the National Gallery of Ireland in my sole hour of free time (as my classy hotel was very close). The building itself is quite nice, being well-inserted between brick houses from the outside, while providing impressive height, space, and light from the inside.

**T**he masterpiece gallery is quite small (unless I missed a floor!), if filled with masterpieces like a painting by Caillebotte I did not know.

**T**he modern art gallery was taken by a temporary (and poorly exposed) exhibit that includes live happenings (five persons wearing monkish outfits standing around a mommy floating in mid-air), tags (!), and two interesting pieces: one was made of several tables filed with piles of books glued together and sculpted, giving an output that looked like 2-D histograms, and reminding me of the fear histograms discussed on Statisfaction by Julyan a few days ago. (Note the Mathematica book in the last picture!) While I love books very much, I am also quite interested in sculptures involving books, like the one I saw a few years ago where the artist had grown different cereals on opened books: although it may sound like an easy trick (food for thought and all that), the result was amazing and impressive!

**T**he second piece was a beautiful board illuminated by diodes which felts very warm and comforting, maybe in reminiscence of the maternal womb, of candles, or of myriads of galaxies, but very powerful in any case. (I usually dislike constructs involving light, like the neon sculptures of the 80’s, so I started with an a priori against it.) I could have stayed there for hours…