## Xiao-Li Meng’s inception [in Paris]

**X**iao-Li Meng will give a talk in Paris next September 1st, so I advertise it now, before my Parisian readers leave the city for their August retreat. Here is the abstract, explaining the above title:

Statistical Inception for the MCMC Dream: The kick is in the residual (augmentation)!

Xiao-Li Meng

Department of Statistics, Harvard UniversityThe development of MCMC algorithms via data augmentation (DA) or equivalently auxiliary variables has some resemblance to the theme plot of the recent Hollywood hit

Inception. We MCMC designers all share essentially the same “3S” dream, that is, to create algorithms that are simple, stable, and speedy. Within that grand dream, however, we have created a rather complex web of tools, with some of them producing very similar algorithms but for unclear reasons, or others that were thought to be of different origins but actually are layered when viewed from a suitable distance. These includeconditional augmentation, marginal augmentation, PX-DA, partially non-centering parameterization, sandwiched algorithms, interweaving strategies, ASIS, etc. It turns out that there is a simple statistical insight that can unify essentially all these methods conceptually, and it also provides practical guidelines for their DA constructions. It is the simple concept of regression residuals, which are constructed to be orthogonal to the regression functions. All these methods in one form or another effectively build aresidual augmentation. Given a DA distribution f(T, A), where T is our targeted variable (i.e., f(T) is our targeted distribution) and A is the augmented variable, there are two broad classes of residuals depending on whether we regress T on A or A on T. In this talk we will demonstrate how methods like conditional augmentation and partially non-centering parameterization build their residual augmentations by regressing A on T, whereas methods such as marginal augmentation and ASIS effectively use residual augmentations from regressing T on A. For either class, the attempted orthogonality helps to reduce the dependence among MCMC draws, and when the orthogonality leads to true independence as occurring in some special cases, we reach the dream of producing i.i.d. draws. (The talk is based on an upcoming discussion article, especially its rejoinder, Yu and Meng (2011,JCGS) )

**T**he talk will take place at Institut Henri Poincaré, Thursday Sept. 1, at 15:00, as part of the Big’MC seminars.

January 19, 2022 at 3:32 am

[…] la dimensión con variables auxiliares puede mejorar la convergencia; vea el trabajo reciente de Xiao-Li Meng en JCGS como […]

December 5, 2011 at 12:12 am

[…] augmentation and alternating subspace-spanning resampling, which reminded of the recent talk by Xiao-Li Meng in Paris. Chapter 3 motivates the Metropolis-Hastings algorithm as able to handle varying dimension […]

October 27, 2011 at 6:06 am

[…] all our insomnia remembered”, and “needing inception”, in connection with the talk Xiao-Li gave in Paris two months ago….), and above all the fascinating puzzle of linking […]

August 27, 2011 at 5:08 pm

Just attended his talk in London (first european show ;-)), it was brilliant!

August 21, 2011 at 9:05 am

Inception (the film) has won the 2011 Hugo Award for “Best Dramatic Presentation, Long Form”. I hope (with high confidence) Xiao-Li will also give a dramatic presentation in a long form!!!