Archive for Monte Carlo Statistical Methods

rethinking the ESS published!

Posted in Statistics with tags , , , , , , , , on May 3, 2022 by xi'an

Our paper Rethinking the Effective Sample Size, with Victor Elvira (the driving force behind the paper!) and Luca Martino, has now been published in the International Statistical Review! As discussed earlier on this blog, we wanted to re-evaluate the pros and cons of the effective sample size (ESS), as a tool assessing the quality [or lack thereof] of a Monte Carlo approximation. It is particularly exploited in the specific context of importance sampling. Following a 1992 construction by Augustine Kong, his approximation has been widely used in the last 25 years, in part due to its simplicity as a practical rule of thumb. However, we show in this paper that the assumptions made in the derivation of this approximation make it difficult to consider it as a reasonable approximation of the ESS. Note that this reevaluation does not cover the use of ESS for Markov chain Monte Carlo algorithms, although there would also be much to tell about it..!

ensemble Metropolis-Hastings

Posted in Books, Kids, Statistics with tags , , , , , on October 14, 2021 by xi'an

A question on X validated about ensemble MCMC samplers had me try twice to justify the Metropolis-Hasting ratio the authors used. To recap, ensemble sampling moves a cloud of points (just like our bouncy particle sampler) one point X at a time by using another point Z as a pivot or origin and moving randomly X along the line [XZ]. In the paper,  the distribution of the rescaling is symmetric in the sense that f(z)=f(1/z). I indeed started by perceiving the basic step of the sampler as a Metropolis-within-Gibbs step along a random direction. But it did not work as the direction depends on the current X. I then wondered at a possible importance sampling interpretation compensating for the change of scale, but it was leading to the wrong power anyway. Before hitting the fact that this was actually a change of radius in the space with origin Z, leaving the angular coordinates invariant. Which explained for the power (n-1) in the Metropolis ratio, in agreement with a switch to polar coordinates.

population quasi-Monte Carlo

Posted in Books, Statistics with tags , , , , , , , , , , , , on January 28, 2021 by xi'an

“Population Monte Carlo (PMC) is an important class of Monte Carlo methods, which utilizes a population of proposals to generate weighted samples that approximate the target distribution”

A return of the prodigal son!, with this arXival by Huang, Joseph, and Mak, of a paper on population Monte Carlo using quasi-random sequences. The construct is based on an earlier notion of Joseph and Mak, support points, which are defined wrt a given target distribution F as minimising the variability of a sample from F away from these points. (I would have used instead my late friend Bernhard Flury’s principal points!) The proposal uses Owen-style scrambled Sobol points, followed by a deterministic mixture weighting à la PMC, followed by importance support resampling to find the next location parameters of the proposal mixture (which is why I included an unrelated mixture surface as my post picture!). This importance support resampling is obviously less variable than the more traditional ways of resampling but the cost moves from O(M) to O(M²).

“The main computational complexity of the algorithm is O(M²) from computing the pairwise distance of the M weighted samples”

The covariance parameters are updated as in our 2008 paper. This new proposal is interesting and reasonable, with apparent significant gains, albeit I would have liked to see a clearer discussion of the actual computing costs of PQMC.

Rao-Blackwellisation in the MCMC era

Posted in Books, Statistics, University life with tags , , , , , , , , , , on January 6, 2021 by xi'an

A few months ago, as indicated on this blog, I was contacted by ISR editors to write a piece on Rao-Blackwellisation, towards a special issue celebrating Calyampudi Radhakrishna Rao’s 100th birthday. Gareth Roberts and I came up with this survey, now on arXiv, discussing different aspects of Monte Carlo and Markov Chain Monte Carlo that pertained to Rao-Blackwellisation, one way or another. As I discussed the topic with several friends over the Fall, it appeared that the difficulty was more in setting the boundaries. Than in finding connections. In a way anything conditioning or demarginalising or resorting to auxiliary variates is a form of Rao-Blackwellisation. When re-reading the JASA Gelfand and Smith 1990 paper where I first saw the link between the Rao-Blackwell theorem and simulation, I realised my memory of it had drifted from the original, since the authors proposed there an approximation of the marginal based on replicas rather than the original Markov chain. Being much closer to Tanner and Wong (1987) than I thought. It is only later that the true notion took shape. [Since the current version is still a draft, any comment or suggestion would be most welcomed!]

computing Bayes 2.0

Posted in Books, Statistics, University life with tags , , , , , , , , , , , on December 11, 2020 by xi'an

Our survey paper on “computing Bayes“, written with my friends Gael Martin [who led this project most efficiently!] and David Frazier, has now been revised and resubmitted, the new version being now available on arXiv. Recognising that the entire range of the literature cannot be encompassed within a single review, esp. wrt the theoretical advances made on MCMC, the revised version is more focussed on the approximative solutions (when considering MCMC as “exact”!). As put by one of the referees [which were all very supportive of the paper], “the authors are very brave. To cover in a review paper the computational methods for Bayesian inference is indeed a monumental task and in a way an hopeless one”. This is the opportunity to congratulate Gael on her election to the Academy of Social Sciences of Australia last month. (Along with her colleague from Monash, Rob Hyndman.)

%d bloggers like this: