Archive for Annals of Statistics

improper priors, incorporated

Posted in Books, Statistics, University life with tags , , , , , , , , on January 11, 2012 by xi'an

If a statistical procedure is to be judged by a criterion such as a conventional loss function (…) we should not expect optimal results from a probabilistic theory that demands multiple observations and multiple parameters.” P. McCullagh & H. Han

Peter McCullagh and Han Han have just published in the Annals of Statistics a paper on Bayes’ theorem for improper mixtures. This is a fascinating piece of work, even though some parts do elude me… The authors indeed propose a framework based on Kingman’s Poisson point processes that allow to include (countable) improper priors in a coherent probabilistic framework. This framework requires the definition of a test set A in the sampling space, the observations being then the events Y∩ A, Y being an infinite random set when the prior is infinite. It is therefore complicated to perceive this representation in a genuine Bayesian framework, i.e. for a single observation, corresponding to a single parameter value. In that sense it seems closer to the original empirical Bayes, à la Robbins.

An improper mixture is designed for a generic class of problems, not necessarily related to one another scientifically, but all having the same mathematical structure.” P. McCullagh & H. Han

The paper thus misses in my opinion a clear link with the design of improper priors. And it does not offer a resolution of the  improper prior Bayes factor conundrum. However, it provides a perfectly valid environment for working with improper priors. For instance, the final section on the marginalisation “paradoxes” is illuminating in this respect as it does not demand  using a limit of proper priors.

MCMC with errors

Posted in R, Statistics, University life with tags , , , , , , , on March 25, 2011 by xi'an

I received this email last week from Ian Langmore, a postdoc in Columbia:

I’m looking for literature on a subject and can’t find it:  I have a Metropolis sampler where the acceptance probability is evaluated with some error.  This error is not simply error in evaluation of the target density.  It occurs due to the method by which we approximate the acceptance probability.

This is a sensible question, albeit a wee vague… The closest item of work I can think of is the recent paper by Christophe Andrieu and Gareth Roberts,  in the Annals of Statistics (2009) following an original proposal by Marc Beaumont. I think there is an early 1990’s paper by Gareth and Jeff Rosenthal where they consider the impact of some approximation effect like real number representation on the convergence but I cannot find it. Of course, the recent particle MCMC JRSS B discussion paper by Christophe,  Arnaud Doucet and Roman Hollenstein is a way to bypass the problem. (In a sense ABC is a rudimentary answer as well.) And there must be many other papers on this topic I am not aware of….

Vanilla on-line

Posted in Statistics, University life with tags , , , , on February 18, 2011 by xi'an

The Vanilla Rao–Blackwellization of Metropolis–Hastings algorithms paper with Randal Douc is now published in Annals of Statistics (Volume 39, Number 1 (2011), pages 261-277) and available on-line via the project Euclid. We are currently working with Pierre Jacob on an extension of this idea towards parallelisation

Seminar in Stanford

Posted in Statistics, University life with tags , , , on August 7, 2010 by xi'an

Following a kind invitation of Bala Rajaratnam during JSM 2010, I will give a special seminar in Stanford University this incoming Monday at 4:30, on my recent Annals paper with Randal Douc on “A vanilla Rao-Blackwellisation of Metropolis-Hastings algorithms“. Here are the slides

Vanilla Rao-Blackwellisation accepted

Posted in Statistics, University life with tags , , on June 13, 2010 by xi'an

The revision of our Vanilla Rao-Blackwellisation paper has been accepted by the Annals of Statistics as I was leaving for València 9. This is a very good news, indeed! Especially because I came back from València 9 with an idea on how to extend the Rao-Blackwellisation…


Posted in Books, Statistics with tags , , , , , , on March 31, 2010 by xi'an

Yesterday, I finally arXived my notes on Keynes’ book A Treatise On Probability, but, due to the new way the arXiv website operates, there is no indication of the page associated with the submitted paper before it gets accepted and I cannot thus prepare an Og’ entry until this acceptance, wasting perfect timing! Anyway, this is the first draft of the notes and it has not yet been submitted to a journal. As the new user interface on the arXiv webpage now displays all past papers, I added a category on our 2007 Annals paper with Randal Douc, Arnaud Guillin and Jean-Michel Marin, which means it appeared again in today’s list…

Today I completed my revision of the review of Burdzy’s The Search for Certainty over for Bayesian Analysis, so the new version will be on arXiv tomorrow morning. The changes are really minor as Bayesian Analysis mostly requested smoothing down my criticisms. I also added a few more quotes and some sentences in the conclusion. I wonder if this paper will appear with a discussion, since three are already written!

At last, let me point out three recent interesting postings on arXiv if I do not have time to discuss them more in depth, one by Peter Green on Colouring and breaking sticks: random distributions and heterogeneous clustering, one by Nicolas Chopin, Tony Lelièvre et Gabriel Stolz on Free energy methods for efficient exploration of mixture posterior densities, and one by Sophie Donnet and Jean-Michel Marin on An empirical Bayes procedure for the selection of Gaussian graphical models.

Savage-Dickey rejected

Posted in Statistics, University life with tags , , , , on January 16, 2010 by xi'an

Our Savage-Dickey paper has been rejected by the Annals of Statistics, for being too obscure. I completely understand the Editor’s perspective that this resolution of ours has very little bearing on statistical practice and on the community as a whole (“the readership at large”, as I used to write for Series B). But I fear both screeners have missed the main point of our paper which is that papers and books using the Savage-Dickey ratio all start with an assumption that does not make sense from a measure-theoretic point of view… One screener argued that our point is moot, given that everyone agrees on the same version of the density, as otherwise “would even a likelihood function be properly defined?” But this is not true: a likelihood


is the value of the density at the observed value of the random variable. Since this observed value is by nature random, it is not possible to define a specific version of the density function at x… This may alas be related with the progressive disappearance of measure-theory from the Statistics programs: when my third year exchange students go abroad, it is rarer and rarer to find a place that offers Measure Theory at a level lower than a PhD course.