Archive for Gibbs

An alternative proof of DA convergence

Posted in Statistics with tags , , , , , , on September 17, 2009 by xi'an

I came across a curio today while looking at recent postings on arXiv, namely a different proof of the convergence of the Data Augmentation algorithm, more than twenty years after it was proposed by Martin Tanner and Wing Hung Wong in a 1987 JASA paper… The convergence under the positivity condition is of course a direct consequence of the ergodic theorem, as shown for instance by Tierney (1994), but the note by Yaming Yu uses instead a Kullback divergence

K(p^{(t)},\pi) = \int \log\{ p^{(t)}(x)/\pi(x) \} p^{(t)}(\text{d}x)

and shows as Liu, Wong and Kong do for the variance (Biometrika, 1994) that this divergence is monotonically decreasing in t. The proof is interesting in that only functional (i.e., non-ergodic) arguments are used, even though I am a wee surprised at IEEE Transactions on Information Theory publishing this type of arcane mathematics… Note that the above divergence is the “wrong” one in that it measures the divergence from p^{(t)}, not from \pi. The convergence thus involves a sequence of divergences rather than a single one. (Of course, this has no consequence on the corollary that the total variation distance goes to zero.)

An incomplete but revised history of Markov Chain Monte Carlo

Posted in Statistics with tags , , , on August 30, 2009 by xi'an

Our incomplete MCMC history paper with George Casella has now been refereed with (almost inevitably) a lot of requests for further quotes and other branches as well as, paradoxically, a demand to get down to 20 pages!, and the revised (albeit mostly unchanged) version has been resubmitted (for publication in the MCMC handbook Xiao-Li Meng is editing). We managed to keep within the 20 pages, except for the 5 page bibliography, which is obviously a main part of the paper… The new version is available on arXiv.

Great week!

Posted in Statistics with tags , , , on April 7, 2009 by xi'an

Within a week, I got acceptance letters from Bayesian Analysis, Biometrika, and Statistical Science for the papers ABC likelihood-free methods for model choice in Gibbs random fields (arXiv posting), ABC-PMC (arXiv posting), and Theory of Probability revisited (arXiv posting) What a great week! Further, the paper on Jeffreys will be a discussion paper with two discussions from Andrew Gelman and Dennis Lindley already received. The ABC-PMC paper, now called Adaptive approximate Bayesian computation, is in the Miscellanea section of Biometrika, which I feel is a bit lower key, but, nonetheless, having this re-evaluation of ABC-PRC and of the substitute ABC-PMC we propose in a top-tier journal is the major thing.

ABC methods for model choice in Gibbs random fields

Posted in Statistics with tags , , , , , , on February 19, 2009 by xi'an

1tqga from Thermotoga maritimaWe have resubmitted to Bayesian Analysis a revised version of our paper ” ABC methods for model choice in Gibbs random fields” available on arXiv. The only major change is the addition of a second protein example in the biophysical illustration. The core idea in this paper is that, for Gibbs random fields and in particular for Ising models, when comparing several neighbourhood structures, the computation of the posterior probabilities of the models/structures under competition can be operated by likelihood-free simulation techniques akin to the Approximate Bayesian Computation (ABC) algorithm often discussed here. The point for this resolution is that, due to the specific structure of Gibbs random field distributions, there exists a sufficient statistic across models which allows for an exact (rather than Approximate) simulation from the posterior probabilities of the models. Obviously, when the structures grow more complex, it becomes necessary to introduce a true ABC step with a tolerance threshold\mathbf{\epsilon}in order to avoid running the algorithm for too long. Our toy example shows that the accuracy of the approximation of the Bayes factor can be greatly improved by resorting to the original ABC approach, since it allows for the inclusion of many more simulations. In the biophysical application to the choice of a folding structure for two proteins, we also demonstrate that we can implement the ABC solution on realistic datasets and, in the examples processed there, that the Bayes factors allow for a ranking more standard methods (FROST, TM-score) do not.

%d bloggers like this: