**I** had been privileged to have a look at a preliminary version of the now-published retrospective written by Mike Titterington on the 100 first issues of *Biometrika* (more exactly, “*from volume 28 onwards*“, as the title state). Mike was the dedicated editor of *Biometrika* for many years and edited a nice book for the 100th anniversary of the journal. He started from the 100th most highly cited papers within the journal to build a coherent chronological coverage. From a Bayesian perspective, this retrospective starts with Maurice Kendall trying to reconcile frequentists and non-frequentists in 1949, while having a hard time with fiducial statistics. Then Dennis Lindley makes it to the top 100 in 1957 with the Lindley-Jeffreys paradox. From 1958 till 1961, Darroch is quoted several times for his (fine) formalisation of the capture-recapture experiments we were to study much later (Biometrika, 1992) with Ed George… In the 1960′s, Bayesian papers became more visible, including Don Fraser (1961) and Arthur Dempster’ Demspter-Shafer theory of evidence, as well as George Box and co-authors (1965, 1968) and Arnold Zellner (1964). Keith Hastings’ 1970 paper stands as the fifth most highly cited paper, even though it was ignored for almost two decades. The number of Bayesian papers kept increasing. including Binder’s (1978) cluster estimation, Efron and Morris’ (1972) James-Stein estimators, and Efron and Thisted’s (1978) terrific evaluation of Shakespeare’s vocabulary. From then, the number of Bayesian papers gets too large to cover in its entirety. The 1980′s saw papers by Julian Besag (1977, 1989, 1989 with Peter Clifford, which was yet another precursor MCMC) and Luke Tierney’s work (1989) on Laplace approximation. Carter and Kohn’s (1994) MCMC algorithm on state space models made it to the top 40, while Peter Green’s (1995) reversible jump algorithm came close to Hastings’ (1970) record, being the 8th most highly cited paper. Since the more recent papers do not make it to the top 100 list, Mike Titterington’s coverage gets more exhaustive as the years draw near, with an almost complete coverage for the final years. Overall, a fascinating journey through the years and the reasons why *Biometrika* is such a great journal and constantly so.

## Archive for reversible jump

## Biometrika, volume 100

Posted in Books, Statistics, University life with tags Bayesian statistics, Biometrika, Dempster-Shafer theory, Dennis Lindley, Hastings, Jeffreys-Lindley paradox, Julian Besag, Karl Pearson, Luke Tierney, MCMC, MCMC algorithms, Mike Titterington, Peter Clifford, Peter Green, reversible jump on March 5, 2013 by xi'an## reversible jump on HMMs

Posted in Books, Mountains, pictures, Statistics, Travel, University life with tags Bayesian inference, Glasgow, hidden Markov models, MCMC algorithms, regime-swiching time series, reversible jump, simiulation, split and merge moves, variational Bayes methods on December 19, 2011 by xi'an**H**ere is an email I received a few weeks ago about a paper written more than a decade ago in Glasgow with Tobias Rydén and Mike Titterington:

Sorry to bother you. I am a PhD student in economics. Recently, I am very interested in your paper “Bayesian inference in hidden Markov models through the reversible jump Markov chain Monte Carlo method”. I would like to use your method in estimating some regime-switching economic model. Unfortunately, I am not exactly understand your paper. Hence, I am writing to ask for your help. My questions are:

- A split or merge move is determined at the same time or sequentially? If the moves are determined at that same time, then accepting a split move implies that we can not accept a merge move any more in the same sweep. If the moves are determined sequentially, it means that we can accept a split move first, then accept a merge move in the same sweep.
[Answer: First interpretation is correct. Except that the type of move is first selected at random, then only the corresponding move is generated and potentially accepted.]- In the paper, you discuss how to generate new transition probabilities in a split move in details. However, you did not discuss (probably, I am wrong) how to generate probabilities in each new state (series Z
_{t}in your paper). Could you please tell me how to generate the series Z_{t}?[Answer: check eqn (3).]- My economic model is a multiple series (a vector hidden Markov model), will you refer me to some other papers for the vector model?
[Answer: If the observed series is multidimensional, the extension is formally straightforward, if potentially prone to slow mixing and low acceptance rates. If the hidden Markov chain is multidimensional, I have not seen a version of reversible jump in this setting. Maybe an extension of the variational methods described in Ghahramani and Jordan would help.]

to which I replied that the questions showed a deep lack of understanding of what reversible jump is and that the PhD student should first check the literature, for instance the great intro paper by Charlie Geyer in ** Handbook of Markov chain Monte Carlo** and then the original papers by Green (1995) and Richardson and Green (1997).

## Handbook of Markov chain Monte Carlo

Posted in Books, R, Statistics, University life with tags ABC, adaptive MCMC methods, base-jumping, Biometrika, book review, edited book, Gaussian state spaces, history of statistics, Markov chains, MCMC, Monte Carlo Statistical Methods, perfect sampling, R, reversible jump, simulation on September 22, 2011 by xi'an**A**t JSM, John Kimmel gave me a copy of the ** Handbook of Markov chain Monte Carlo**, as I had not (yet?!) received it. This handbook is edited by Steve Brooks, Andrew Gelman, Galin Jones, and Xiao-Li Meng, all first-class jedis of the MCMC galaxy. I had not had a chance to get a look at the book until now as Jean-Michel Marin took it home for me from Miami, but, as he remarked in giving it back to me last week, the outcome truly is excellent! Of course, authors and editors being friends of mine, the reader may worry about the objectivity of this assessment; however the quality of the contents is clearly there and the book appears as a worthy successor to the tremendous

**by Wally Gilks, Sylvia Richardson and David Spiegelhalter. (I can attest to the involvement of the editors from the many rounds of reviews we exchanged about our MCMC history chapter!) The style of the chapters is rather homogeneous and there are a few R codes here and there. So, while I will still stick to our**

*Markov chain Monte Carlo in Practice***book for teaching MCMC to my graduate students next month, I think the book can well be used at a teaching level as well as a reference on the state-of-the-art MCMC technology. Continue reading**

*Monte Carlo Statistical Methods*## Posterior model probabilities computed from model-specific Gibbs output [arXiv:1012.0073]

Posted in Books, Statistics with tags Bayesian Analysis, Bayesian model choice, Gibbs sampling, MCMC algorithms, pseudo-priors, reversible jump on December 9, 2010 by xi'an

“Expressing RJMCMC as simple Gibbs sampling provides the key innovation of our formulation: it allows us to fit models one at a time using ordinary MCMC and then compute model weights or Bayes factors by post-processing the Monte Carlo output.”

**R**ichard Barker (from the University of Otago, Dunedin, New Zealand) and William Link posted this new paper on arXiv. A point in their abstract attracted my attention, namely that they produce a “representation [that] allows [them] to fit models one at a time using ordinary MCMC and then compute model weights or Bayes factors by post-processing the Monte Carlo output”. This is quite interesting in that most attempts at building Bayes factors approximations from separate chains running each on a separate model have led to erroneous solutions. It appears however that the paper builds upon a technique fully exposed in the book written by the authors. Continue reading

## CoRe in CiRM [end]

Posted in Books, Kids, Mountains, pictures, R, Running, Statistics, Travel, University life with tags ABC, Bayes factor, Bayesian Core, Bayesian model choice, birth-and-death process, calanques, Chib's approximation, CIRM, Introducing Monte Carlo Methods with R, Luminy, Marseille, Morgiou, reversible jump, Sugiton on July 18, 2010 by xi'an**B**ack home after those two weeks in CiRM for our “research in pair” invitation to work on the new edition of * Bayesian Core*, I am very grateful for the support we received from CiRM and through it from SMF and CNRS. Being “locked” away in such a remote place brought a considerable increase in concentration and decrease in stress levels. Although I was planning for more, we have made substantial advances on five chapters of the book (out of nine), including a completely new chapter (Chapter 8) on hierarchical models and a thorough rewriting of the normal chapter (Chapter 2), which along with Chapter 1 (largely inspired from Chapter 1 of

**, itself inspired from the first edition of**

*Introducing Monte Carlo Methods with R**,!). is nearly done. Chapter 9 on image processing is also quite close from completion, with just the result of a batch simulation running on the Linux server in Dauphine to include in the ABC section. As the only remaining major change is the elimination of reversible jump from the mixture chapter (to be replaced with Chib’s approximation) and from the time-series chapter (to be simplified into a birth-and-death process). Going back to the CiRM environment, I think we were lucky to come during the vacation season as there is hardly anyone on the campus, which means no car and no noise. The (good) feeling of remoteness is not as extreme as in Oberwolfach, but it is truly a quality environment. Besides, being able to work 24/7 in the math library is a major plus. as we could go and grab any reference we needed to check. (Presumably, CiRM is lacking in terms of statistics books, compared with Oberwolfach, still providing most of the references we were looking for.) At last, the freedom to walk right out of the Centre into the national park for a run, a climb or even a swim (in Morgiou, rather than Sugiton) makes working there very tantalising indeed! I thus dearly hope I can enjoy again this opportunity in a near future…*

**Bayesian Core**