Bayesian computation: a summary of the current state, and samples backwards and forwards
“The Statistics and Computing journal gratefully acknowledges the contributions for this special issue, celebrating 25 years of publication. In the past 25 years, the journal has published innovative, distinguished research by leading scholars and professionals. Papers have been read by thousands of researchers world-wide, demonstrating the global importance of this field. The Statistics and Computing journal looks forward to many more years of exciting research as the field continues to expand.” Mark Girolami, Editor in Chief for The Statistics and Computing journal
Our joint [Peter Green, Krzysztof Łatuszyński, Marcelo Pereyra, and myself] review [open access!] on the important features of Bayesian computation has already appeared in the special 25th anniversary issue of Statistics & Computing! Along with the following papers
- Statistics and computing: the genesis of data science, David J. Hand, Founding Editor
- EM for mixtures: Initialization requires special care, Jean-Patrick Baudry, Gilles Celeux
- Sequential Monte Carlo methods for Bayesian elliptic inverse problems, Alexandros Beskos, Ajay Jasra, Ege A. Muzaffer, Andrew M. Stuart
- Bayesian inference via projections, Ricardo Silva, Alfredo Kalaitzis
- Computing functions of random variables via reproducing kernel Hilbert space representations, Bernhard Schölkopf, Krikamol Muandet, Kenji Fukumizu, Stefan Harmeling, Jonas Peters
- The Poisson transform for unnormalised statistical models, Simon Barthelmé, Nicolas Chopin
- Scalable estimation strategies based on stochastic approximations: classical results and new insights, Panos Toulis, Edoardo M. Airoldi
- de Finetti Priors using Markov chain Monte Carlo computations, Sergio Bacallado, Persi Diaconis, Susan Holmes
- Simulation-efficient shortest probability intervals, Ying Liu, Andrew Gelman, Tian Zheng
- Flexible parametric bootstrap for testing homogeneity against clustering and assessing the number of clusters, Christian Hennig, Chien-Ju Lin
which means very good company, indeed! And happy B’day to Statistics & Computing!
June 26, 2015 at 7:17 am
Thanks! This is a great summary! I have been looking forward to such comprehensive review on Big Bayesian computation for a while. I just had one quick comment on the parallel MCMC part as it is related to my work. Neiswanger et al. (2013) does not really solve the mixture component explosion issue, even they use the “one-at-a-time” trick. Supposing each subset posterior has 10,000 samples, their method needs to multiply at least two kernel densities, which already yields 10^8 mixture components, and each component is associated with a different weight. An independent Metropolis sampler works poorly in such situation. (It is likely to stuck at one “good” component and never move away, so works poorly for multi-modality). We recently wrote a new paper to fully address this component explosion issue by using multi-scale histograms (http://arxiv.org/pdf/1506.03164v1.pdf, comments are gratefully welcomed!). We had one example in the paper, where the posterior possesses two heterogeneous modes. It seems that Neiswanger et al. (2013) cannot allocate the correct probability mass to the two modes because of the aforementioned reason.
June 26, 2015 at 9:07 am
Thanks for the comments. Your paper is on my [long] reading list!