Monte Carlo Statistical Methods third edition

Last week, George Casella and I worked around the clock on starting the third edition of Monte Carlo Statistical Methods by detailing the changes to make and designing the new table of contents. The new edition will not see a revolution in the presentation of the material but rather a more mature perspective on what matters most in statistical simulation:

Chapter 1 (Introduction) will include real datasets with more complex motivating models (e.g., Bayesian lasso).

Chapter 2 (Random generation) will update the section on uniform generators and include a section on ABC. (I will also include a note on the irrelevance of hardware random generators.)

Chapter 3 (Monte Carlo integration) will include a reference to INLA, the integrated Laplace approximation of Rue, Martinez and Chopin, as well as to our recent vanilla Rao-Blackwellisation paper.

Chapter 4 (Control of convergence) will remove the multivariate normal approximation of the beginning, replace it with the Brownian bound solution already presented in Introducing Monte Carlo Methods with R, and include connections with the Read Paper of Kong et al. and the multiple mixture paper of Owen and Zhou.

Chapter 4 (Stochastic optimisation) will include some example from Introducing Monte Carlo Methods with R and add recent results on EM standard error by Cappé and Moulines.

Chapter 6 (Markov chains) will now be split in two chapter. The first chapter will deal with the basics of Markov chains, independent of MCMC algorithms. The second Markov Chain chapter will be devoted to the theory specific to MCMC use, including regeneration, Peskun ordering, batch means, spectral analysis…

Chapter 7 (Metropolis-Hastings algorithms) will include a very basic example at the beginning and cover algorithms beyond the random walk. Adaptive MCMC will also be processed in this chapter.

Chapter 8 (Slice sampling) and Chapter 9 (2-stage Gibbs sampling) will be reunited, as in the first edition of the book (!). The new chapter will also compare mixture models with product partition models. And hopefully do a better job at covering Liu, Wong and Kong (1994).

Chapter 10 (general Gibbs sampling) will cover mixed linear models and hierarchical models in more details. It will include entries on parameter expansion, Dirichlet processes, JAGS, Bayesian lasso.

Chapter 11 (Reversible jump algorithms) will face an in-depth change to become a chapter on computational techniques for model choice. This means covering intra-model as well as inter-model computational tools like bridge, path, umbrella, nested sampling, harmonic means, Chib’s representation, &tc. We will reduce considerably the entry on reversible jump and cover other stochastic search methods, like the shotgun gun stochastic search of Hans, Dobra and West.

Chapter 12 (Diagnosing convergence) will focus more precisely on the methods that survived the test of time, removing some parts and illustrating remaining methods with coda output. Batch means and effective sample size will be part of the diagnostics.

Chapter 13 (Perfect sampling) will disappear into a section of Chapter 6 (Markov chains), as we fear perfect sampling remained more of an elegant theoretical construct than a genuinely implementable technique, despite the fascination it inspired in the community, us included.

Chapter 14 (Iterated and Sequential Importance Sampling) will become a chapter on Iterated and Sequential Monte Carlo, with an extensive rewriting in order to include some of the most recent advances in particle systems, including the 2009 Read Paper of Andrieu, Doucet and Holenstein.

Overall, the goal is to make the book more focussed on well-established techniques, reinforcing the theoretical backup whenever possible, as well as to cover recent developments of importance in the field. Given the availability of the companion Introducing Monte Carlo Methods with R , we will not cover the practicals of R implementation, even though we will make all R codes available once the revision is completed. We hope to be done by next summer, even though the simultaneous handling of three other books will certainly be a liability for me…

12 Responses to “Monte Carlo Statistical Methods third edition”

  1. I was wondering if the third edition has been released? Thanks.

    • Alas, George’s sudden demise put a stop to these plans and I have not been re-considering a third edition since then.

      • So sorry. Casella has written several books that has greatly improved my understanding in statistics. Currently I am reading the monte carlo book (2ed) by you two.

      • Yes George was quite engaged towards this third edition and came to Paris in 2010 to start working on it. We did not make it on time, though, mostly my fault!, and the more I wait the more difficult it gets as the entire book would need be updated, I fear!

  2. […] sampling is clear there. Which makes me think we should present the comparison that way in the next edition of Monte Carlo Statistical Methods. […]

  3. Dear Prof. Robert

    is there any update on the 3rd edition, in terms of expected release date? I am wondering whether to get the 2nd edition now or wait for the 3rd one, in case the latter does not come out too far away in time.

    Kind regards

  4. […] belief that something needs to be done to counteract restricted supports. However, there is no mathematical reason for doing so! Consider the following […]

  5. […] came to see me the other day with a bivariate Poisson distribution and a question about using EM in this framework. The problem boils down to adding one correlation parameter and an extra term in […]

  6. Is there a listserv to be notified when the 3rd edition becomes eligible for early purchase sign-ups through, say, Amazon? I have both 2nd edition and Intro MCM with R, and am using them *heavily*.

    Thanks.

  7. […] I will most certainly use this material in my graduate courses and also include part of it in the revision of Monte Carlo Statistical […]

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.