## Monte Carlo Statistical Methods third edition

**L**ast week, George Casella and I worked around the clock on starting the third edition of * Monte Carlo Statistical Methods* by detailing the changes to make and designing the new table of contents. The new edition will not see a revolution in the presentation of the material but rather a more mature perspective on what matters most in statistical simulation:

Chapter 1 (Introduction) will include real datasets with more complex motivating models (e.g., Bayesian lasso).

Chapter 2 (Random generation) will update the section on uniform generators and include a section on ABC. (I will also include a note on the irrelevance of hardware random generators.)

Chapter 3 (Monte Carlo integration) will include a reference to INLA, the integrated Laplace approximation of Rue, Martinez and Chopin, as well as to our recent vanilla Rao-Blackwellisation paper.

Chapter 4 (Control of convergence) will remove the multivariate normal approximation of the beginning, replace it with the Brownian bound solution already presented in ** Introducing Monte Carlo Methods with R**, and include connections with the Read Paper of Kong et al. and the multiple mixture paper of Owen and Zhou.

Chapter 4 (Stochastic optimisation) will include some example from ** Introducing Monte Carlo Methods with R **and add recent results on EM standard error by Cappé and Moulines.

Chapter 6 (Markov chains) will now be split in two chapter. The first chapter will deal with the basics of Markov chains, independent of MCMC algorithms. The second Markov Chain chapter will be devoted to the theory specific to MCMC use, including regeneration, Peskun ordering, batch means, spectral analysis…

Chapter 7 (Metropolis-Hastings algorithms) will include a very basic example at the beginning and cover algorithms beyond the random walk. Adaptive MCMC will also be processed in this chapter.

Chapter 8 (Slice sampling) and Chapter 9 (2-stage Gibbs sampling) will be reunited, as in the first edition of the book (!). The new chapter will also compare mixture models with product partition models. And hopefully do a better job at covering Liu, Wong and Kong (1994).

Chapter 10 (general Gibbs sampling) will cover mixed linear models and hierarchical models in more details. It will include entries on parameter expansion, Dirichlet processes, JAGS, Bayesian lasso.

Chapter 11 (Reversible jump algorithms) will face an in-depth change to become a chapter on computational techniques for model choice. This means covering intra-model as well as inter-model computational tools like bridge, path, umbrella, nested sampling, harmonic means, Chib’s representation, &tc. We will reduce considerably the entry on reversible jump and cover other stochastic search methods, like the shotgun gun stochastic search of Hans, Dobra and West.

Chapter 12 (Diagnosing convergence) will focus more precisely on the methods that survived the test of time, removing some parts and illustrating remaining methods with coda output. Batch means and effective sample size will be part of the diagnostics.

Chapter 13 (Perfect sampling) will disappear into a section of Chapter 6 (Markov chains), as we fear perfect sampling remained more of an elegant theoretical construct than a genuinely implementable technique, despite the fascination it inspired in the community, us included.

Chapter 14 (Iterated and Sequential Importance Sampling) will become a chapter on Iterated and Sequential Monte Carlo, with an extensive rewriting in order to include some of the most recent advances in particle systems, including the 2009 Read Paper of Andrieu, Doucet and Holenstein.

**O**verall, the goal is to make the book more focussed on well-established techniques, reinforcing the theoretical backup whenever possible, as well as to cover recent developments of importance in the field. Given the availability of the companion ** Introducing Monte Carlo Methods with R **, we will not cover the practicals of

**R**implementation, even though we will make all

**R**codes available once the revision is completed. We hope to be done by next summer, even though the simultaneous handling of three other books will certainly be a liability for me…

February 23, 2012 at 12:12 pm

[…] sampling is clear there. Which makes me think we should present the comparison that way in the next edition of Monte Carlo Statistical Methods. […]

September 12, 2011 at 4:07 pm

Dear Prof. Robert

is there any update on the 3rd edition, in terms of expected release date? I am wondering whether to get the 2nd edition now or wait for the 3rd one, in case the latter does not come out too far away in time.

Kind regards

September 12, 2011 at 4:53 pm

Sorry, the 3rd edition still is in limbos, mostly due to the editorial changes at Springer (and also to our busy schedules!)

July 5, 2011 at 12:17 am

[…] belief that something needs to be done to counteract restricted supports. However, there is no mathematical reason for doing so! Consider the following […]

March 2, 2011 at 12:14 am

[…] came to see me the other day with a bivariate Poisson distribution and a question about using EM in this framework. The problem boils down to adding one correlation parameter and an extra term in […]

February 24, 2011 at 5:46 pm

Is there a listserv to be notified when the 3rd edition becomes eligible for early purchase sign-ups through, say, Amazon? I have both 2nd edition and Intro MCM with R, and am using them *heavily*.

Thanks.

February 24, 2011 at 6:41 pm

Jan: thanks for the question but I have not yet started on the new edition so this should take quite a while…

February 23, 2011 at 12:11 am

[…] I will most certainly use this material in my graduate courses and also include part of it in the revision of Monte Carlo Statistical […]