Archive for Monte Carlo methods

of first importance

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , , , , , , on June 14, 2022 by xi'an

My PhD student Charly Andral came with the question of the birthdate of importance sampling. I was under the impression that it had been created at the same time as the plain Monte Carlo method, being essentially the same thing since

\int_{\mathfrak X} h(x)f(x)\,\text dx = \int_{\mathfrak X} h(x)\frac{f(x)}{g(x)}g(x)\,\text dx

hence due to von Neumann or Ulam, but he could not find a reference earlier than a 1949 proceeding publication by Hermann Kahn in a seminar on scientific computation run by IBM. Despite writing a series of Monte Carlo papers in the late 1940’s and 1950’s, Kahn is not well-known in these circles (although mentioned in Fishman’s book), while being popular to some extent for his theorisation of nuclear war escalation and deterence. (I wonder if the concept is developed in some of his earlier 1948 papers. In a 1951 paper with Goertzel, a footnote signals than the approach was called quota sampling in their earlier papers. Charly has actually traced the earliest proposal as being Kahn’s, in a 14 June 1949 RAND preprint, beating Goertzel’s Oak Ridge National Laboratory preprint on quota sampling and importance functions by five days.)

(As a further marginalia, Kahn wrote with T.E. Harris an earlier preprint on Monte Carlo methods in April 1949, the same Harris as in Harris recurrence.)

rethinking the ESS published!

Posted in Statistics with tags , , , , , , , , on May 3, 2022 by xi'an

Our paper Rethinking the Effective Sample Size, with Victor Elvira (the driving force behind the paper!) and Luca Martino, has now been published in the International Statistical Review! As discussed earlier on this blog, we wanted to re-evaluate the pros and cons of the effective sample size (ESS), as a tool assessing the quality [or lack thereof] of a Monte Carlo approximation. It is particularly exploited in the specific context of importance sampling. Following a 1992 construction by Augustine Kong, his approximation has been widely used in the last 25 years, in part due to its simplicity as a practical rule of thumb. However, we show in this paper that the assumptions made in the derivation of this approximation make it difficult to consider it as a reasonable approximation of the ESS. Note that this reevaluation does not cover the use of ESS for Markov chain Monte Carlo algorithms, although there would also be much to tell about it..!

computing Bayes 2.0

Posted in Books, Statistics, University life with tags , , , , , , , , , , , on December 11, 2020 by xi'an

Our survey paper on “computing Bayes“, written with my friends Gael Martin [who led this project most efficiently!] and David Frazier, has now been revised and resubmitted, the new version being now available on arXiv. Recognising that the entire range of the literature cannot be encompassed within a single review, esp. wrt the theoretical advances made on MCMC, the revised version is more focussed on the approximative solutions (when considering MCMC as “exact”!). As put by one of the referees [which were all very supportive of the paper], “the authors are very brave. To cover in a review paper the computational methods for Bayesian inference is indeed a monumental task and in a way an hopeless one”. This is the opportunity to congratulate Gael on her election to the Academy of Social Sciences of Australia last month. (Along with her colleague from Monash, Rob Hyndman.)

MCqMC 2020 live and free and online

Posted in pictures, R, Statistics, Travel, University life with tags , , , , , , , , , , , , , on July 27, 2020 by xi'an

The MCqMC 20202 conference that was supposed to take place in Oxford next 9-14 August has been turned into an on-line free conference since travelling remains a challenge for most of us. Tutorials and plenaries will be live with questions  on Zoom, with live-streaming and recorded copies on YouTube. They will probably be during 14:00-17:00 UK time (GMT+1),  15:00-18:00 CET (GMT+2), and 9:00-12:00 ET. (Which will prove a wee bit of a challenge for West Coast and most of Asia and Australasia researchers, which is why our One World IMS-Bernoulli conference we asked plenary speakers to duplicate their talks.) All other talks will be pre-recorded by contributors and uploaded to a website, with an online Q&A discussion section for each. As a reminder here are the tutorials and plenaries:

Invited plenary speakers:

Aguêmon Yves Atchadé (Boston University)
Jing Dong (Columbia University)
Pierre L’Écuyer (Université de Montréal)
Mark Jerrum (Queen Mary University London)
Peter Kritzer (RICAM Linz)
Thomas Muller (NVIDIA)
David Pfau (Google DeepMind)
Claudia Schillings (University of Mannheim)
Mario Ullrich (JKU Linz)

Tutorials:

Fred Hickernell (IIT) — Software for Quasi-Monte Carlo Methods
Aretha Teckentrup (Edinburgh) — Markov chain Monte Carlo methods

Rao-Blackwellisation, a review in the making

Posted in Statistics with tags , , , , , , , , , , on March 17, 2020 by xi'an

Recently, I have been contacted by a mainstream statistics journal to write a review of Rao-Blackwellisation techniques in computational statistics, in connection with an issue celebrating C.R. Rao’s 100th birthday. As many many techniques can be interpreted as weak forms of Rao-Blackwellisation, as e.g. all auxiliary variable approaches, I am clearly facing an abundance of riches and would thus welcome suggestions from Og’s readers on the major advances in Monte Carlo methods that can be connected with the Rao-Blackwell-Kolmogorov theorem. (On the personal and anecdotal side, I only met C.R. Rao once, in 1988, when he came for a seminar at Purdue University where I was spending the year.)

%d bloggers like this: