Archive for Clifton

resampling methods

Posted in Books, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , on December 6, 2017 by xi'an

A paper that was arXived [and that I missed!] last summer is a work on resampling by Mathieu Gerber, Nicolas Chopin (CREST), and Nick Whiteley. Resampling is used to sample from a weighted empirical distribution and to correct for very small weights in a weighted sample that otherwise lead to degeneracy in sequential Monte Carlo (SMC). Since this step is based on random draws, it induces noise (while improving the estimation of the target), reducing this noise is preferable, hence the appeal of replacing plain multinomial sampling with more advanced schemes. The initial motivation is for sequential Monte Carlo where resampling is rife and seemingly compulsory, but this also applies to importance sampling when considering several schemes at once. I remember discussing alternative schemes with Nicolas, then completing his PhD, as well as Olivier Cappé, Randal Douc, and Eric Moulines at the time (circa 2004) we were working on the Hidden Markov book. And getting then a somewhat vague idea as to why systematic resampling failed to converge.

In this paper, Mathieu, Nicolas and Nick show that stratified sampling (where a uniform is generated on every interval of length 1/n) enjoys some form of consistent, while systematic sampling (where the “same” uniform is generated on every interval of length 1/n) does not necessarily enjoy this consistency. There actually exists cases where convergence does not occur. However, a residual version of systematic sampling (where systematic sampling is applied to the residuals of the decimal parts of the n-enlarged weights) is itself consistent.

The paper also studies the surprising feature uncovered by Kitagawa (1996) that stratified sampling applied to an ordered sample brings an error of O(1/n²) between the cdf rather than the usual O(1/n). It took me a while to even understand the distinction between the original and the ordered version (maybe because Nicolas used the empirical cdf during his SAD (Stochastic Algorithm Day!) talk, ecdf that is the same for ordered and initial samples).  And both systematic and deterministic sampling become consistent in this case. The result was shown in dimension one by Kitagawa (1996) but extends to larger dimensions via the magical trick of the Hilbert curve.

forgotten snapshot from Bristol

Posted in Statistics with tags , , , , on February 27, 2013 by xi'an

house in Clifton, Bristol, Sept. 28, 2012

Bristol rainbow

Posted in pictures, Running, Travel with tags , , , , , , on September 30, 2012 by xi'an

 

structure and uncertainty, Bristol, Sept. 27

Posted in pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , on September 28, 2012 by xi'an

The last sessions at the SuSTain workshop. were equally riveting but I alas had to leave early to get a noon flight—as it happens, while I expected to get home early enough to work, run, cook, and do maths with my daughter, my taxi got stuck in an endless traffic jam and I only had time for the maths!—, hence missing the talks by Chris Holmes—second time after Kyoto!—, Sofia Massa, and Arnoldo Frigessi… I am glad I managed to get Michael Newton’s and Forrest Crawford’s talks, though, as Michael presented a highly pedagogical entry to computational concepts related to system biology (a potential candidate for an MCMSki IV talk?) and Forrest discussed some birth-and-death processes, including the Yule process, that allowed for closed form expressions of their Laplace transform via continued fractions. (Continued fractions, one of my favourite mathematical objects!!! Rarely appearing in statistics, though…) I have to check on Forrest’s recent papers to understand how widely this approach applies to philogenetic trees, but this opens a fairly interesting alternative to ABC!

This was a highly enjoyable meeting, first and foremost due to the quality of the talks and of their scheduling, but also by the pleasure of seeing again many friends of many years—notice how I carefully avoided using “old friends”!—, by the relaxed and open atmosphere of the workshop—in the terrific location of Goldney Hall—and of course of unofficially celebrating Peter Green’s deeds and contributions to the field, the profession, and the statistics group in Bristol! Deeds and contributions so far, as I am sure he will keep contributing in many ways in the coming years and decades, as already shown by his committed involvement in the very recent creation of BayesComp. I thus most gladly join the other participants of this workshop both to thank him most sincerely for those many and multifaceted contributions and to wish him all the best for those coming decades!

As an aside, I also enjoyed being “back” in Bristol once again, as I do like the city, the surrounding Somerset countryside, the nearby South Wales, and the wide running possibilities (from the Downs to the Mendip Hills!). While I sampled many great hotels in Bristol and Clifton over the years, I now rank the Avon Gorges Hotel where I stayed this time quite high in the list, both for its convenient (running!) location and its top-quality facilities (incl. high-speed WiFi!)

up the Downs

Posted in pictures, Running, Travel with tags , , , , , on September 27, 2012 by xi'an

%d bloggers like this: