Archive for Statisfaction

assessing MCMC convergence

Posted in Books, Statistics, University life with tags , , , , , , , , , , , on June 6, 2019 by xi'an

When MCMC became mainstream in the 1990’s, there was a flurry of proposals to check, assess, and even guarantee convergence to the stationary distribution, as discussed in our MCMC book. Along with Chantal Guihenneuc and Kerrie Mengersen, we also maintained for a while a reviewww webpage categorising theses. Niloy Biswas and Pierre Jacob have recently posted a paper where they propose the use of couplings (and unbiased MCMC) towards deriving bounds on different metrics between the target and the current distribution of the Markov chain. Two chains are created from a given kernel and coupled with a lag of L, meaning that after a while, the two chains become one with a time difference of L. (The supplementary material contains many details on how to induce coupling.) The distance to the target can then be bounded by a sum of distances between the two chains until they merge. The above picture from the paper is a comparison a Polya-Urn sampler with several HMC samplers for a logistic target (not involving the Pima Indian dataset!). The larger the lag L the more accurate the bound. But the larger the lag the more expensive the assessment of how many steps are needed to convergence. Especially when considering that the evaluation requires restarting the chains from scratch and rerunning until they couple again, rather than continuing one run which can only brings the chain closer to stationarity and to being distributed from the target. I thus wonder at the possibility of some Rao-Blackwellisation of the simulations used in this assessment (while realising once more than assessing convergence almost inevitably requires another order of magnitude than convergence itself!). Without a clear idea of how to do it… For instance, keeping the values of the chain(s) at the time of coupling is not directly helpful to create a sample from the target since they are not distributed from that target.

[Pierre also wrote a blog post about the paper on Statisfaction that is definitely much clearer and pedagogical than the above.]

correlation for maximal coupling

Posted in Books, Kids, pictures, R, Statistics, University life with tags , , , , , , on January 3, 2018 by xi'an

An interesting (if vaguely formulated) question on X validated: given two Gaussian variates that are maximally coupled, what is the correlation between these variates? The answer depends on the parameters of both Gaussian, with a correlation of one when both Gaussians are identical. Answering the question by simulation (as I could not figure out the analytical formula on Boxing Day…) led me back to Pierre Jacob’s entry on the topic on Statisfaction, where simulating the maximal coupling stems from the decompositions

p(x)=p(x)∧q(x)+{p(x)-p(x)∧q(x)}  and  q(x)=p(x)∧q(x)+{q(x)-p(x)∧q(x)}

and incidentally to the R function image.plot (from the R library fields) for including the side legend.

truncated normal algorithms

Posted in Books, pictures, R, Statistics, University life with tags , , , , on January 4, 2017 by xi'an

Nicolas Chopin (CREST) just posted an entry on Statisfaction about the comparison of truncated Normal algorithms run by Alan Rogers, from the University of Utah. Nicolas wrote a paper in Statistics and Computing about a simulation method, which proposes a Ziggurat type of algorithm for this purpose, and which I do not remember reading, thanks to my diminishing memory buffer!  As shown in the picture below, when truncating to the half-line (a,∞), this method improves upon my accept-reject approach except in the far tails.

truncanormOn the top graph, made by Alan Rogers, my uniform proposal (r) seems to be doing better for a Normal truncated to (a,b) when b<0, or when a gets large and close to b. Nicolas’ ziggurat (c) works better than the Gaussian accept-reject method (c) on the positive part. (I wonder what the exponential proposal (e) stands for, in terms of scale parameter.)

in the time of cholera

Posted in Books, Kids, pictures, Travel with tags , , , , , , , , , on April 6, 2014 by xi'an

Path storage in the particle filter

Posted in Books, Kids, Statistics, Travel, University life with tags , , , , , , on July 24, 2013 by xi'an

IMG_0324In the train to Annecy, I read the recently arXived paper by my former PhD student Pierre Jacob (now at NUS), along with Lawrence Murray (Perth), and Sylvain Rubenthaler (Nice), where they obtain precise degeneracy rates of the regular particle filter applied to hidden Markov models with a compact observation space, precise enough to consider storing the entire paths at a linear occupancy rate. Interestingly, the distance to the most common ancestor is of order N log N, if N is the number of particles. And the number of nodes is O(N log N) as well. This means indeed that the whole paths can be stored, which offers a lot of potential in terms of Rao-Blackwellisation and parallelisation. I was first bemused by a silly misunderstanding about the purpose of the algorithm: it is directed at inference at the current time index, not over the whole past and not over the parameters of the model for, else how could we consider the algorithm has converged when it degenerates to a single path at some finite horizon the past? Pierre has also written a further comment of the paper on Statistfaction.

#2 blog for the statistics geek?!

Posted in Uncategorized with tags , , , , , on January 24, 2011 by xi'an

Julien pointed out to me that the ‘Og was #2 in this list of “40 fascinating blogs for the ultimate statistics geek“… Dunno how to take it! I also note Statisfaction ranked as #4 and Freakanometrics as #5, which sounds like the ranking is a wee haphazard, the latter blog having at least four times as much trafffic as the ‘Og and focussing solely on statistics, acturial science, R programming, and related scientific questions! (Still, verrry nice to make it to a list!)

Statisfaction

Posted in R, Statistics, University life with tags , , , on September 3, 2010 by xi'an

A collective blog has been started by the statistics students and postdocs at CREST, in the wake of the Valencia meeting. It is called Statisfaction. (The Rolling Stones of Statistics?! Actually, Andrew Gelman also has a post with that title… And it is even part of the Urban Dictionnary!) Since I have no responsability nor even say in the contents of this independent blog, I cannot help but recommend following it! The latest posting is about the slides of Peter Müller’s slides of his Santa Cruz course in Bayesian nonparametrics being available on line.