Archive for random number generator
random number generation
Posted in Books, Kids, pictures with tags Ayn Rand, objectivism, random number generator, randomness, simulation on October 27, 2013 by xi'anatmospheric random generator?!
Posted in Books, Mountains, pictures, Statistics, Travel with tags Mersenne twister, Patagonia, PHP, rand, random number generator, randomness, surfing, The Cleanest Line on April 10, 2012 by xi'anAs I was glancing through The Cleanest Line, (the outdoor clothing company) Patagonia‘s enjoyable—as long as one keeps in mind Patagonia is a company, although with commendable ethical and ecological goals—blog, I came upon this entry “And the Winner of “Chasing Waves” is …” where the name of the winner of the book Chasing Wave was revealed. (Not that I am particularly into surfing…!) The interesting point to which I am coming so circumlocutory (!) is that they use a random generator based on atmospheric noise to select the winner! I particularly like the sentence that the generator “for many purposes is better than the pseudorandom number algorithms typically used in computer programs”. For which purpose exactly?!
Now, to be (at least a wee) fair, the site of random.org contains an explanation about the quality of their generator. I am however surprised by the comparison they run with the rand() function from PHP on Microsoft Windows, since it produces a visible divergence from uniformity on a bitmap graph… Further investigation led to this explanation of the phenomenon, namely the inadequacy of the PHP language rather than of the underlying (pseudo)random generator. (It had been a while since I had a go at this randomness controvery!)
resampling and [GPU] parallelism
Posted in Statistics, University life with tags GPU, particle MCMC, Raftery and Lewis' number of iterations, random number generator, resampling, stratified resampling, systematic resampling on March 13, 2012 by xi'anIn a recent note posted on arXiv, Lawrence Murray compares the implementation of resampling schemes for parallel systems like GPUs. Given a system of weighted particles, (x_{i},ω_{i}), there are several ways of drawing a sample according to those weights:
 regular multinomial resampling, where each point in the (new) sample is one of the (x_{i},ω_{i}), with probability (x_{i},ω_{i}), meaning there is a uniform generated for each point;
 stratified resampling, where the weights are added, divided into equal pieces and a uniform is sampled on each piece, which means that points with large weights are sampled at least once and those with small weights at most once;
 systematic resampling, which is the same as the above except that the same uniform is used for each piece,
 Metropolis resampling, where a Markov chain converges to the distribution (ω_{1},…, ω_{P}) on {1,…,P},
The three first resamplers are common in the particle system literature (incl. Nicolas Chopin’s PhD thesis), but difficult to adapt to GPUs (and I always feel uncomfortable with the fact that systematic uses a single uniform!), while the last one is more unusual, but actually wellfitted for a parallel implementation. While Lawrence Murray suggests using Raftery and Lewis’ (1992) assessment of the required number of Metropolis iterations to “achieve convergence”, I would instead suggest taking advantage of the toric nature of the space (as represented above) to run a random walk and wait for the equivalent of a complete cycle. In any case, this is a cool illustration of the new challenges posed by parallel implementations (like the development of proper random generators).
GPUs in computational statistics
Posted in pictures, Statistics, Travel with tags Coventry, CPU, GPU, haggis, random number generator, Robert Burns, Stilton, University of Warwick on January 27, 2012 by xi'anThe workshop in Warwick yesterday went on very quickly! The room was packed. The three first talks were by myself, Christophe and Pierre, so less about GPUs and more about simulation techniques which could benefit from or even require implementation on GPUS. (I did manage to have complete slides this time… More seriously, Christophe’s talk set me thinking on the issue of estimating the likelihood function in ways different (?) from the one used in ABC.) The second half was more novel for me, in that the three talks went into the computing and computer aspects of GPUS, with Chris Holmes doing sparse [Lassolike] regression on a scale otherwise [i.e. w/o GPUs] impossible, Chris [fourth Chris in the list of speakers!] Barnes explaining ABC for molecular biology and design (a point I plan to discuss on a later post), with even more details about the architecture and programming of GPUs, and Michael Stumpf delivering a grand finale, with essentially three talks into one: network analysis (incl. terrific movie bits incorporated within the beamer slides!), GPUs vs. CPUs and older alternatives, and random generators on GPU, commenting on a recent paper by Salmon et al. (SC, 2011) and showing that true gains in efficiency from using GPUs involved a heavy involvement into the hardware structure… A very exciting day followed by Stilton cheese degustation and haggis (if not poems) to celebrate Burns’ night!

Some hae meat and canna eat,
And some wad eat that want it;
But we hae meat, and we can eat,
And sae let the Lord be thankit.