## RNG impact on MCMC [or lack thereof]

Posted in Books, R, Statistics, Travel, University life with tags , , , , , , , on July 13, 2017 by xi'an

Following the talk at MCM 2017 about the strange impact of the random generator on the outcome of an MCMC generator, I tried in Montréal airport the following code on the banana target of Haario et al. (1999), copied from Soetaert and Laine and using the MCMC function of the FME package:

library(FME)
Banana <- function (x1, x2) {
return(x2 - (x1^2+1)) }
pmultinorm <- function(vec, mean, Cov) {
diff <- vec - mean
ex <- -0.5*t(diff) %*% solve(Cov) %*% diff
rdet <- sqrt(det(Cov))
power <- -length(diff)*0.5
return((2.*pi)^power / rdet * exp(ex)) }
BananaSS <- function (p) {
P <- c(p[1], Banana(p[1], p[2]))
Cov <- matrix(nr = 2, data = c(1, 0.9, 0.9, 1))
N=1e3
ejd=matrix(0,4,N)
RNGkind("Mars")
for (t in 1:N){
MCMC <- modMCMC(f = BananaSS, p = c(0, 0.7),
jump = diag(nrow = 2, x = 5), niter = 1e3)
ejd[1,t]=mean((MCMC$pars[-1,2]-MCMC$pars[1,2])^2)}


since this divergence from the initial condition seemed to reflect the experiment of the speaker at MCM 2017. Unsurprisingly, no difference came from using the different RNGs in R (which may fail to contain those incriminated by the study)…

## Monty Python generator

Posted in Books, Kids, pictures, R, Statistics, University life with tags , , , , , , , on November 23, 2016 by xi'an

By some piece of luck I came across a paper by the late George Marsaglia, genial contributor to the field of simulation, and Wai Wan Tang, entitled The Monty Python method for generating random variables. As shown by the below illustration, the concept is to flip the piece H outside the rectangle back inside the rectangle, exploiting the remaining area above the density. The fantastic part is actually that “since areas of the rectangle add to 1, the slim in-between area is exactly the tail area”! So the tiny bit between G and the flipped H is the remaining tail.In the case of a Gamma Ga(a,1) variate, the authors express this variate as the transform of another variate with a nearly symmetry density, on which the Monty Python method applies. The transform is

$q(x)=(a-1/3)(1 + x/\sqrt{16a})^3$

with -√16a<x. The second nice trick is that the density of x is provided for free by the Gamma Ga(a,1) density and the transform, thanks to the change of variable formula. One lingering question is obviously how to handle the tail part. This is handled separately in the paper, with a rather involved algorithm, but since the area of the tail is tiny, a mere 1.2% in the case of the Gaussian density, this instance occurs rarely. Very clever if highly specialised! (The case of a<1 has to be processed by the indirect of multiplying a Ga(a+1,1) by a uniform variate to the power 1/a.)

I also found out that there exists a Monte Python software, which is an unrelated Monte Carlo code in python [hence the name] for cosmological inference. Including nested sampling, unsurprisingly.

## Le Monde sans puzzle

Posted in Books, Kids, Statistics, University life with tags , , , , , , , on June 17, 2014 by xi'an

This week, Le Monde mathematical puzzle: is purely geometric, hence inappropriate for an R resolution. In the Science & Médecine leaflet, there is however an interesting central page about random generators, from the multiple usages of those in daily life to the consequences of poor generators on cryptography and data safety. The article is compiling an interview of Jean-Paul Delahaye on the topic with recent illustrations from cybersecurity. One final section gets rather incomprehensible: when discussing the dangers of seed generation, it states that “a poor management of the entropy means that an hacker can saturate the seed and take over the original randomness, weakening the whole system”. I am sure there is something real behind the imagery, but this does not make sense… Another insert mentions a possible random generator built out of the light detectors on a smartphone. And quantum physics. The society IDQ can indeed produce ultra-rapid random generators that way. And it also ran randomness tests summarised here. Using in particular George Marsaglia’s diehard battery.

Another column report that a robot passed the Turing test last week, on Turing‘s death anniversary. Meaning that 33% of the jury was convinced the robot’s answers were given by a human. This reminded me of the Most Human Human book sent to me by my friends from BYU. (A marginalia found in Le Monde is that the test was organised by Kevin Warwick…from the University of Coventry, a funny reversal of the University of Warwick sitting in Coventry! However, checking on his website showed that he has and had no affiliation with this university, being at the University of Reading instead.)