## hands-on probability 101

Posted in Books, Kids, pictures, Statistics, University life with tags , , , , , , , , , on April 3, 2021 by xi'an When solving a rather simple probability question on X validated, namely the joint uniformity of the pair $(X,Y)=(A-B+\mathbb I_{A

when A,B,C are iid U(0,1), I chose a rather pedestrian way and derived the joint distribution of (A-B,C-B), which turns to be made of 8 components over the (-1,1)² domain. And to conclude at the uniformity of the above, I added a hand-made picture to explain why the coverage by (X,Y) of any (red) square within (0,1)² was uniform by virtue of the symmetry between the coverage by (A-B,C-B) of four copies of the (red) square, using color tabs that were sitting on my desk..! It did not seem to convince the originator of the question, who kept answering with more questions—or worse an ever-changing question, reproduced in real time on math.stackexchange!, revealing there that said originator was tutoring an undergrad student!—but this was a light moment in a dreary final day before a new lockdown.

## deterministic moves in Metropolis-Hastings

Posted in Books, Kids, R, Statistics with tags , , , , , , , , on July 10, 2020 by xi'an A curio on X validated where an hybrid Metropolis-Hastings scheme involves a deterministic transform, once in a while. The idea is to flip the sample from one mode, ν, towards the other mode, μ, with a symmetry of the kind

μ-α(x+μ) and ν-α(x+ν)

with α a positive coefficient. Or the reciprocal,

-μ+(μ-x)/α and -ν+(ν-x)/α

for… reversibility reasons. In that case, the acceptance probability is simply the Jacobian of the transform to the proposal, just as in reversible jump MCMC. Why the (annoying) Jacobian? As explained in the above slides (and other references), the Jacobian is there to account for the change of measure induced by the transform.

Returning to the curio, the originator of the question had spotted some discrepancy between the target and the MCMC sample, as the moments did not fit well enough. For a similar toy model, a balanced Normal mixture, and an artificial flip consisting of

x’=±1-x/2 or x’=±2-2x

implemented by

  u=runif(5)
if(u<.5){
mhp=mh[t-1]+2*u-1
mh[t]=ifelse(u<gnorm(mhp)/gnorm(mh[t-1]),mhp,mh[t-1])
}else{
dx=1+(u<.5)
mhp=ifelse(dx==1,
ifelse(mh[t-1]<0,1,-1)-mh[t-1]/2,
2*ifelse(mh[t-1]<0,-1,1)-2*mh[t-1])
mh[t]=ifelse(u<dx*gnorm(mhp)/gnorm(mh[t-1])/(3-dx),mhp,mh[t-1])


I could not spot said discrepancy beyond Monte Carlo variability.

## an elegant result on exponential spacings

Posted in Statistics with tags , , , , , , , , , , , , , on April 19, 2017 by xi'an

A question on X validated I spotted in the train back from Lyon got me desperately seeking a reference in Devroye’s Generation Bible despite the abyssal wireless and a group of screeching urchins a few seats away from me… The question is about why $\sum_{i=1}^{n}(Y_i - Y_{(1)}) \sim \text{Gamma}(n-1, 1)$

when the Y’s are standard exponentials. Since this reminded me immediately of exponential spacings, thanks to our Devroye fan-club reading group in Warwick,  I tried to download Devroye’s Chapter V and managed after a few aborts (and a significant increase in decibels from the family corner). The result by Sukhatme (1937) is in plain sight as Theorem 2.3 and is quite elegant as it relies on the fact that $\sum_{i=1}^n y_i=\sum_{j=1}^n (n-j+1)(y_{(j)}-y_{(j-1)})=\sum_{j=2}^n (y_{(j)}-y_{(1)})$

hence sums up as a mere linear change of variables! (Pandurang Vasudeo Sukhatme (1911–1997) was an Indian statistician who worked on human nutrition and got the Guy Medal of the RSS in 1963.)