Archive for FiveThirtyEight

stack overload

Posted in Books, Kids, R with tags , , , , , on March 3, 2021 by xi'an

The Riddle this week is rather straightforward to explain: stacking identical objects (bars of length and mass two, say) on top of one another so that the center of each new bar is uniformly distributed along the previous bar, what is the distribution of the number of bars when the stack collapses? If I am not confused, the stack collapses the first time the centre of gravity of an upper stack leaves the interval represented by the bar just below. Namely

\left|\frac{1}{N-j} \sum_{i=j+1}^N x_i -x_j\right|>1

when the xi are the bar centres, or equivalently

\max_{2\le j\le N-1} \left|\frac{1}{N-j} \sum_{i=j+1}^N \sum_{k=j+1}^i\epsilon_i \right|>1

where the ε_i‘s are U(-1,1). Which is straightforward to code in R by looking at means of cumulated sums.

new order

Posted in Books, Kids, R, Statistics with tags , , , , on February 5, 2021 by xi'an

The latest riddle from The Riddler was both straightforward: given four iid Normal variates, X¹,X²,X³,X⁴, what is the probability that X¹+X²<X³+X⁴ given that X¹<X³ ? The answer is ¾ and it actually does not depend on the distribution of the variates. The bonus question is however much harder: what is this probability when there are 2N iid Normal variates?

I posted the question on math.stackexchange, then on X validated, but received no hint at a possible simplification of the probability. And then erased the questions. Given the shape of the domain where the bivariate Normal density is integrated, it sounds most likely there is no closed-form expression. (None was proposed by the Riddler.) The probability decreases roughly in N³ when computing this probability by simulation and fitting a regression.

> summary(lm(log(p)~log(r)))

Residuals:
      Min        1Q    Median        3Q       Max 
-0.013283 -0.010362 -0.000606  0.007835  0.039915 

Coefficients:
             Estimate Std. Error t value Pr(>|t|)    
(Intercept) -0.111235   0.008577  -12.97 4.11e-13 ***
log(r)      -0.311361   0.003212  -96.94  < 2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Residual standard error: 0.01226 on 27 degrees of freedom
Multiple R-squared:  0.9971,	Adjusted R-squared:  0.997 
F-statistic:  9397 on 1 and 27 DF,  p-value: < 2.2e-16

around the table

Posted in Books, pictures, R, Statistics with tags , , , , , , , , , , on December 2, 2020 by xi'an

Monty Python and the Holy Grail: the round table in CamelotThe Riddler has a variant on the classical (discrete) random walk around a circle where every state (but the starting point) has the same probability 1/(n-1) to be visited last. Surprising result that stems almost immediately from the property that, leaving from 0, state a is visited couterclockwise before state b>a is visited clockwise is b/a+b. The variant includes (or seems to include) the starting state 0 as counting for the last visit (as a return to the origin). In that case, all n states, including the origin, but the two neighbours of 0, 1, and n-1, have the same probability to be last. This can also be seen on an R code that approximates (inner loop) the probability that a given state is last visited and record how often this probability is largest (outer loop):

w=0*(1:N)#frequency of most likely last
for(t in 1:1e6){
 o=0*w#probabilities of being last
 for(v in 1:1e6)#sample order of visits
   o[i]=o[i<-1+unique(cumsum(sample(c(-1,1),300,rep=T))%%N)[N]]+1
 w[j]=w[j<-order(o)[N]]+1}

However, upon (jogging) reflection, the double loop is a waste of energy and

o=0*(1:N)
for(v in 1:1e8)
   o[i]=o[i<-1+unique(cumsum(sample(c(-1,1),500,rep=T))%%N)[N]]+1

should be enough to check that all n positions but both neighbours have the same probability of being last visited. Removing the remaining loop should be feasible by considering all subchains starting at one of the 0’s, since this is a renewal state, but I cannot fathom how to code it succinctly. A more detailed coverage of the original problem (that is, omitting the starting point) was published the Monday after publication of the riddle on R bloggers, following a blog post by David Robinson on Variance Explained.

R codegolf challenge: is there a way to shorten the above R for loop in a single line command?!

Bernoulli factory in the Riddler

Posted in Books, Kids, R, Statistics with tags , , , , , , , , , , on December 1, 2020 by xi'an

“Mathematician John von Neumann is credited with figuring out how to take a p biased coin and “simulate” a fair coin. Simply flip the coin twice. If it comes up heads both times or tails both times, then flip it twice again. Eventually, you’ll get two different flips — either a heads and then a tails, or a tails and then a heads, with each of these two cases equally likely. Once you get two different flips, you can call the second of those flips the outcome of your “simulation.” For any value of p between zero and one, this procedure will always return heads half the time and tails half the time. This is pretty remarkable! But there’s a downside to von Neumann’s approach — you don’t know how long the simulation will last.” The Riddler

The associated riddle (first one of the post-T era!) is to figure out what are the values of p for which an algorithm can be derived for simulating a fair coin in at most three flips. In one single flip, p=½ sounds like the unique solution. For two flips, p²,(1-p)^2,2p(1-p)=½ work, but so do p+(1-p)p,(1-p)+p(1-p)=½, and the number of cases grows for three flips at most. However, since we can have 2³=8 different sequences, there are 2⁸ ways to aggregate these events and thus at most 2⁸ resulting probabilities (including 0 and 1). Running a quick R code and checking for proximity to ½ of any of these sums leads to

[1] 0.2062997 0.7937005 #p^3
[1] 0.2113249 0.7886753 #p^3+(1-p)^3
[1] 0.2281555 0.7718448 #p^3+p(1-p)^2
[1] 0.2372862 0.7627143 #p^3+(1-p)^3+p(1-p)^2
[1] 0.2653019 0.7346988 #p^3+2p(1-p)^2
[1] 0.2928933 0.7071078 #p^2
[1] 0.3154489 0.6845518 #p^3+2p^2(1-p)
[1] 0.352201  0.6477993 #p^3+p(1-p)^2+p^2(1-p)
[1] 0.4030316 0.5969686 #p^3+p(1-p)^2+3(1-p)p^2
[1] 0.5

which correspond to

1-p³=½, p³+(1-p)³=½,(1-p)³+(1-p)p²=½,p³+(1-p)³+p²(1-p),(1-p)³+2(1-p)p²=½,1-p²=½, p³+(1-p)³+p²(1-p)=½,(1-p)³+p(1-p)²+p²(1-p)=½,(1-p)³+p²(1-p)+3p(1-p)²=½,p³+p(1-p)²+3(p²(1-p)=½,p³+2p(1-p)²+3(1-p)p²=½,p=½,

(plus the symmetric ones), leading to 19 different values of p producing a “fair coin”. Missing any other combination?!

Another way to look at the problem is to find all roots of the 2^{2^n} equations

a_0p^n+a_1p^{n-1}(1-p)+\cdots+a_{n-1}p(1-p)^{n-1}+a_n(1-p)^n=1/2

where

0\le a_i\le{n \choose i}

(None of these solutions is rational, by the way, except p=½.) I also tried this route with a slightly longer R code, calling polyroot, and finding the same 19 roots for three flips, [at least] 271 for four, and [at least] 8641 for five (The Riddler says 8635!). With an imprecision in the exact number of roots due to rather poor numerical rounding by polyroot. (Since the coefficients of the above are not directly providing those of the polynomial, I went through an alternate representation as a polynomial in (1-p)/p, with a straightforward derivation of the coefficients.)

sampling w/o replacement except when replacing

Posted in Books, Kids, R with tags , , , , , , , on November 3, 2020 by xi'an

Another Riddle(r), considering a box with M myrtle balls and D dandelion balls. Drawing balls without replacement while they stay of the same color as the initial draw, else put back the last ball and repeat the process until all balls are drawn. The funny thing is that, unless M=0 or D=0, the probability to draw a myrtle ball at the end is always ½..! This can be easily checked by simulation (when M=2 and D=8)

r=function()sample(0:1,1,p=c(d,m))
for(t in 1:1e6){
  m=2;d=8
  i=r();m=m-!!i;d=d-!i
  while(!!m*d){
    j=r();i=ifelse(i==j,j,r())
    m=m-!!i;d=d-!i}
  F=F+(m>0)}
F/1e6

Now the proof that the probability is ½ is quite straightforward, for M=1 (or D=1). But I cannot find a quick fix for larger values. I thus reasoned by recursion, with the probability of emptying a given colour first is d!m!/(d+m)!, whatever the colour and whatever d>0,m>0. Hence half a chance to finish with myrtle. Any shorter sequence of a given colour reduces the value of either d or m, at which point we are using the recursion assumption that the probability is ½…