## The answer is e, what was the question?! A rather exotic question on X validated: since π can be approximated by random sampling over a unit square, is there an equivalent for approximating e? This is an interesting question, as, indeed, why not focus on e rather than π after all?! But very quickly the very artificiality of the problem comes back to hit one in one’s face… With no restriction, it is straightforward to think of a Monte Carlo average that converges to e as the number of simulations grows to infinity. However, such methods like Poisson and normal simulations require some complex functions like sine, cosine, or exponential… But then someone came up with a connection to the great Russian probabilist Gnedenko, who gave as an exercise that the average number of uniforms one needs to add to exceed 1 is exactly e, because it writes as $\sum_{n=0}^\infty\frac{1}{n!}=e$

(The result was later detailed in the American Statistician as an introductory simulation exercise akin to Buffon’s needle.) This is a brilliant solution as it does not involve anything but a standard uniform generator. I do not think it relates in any close way to the generation from a Poisson process with parameter λ=1 where the probability to exceed one in one step is e⁻¹, hence deriving  a Geometric variable from this process leads to an unbiased estimator of e as well. As an aside, W. Huber proposed the following elegantly concise line of R code to implement an approximation of e:

1/mean(n*diff(sort(runif(n+1))) > 1)

Hard to beat, isn’t it?! (Although it is more exactly a Monte Carlo approximation of $\left(1-\frac{1}{n}\right)^n$

which adds a further level of approximation to the solution….)

### 10 Responses to “The answer is e, what was the question?!”

1. Zen Says:

By the way, the only condition is that the sequence is exchangeable. There is no need to suppose that the r.v.’s are iid uniforms. The result is much more general.

• xi'an Says:

True. This is a rank property.

2. Zen Says:

This wonderful vectorized R one-liner by Huber made laugh out loud about the “C in R” code I wrote answering this question last year!

http://stats.stackexchange.com/questions/148962/suppose-x-1-x-2-dotsc-x-n-are-i-i-d-random-variables-when-is-the-sequenc?rq=1

• xi'an Says:

This is very concise indeed. I am seeking a similar line for the Gnedenko solution, aiming at analysing a long uniform vector in one go with no recursion.

• xi'an Says:

I also have a one-line solution for the Forsythe’s method! Soon to be posted on the ‘Og, but already updated within my resolution on X validated.

3. Georges Henry Says:

As correctly pointed out by Alex Thiery, the only non trivial thing to prove is that $Pr(U_1+\cdots+U_k<1)=1/k!$ when the $U_i$‘s are independent and uniform on (0,1). The best is to imagine that $(U_1,\ldots,U_k)$ is uniform in the cube $(0,1)^k$ and consider the volume of the tetrahedron defined by $U_1+\cdots+U_k<1$

• xi'an Says:

Very nice and intuitive explanation as to why the probability is 1/k!

4. […] article was first published on R – Xi’an’s Og , and kindly contributed […]

5. Alex Thiery Says:

That’s neat!

One can also think about it in terms of random truncations (a.k.a.Russian roulette). Indeed, an unbiased estimate of “e” is given by the sum from k=1 to k=N of 1/[k! * P(k <= N)] where N is any reasonable integer valued random variable.

If N is the number of uniform random variables one needs to add up to exceed 1 then one precisely has that P(k <= N) = 1/k! and this gives back the mentioned result. But there are many other choices…

• xi'an Says:

Right! The Glynn-Rhee approach has indeed the power to turn it into an unbiased estimator. Thanks.

This site uses Akismet to reduce spam. Learn how your comment data is processed.