**I**n connection with the Bernoulli factory post of last week, Richard Brent arXived a short historical note recalling George Forsythe’s algorithm for simulating variables with density when (the extension to any upper bound is straightforward). The idea is to avoid computing the exponential function by simulating uniforms until

since the probability of this event is

its expectation is and the probability that *n* is even is . This turns into a generation method if the support of *G* is bounded. In relation with the Bernoulli factory problem, I think this has potential applications in that, when the function *G(x)* is replaced with an unbiased estimator the subsequent steps remain valid. This approach would indeed involve computing one single value of *G(x)*, but this is also the case with Latuszyński et al.’s and our solutions… So I am uncertain as to whether or not this has practical implications. (Brent mentions normal simulation but this is more history than methodology.)