## a simpler (?) birthday problem

Posted in Books, Kids, Statistics with tags , , , , , , , on April 9, 2022 by xi'an

A monthly birthday problem from the Riddler:

What was the probability that none of the 40 people had birthdays this month? What is the probability that there is at least one month in the year during which none of the 40 people had birthdays (not necessarily this month)?

Assuming the same number of days in all months, the probability that one individual is not born in March is 1/12 and hence the probability that none of 40 (independent!) persons are not born in March is (11/12)⁴⁰, about 3%. The second question can be solved by reading Feller’s chapter on the combination of events (1970, Chapter IV, p.102). The probability that all months are seeing at least one birthday is

$\sum_{i=0}^{12} (-1)^i {12\choose i}(1-i/12)^{40}=0.6732162$

which can be checked by a quick R simulation. The complement 0.326 is thus close to 11 x 0.03!

## Hitch’s tricks

Posted in Books, Travel with tags , , , , , , , , , , , , on November 27, 2020 by xi'an

As I was watching the first minutes of the 1944 under-rated Lifeboat by Alfred Hitchcock (and John Steinbeck as the script writer!), a series of objects floating by the lifeboat to convey the preliminary mutual sinking of an Allied boat and a Nazi U-boat contained a cover of the New Yorker. Which while being iconic sounds like a weird inclusion, given that this is the very first issue of the magazine, in February 1925, hardly the first thing I would carry across the Atlantic at war time! Maybe being iconic was the reason to keep this issue rather than a more recent one, another mystery about the great Hitch allusions and clues interseeded throughout his films.

## Jeffreys priors for hypothesis testing [Bayesian reads #2]

Posted in Books, Statistics, University life with tags , , , , , , , , , , , , , , , , on February 9, 2019 by xi'an

A second (re)visit to a reference paper I gave to my OxWaSP students for the last round of this CDT joint program. Indeed, this may be my first complete read of Susie Bayarri and Gonzalo Garcia-Donato 2008 Series B paper, inspired by Jeffreys’, Zellner’s and Siow’s proposals in the Normal case. (Disclaimer: I was not the JRSS B editor for this paper.) Which I saw as a talk at the O’Bayes 2009 meeting in Phillie.

The paper aims at constructing formal rules for objective proper priors in testing embedded hypotheses, in the spirit of Jeffreys’ Theory of Probability “hidden gem” (Chapter 3). The proposal is based on symmetrised versions of the Kullback-Leibler divergence κ between null and alternative used in a transform like an inverse power of 1+κ. With a power large enough to make the prior proper. Eventually multiplied by a reference measure (i.e., the arbitrary choice of a dominating measure.) Can be generalised to any intrinsic loss (not to be confused with an intrinsic prior à la Berger and Pericchi!). Approximately Cauchy or Student’s t by a Taylor expansion. To be compared with Jeffreys’ original prior equal to the derivative of the atan transform of the root divergence (!). A delicate calibration by an effective sample size, lacking a general definition.

At the start the authors rightly insist on having the nuisance parameter v to differ for each model but… as we all often do they relapse back to having the “same ν” in both models for integrability reasons. Nuisance parameters make the definition of the divergence prior somewhat harder. Or somewhat arbitrary. Indeed, as in reference prior settings, the authors work first conditional on the nuisance then use a prior on ν that may be improper by the “same” argument. (Although conditioning is not the proper term if the marginal prior on ν is improper.)

The paper also contains an interesting case of the translated Exponential, where the prior is L¹ Student’s t with 2 degrees of freedom. And another one of mixture models albeit in the simple case of a location parameter on one component only.

## three ½ [out of 159] versions of Johnny B. Goode

Posted in Statistics with tags , , , , , , on April 2, 2017 by xi'an

$\int_0^\infty(1-F(x))\text{d}x = \mathbb{E}_F[X]$