Reference priors for the law of natural induction

Jim Berger, José Bernardo and Dongchu Sun have written a paper on reference/objective priors to be used in hypergeometric sampling or, in more historical terms, for the law of natural induction. (This is an invited discussion paper for the Revista de la Real Academia of Ciencias, Series A Matemáticas). This paper is interesting in that, by using an argument previously made by Harold Jeffreys in Theory of Probability, it overcomes a strong difficulty with the standard Bayesian analysis of the law of natural induction, which gives the probability that all members of a population (say, swans) are of the same kind as those observed so far (say, white swans). In the standard approach, the probability that the next draw is of the same kind as the n previous ones is (n+1)/(n+2), while the probability that all N members are of the same kind is (n+1)/(N+1). These probabilities are paradoxically opposed when N is large against n. Using a representation of the problem in an hypothesis testing framework, as in Jeffreys’s, Jim Berger, José Bernardo and Dongchu Sun give a reference prior that lead to probabilities that behave like (n+1/2)/(n+1) and (approximately) \sqrt{n}/(\sqrt{n}+1/\sqrt{\pi}), hence making them compatible. Of course, this is not going to solve the (unrelated) debate about the relevance of this rule for finding black swans, but this is a fairly interesting result!

One Response to “Reference priors for the law of natural induction”

  1. […] Besides being awfully nice, these comments mostly focus on Laplace’s succession law which, as posted earlier, is an endless source for debate! Stephen notes the trick of keeping weights at the […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: