Archive for Type II error

a resolution of the Jeffreys-Lindley paradox

Posted in Books, Statistics, University life with tags , , , , on April 24, 2019 by xi'an

“…it is possible to have the best of both worlds. If one allows the significance level to decrease as the sample size gets larger (…) there will be a finite number of errors made with probability one. By allowing the critical values to diverge slowly, one may catch almost all the errors.” (p.1527)

When commenting another post, Michael Naaman pointed out to me his 2016 Electronic Journal of Statistics paper where he resolves the Jeffreys-Lindley paradox. The argument there is to consider a Type I error going to zero with the sample size n going to infinity but slowly enough for both Type I and Type II errors to go to zero. And guarantee  a finite number of errors as the sample size n grows to infinity. This translates for the Jeffreys-Lindley paradox into a pivotal quantity within the posterior probability of the null that converges to zero with n going to infinity. Hence makes it (most) agreeable with the Type I error going to zero. Except that there is little reason to assume this pivotal quantity goes to infinity with n, despite its distribution remaining constant in n. Being constant is less unrealistic, by comparison! That there exists an hypothetical sequence of observations such that the p-value and the posterior probability agree, even exactly, does not “solve” the paradox in my opinion.