relativity is the keyword

As I was teaching my introduction to Bayesian Statistics this morning, ending up with the chapter on tests of hypotheses, I found reflecting [out loud] on the relative nature of posterior quantities. Just like when I introduced the role of priors in Bayesian analysis the day before, I stressed the relativity of quantities coming out of the BBB [Big Bayesian Black Box], namely that whatever happens as a Bayesian procedure is to be understood, scaled, and relativised against the prior equivalent, i.e., that the reference measure or gauge is the prior. This is sort of obvious, clearly, but bringing the argument forward from the start avoids all sorts of misunderstanding and disagreement, in that it excludes the claims of absolute and certainty that may come with the production of a posterior distribution. It also removes the endless debate about the determination of the prior, by making each prior a reference on its own. With an additional possibility of calibration by simulation under the assumed model. Or an alternative. Again nothing new there, but I got rather excited by this presentation choice, as it seems to clarify the path to Bayesian modelling and avoid misapprehensions.

Further, the curious case of the Bayes factor (or of the posterior probability) could possibly be resolved most satisfactorily in this framework, as the [dreaded] dependence on the model prior probabilities then becomes a matter of relativity! Those posterior probabilities depend directly and almost linearly on the prior probabilities, but they should not be interpreted in an absolute sense as the ultimate and unique probability of the hypothesis (which anyway does not mean anything in terms of the observed experiment). In other words, this posterior probability does not need to be scaled against a U(0,1) distribution. Or against the p-value if anyone wishes to do so. By the end of the lecture, I was even wondering [not so loudly] whether or not this perspective was allowing for a resolution of the Lindley-Jeffreys paradox, as the resulting number could be set relative to the choice of the [arbitrary] normalising constant.

To illustrate the earlier point here is a comparison of the distribution of a posterior probability when the prior probability of the Gaussian null is .1, .5, and .9, and when the data is simulated either under the alternative (with a Normal N(0,10) prior) or under the null. The impact of the prior weight is void in this comparison, once the relative value of the probability is accounted for.

3 Responses to “relativity is the keyword”

1. Very much agree with posterior quantities not being absolute, that they are relative to the prior (and data generating model assumed) and that calibrations using simulations from assumed priors and data generating models (both assumed and used versus not assumed but used) can be very sensible (and needs wider encouragement).

But, “a prior is a prior is a prior” is formally true but disastrous in practice – the prior (and data model) can miss-represent the reality one is trying to learn about so badly as to make any analyses assuming them harmful. So developing peer review expertise for priors is badly needed (which I wrote about in Two cheers for Bayes 1996 https://www.ncbi.nlm.nih.gov/pubmed/8889349 ).

Also, I think you find these ideas used by Mike Evans solution to Lindley-Jeffreys paradox.

Keith O’Rourke

2. This is a very nice way of thinking. We practitioners of Bayesian inference need to share results across studies. How can we use this idea? [Conditionally on the difficulty of trying, or even describe, several complex multivariate priors]

• Thank you. The main messages here are that (a) there is no principled way to rank priors, i.e., “a prior is a prior is a prior”, and (b) probability statements are made relative to the probability model induced by the prior choice of the prior (!), which means that comparing priors via their outcome is not particularly meaningful. Comparing posteriors with the same priors, if this is what you mean by “across studies”, is relevant.

This site uses Akismet to reduce spam. Learn how your comment data is processed.