Is Jeffreys’ prior unique?
“A striking characterisation showing the central importance of Fisher’s information in a differential framework is due to Cencov (1972), who shows that it is the only invariant Riemannian metric under symmetry conditions.” N. Polson, PhD Thesis, University of Nottingham, 1988
Following a discussion on Cross Validated, I wonder whether or not the affirmation that Jeffreys’ prior was the only prior construction rule that remains invariant under arbitrary (if smooth enough) reparameterisation. In the discussion, Paulo Marques mentioned Nikolaj Nikolaevič Čencov’s book, Statistical Decision Rules and Optimal Inference, Russian book from 1972, of which I had not heard previously and which seems too theoretical [from Paulo’s comments] to explain why this rule would be the sole one. As I kept looking for Čencov’s references on the Web, I found Nick Polson’s thesis and the above quote. So maybe Nick could tell us more!
However, my uncertainty about the uniqueness of Jeffreys’ rule stems from the fact that, f I decide on a favourite or reference parametrisation—as Jeffreys indirectly does when selecting the parametrisation associated with a constant Fisher information—and on a prior derivation from the sampling distribution for this parametrisation, I have derived a parametrisation invariant principle. Possibly silly and uninteresting from a Bayesian viewpoint but nonetheless invariant.
March 24, 2015 at 11:44 am
Hi Professor Robert,
While studying Haar measures I came across the following paper:
George, EI, McCulloch, R (1993). On obtaining invariant prior distributions,
which might relate to the uniqueness of Jeffreys’s prior. I believe that the paper states the following:
1. Ω-invariance: All priors that are proportional to det(D”), where D is a divergence measure and D” its Hessian, are parametrisation invariant.
are sample space invariant, if d0 is homogenous. Such a divergence measure can be related to a divergence measure
such that $latexd_{0}(p, q)= p d(q/p)$ , which is related to Csiszar’s f-divergence measures. As an example they discuss D0 being the squared Hellinger distance and D1 the KL-divergence and these two divergence measures have the Fisher information as their Hessian.
2. S-invariance: Divergence measures of the form
3. Group-invariance: According to Kass (1980, 1989) Jeffreys prior is the left Haar density whenever the data can be written as being acted upon by the parameter (left-group action).
I’m not sure whether these three statements points towards the uniqueness of Jeffreys’s prior. I’m not sure whether the converse of 1 holds? Do you know whether a parametrisation invariant prior is necessarily proportional to the root of a divergence’s Hessian? Clearly, there’s a relation between Jeffreys prior, Hellinger distance and the KL-divergence, but is this relationship uniquely connected to the Riemannian metric on model space?
Cheers,
Alexander
March 3, 2015 at 2:58 pm
Xian
Here’s a modern day version of Cencov
http://m.pnas.org/content/108/25/10078.full.pdf
Nick
March 3, 2015 at 7:14 pm
Thank you, Nick! While it may be modern day (!), it does not sound comprehensible enough for me to make the link with the uniqueness of Jeffreys’s prior as the invariant prior.
March 3, 2015 at 12:48 pm
X,
The uniqueness argument comes from the uniqueness of the Fisher-Rao metric, not the resulting measure itself.
Basically Fisher-Rao is the only metric that is consistent with the usual properties of Frequentist statistics, such as sufficiency and the like. Then, using the fact that a metric induces a unique measure (not guaranteed to be a probability measure, of course), you can argue that the Jeffreys’ prior is unique to a given likelihood function.
If you relax the condition on Fisher-Rao (in particular if you do mathematically suspect things like adding the Hessian of a prior density) then you no longer get a unique metric and hence no more unique Jeffreys’ prior.
March 3, 2015 at 4:03 am
I took some notes about this eight years ago… My memory is connected with Polson’s quote: the “only” (in a certain sense) Riemannian metric is the Rao-Jeffreys metric (aka information metric or Fisher metric)
, and Jeffreys’ prior is (proportional to) the induced measure
.