Is Jeffreys’ prior unique?

“A striking characterisation showing the central importance of Fisher’s information in a differential framework is due to Cencov (1972), who shows that it is the only invariant Riemannian metric under symmetry conditions.” N. Polson, PhD Thesis, University of Nottingham, 1988

Following a discussion on Cross Validated, I wonder whether or not the affirmation that Jeffreys’ prior was the only prior construction rule that remains invariant under arbitrary (if smooth enough) reparameterisation. In the discussion, Paulo Marques mentioned Nikolaj Nikolaevič Čencov’s book, Statistical Decision Rules and Optimal Inference, Russian book from 1972, of which I had not heard previously and which seems too theoretical [from Paulo’s comments] to explain why this rule would be the sole one. As I kept looking for Čencov’s references on the Web, I found Nick Polson’s thesis and the above quote. So maybe Nick could tell us more!

However, my uncertainty about the uniqueness of Jeffreys’ rule stems from the fact that, f I decide on a favourite or reference parametrisation—as Jeffreys indirectly does when selecting the parametrisation associated with a constant Fisher information—and on a prior derivation from the sampling distribution for this parametrisation, I have derived a parametrisation invariant principle. Possibly silly and uninteresting from a Bayesian viewpoint but nonetheless invariant.

5 Responses to “Is Jeffreys’ prior unique?”

  1. Hi Professor Robert,

    While studying Haar measures I came across the following paper:

    George, EI, McCulloch, R (1993). On obtaining invariant prior distributions,

    which might relate to the uniqueness of Jeffreys’s prior. I believe that the paper states the following:

    1. Ω-invariance: All priors that are proportional to det(D”), where D is a divergence measure and D” its Hessian, are parametrisation invariant.
    2. S-invariance: Divergence measures of the form D_{0}(P,Q)=\int d_{0}(p, q) \text{d} \mu are sample space invariant, if d0 is homogenous. Such a divergence measure can be related to a divergence measure D_{1}(P, Q)=\int p d (q / p) \text{d} \mu such that $latexd_{0}(p, q)= p d(q/p)$ , which is related to Csiszar’s f-divergence measures. As an example they discuss D0 being the squared Hellinger distance and D1 the KL-divergence and these two divergence measures have the Fisher information as their Hessian.
    3. Group-invariance: According to Kass (1980, 1989) Jeffreys prior is the left Haar density whenever the data can be written as being acted upon by the parameter (left-group action).

    I’m not sure whether these three statements points towards the uniqueness of Jeffreys’s prior. I’m not sure whether the converse of 1 holds? Do you know whether a parametrisation invariant prior is necessarily proportional to the root of a divergence’s Hessian? Clearly, there’s a relation between Jeffreys prior, Hellinger distance and the KL-divergence, but is this relationship uniquely connected to the Riemannian metric on model space?


  2. Xian
    Here’s a modern day version of Cencov

    • Thank you, Nick! While it may be modern day (!), it does not sound comprehensible enough for me to make the link with the uniqueness of Jeffreys’s prior as the invariant prior.

  3. betanalpha Says:


    The uniqueness argument comes from the uniqueness of the Fisher-Rao metric, not the resulting measure itself.

    Basically Fisher-Rao is the only metric that is consistent with the usual properties of Frequentist statistics, such as sufficiency and the like. Then, using the fact that a metric induces a unique measure (not guaranteed to be a probability measure, of course), you can argue that the Jeffreys’ prior is unique to a given likelihood function.

    If you relax the condition on Fisher-Rao (in particular if you do mathematically suspect things like adding the Hessian of a prior density) then you no longer get a unique metric and hence no more unique Jeffreys’ prior.

  4. I took some notes about this eight years ago… My memory is connected with Polson’s quote: the “only” (in a certain sense) Riemannian metric is the Rao-Jeffreys metric (aka information metric or Fisher metric) g_{ij}=\mathrm{E}[\partial \ell_i \ell_j], and Jeffreys’ prior is (proportional to) the induced measure \sqrt{\mathrm{det}(g_{ij})}.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s