## which parameters are U-estimable?

Posted in Books, Kids, Statistics, University life with tags , , , , , , , on January 13, 2015 by xi'an

Today (01/06) was a double epiphany in that I realised that one of my long-time beliefs about unbiased estimators did not hold. Indeed, when checking on Cross Validated, I found this question: For which distributions is there a closed-form unbiased estimator for the standard deviation? And the presentation includes the normal case for which indeed there exists an unbiased estimator of σ, namely

$\frac{\Gamma(\{n-1\}/{2})}{\Gamma({n}/{2})}2^{-1/2}\sqrt{\sum_{k=1}^n(x_i-\bar{x})^2}$

which derives directly from the chi-square distribution of the sum of squares divided by σ². When thinking further about it, if a posteriori!, it is now fairly obvious given that σ is a scale parameter. Better, any power of σ can be similarly estimated in a unbiased manner, since

$\left\{\sum_{k=1}^n(x_i-\bar{x})^2\right\}^\alpha \propto\sigma^\alpha\,.$

And this property extends to all location-scale models.

So how on Earth was I so convinced that there was no unbiased estimator of σ?! I think it stems from reading too quickly a result in, I think, Lehmann and Casella, result due to Peter Bickel and Erich Lehmann that states that, for a convex family of distributions F, there exists an unbiased estimator of a functional q(F) (for a sample size n large enough) if and only if q(αF+(1-α)G) is a polynomial in 0α1. Because of this, I had this [wrong!] impression that only polynomials of the natural parameters of exponential families can be estimated by unbiased estimators… Note that Bickel’s and Lehmann’s theorem does not apply to the problem here because the collection of Gaussian distributions is not convex (a mixture of Gaussians is not a Gaussian).

This leaves open the question as to which transforms of the parameter(s) are unbiasedly estimable (or U-estimable) for a given parametric family, like the normal N(μ,σ²). I checked in Lehmann’s first edition earlier today and could not find an answer, besides the definition of U-estimability. Not only the question is interesting per se but the answer could come to correct my long-going impression that unbiasedness is a rare event, i.e., that the collection of transforms of the model parameter that are U-estimable is a very small subset of the whole collection of transforms.

## A final if still incomplete history of Markov Chain Monte Carlo

Posted in Books, Statistics, University life with tags , , , , on May 18, 2010 by xi'an

To keep up with the habit of posting about any (!) arXiving of my papers, the final revision of our paper with George Casella on some recollection on the history of MCMC has been accepted for publication in the Handbook of Markov Chain Monte Carlo: Methods and Applications, edited by Steve Brooks;  Andrew Gelman, Galin Jones, and Xiao-Li Meng, and is re-arXived as well. As stressed by the title, the coverage is anything but exhaustive and we mostly stress the parts of MCMC “history” we witnessed and in which we took part. One referee took issue on this stance, asking for a more scholarly work with less standard entries covered in this chapter. This was a perfectly reasonable request from the handbook perspective, but we had neither the time nor the will to turn into part-time historians, as a large chunk of the MCMC history takes place in the foreign realms of particle physics  (examplified in, say, Landau and Binder) and signal processing! The same referee objected to our light, too light, style, as e.g. the (mock-)inclusion of

Definition: epiphany n. A spiritual event in which the essence of a given object of manifestation appears to the subject, as in a sudden flash of recognition.

within the text. (I remember being puzzled by the term when I read it for the first time in Joyce’s Portrait of the Artist As a Young Man as it has a purely ecclesiastic meaning in French…) Again reasonable from the referee’s viewpoint, but handbooks should allow for more freedom of style than journals, unless all chapters get edited by the same managing editor as in Monte Carlo Markov chain in practice where Wally Gilks did an incredible job!