Archive for best equivariant estimator

more concentration, everywhere

Posted in R, Statistics with tags , , , , , , , , , , on January 25, 2019 by xi'an

Although it may sound like an excessive notion of optimality, one can hope at obtaining an estimator δ of a unidimensional parameter θ that is always closer to θ that any other parameter. In distribution if not almost surely, meaning the cdf of (δ-θ) is steeper than for other estimators enjoying the same cdf at zero (for instance ½ to make them all median-unbiased). When I saw this question on X validated, I thought of the Cauchy location example, where there is no uniformly optimal estimator, albeit a large collection of unbiased ones. But a simulation experiment shows that the MLE does better than the competition. At least than three (above) four of them (since I tried the Pitman estimator via Christian Henning’s smoothmest R package). The differences to the MLE empirical cd make it clearer below (with tomato for a score correction, gold for the Pitman estimator, sienna for the 38% trimmed mean, and blue for the median):I wonder at a general theory along these lines. There is a vague similarity with Pitman nearness or closeness but without the paradoxes induced by this criterion. More in the spirit of stochastic dominance, which may be achievable for location invariant and mean unbiased estimators…

Pitman medal for Kerrie Mengersen

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , on December 20, 2016 by xi'an

6831250-3x2-700x467My friend and co-author of many years, Kerrie Mengersen, just received the 2016 Pitman Medal, which is the prize of the Statistical Society of Australia. Congratulations to Kerrie for a well-deserved reward of her massive contributions to Australian, Bayesian, computational, modelling statistics, and to data science as a whole. (In case you wonder about the picture above, she has not yet lost the medal, but is instead looking for jaguars in the Amazon.)

This medal is named after EJG Pitman, Australian probabilist and statistician, whose name is attached to an estimator, a lemma, a measure of efficiency, a test, and a measure of comparison between estimators. His estimator is the best equivariant (or invariant) estimator, which can be expressed as a Bayes estimator under the relevant right Haar measure, despite having no Bayesian motivation to start with. His lemma is the Pitman-Koopman-Darmois lemma, which states that outside exponential families, sufficient is essentially useless (except for exotic distributions like the Uniform distributions). Darmois published the result first in 1935, but in French in the Comptes Rendus de l’Académie des Sciences. And the measure of comparison is Pitman nearness or closeness, on which I wrote a paper with my friends Gene Hwang and Bill Strawderman, paper that we thought was the final paper on the measure as it was pointing out several majors deficiencies with this concept. But the literature continued to grow after that..!