GLMEB outperforms James-Stein?

At the monthly meeting of the Apprentissage et Sparsité group run by Sacha Tsybakov at CREST, Ismael Castillo will discuss tomorrow the recent paper General maximum likelihood empirical Bayes estimation of normal means by Wenhua Jiang and Cun-Hui Zhang just published in the Annals of Statistics (37(4), 1647-1684). (The paper is available on arXiv.) An interesting sentence from the abstract is that “the GMLEB outperforms the JamesStein and several state-of-the-art threshold estimators in a wide range of settings without much down side”. This attracted my attention given my earlier work on James-Stein estimators and I took a quick look at the paper to see what new aspects could be uncovered about 50 years after the original James and Stein’s paper. The setting is the original normal mean estimation problem under squared error loss and the GLMEB estimate is based on the non-parametric estimate of the mixing distribution G

\widehat G_n = \arg\max_{G\in\mathcal{G}} \prod^n_{i=1} \int \varphi(x_i-u) G(du)

as the (empirical) Bayes estimator associated with \widehat G_n. The domination advertised in the abstract seems to be related to an integrated squared error loss under an unknown G, which thus does not clash with the robust minimaxity of the original James-Stein estimator… Anyway, if you are interested and in Paris next Thursday, Dec. 3, the discussion is from 3pm to 4:30pm at ENSAE, Salle S8.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.