GLMEB outperforms James-Stein?

At the monthly meeting of the Apprentissage et Sparsité group run by Sacha Tsybakov at CREST, Ismael Castillo will discuss tomorrow the recent paper General maximum likelihood empirical Bayes estimation of normal means by Wenhua Jiang and Cun-Hui Zhang just published in the Annals of Statistics (37(4), 1647-1684). (The paper is available on arXiv.) An interesting sentence from the abstract is that “the GMLEB outperforms the JamesStein and several state-of-the-art threshold estimators in a wide range of settings without much down side”. This attracted my attention given my earlier work on James-Stein estimators and I took a quick look at the paper to see what new aspects could be uncovered about 50 years after the original James and Stein’s paper. The setting is the original normal mean estimation problem under squared error loss and the GLMEB estimate is based on the non-parametric estimate of the mixing distribution G

\widehat G_n = \arg\max_{G\in\mathcal{G}} \prod^n_{i=1} \int \varphi(x_i-u) G(du)

as the (empirical) Bayes estimator associated with \widehat G_n. The domination advertised in the abstract seems to be related to an integrated squared error loss under an unknown G, which thus does not clash with the robust minimaxity of the original James-Stein estimator… Anyway, if you are interested and in Paris next Thursday, Dec. 3, the discussion is from 3pm to 4:30pm at ENSAE, Salle S8.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 633 other followers