Archive for the pictures Category

BAYSM’18 [British summer conference series]

Posted in Kids, pictures, Statistics, Travel, University life with tags , , , , , , on January 22, 2018 by xi'an

only in India… [jatp]

Posted in pictures, Travel with tags , , , , , , , on January 21, 2018 by xi'an

Last time I visited India, I highlighted an advertisement for a machine-learning match-making system. This time, while eating a last aloo paratha for breakfast in Kolkata, I noticed ads for Hindu temples on the back of my Air India boarding passes which are actually run by the State of Gujarat rather than the temples themselves. (Gujarat is India’s westernmost state.)

Rosso di Montalcino

Posted in pictures, Travel, Wines with tags , , , , , , , , on January 20, 2018 by xi'an

The Norse Farce [cuppas]

Posted in Kids, pictures, University life with tags , , , , on January 20, 2018 by xi'an

Today, I received the Norse Farce cups I had designed with the help of Thomas! While just as easy to replicate on sites like Vistaprint, I have a few left in case some Og’s readers are interested!

Bayesian workers, unite!

Posted in Books, Kids, pictures, Statistics, University life with tags , , , , , , , on January 19, 2018 by xi'an

This afternoon, Alexander Ly is defending his PhD thesis at the University of Amsterdam. While I cannot attend the event, I want to celebrate the event and a remarkable thesis around the Bayes factor [even though we disagree on its role!] and the Jeffreys’s Amazing Statistics Program (!), otherwise known as JASP. Plus commend the coolest thesis cover I ever saw, made by the JASP graphical designer Viktor Beekman and representing Harold Jeffreys leading empirical science workers in the best tradition of socialist realism! Alexander wrote a post on the JASP blog to describe the thesis, the cover, and his plans for the future. Congratulations!

a summer of British conferences!

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , on January 18, 2018 by xi'an

best unbiased estimators

Posted in Books, Kids, pictures, Statistics, University life with tags , , , , , , , , , , , , on January 18, 2018 by xi'an

A question that came out on X validated today kept me busy for most of the day! It relates to an earlier question on the best unbiased nature of a maximum likelihood estimator, to which I pointed out the simple case of the Normal variance when the estimate is not unbiased (but improves the mean square error). Here, the question is whether or not the maximum likelihood estimator of a location parameter, when corrected from its bias, is the best unbiased estimator (in the sense of the minimal variance). The question is quite interesting in that it links to the mathematical statistics of the 1950’s, of Charles Stein, Erich Lehmann, Henry Scheffé, and Debabrata Basu. For instance, if there exists a complete sufficient statistic for the problem, then there exists a best unbiased estimator of the location parameter, by virtue of the Lehmann-Scheffé theorem (it is also a consequence of Basu’s theorem). And the existence is pretty limited in that outside the two exponential families with location parameter, there is no other distribution meeting this condition, I believe. However, even if there is no complete sufficient statistic, there may still exist best unbiased estimators, as shown by Bondesson. But Lehmann and Scheffé in their magisterial 1950 Sankhya paper exhibit a counter-example, namely the U(θ-1,θ-1) distribution:

since no non-constant function of θ allows for a best unbiased estimator.

Looking in particular at the location parameter of a Cauchy distribution, I realised that the Pitman best equivariant estimator is unbiased as well [for all location problems] and hence dominates the (equivariant) maximum likelihood estimator which is unbiased in this symmetric case. However, as detailed in a nice paper of Gabriela Freue on this problem, I further discovered that there is no uniformly minimal variance estimator and no uniformly minimal variance unbiased estimator! (And that the Pitman estimator enjoys a closed form expression, as opposed to the maximum likelihood estimator.) This sounds a bit paradoxical but simply means that there exists different unbiased estimators which variance functions are not ordered and hence not comparable. Between them and with the variance of the Pitman estimator.