Archive for Cornell University

admissible estimators that are not Bayes

Posted in Statistics with tags , , , , , , on December 30, 2017 by xi'an

A question that popped up on X validated made me search a little while for point estimators that are both admissible (under a certain loss function) and not generalised Bayes (under the same loss function), before asking Larry Brown, Jim Berger, or Ed George. The answer came through Larry’s book on exponential families, with the two examples attached. (Following our 1989 collaboration with Roger Farrell at Cornell U, I knew about the existence of testing procedures that were both admissible and not Bayes.) The most surprising feature is that the associated loss function is strictly convex as I would have thought that a less convex loss would have helped to find such counter-examples.

machine learning and the future of realism

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , on May 4, 2017 by xi'an

Giles and Cliff Hooker arXived a paper last week with this intriguing title. (Giles Hooker is an associate professor of statistics and biology at Cornell U, with an interesting blog on the notion of models, while Cliff Hooker is a professor of philosophy at Newcastle U, Australia.)

“Our conclusion is that simplicity is too complex”

The debate in this short paper is whether or not machine learning relates to a model. Or is it concerned with sheer (“naked”) prediction? And then does it pertain to science any longer?! While it sounds obvious at first, defining why science is more than prediction of effects given causes is much less obvious, although prediction sounds more pragmatic and engineer-like than scientific. (Furthermore, prediction has a somewhat negative flavour in French, being used as a synonym to divination and opposed to prévision.) In more philosophical terms, prediction offers no ontological feature. As for a machine learning structure like a neural network being scientific or a-scientific, its black box nature makes it much more the later than the former, in that it brings no explanation for the connection between input and output, between regressed and regressors. It further lacks the potential for universality of scientific models. For instance, as mentioned in the paper, Newton’s law of gravitation applies to any pair of weighted bodies, while a neural network built on a series of observations could not be assessed or guaranteed outside the domain where those observations are taken. Plus, would miss the simple square law established by Newton. Most fascinating questions, undoubtedly! Putting the stress on models from a totally different perspective from last week at the RSS.

As for machine learning being a challenge to realism, I am none the wiser after reading the paper. Utilising machine learning tools to produce predictions of causes given effects does not seem to modify the structure of the World and very little our understanding of it, since they do not bring explanation per se. What would lead to anti-realism is the adoption of those tools as substitutes for scientific theories and models.

Cayuga 1989

Posted in Kids, Travel, Wines with tags , , , , , on August 20, 2015 by xi'an

cayuga

hasta luego, Susie!

Posted in Statistics, University life with tags , , , , , , , , on August 20, 2014 by xi'an

I just heard that our dear, dear friend Susie Bayarri passed away early this morning, on August 19, in Valencià, Spain… I had known Susie for many, many years, our first meeting being in Purdue in 1987, and we shared many, many great times during simultaneous visits to Purdue University and Cornell University in the 1990’s. During a workshop in Cornell organised by George Casella (to become the unforgettable Camp Casella!), we shared a flat together and our common breakfasts led her to make fun of my abnormal consumption of cereals  forever after, a recurrent joke each time we met! Another time, we were coming from the movie theatre in Lafayette in Susie’ s car when we got stopped for going through a red light. Although she tried very hard, her humour and Spanish verve were for once insufficient to convince her interlocutor.

Susie was a great Bayesian, contributing to the foundations of Bayesian testing in her numerous papers and through the direction of deep PhD theses in Valencia. As well as to queuing systems and computer models. She was also incredibly active in ISBA, from the very start of the Bayesian society, and was one of the first ISBA presidents. She also definitely contributed to the Objective Bayes section of ISBA, especially in the construction of the O’Bayes meetings. She gave a great tutorial on Bayes factors at the last O’Bayes conference in Duke last December, full of jokes and passion, despite being already weak from her cancer…

So, hasta luego, Susie!, from all your friends. I know we shared the same attitude about our Catholic education and our first names heavily laden with religious meaning, but I’d still like to believe that your rich and contagious laugh now resonates throughout the cosmos. So, hasta luego, Susie, and un abrazo to all of us missing her.

ski with deviation

Posted in Kids, Mountains, pictures, Travel with tags , , , , , , , on March 29, 2014 by xi'an

redduckI just learned that a micro-brew brand of homemade skis has connections with statistics and, who knows, could become a sponsor to the next MCMSki…  Indeed, the brand is called deviation (as in standard deviation), located in Gresham, Oregon, and sell locally made skis and snowboards with names like The Moment Generator or The Mode! The logo  clearly indicates a statistical connection:

As it happens, two of the founding partners of deviation, Tim and Peter Wells, are the sons of my long-time friend Marty Wells from Cornell University. When I first met them, they were great kids, young enough to give no inkling they would end up producing beautiful hardwood core skis in a suburb of Portland, Oregon!!! Best wishes to them and to deviation, the most statistical of all ski brands! (Here is a report in The Oregonian that tells the story of how deviation was created.)

samdeviation