Bayes 250

IMG_0116While I left Paris under a thunderstorm, the weather in London was warm and sunny, and I enjoyed a nice walk to the RSS. With a Betsey Trotwood pub on the way that obviously delighted the David Copperfield fan in me! The Bayes 250 meeting started with the videoed interview of Dennis Lindley by and thanks to Tony O’Hagan in his Devonshire home. I hope the video gets on-line soon as it is remarkable in rendering Dennis’ view on Bayesian statistics, being full of humour and unremitting in his defence of the Bayesian approach. (And as I missed a few points due to an imperfect sound system.) “Coherence is all” could best summarise this interview. And the sincere regret that Bayesianism has not taken over…

IMG_0119The talks started with Gareth Roberts explaining why MCMC was possible in infinite dimension despite the dimensionality curse. (Starting his talk with a Rev. Bayes meets Newton, Markov and Metropolis diaporama.) Then, after a lunch break where some participants eloped to Bayes’ tomb next door (!), Sylvia Richardson presented a broad vision of Bayesian biostatistics, answering in my opinion some of Dennis’ worries that Bayes had not taken off widely-enough (my rephrasing). Dennis Prangle also chose to give an overview of ABC, rejoining my perspective that it is more of a new kind of inference with Bayesian justifications than a mere computational tool, Michael Jordan talked about Kingman’s paintbox (in relation with Tamara Broderick’s talk I had enjoyed so much in Kyoto) before rushing back to Paris, Phil Dawid gave a somehow a-Bayesian talk about the frequentist (in)validation of predictors, in connection with his calibration talk in Padova a few months ago, Iain Murray explained his NADE modelling tool, mixing neural nets with mixtures, and YeeWhye Teh concluded the talks of the day with a presentation of his Gibbs sampler for jump processes that I found most interesting (I later realised this was a paper I had missed in Bayes 250 in Edinburgh by leaving early!). The day ended with a few posters, including one by Maria Lomelli Garcia and YeeWhye Teh on alpha-stable processes that provided a new auxiliary variable representation of clear appeal. (The day actually ended for good with a light and enjoyable dinner in this most improbable Renaissance Hotel that literally stands at the end of the tracks of St Pancras…)

St Pancras. London, Jan. 26, 2012The second day was just as rich: after [a run in Regent’s Park and] a welcome from the current RSS president (John Pullinger, who happens to live in Turnbridge Wells, of all places!), Michael Goldstein gave a spirited defence of Bayesian statistics as a projection device (putting expectation forward of probability as in deFinetti and Hartigan), Andrew Golightly discussed particle filter approximations based on discretised diffusions and fighting degeneracy via bridging, Nicky Best managed to give three talks in one (!) around Bayesian epidemiology, beginning with a Rev. Bayes meets Dr. Snow (who started spatial epidemiology with his famous cholera map). Then Christophe Andrieu presented what were new & exciting results for me, showing by Peskun and convex orderings that using more unbiased estimates of the likelihood function was theoretically as well as practically improving the performances of the associated Exact Approximation MCMC algorithm. This was followed by Ben Calderhead, who summarised his recently arXived paper with Mark Girolami and co-authors on using Bayesian analysis to evaluate the uncertainty associated with the numerical resolution of differential equations, connecting with the older paper by Persi Diaconis on the topic (paper I remember discussing with George Casella in an Ithaca café while we were waiting for his car to be fixed…). I wonder whether the approach could be used to handle the constant estimation paradox raised by Larry Wasserman (and discussed on the ‘Og as well)… Under the title of  “the misspecified Bayesian”, Stephen Walker sketched an on-going work with Chris Holmes, work that resonated deeply with some of my current musings about the nature of Bayesian inference on intractable problems. Hence giving me new prospects on ABC validation and extension. More precisely, he showed us a way to handle problems where only some aspect of the model is of interest and where a pseudo-model that (asymptotically) manages this aspect can be found. The paper should soon be arXived and I will certainly discuss it more at length then! Simon Wilson did a “Rev. Bayes meets Dr. Linnaeus” introduction and talked about the estimation of the number of newly discoveries of (unknown) species, a problem that I find fascinating even though I find the current solutions of an essentially hypergeometric model somehow oversimplifying. Chris Yau introduced us to his current work on cancer analysis and to his way of managing the complexity of the mutation process by hierarchical models, and Peter Green ended the presentations with a survey or survol of his work on doing inference on decomposable graphs, with online exhibits.

RSS wineThe meeting concluded with Adrian Smith giving a personal reminiscence of the (poor) state of Bayesian statistics in the 60’s and 70’s, paying tribute to his advisor Dennis Lindley for keeping the faith against strong opposition and for ensuring the survival of the field onto the next generation. (And linking once again with John Kingman.) As hopefully shown by my summary, the field is definitely alive nowadays and has accomplished much by managing the computational hurdles. (As shown further by our Statistical Science incoming vignettes, there are many cases where Bayesian analysis looks like the only available answer.) However, the new challenges raised by Big Data may well jeopardise this revival of a 250 year old principle by moving to quick-and-dirty (and less principled) inference techniques. What really made this meeting so successful in my opinion is that a lot of the talks we heard in Errol Street over those two days were exposing progress being made towards handling the new challenges. Hence, there still is hope for Bayesian techniques in the coming century!

2 Responses to “Bayes 250”

  1. Gianluca Baio Says:

    Well — perfect summary! Well done for the organisation!

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.