ISBA 2012 [#3]

A third and again very intense day at ISBA 2012: as Steve Scott said, “we are  getting Bayes-ed out”… It started for me with Robert Kohn’s particle filter session, where Julien Cornebise gave us programming recommendations to improve our code, performances, and overall impact of our research, passionately pleading for an object oriented approach that would make everything we program much more portable. Scott Sisson presented a new approach to density estimation for ABC purposes, using first a marginal estimation for each component of the statistic vector, then a normal mixture copula on the normal transforms of the inverse cdfs, and Robert concluded with a extension of  PMCMC to eliminate nuisance parameters by importance sampling, a topic we will discuss again when I visit Sydney in two weeks. The second session of the morning was ABC II, where David Nott spoke about the combination of ABC with Bayes linear tools, a paper Scott had presented in Banff last Spring, Michael Blum summarised the survey on the selection of summary statistics discussed earlier on the ‘Og, Jean-Michel spoke about our (recently accepted) LDA paper, acknowledging our initial (2005) misgivings about ABC (!), and Olie Ratmann concluded the session with a fairly exciting new notion of using a testing perspective to define acceptable draws. While I clearly enjoyed the amount of “ABC talks” during this meeting, several attendees mentioned to me it was a bit overwhelming… Well, my impression is that this conveyed high and loud the message that ABC is now truly part of the Bayesian toolbox, and that further theoretical exploration would be most welcomed.

The afternoon session saw another session I was involved in organising, along with Marc Suchard, on parallel computing for Bayesian calculations. Marc motivated the use of GPUs for a huge medical dataset, showing impressive gains in time for a MAP calculation, with promises of a more complete Bayesian processing. Steve Scott gave the distributed computing version of the session, with Google requirements for a huge and superfast logistic regression, Jarad Niemi went into the (highly relevant!) details of random processors on GPUs and Kenichiro McAlinn described an application to portfolio selection using GPUs. (The topic attracted a huge crowd and the room was packed!) I am sorry the parallel session on Bayesian success stories was taking place at the same time. As it related very much to our on-going project with Kerrie Mengersen (we are currently waiting for the return from  selected authors). Then it was time for a bit of joint work, along with a succulent macha ice-cream in Kyoto station, and another fairly exhausting if quality poster session.

I am sorry to miss the sessions of Friday (and got “flak” from Arnaud for missing his lecture!) as these were promising as well. (Again, anyone for a guest post?!) Overall, I come home exhausted but richer for the exchanges and all I learn from a very good and efficient meeting. Not even mentioning this first experience of Japan. (Written from Kansai Osaka airport on a local machine.)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.