This (early) summer, a conference on missing data will be organised in Rennes, Brittany, with the support of the French Statistical Society [SFDS]. (Check the website if interested, Rennes is a mere two hours from Paris by fast train.)
The next Nordic-Baltic Biometric conference will take place in Reykjavik, next June, a few days after the O-Bayes 15 meeting in València. I will attend the conference as the organisers were kind enough to invite me to give a talk, with high hopes to take a few days off to go hiking day and night! The registration is now open, as is the call for abstracts.
So today was the NIPS 2014 workshop, “ABC in Montréal“, which started with a fantastic talk by Juliane Liepe on some exciting applications of ABC to the migration of immune cells, with the analysis of movies involving those cells acting to heal a damaged fly wing and a cut fish tail. Quite amazing videos, really. (With the great entry line of ‘We have all cut a finger at some point in our lives’!) The statistical model behind those movies was a random walk on a grid, with different drift and bias features that served as model characteristics. Frank Wood managed to deliver his talk despite a severe case of food poisoning, with a great illustration of probabilistic programming that made me understand (at last!) the very idea of probabilistic programming. And Vikash Mansinghka presented some applications in image analysis. Those two talks led me to realise why probabilistic programming was so close to ABC, with a programming touch! Hence why I was invited to talk today! Then Dennis Prangle exposed his latest version of lazy ABC, that I have already commented on the ‘Og, somewhat connected with our delayed acceptance algorithm, to the point that maybe something common can stem out of the two notions. Michael Blum ended the day with provocative answers to the provocative question of Ted Meeds as to whether or not machine learning needed ABC (Ans. No!) and whether or not machine learning could help ABC (Ans. ???). With an happily mix-up between mechanistic and phenomenological models that helped generating discussion from the floor.
The posters were also of much interest, with calibration as a distance measure by Michael Guttman, in continuation of the poster he gave at MCMski, Aaron Smith presenting his work with Luke Bornn, Natesh Pillai and Dawn Woodard, on why a single pseudo-sample is enough for ABC efficiency. This gave me the opportunity to discuss with him the apparent contradiction with the result of Kryz Łatunsziński and Anthony Lee about the geometric convergence of ABC-MCMC only attained with a random number of pseudo-samples… And to wonder if there is a geometric versus binomial dilemma in this setting, Namely, whether or not simulating pseudo-samples until one is accepted would be more efficient than just running one and discarding it in case it is too far. So, although the audience was not that large (when compared with the other “ABC in…” and when considering the 2500+ attendees at NIPS over the week!), it was a great day where I learned a lot, did not have a doze during talks (!), [and even had an epiphany of sorts at the treadmill when I realised I just had to take longer steps to reach 16km/h without hyperventilating!] So thanks to my fellow organisers, Neil D Lawrence, Ted Meeds, Max Welling, and Richard Wilkinson for setting the program of that day! And, by the way, where’s the next “ABC in…”?! (Finland, maybe?)
Here are the slides of my incoming discussion of Ron Gallant’s paper, tomorrow.
Great poster session yesterday night and at lunch today. Saw an ABC poster (by Dennis Prangle, following our random forest paper) and several MCMC posters (by Marco Banterle, who actually won one of the speed-meeting mini-project awards!, Michael Betancourt, Anne-Marie Lyne, Murray Pollock), and then a rather different poster on Mondrian forests, that generalise random forests to sequential data (by Balaji Lakshminarayanan). The talks all had interesting aspects or glimpses about big data and some of the unnecessary hype about it (them?!), along with exposing the nefarious views of Amazon to become the Earth only seller!, but I particularly enjoyed the astronomy afternoon and even more particularly Steve Roberts sweep through astronomy machine-learning. Steve characterised variational Bayes as picking your choice of sufficient statistics, which made me wonder why there were no stronger connections between variational Bayes and ABC. He also quoted the book The Fourth Paradigm: Data-Intensive Scientific Discovery by Tony Hey as putting forward interesting notions. (A book review for the next vacations?!) And also mentioned zooniverse, a citizens science website I was not aware of. With a Bayesian analysis of the learning curve of those annotating citizens (in the case of supernovae classification). Big deal, indeed!!!
We received the good news from the Program Committee of the next ISBA World meeting in Cancún, Quintana Roo, México, that our proposal of a short course on ABC methods was accepted. So, along with Jean-Michel Marin, I hope to introduce ABC to a large group of interested participants on either July 13 or the morn of July 14. Here is the abstract for the short course
ABC appeared in 1999 to solve complex genetic problems where the likelihood of the model was impossible to compute. They are now a standard tool in the statistical genetic community but have also addressed many other problems where likelihood computation was also an issue, including dynamic models in signal processing and financial data analysis. However, these methods suffer to some degree from calibration difficulties that make them rather volatile in their implementation and thus render them suspicious to the users of more traditional Monte Carlo methods. Nonetheless, ABC techniques have several claims to validity: first, they are connected with econometric methods like indirect inference. Second, they can be expressed in terms of various non-parametric estimators of the likelihood or of the posterior density and follow standard convergence patterns. At last, they appear as regular Bayesian inference over noisy data. The lectures cover those validation steps but also detail the different implementations of ABC algorithms and the calibration of their parameters. The second part of the course illustrates those issues in the special case of the coalescent model used in population genetics, where many of the early advances of ABC were first implemented.
and the complete description is available on the ISBA website.
In addition, the special ISBA section sessions for BayesComp and O’Bayes ISBA sections have been accepted too. (As well as sessions for most of the other ISBA sections.) Thanks to Raquel Prado and to the whole scientific committee for working hard towards another successful ISBA World meeting! Note that the early bird conference registration begins today, so make sure to book your seat for Cancun!