## Archive for ABC

## talk in Linz [first slide]

Posted in Mountains, pictures, Running, University life with tags ABC, Austria, Boston, IFAS, JKU, JSM, Linz, model choice, Pöstlingberg, talk on September 17, 2014 by xi'an## ABC@NIPS: call for papers

Posted in Statistics, Travel, University life with tags ABC, BayesComp, Canada, ISBA@NIPS, likelihood-free methods, machine learning, Montréal, NIPS 2014, Québec, simulation on September 9, 2014 by xi'an*In connection with the previous announcement of ABC in Montréal, a call for papers that came out today:*

NIPS 2014 Workshop: ABC in Montreal

**December 12, 2014**

** Montréal, Québec, Canada**

Approximate Bayesian computation (ABC) or likelihood-free (LF) methods have developed mostly beyond the radar of the machine learning community, but are important tools for a large segment of the scientific community. This is particularly true for systems and population biology, computational psychology, computational chemistry, etc. Recent work has both applied machine learning models and algorithms to general ABC inference (NN, forests, GPs) and ABC inference to machine learning (e.g. using computer graphics to solve computer vision using ABC). In general, however, there is significant room for collaboration between the two communities.

The workshop will consist of invited and contributed talks, poster spotlights, and a poster session. Rather than a panel discussion we will encourage open discussion between the speakers and the audience!

Examples of topics of interest in the workshop include (but are not limited to):

* Applications of ABC to machine learning, e.g., computer vision, inverse problems

* ABC in Systems Biology, Computational Science, etc

* ABC Reinforcement Learning

* Machine learning simulator models, e.g., NN models of simulation responses, GPs etc.

* Selection of sufficient statistics

* Online and post-hoc error

* ABC with very expensive simulations and acceleration methods (surrogate modeling, choice of design/simulation points)

* ABC with probabilistic programming

* Posterior evaluation of scientific problems/interaction with scientists

* Post-computational error assessment

* Impact on resulting ABC inference

* ABC for model selection

===========

**Submission:**

=========== Continue reading

## statistical challenges in neuroscience

Posted in Books, pictures, Statistics, Travel with tags ABC, computer experiment model, Gaussian processes, indirect inference, neurosciences, University of Warwick, workshop on September 4, 2014 by xi'an**Y**et another workshop around! Still at Warwick, organised by Simon Barthelmé, Nicolas Chopin and Adam Johansen on the theme of statistical aspects of neuroscience. Being nearby I attended a few lectures today but most talks are more topical than my current interest in the matter, plus workshop fatigue starts to appear!, and hence I will keep a low attendance for the rest of the week to take advantage of my visit here to make some progress in my research and in the preparation of the teaching semester. (Maybe paradoxically I attended a non-neuroscience talk by listening to Richard Wilkinson’s coverage of ABC methods, with an interesting stress on meta-models and the link with computer experiments. Given that we are currently re-revising our paper with Matt Moore and Kerrie Mengersen (and now Chris Drovandi), I find interesting to see a sort of convergence in our community towards a re-re-interpretation of ABC as producing an approximation of the distribution of the summary statistic itself, rather than of the original data, using auxiliary or indirect or pseudo-models like Gaussian processes. (Making the link with Mark Girolami’s talk this morning.)

## big data, big models, it is a big deal! [posters & talks]

Posted in Books, Kids, pictures, Statistics, Travel, University life with tags ABC, Amazon, astronomy, astrostatistics, big data, conference, England, galaxies, pulsars, Statistics, supernovae, The Fourth Paradigm, The Large Synoptic Survey Telescope, University of Warwick, variational Bayes methods, workshop on September 3, 2014 by xi'an**G**reat poster session yesterday night and at lunch today. Saw an ABC poster (by Dennis Prangle, following our random forest paper) and several MCMC posters (by Marco Banterle, who actually won one of the speed-meeting mini-project awards!, Michael Betancourt, Anne-Marie Lyne, Murray Pollock), and then a rather different poster on Mondrian forests, that generalise random forests to sequential data (by Balaji Lakshminarayanan). The talks all had interesting aspects or glimpses about big data and some of the unnecessary hype about it (them?!), along with exposing the nefarious views of Amazon to become the Earth only seller!, but I particularly enjoyed the astronomy afternoon and even more particularly Steve Roberts sweep through astronomy machine-learning. Steve characterised variational Bayes as picking your choice of sufficient statistics, which made me wonder why there were no stronger connections between variational Bayes and ABC. He also quoted the book The Fourth Paradigm: Data-Intensive Scientific Discovery by Tony Hey as putting forward interesting notions. (A book review for the next vacations?!) And also mentioned zooniverse, a citizens science website I was not aware of. With a Bayesian analysis of the learning curve of those annotating citizens (in the case of supernovae classification). Big deal, indeed!!!

## NIPS workshops (Dec. 12-13, 2014, Montréal)

Posted in Kids, Statistics, Travel, University life with tags ABC, ABC in Montréal, Canada, delayed acceptance, machine learning, Montréal, NIPS 2014, prefetching, Québec on August 25, 2014 by xi'an**F**ollowing a proposal put forward by Ted Meeds, Max Welling, Richard Wilkinson, Neil Lawrence and myself, our ** ABC in Montréal** workshop has been accepted by the NIPS 2014 committee and will thus take place on either Friday, Dec. 11, or Saturday, Dec. 12, at the end of the main NIPS meeting (Dec. 8-10). (Despite the title, this workshop is not part of the

*ABC in …*series I started five years ago. It will only last a single day with a few invited talks and no poster. And no free wine & cheese party.) On top of this workshop, our colleagues Vikash K Mansinghka, Daniel M Roy, Josh Tenenbaum, Thomas Dietterich, and Stuart J Russell have also been successful in their bid for the

**which will presumably be held on the opposite day to ours, as Vikash is speaking at our workshop, while I am speaking in this workshop. I am yet undecided as to whether or not to attend the main conference, given that I am already travelling a lot this semester and have to teach two courses, incl. a large undergraduate statistics inference course… Obviously, I will try to attend if our joint paper is accepted by the editorial board! Even though Marco will then be the speaker.**

*3rd NIPS Workshop on Probabilistic Programming*## ABC model choice by random forests [guest post]

Posted in pictures, R, Statistics, University life with tags ABC, ABC model choice, arXiv, classification, Dennis Prangle, Elements of Statistical Learning, machine learning, model posterior probabilities, posterior predictive, PPER, random forests on August 11, 2014 by xi'an*[Dennis Prangle sent me his comments on our ABC model choice by random forests paper. Here they are! And I appreciate very much contributors commenting on my paper or others, so please feel free to join.]*

**T**his paper proposes a new approach to likelihood-free model choice based on random forest classifiers. These are fit to simulated model/data pairs and then run on the observed data to produce a predicted model. A novel “posterior predictive error rate” is proposed to quantify the degree of uncertainty placed on this prediction. Another interesting use of this is to tune the threshold of the standard ABC rejection approach, which is outperformed by random forests.

The paper has lots of thought-provoking new ideas and was an enjoyable read, as well as giving me the encouragement I needed to read another chapter of the indispensable *Elements of Statistical Learning* However I’m not fully convinced by the approach yet for a few reasons which are below along with other comments.

Alternative schemes

The paper shows that random forests outperform rejection based ABC. I’d like to see a comparison to more efficient ABC model choice algorithms such as that of Toni et al 2009. Also I’d like to see if the output of random forests could be used as summary statistics within ABC rather than as a separate inference method.

**Posterior predictive error rate (PPER)**

This is proposed to quantify the performance of a classifier given a particular data set. The PPER is the proportion of times the classifier’s most favoured model is incorrect for simulated model/data pairs drawn from an approximation to the posterior predictive. The approximation is produced by a standard ABC analysis.

Misclassification could be due to (a) a poor classifier or (b) uninformative data, so the PPER aggregrates these two sources of uncertainty. I think it is still very desirable to have an estimate of the uncertainty due to (b) only i.e. a posterior weight estimate. However the PPER is useful. Firstly end users may sometimes only care about the aggregated uncertainty. Secondly relative PPER values for a fixed dataset are a useful measure of uncertainty due to (a), for example in tuning the ABC threshold. Finally, one drawback of the PPER is the dependence on an ABC estimate of the posterior: how robust are the results to the details of how this is obtained?

**Classification**

This paper illustrates an important link between ABC and machine learning classification methods: model choice can be viewed as a classification problem. There are some other links: some classifiers make good model choice summary statistics (Prangle et al 2014) or good estimates of ABC-MCMC acceptance ratios for parameter inference problems (Pham et al 2014). So the good performance random forests makes them seem a generally useful tool for ABC (indeed they are used in the Pham et al al paper).