efficient learning in ABC

Jean-Michel Marin just posted on arXiv a joint paper of ours, Efficient learning in ABC algorithms. This paper, to which elaboration [if not redaction] I contributed to, is one of the chapters of Mohammed Sedki’s thesis. (Mohammed is about to defend this thesis, currently with reviewers. A preliminary version of this paper was presented at ABC in London and it is in revision with Statistics and Computing. Hence missing the special issue!)

The paper builds on the sequential ABC scheme of Del Moral et al. (2012), already discussed in this post, where the tolerance level at each step is adapted from the previous iterations as a quantile of the distances. (The quantile level is based on a current effective sample size.) In a “systematic” step, the particles that are closest to the observations are preserved and duplicated, while those farther away are sampled randomly. The resulting population of particles is then perturbed by an adaptive (random walk) kernel and the algorithm stops when the tolerance level does not decrease any longer or when the acceptance rate of the Metropolis step is too low. Pierre Pudlo and Mohammed Sedki experimented a parallel implementation of the algorithm, with an almost linear improvement in the number of cores. It is less clear the same would work on a GPU because of the communication requirements. Overall, the new algorithm brings a significant improvement in computing time when compared with earlier versions, mainly because the number of simulations from the model is minimised. (It was tested on a rather complex population scenario retracing the invasion of honeybees in Europe (in connection with the previous post!)

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.