## running ABC when the likelihood is available

**T**oday I refereed a paper where the authors used ABC to bypass convergence (and implementation) difficulties with their MCMC algorithm. And I am still pondering whether or not this strategy makes sense. If only because ABC needs to handle the same complexity and the same amount of parameters as an MCMC algorithm. While shooting “in the dark” by using the prior or a coarse substitute to the posterior. And I wonder at the relevance of simulating new data when the [true] likelihood value [at the observed data] can be computed. This would sound to me like the relevant and unique “statistics” worth considering…

September 19, 2017 at 9:21 am

Very interesting point. I recently had some discussions about this since my work led me to use an “approximate” strategy rather than a standard MCMC. In my cases, I didn’t have an explicit likelihood since some integrals were involved. Anyway, in both cases a standard MCMC could be set. In the first case, ABC resulted much more faster and quite accurate in the simulation study. The second case involved the use of various auxiliary variables tricks in order to set a sampler (only one full conditional was not an explicit distribution). In this case, the MCMC was quite unstable, I believe for the high correlation among the parameters. On the other hand, the use of my bootstrap likelihood algorithm led to very good results on the simulation study (although very very slow). But I tend to consider your BC-EL and my BC-BL as an “exact” class of algorithm rather than in the same umbrella of accept/reject algorithms. Therefore, I don’t see it as a “sin” when well-motivated.

From your post, I guess they didn’t have a particular good reason rather than lazyness … :)

September 19, 2017 at 5:17 pm

Thanks Fabrizio! It would be most interesting to see why this ABC resolution gets faster than the alternative MCMC.