## ABC+EL=no D(ata)

**I**t took us a loooong while *[for various and uninteresting reasons]* but we finally ended up completing a paper on ABC using empirical likelihood (EL) that was started by me listening to Brunero Liseo’s tutorial in O’Bayes-2011 in Shanghai… Brunero mentioned empirical likelihood as a semi-parametric technique w/o much Bayesian connections and this got me thinking of a possible recycling within ABC. I won’t get into the details of empirical likelihood, referring to Art Owen’s book “Empirical Likelihood” for a comprehensive entry, The core idea of empirical likelihood is to use a maximum entropy discrete distribution supported by the data and constrained by estimating equations related with the parameters of interest/of the model. As such, it is a non-parametric approach in the sense that the distribution of the data does not need to be specified, only some of its characteristics. Econometricians have been quite busy at developing this kind of approach over the years, see e.g. Gouriéroux and Monfort’s Simulation-Based Econometric Methods). However, this empirical likelihood technique can also be seen as a convergent approximation to the likelihood and hence exploited in cases when the exact likelihood cannot be derived. For instance, as a substitute to the exact likelihood in Bayes’ formula. Here is for instance a comparison of a true normal-normal posterior with a sample of 10³ points simulated using the empirical likelihood based on the moment constraint.

**T**he paper we wrote with Kerrie Mengersen and Pierre Pudlo thus examines the consequences of using an empirical likelihood in ABC contexts. Although we called the derived algorithm ABC_{el}, it differs from genuine ABC algorithms in that *it does not simulate pseudo-data*. Hence the title of this post. (The title of the paper is “*Approximate Bayesian computation via empirical likelihood*“. It should be arXived by the time the post appears: “*Your article is scheduled to be announced at Mon, 28 May 2012 00:00:00 GMT*“.) We had indeed started looking at a simulated data version, but it was rather poor, and we thus opted for an importance sampling version where the parameters are simulated from an importance distribution (e.g., the prior) and then weighted by the empirical likelihood (times a regular importance factor if the importance distribution is not the prior). The above graph is an illustration in a toy example.

**T**he difficulty with the method is in connecting the parameters (of interest/of the assumed distribution) with moments of the (iid) data. While this operates rather straightforwardly for quantile distributions, it is less clear for dynamic models like ARCH and GARCH, where we have to reconstruct the underlying iid process. (Where ABC_{el} clearly improves upon ABC for the GARCH(1,1) model but remains less informative than a regular MCMC analysis. Incidentally, this study led to my earlier post on the unreliable garch() function in the tseries package!) And it is even harder for population genetic models, where parameters like divergence dates, effective population sizes, mutation rates, &tc., cannot be expressed as moments of the distribution of the sample at a given locus. In particular, the datapoints are not iid. Pierre Pudlo then had the brilliant idea to resort instead to a composite likelihood, approximating the intra-locus likelihood by a product of pairwise likelihoods over all pairs of genes in the sample at a given locus. Indeed, in Kingman’s coalescent theory, the pairwise likelihoods can be expressed in closed form, hence we can derive the pairwise composite scores. The comparison with optimal ABC outcomes shows an improvement brought by ABC_{el} in the approximation, at an overall computing cost that is negligible against ABC (i.e., it takes minutes to produce the ABC_{el} outcome, compared with hours for ABC.)

**W**e are now looking for extensions and improvements of ABC_{el}, both at the methodological and at the genetic levels, and we would of course welcome any comment at this stage. The paper has been submitted to *PNAS*, as we hope it should appeal to the ABC community at large, i.e. beyond statisticians…

November 18, 2014 at 5:34 pm

[…] in getting maximum empirical likelihood estimator. This reminded me of another paper about “Approximate Bayesian Computation (ABC) via Empirical Likelihood“, which uses empirical likelihood to get improvement in the approximation at an overall […]

June 1, 2012 at 12:53 am

Christian

I did an implementation of the raw ABC and ABC-el algorithms for the Stochastic Volatility model using conditional moments, and also using the smoothed moments for for inducing independence in the empirical likelihood function, e.g. Kitamura (1997).

I still need to perform a more detailed analysis of results and a Monte Carlo analysis, but the results seem very interesting. For real data the posterior expectation of the estimation by ABC-el using smoothed moments is almost equal to that obtained by the MCMC method of Kim, Shepard and Chib (1998), using only 2000 samples.

If you are interested I can send you the results and a more detailed description of the method.

Márcio

June 1, 2012 at 6:11 am

This is

highlyinteresting, most obviously! Please send me the results or a link to them.May 29, 2012 at 5:25 am

A truly innovative article!

I believe that a Stochastic Volatility model is a most interesting application for the ABC methodology using empirical likelihood, as it allows time to build directly moment conditions, e.g. the moment conditions used in the estimation by the generalized method of moments – GMM Estimation of a Stochastic Volatility Model: A Monte Carlo Study Torben G. Andersen and Bent E. Sørensen, Journal of Business & Economic Statistics. Vol 14, No. 3 (July 1996), p. 328-352.

An alternative to the reconstruction of IID process for dependent data in the context of empirical likelihood is the use of smoothing / blocking in moment conditions, as defined in Empirical Likelihood Methods with weakly dependent data, Kitamura, Y. (1997) Annals of Statistics 25(5), 2084-2102 and Smith, R. (2001): “GEL Criteria for Moment Condition Models”.

[] s

Márcio Laurini

May 29, 2012 at 6:55 am

Thanks for the references. Stochastic volatility was on my list of to-do models, indeed!

May 29, 2012 at 4:47 am

Even though I’m outside the ABC community, I have to say this sounds like a pretty major advance!

May 29, 2012 at 6:50 am

Thanks!