## approximate likelihood

**T**oday, I read a newly arXived paper by Stephen Gratton on a method called GLASS for *General Likelihood Approximate Solution Scheme*… The starting point is the same as with ABC or synthetic likelihood, namely a collection of summary statistics and an intractable likelihood. The author proposes to use as a substitute a maximum entropy solution based on these summary statistics and their assumed moments under the theoretical model. What is quite unclear in the paper is whether or not these assumed moments are available in closed form or not. Otherwise, it would appear as a variant to the synthetic likelihood [aka simulated moments] approach, meaning that the expectations of the summary statistics under the theoretical model and for a given value of the parameter are obtained through Monte Carlo approximations. (All the examples therein allow for closed form expressions.)

September 7, 2017 at 8:49 pm

I saw the synthetic likelihood connection to be more for the hypothetical “ideal case” Stephen alludes to; for the case of some number of moments available in closed form and the likelihood to be completed via maximum entropy, I see this as being most similar to empirical likelihood (the prod pi_i empirical likelihood penalty being equivalent to the max ent penalty, but restricted to the samples with the empirical distribution as reference density rather than integrated over the parameter space); possibly also similar to other ‘estimating equations’/’moment restriction’ ideas.