I tried to read the paper but found it extremely convoluted. From what I understand, this is a pseudo-likelihood method with a simulation component but there is no approximation assessment as in ABC, as far as I can judge. The pseudo-model stands as it.

]]>The simulated method of moments is a particular case of indirect inference, where the simulated instrumental (auxiliary) model is the correctly specified model, and is used only to generate the moment conditions, not available in analytic forms.

In the article by Ron and Rob the statistical model can be overidentified (the number of parameters of the statistical model is greater than the number of parameters of the scientific model) , and so the mapping is not necessarily 1-1.

The connection between moment conditions and approximate bayesian inference is explored in:

http://www.dynare.org/wp-repo/dynarewp008.pdf

INDIRECT LIKELIHOOD INFERENCE

MICHAEL CREEL AND DENNIS KRISTENSEN

ABSTRACT. Given a sample from a fully specified parametric model, let Zn be a given finite-dimensional statistic – for example, an initial estimator or a set of sample moments.

We propose to (re-)estimate the parameters of the model by maximizing the likelihood of Zn. We call this the maximum indirect likelihood (MIL) estimator. We also propose a computationally

tractable Bayesian version of the estimator which we refer to as a Bayesian Indirect Likelihood (BIL) estimator. In most cases, the density of the statistic will be of unknown form, and we develop simulated versions of the MIL and BIL estimators. We show that the indirect likelihood estimators are consistent and asymptotically normally distributed, with the same asymptotic variance as that of the corresponding efficient twostep GMM estimator based on the same statistic. However, our likelihood-based estimators, by taking into account the full finite-sample distribution of the statistic, are higher order efficient relative to GMM-type estimators. Furthermore, in many cases they enjoy a bias reduction property similar to that of the indirect inference estimator. Monte Carlo

results for a number of applications including dynamic and nonlinear panel data models, a structural auction model and two DSGE models show that the proposed estimators indeed have attractive finite sample properties.

Correct! There is no assumption of that kind. Once a set of statistics is chosen, ABC approximates the posterior distribution associated with those statistics. If the set is too small and fails to identify the parameters, the posterior will be the prior (on some combinations of the parameters)…

]]>One thing that I believing is limiting their approach is the “identifiability” of the mapping (1 to 1 mapping) between scientific and statistical models.

I am new to the ABC literature (and this is the main motivation why I started reading your blog regularly)… please Christian correct me if I am wrong.. but ABC – I believe – does not assume any form “identifiability”.

The way I see it is that if there are two sets of parameters that generate the same summary statistics through the simulated data, we still have valid ABC-posteriors (even if they will probably appear multimodal).

]]>Yes, this is right: the simulated method of moments is a special case of simulated inference and thus relates to ABC algorithms as well.

]]>Thanks Marcio, I will look at the paper today!

]]>Another connection between Bayesian methods and indirect inference is:

“On the Determination of General Scientific Models With Application to Asset Pricing”

A. Ronald Gallant, Robert E. McCulloch. Journal of the American Statistical Association. March 1, 2009, 104(485): 117-131. doi:10.1198/jasa.2009.0008.

from the abstract:

The habit model exhibits four characteristics that are often present in models developed from scientific considerations: (1) a likelihood is not available; (2) prior information is available; (3) a portion of the prior information is expressed in terms of functionals of the model that cannot be converted into an analytic prior on model parameters; (4) the model can be simulated. The underpinning of our approach is that, in addition, (5) a parametric statistical model for the data, determined without reference to the scientific model, is known. In general one can expect to be able to determine a model that satisfies (5) because very richly parameterized statistical models are easily accommodated. We develop a computationally intensive, generally applicable, Bayesian strategy for estimation and inference for scientific models that meet this description together with methods for assessing model adequacy. An important adjunct to the method is that a map from the parameters of the scientific model to functionals of the scientific and statistical models becomes available. This map is a powerful tool for understanding the properties of the scientific model.

]]>The way I see I see it, is that in (SMM) the “summary statistics” are collection of moments that are thought to be relevant by the economists. The way they are matched is through a moment-score equation following the GMM literature. You could formulate similar ABC algorithms..

D.