## ABC with no prior

“I’m trying to fit a complex model to some data that take a large amount of time to run. I’m also unable to write down a Likelihood function to this problem and so I turned to approximate Bayesian computation (ABC). Now, given the slowness of my simulations, I used Sequential ABC (…) In fact, contrary to the concept of Bayesian statistics (new knowledge updating old knowledge) I would like to remove all the influence of the priors from my estimates. “

**A** question from X validated where I have little to contribute as the originator of the problem had the uttermost difficulties to understand that ABC could not be run without a probability structure on the parameter space. Maybe a fiducialist in disguise?! To this purpose this person simulated from a collection of priors and took the best 5% across the priors, which is akin to either running a mixture prior or to use ABC for conducting prior choice, which reminds me of a paper of Toni et al. Not that it helps removing “all the influence of the priors”, of course…

An unrelated item of uninteresting trivia is that a question I posted in 2012 on behalf of my former student Gholamossein Gholami about the possibility to use EM to derive a Weibull maximum likelihood estimator (instead of sheer numerical optimisation) got over the 10⁴ views. But no answer so far!

April 30, 2018 at 12:46 pm

I just now left a response at Cross Validated, on that Weibull MLE question. Does that count as an answer?

May 1, 2018 at 8:47 am

Thank you David, this is a stunning answer in that I could not imagine using this latent variable mixing in a deterministic way observations and parameters. Great “trick”, really!!!

April 30, 2018 at 1:00 am

If you don’t like priors there’s always indirect inference, right? https://arxiv.org/abs/0908.0433

April 30, 2018 at 8:08 am

Thanks for pointing out the connection, Corey. While indirect inference was introduced by my mentors in graduate school, then colleagues at CREST (and soon to be colleague at Warwick), and while this is a brilliant concept, I do no see indirect inference as a proxy for Bayesian inference in that it only produces an estimator (or a sequence thereof) and not a full inferential apparatus in the sense the posterior can be perceived. It is not solely a Monte Carlo method, granted, but it did not X my mind to mention it in this X validated answer. Will do now!