**E**arlier this year, F. Llorente, L. Martino, D. Delgado, and J. Lopez-Santiago have arXived an updated version of their massive survey on marginal likelihood computation. Which I can only warmly recommend to anyone interested in the matter! Or looking for a base camp to initiate a graduate project. They break the methods into four families

- Deterministic approximations (e.g., Laplace approximations)
- Methods based on density estimation (e.g., Chib’s method, aka the candidate’s formula)
- Importance sampling, including sequential Monte Carlo, with a subsection connecting with MCMC
- Vertical representations (mostly, nested sampling)

Besides sheer computation, the survey also broaches upon issues like improper priors and alternatives to Bayes factors. The parts I would have done in more details are reversible jump MCMC and the long-lasting impact of Geyer’s reverse logistic regression (with the noise contrasting extension), even though the link with bridge sampling is briefly mentioned there. There is even a table reporting on the coverage of earlier surveys. Of course, the following postnote of the manuscript

*The Christian Robert’s blog deserves a special mention , since Professor C. Robert has devoted several entries of his blog with very interesting comments regarding the marginal likelihood estimation and related topics.*

does not in the least make me less objective! Some of the final recommendations

*use of Naive Monte Carlo *[simulate from the prior] *should be always considered* [assuming a proper prior!]

*a multiple-try method is a good choice within the MCMC schemes*
*optimal umbrella sampling estimator is difficult and costly to implement , so its best performance may not be achieved in practice*
*adaptive importance sampling uses the posterior samples to build a suitable normalized proposal, so it benefits from localizing samples in regions of high posterior probability while preserving the properties of standard importance sampling*

*Chib’s method is a good alternative, that provide very good performances [but is not always available]*
*the success* [of nested sampling] *in the literature is surprising*.

### Like this:

Like Loading...