block-wise pseudo-marginals

One justification for pseudo-marginal Metropolis-Hastings algorithms is the completion or demarginalisation of the initial target with the random variates used to compute the unbiased estimator of the target or likelihood. In a recent arXival, M.-N. Tran, Robert Kohn,  M. Quiroz and M. Villani explore the idea of only updating part of those auxiliary random variates, hence the block in the title. The idea is to “reduce the variability in the ratio of the likelihood estimates” but I think it also reduces the moves of the sampler by creating a strong correlation between the likelihood estimates. Of course, a different appeal of the approach is when facing a large product of densities, large enough to prevent the overall approximation at once and requiring blockwise approximations. As in, e.g., consensus Monte Carlo and other “big data” (re)solutions. The convergence results provided in the paper are highly stylised (like assuming the log of the unbiased estimator of the likelihood being normal and simulation run from the prior), but they lead to a characterisation of the inefficiency of the pseudo-marginal algorithm. The inefficiency being defined as the ratio of the variances when using the true likelihood and when using the limiting unbiased estimator. There is however no corresponding result for selecting the number of blocks, G, which is chosen as G=100 in the paper.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s