http://www-stat.stanford.edu/~owen/students/SuChenThesis.pdf

You will find some very nice mathematical writing there

You can combine QMC with importance sampling when a small part of the space is relevant. [It works best if you use transformations of a finite number of uniform variables. Acceptance-rejection can be accommodated awkwardly at best.] More generally, QMC is applied to the uniform variables from which non-uniform ones are computed. So for example, if the target distribution is a spiky Gaussian, the QMC points will be equidistributed with respect to that spike.

In finite dimensional quadrature, QMC works best for smooth integrands. We see similar things in MCMC: big gains for some Gibbs samplers, lesser gains for Metropolis-Hastings.

For some specialized settings (time series simulations), Su Chen was able to prove that MCQMC attains a better convergence rate than MCMC.

]]>http://onlinelibrary.wiley.com/doi/10.1111/j.1558-5646.2011.01474.x/full

]]>