lifting – a nonreversible MCMC algorithm

Today, I took a look at a recently arXived paper posted in physics, lifting – A non reversible MCMC algorithm by Marija Vucleja, but I simply could not understand the concept of lifting. Presumably because of the physics perspective. And also because the paper is mostly a review, referring to the author’s earlier work. The notion of lifting is to create a duplicate of a regular Markov chain with given stationary distribution towards cancelling reversibility and hence speeding up the exploration of the state space. The central innovation in the paper seems to be in imposing a lifted reversibility, which means using the reverse dynamics on the lifted version of the chain, that is, the dual proposal

\tilde{q}(x,y)=\dfrac{\pi(y)q(y,x)}{\pi(x)}\,.

However, the paper does not explicit how the resulting Markov transition matrix on the augmented space is derived from the original matrix. I now realise my description is most likely giving the impression of two coupled Markov chains, which is not the case: the new setting is made of a duplicated sample space, in the sense of Nummelin split chain (but without the specific meaning for the binary variable found in Nummelin!). In the case of the 1-d Ising model, the implementation of the method means for instance picking a site at random, proposing to change its spin value by a Metropolis acceptance step and then, if the proposal is rejected,  possibly switching to the corresponding value in the dual part of the state. Given the elementary proposal in the first place, I fail to see where the improvement can occur… I’d be most interested in seeing a version of this lifting in a realistic statistical setting.

One Response to “lifting – a nonreversible MCMC algorithm”

  1. Although it isn’t statistical, the paper “Liftings of Tree-Structured Markov Chains” by Hayes and Sinclair (2010) gives what they claim to be the first use of lifting to improve the mixing of a chain that is actually used in practice; certainly it is the only use that I am aware of. The basic technique in the paper is to use coarse information about the target distribution in order to choose a good lift of the Markov chain. Although the state space is simple, the target distribution can be quite general and the resulting lifts can be quite complicated and `non-uniform.’ The same considerations can be used for statistical examples, if you have a `good’ approximation of the posterior, though I don’t have any sense as to how this approach compares to other types of preprocessing for MCMC.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s