## maximal couplings of the Metropolis-Hastings algorithm

Posted in Statistics, University life with tags , , , , , , , , , on November 17, 2020 by xi'an

As a sequel to their JRSS B paper, John O’Leary, Guanyang Wang, and [my friend, co-author and former student!] Pierre E. Jacob have recently posted a follow-up paper on maximal coupling for Metropolis-Hastings algorithms, where maximal is to be understood in terms of the largest possible probability for the coupled chains to be equal, according to the bound set by the coupling inequality. It made me realise that there is a heap of very recent works in this area.

A question that came up when reading the paper with our PhD students is whether or not the coupled chains stay identical after meeting once. When facing two different targets this seems inevitable and indeed Lemma 2 seems to show that no. A strong lemma that does not [need to] state what happens outside the diagonal Δ.

One of the essential tricks is to optimise several kinds of maximal coupling, incl. one for the Bernoullesque choice of moving, as given on p.3.

Algorithm 1 came as a novelty to me as it first seemed (to me!) the two chains may never meet, but this was before I read the small prints of the transition (proposal) kernel being maximally coupled with itself. While Algorithm 2 may be the earliest example of Metropolis-Hastings coupling I have seen, namely in 1999 in Crete, in connection with a talk by Laird Breyer and Gareth Roberts at a workshop of our ESSS network. As explained by the authors, this solution is not always a maximal coupling for the reason that

min(q¹.q²) min(α¹,α²) ≤ min(q¹α¹,q²α²)

(with q for the transition kernel and α for the acceptance probability). Lemma 1 is interesting in that it describes the probability to un-meet (!) as the surface between one of the move densities and the minimum of the two.

The first solution is to couple by plain Accept-Reject with the first chain being the proposed value and if rejected [i.e. not in C] to generate from the remainder or residual of the second target, in a form of completion of acceptance-rejection (accept when above rather than below, i.e. in A or A’). This can be shown to be a maximal coupling. Another coupling using reflection residuals works better but requires some spherical structure in the kernel. A further coupling on the acceptance of the Metropolis-Hastings move seems to bring an extra degree of improvement.

In the introduction, the alternatives about the acceptance probability α(·,·), e.g. Metropolis-Hastings versus Barker, are mentioned but would it make a difference to the preferred maximal coupling when using one or the other?

A further comment is that, in larger dimensions, I mean larger than one!, a Gibbsic form of coupling could be considered. In which case it would certainly decrease the coupling probability but may still speed up the overall convergence by coupling more often. See “maximality is sometimes less important than other properties of a coupling, such as the contraction behavior when a meeting does not occur.” (p.8)

As a final pun, I noted that Vaserstein is not a typo, as Leonid Vaseršteĭn is a Russian-American mathematician, currently at Penn State.

## To Susie [by Kerrie Mengersen]

Posted in pictures, Statistics, University life with tags , , , , , , , on August 21, 2014 by xi'an

[Here is a poem written by my friend Kerrie for the last ISBA cabaret in Cancun, to Susie who could not make it to a Valencia meeting for the first time… Along with a picture of Susie, Alicia and Jennifer taking part in another ISBA cabaret in Knossos, Crete, in 2000.]

This is a parody of a classic Australian bush poem, ‘The Man from Snowy River’, that talks of an amazing horseman in the rugged mountain bush of Australia, who out-performed the ‘cracks’ and became a legend. That’s how I think of Susie, so this very bad poem comes with a big thanks for being such an inspiration, a great colleague and a good friend.

There was movement in the stats world as the emails caught alight
For the cult from Reverend Bayes had got away
And had joined the ‘ISBA’ forces, and were calling for a fight
So all the cracks had gathered to the fray.

All the noted statisticians from the countries near and far
For the Bayesians love their meetings where the sandy beaches are
And the Fishers snuffed the battle with delight.

There were Jim and Ed and Robert, who were ‘fathers of the Bayes’
They were known as the whiskey drinking crowd
But they’d invented all the theory in those Valencia days
Yes, they were smart, but oh boy were they loud!

And Jose M Bernardo came down to lend a hand
A finer Bayesian never wrote a prior
And Mike West, Duke of Bayesians, also joined the band
And brought down all the graduates he could hire

Sonia and Maria strapped their laptops to the cause
And Anto, Chris and Peter ran – in thongs!
Sirs Adrian and David came with armour and a horse
While Brad and Gareth murdered battle songs

And one was there, a Spaniard, blonde and fierce and proud
With a passion for statistics and for fun
She’d been there with the founders of the nouveau Bayesian crowd
And kept those Fisher stats folk on the run

But Jim’s subjective prior made him doubt her power to fight
Mike Goldstein said, ‘That girl will never do,
In the heat of battle, deary, you just don’t have the might
This stoush will be too rough for such as you.’

But Berger and Bernardo came to Susie’s side
We think we ought to let her in, they said
For we warrant she’ll be with us when the blood has fairly dried
For Susie is Valencia born and bred.

She did her Bayesian training in the classic Spanish way
Where the stats is twice as hard and twice as rough
And she knows nonparametrics, which is useful in a fray
She’s soft outside, but inside, man she’s tough!

She went. They found those Fisher stats folk sunning on the beach
And as they grabbed their laptops from the sand
Jim Berger muttered fiercely, ‘right, twist any head you reach
We cannot let those Fish get out of hand.’

Alicia, grab a Dirichlet and break them with a stick
Chris, it’s easy, just like ABC
And Sylvia, a mixture model ought to do the trick
But just you leave that Ronnie up to me.

Jose battled them with inference and curdled Neyman’s blood
And posteriors lined like beaches like sandbags for a flood
And Jim threw whiskey bottles as they fled.

And when the Bayesians and the Fishers were washed up on the sand
The fight was almost judged to be a tie
But it was Susie who kept going, who led the final charge
For she didn’t want objective Bayes to die

She sent the beach on fire as she galloped through the fray
Hurling P and F tests through the foam
‘til the Fishers raised surrender and called the fight a day
And shut their laptops down and sailed for home.

And now at ISBA meetings where the Bayesians spend their days
To laugh and learn and share a drink or two
A glass is always toasted: to Susie, Queen of Bayes
And the cheering echoes loudly round the crew.

She will be remembered for setting Bayesian stats on fire
For her contributions to the field are long
And her passion and her laughter will continue to inspire
The Bayesian from Valencia lives on!