## expectation-propagation from Les Houches

**A**s CHANCE book editor, I received the other day from Oxford University Press acts from an École de Physique des Houches on Statistical Physics, Optimisation, Inference, and Message-Passing Algorithms that took place there in September 30 – October 11, 2013. While it is mostly unrelated with Statistics, and since Igor Caron already reviewed the book a year and more ago, I skimmed through the few chapters connected to my interest, from Devavrat Shah’s chapter on graphical models and belief propagation, to Andrea Montanari‘s denoising and sparse regression, including LASSO, and only read in some detail Manfred Opper’s expectation propagation chapter. This paper made me realise (or re-realise as I had presumably forgotten an earlier explanation!) that expectation propagation can be seen as a sort of variational approximation that produces by a sequence of iterations the distribution within a certain parametric (exponential) family that is the closest to the distribution of interest. By writing the Kullback-Leibler divergence the opposite way from the usual variational approximation, the solution equates the expectation of the natural sufficient statistic under both models… Another interesting aspect of this chapter is the connection with estimating normalising constants. (I noticed a slight typo on p.269 in the final form of the Kullback approximation q() to p().

February 3, 2016 at 11:22 am

There is a great review of the topic of variational approximations by Wainwright and Jordan. They describe a unified variational framework, give connections to exponential families, and show that Belief propagation and EP are simply special cases within there.

https://www.eecs.berkeley.edu/~wainwrig/Papers/WaiJor08_FTML.pdf

We had a reading group at Gatsby about this, here are some slides (from March 23 2015):

http://wittawat.com/mljc/doku.php

February 3, 2016 at 5:33 pm

Thank you Heiko!