mixture post-partum

Now the mixtures estimation workshop is over, I can sit and reflect upon it. It has been an exciting three days and without taking any responsability for it (!), I think the quality of both the talks and the exchanges have been fantastic! I cannot go in detail over every talk in the programme, obviously, but I (sincerely!) fail to spot a single talk that did not bring (me) some new light on mixture (or latent variable) modeling. For instance, even though we had and have somehow divergent views on the topic, I enjoyed immensely the starting session on label switching with John Geweke and Sylvia Früwirth-Schnatter. I actually do not think the divergence is immense: while I consider that most MCMC algorithms fail to converge when they do not reproduce the exchangeability in the posterior and that this lack of convergence is clearly illustrated in the numerical bias found in Chib’s approximation of the evidence, I agree that a rather common occurence sees a simple average over all permutations brings a satisfactory solution. However, the difficulty is in knowing when this occurs: a perfectly switching algorithm is a sure bet for convergence, while a non-switching algorithm is open for interpretation…. Similarly, the large range of talks on non-parametric Bayes (a.k.a. machine learning!) brought to my knowledge new directions in the field, first and foremost thanks to a broad overview provided (twice) by Michael Jordan. I also got a better appreciation of the mixture of experts models and of their connection with mixtures of regression, along with their high potential for social network modelling. More computational talks were equally highly informative, including Olivier Cappé’s work on on-line EM I was not aware of, despite our geographic and thematic proximity! I had heard Paul Fernhead’s talk on sequential Monte Carlo at a previous conference, but this generated new ideas to test the degeneracy effect and see how robust (if any) the aggregation provided by the factorisation through a sufficient statistic was. (With potential applications to the related particle learning which should suffer from the same degeneracy.)

This meeting thus leaves me with postpartum excitement, rather than depression, and with feelings that it will be as profitable to me (and hopefully to the other participants) as were the first two meetings in Aussois (1995) and Edinburgh (2001). Looking forward the 2015 edition in a mountain near you?!

6 Responses to “mixture post-partum”

  1. […] from fighting jetlag (as usual), so I should also let things rest rather than letting a sort of post-meeting melancholia express itself… As after last year JSM… Anyway, here are some of my raw reflections on […]

  2. […] by Wiley! It is a pleasure to flip through the chapters contributed by the participants to the ICMS workshop of about a year ago in Edinburgh. While there may (must) be residual typos, I did not spot any […]

  3. […] Estimation and Applications We have now completed the edition of the book Mixture Estimation and Applications with Kerrie Mengersen and Mike Titterington, made of contributions from participants to  the ICMS […]

  4. […] the ICMS workshop on mixtures that took place in Edinburgh last March, Kerrie Mengersen, Mike Titterington and myself will soon […]

  5. […] for the long flight to Vancouver!) The second book is an edited volume following the exciting meeting on mixtures last March in Edinburgh. I have so far received twelve of the fifteen chapters from the […]

  6. […] earlier talk on language classification], Chris Holmes exposed the advances he had made since Edinburgh (at least for those of us who attended both meetings), highlighting an interesting link with LDA, […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: