Archive for Bayesian Core

label switching by optimal transport: Wasserstein to the rescue

Posted in Books, Statistics, Travel with tags , , , , , , , , , , , , , , on November 28, 2019 by xi'an

A new arXival by Pierre Monteiller et al. on resolving label switching by optimal transport. To appear in NeurIPS 2019, next month (where I will be, but extra muros, as I have not registered for the conference). Among other things, the paper was inspired from an answer of mine on X validated, presumably a première (and a dernière?!). Rather than picketing [in the likely unpleasant weather ]on the pavement outside the conference centre, here are my raw reactions to the proposal made in the paper. (Usual disclaimer: I was not involved in the review of this paper.)

“Previous methods such as the invariant losses of Celeux et al. (2000) and pivot alignments of Marin et al. (2005) do not identify modes in a principled manner.”

Unprincipled, me?! We did not aim at identifying all modes but only one of them, since the posterior distribution is invariant under reparameterisation. Without any bad feeling (!), I still maintain my position that using a permutation invariant loss function is a most principled and Bayesian approach towards a proper resolution of the issue. Even though figuring out the resulting Bayes estimate may prove tricky.

The paper thus adopts a different approach, towards giving a manageable meaning to the average of the mixture distributions over all permutations, not in a linear Euclidean sense but thanks to a Wasserstein barycentre. Which indeed allows for an averaged mixture density, although a point-by-point estimate that does not require switching to occur at all was already proposed in earlier papers of ours. Including the Bayesian Core. As shown above. What was first unclear to me is how necessary the Wasserstein formalism proves to be in this context. In fact, the major difference with the above picture is that the estimated barycentre is a mixture with the same number of components. Computing time? Bayesian estimate?

Green’s approach to the problem via a point process representation [briefly mentioned on page 6] of the mixture itself, as for instance presented in our mixture analysis handbook, should have been considered. As well as issues about Bayes factors examined in Gelman et al. (2003) and our more recent work with Kate Jeong Eun Lee. Where the practical impossibility of considering all possible permutations is processed by importance sampling.

An idle thought that came to me while reading this paper (in Seoul) was that a more challenging problem would be to face a model invariant under the action of a group with only a subset of known elements of that group. Or simply too many elements in the group. In which case averaging over the orbit would become an issue.

mea culpa!

Posted in Books, Kids, R, Statistics, University life with tags , , , , , , on October 9, 2017 by xi'an

An entry about our Bayesian Essentials book on X validated alerted me to a typo in the derivation of the Gaussian posterior..! When deriving the posterior (which was left as an exercise in the Bayesian Core), I just forgot the term expressing the divergence between the prior mean and the sample mean. Mea culpa!!!

a typo that went under the radar

Posted in Books, R, Statistics, University life with tags , , , , , , , on January 25, 2017 by xi'an

A chance occurrence on X validated: a question on an incomprehensible formula for Bayesian model choice: which, most unfortunately!, appeared in Bayesian Essentials with R! Eeech! It looks like one line in our LATEX file got erased and the likelihood part in the denominator altogether vanished. Apologies to all readers confused by this nonsensical formula!

Bayesian Essentials with R [book review]

Posted in Books, R, Statistics, University life with tags , , , , , , , on July 28, 2016 by xi'an

[A review of Bayesian Essentials that appeared in Technometrics two weeks ago, with the first author being rechristened Jean-Michael!]

“Overall this book is a very helpful and useful introduction to Bayesian methods of data analysis. I found the use of R, the code in the book, and the companion R package, bayess, to be helpful to those who want to begin using  Bayesian methods in data analysis. One topic that I would like to see added is the use of Bayesian methods in change point problems, a topic that we found useful in a recent article and which could be added to the time series chapter. Overall this is a solid book and well worth considering by its intended audience.”
David E. BOOTH
Kent State University

solution manual for Bayesian Essentials with R

Posted in Books, Kids, Statistics, University life with tags , , , , , , on March 18, 2015 by xi'an

The solution manual to our Bayesian Essentials with R has just been arXived. If I link this completion with the publication date of the book itself, it sure took an unreasonable time to come out and sadly with no obvious reason or even less justification for the delay… Given the large overlap with the solution manual of the previous edition, Bayesian Core, this version should have been completed much much earlier but, paradoxically if in-line with the lengthy completion of the book istelf, this previous manual is one of the causes for the delay, as we thought the overlap allowed for self-study readers to check some of the exercises. Prodded by Hannah Bracken from Springer-Verlag, and unable to hire an assistant towards this task, I eventually decided to spend the few days required to clean up this solution manual, with the unintentional help from my sorry excuse for an Internet provider who accidentally cutting my home connection for a whole week so far…!

solmanIn the course of writing solutions, I stumbled upon one inexplicably worded exercise about the Lemer-Schur algorithm for testing stationarity, exercise that I had to rewrite from scratch. Apologies to any reader of Bayesian Essentials with R getting stuck on that exercise!!!