comments on Watson and Holmes


“The world is full of obvious things which nobody by any chance ever observes.” The Hound of the Baskervilles

In connection with the incoming publication of James Watson’s and Chris Holmes’ Approximating models and robust decisions in Statistical Science, Judith Rousseau and I wrote a discussion on the paper that has been arXived yesterday.

“Overall, we consider that the calibration of the Kullback-Leibler divergence remains an open problem.” (p.18)

While the paper connects with earlier ones by Chris and coauthors, and possibly despite the overall critical tone of the comments!, I really appreciate the renewed interest in robustness advocated in this paper. I was going to write Bayesian robustness but to differ from the perspective adopted in the 90’s where robustness was mostly about the prior, I would say this is rather a Bayesian approach to model robustness from a decisional perspective. With definitive innovations like considering the impact of posterior uncertainty over the decision space, uncertainty being defined e.g. in terms of Kullback-Leibler neighbourhoods. Or with a Dirichlet process distribution on the posterior. This may step out of the standard Bayesian approach but it remains of definite interest! (And note that this discussion of ours [reluctantly!] refrained from capitalising on the names of the authors to build easy puns linked with the most Bayesian of all detectives!)

2 Responses to “comments on Watson and Holmes”

  1. Dan Simpson Says:

    This is an interesting point about the “big information” limit. My intuition would be that in that case KL neibourhoods are perfect, as the posterior converges to the nearest point in KL divergence from the generating mechanism. So they’re exploring neighbourhoods of the right shape asymptotically. But you can never know if you’re going far enough. And one of the big problems with things based on KL balls is that they don’t have an intuitive scale. We got around that in a very boring case by imposing a notion of scale in a better parameterisation and transforming back, but that’s probably not an extendable idea.

    Btw – arxiv’s latex engine appears to have eaten the reference to our paper (thanks!) and left it out of the bibliography…

    • Missing reference: not even arXiv’s fault! I forgot to switch from plain text Simpson et al. (2015) to \cite{simpson:etal:2015}… Sorry!

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.