MCMSki IV [day 1]
The first day of MCMSki IV went by rather smoothly, I do believe, as most speakers were there, with just a dozen participants missing. And no-one broke a limb or vanished over a cliff when skiing. The only trouble I had was to pick between the parallel sessions, a definitive drawback of the format I had supported from the start… Any proposal for a guest post from participants is welcomed!!!
Chris Holmes gave an exciting first plenary talk, in the spirit of the “substitute” prior talk Stephen Walker had given in London for Bayes 250. This time, Chris talked about robustifying Bayesian inference by creating Kullback-Leibler neighbourhoods and taking least favourable priors within these neighbourhoods, least favourable in a decision-theoretic sense based on a loss function. While this approach has the drawback of being based upon a minimax principle, and requires the determination of a loss function, I find it nonetheless very appealing. And recovering tempered distributions is just cool! (The paper should be arXived within days.)
I then went to the Convergence of MCMC algorithms session, with Gallin Jones and Jim Hobert presenting uniform ergodicity results. Jim analysed the approach to the logistic regression model advocated by Nick Polson, James Scott, and Jesse Windle, which brings a parallel to the older latent variable solution of Albert and Chib (1993). Krys Łatuszyński showed that the spectral gap does not vary between the deterministic, random and random order Gibbs samplers (with a maybe tongue-in-cheek title Solidarity in the title that could refer to the iconic Solidarność… Or not.) And Éric Moulines established the uniformly ergodicity of a particle filter algorithm, showing that the number of particles had to grow as T1+ε for any positive ε. (And managed to get the paper arXived today!)
January 7, 2014 at 7:56 am
I thaught Chris was talking about a neighborhood around the approximation of the posterior, and not around the prior.