So this is the second meeting on computational statistics in a row for me and several other participants! Now in Edinburgh, in the terrific location of the ICMS, near the University of Edinburgh. The overlap with the previous meeting in Bristol is actually very limited and I only saw yesterday a talk by Nial Friel I had already heard in Bristol (plus one from Jim Hobert he delivered in Banff!). And of course most of the participants were not in Bristol, so got the most from these talks. The day went on quite smoothly and quickly, despite the tight schedule, and we managed to keep to this schedule within the five minute confidence band… Gareth Roberts gave the first talk of the day on an overview of convergence speeds for Gibbs samplers that insisted on the importance of the decomposition of the model into hierarchical components. In connection with one of Xiao-Li Meng’s favourite themes of getting different convergence behaviours for different conditional decompositions. Christophe Andrieu and Jim Hobert kept to the same theme of convergence properties of MCMC samplers, Christophe developing a recent work about using two Lyapounov control functions to assess adaptive MCMC. The second theme of the day was connected with normalising constants, with Yves Atchadé expanding on path sampling to construct confidence evaluations and Niel Friel comparing auxiliary variable techniques with ABC approximations. (The path sampling equality is a magical mystery to me: magical because the equality is true, mystery because the implementation depends very much on calibration choices that are both delicate and influential. Yves addressed the impact of the discretisation in the error.) Nicolas Chopin also considered approximation impacts on long-memory process estimation, where I think ABC could come as a calibration (something we have to discuss at CREST when we are back). Omiros Papaspiliopoulos gave his talk on the same paper Gareth presented in Banff and Bristol, but using his own perspective, which made the presentation quite worthwhile. Darren Wilkinson and Andrew Golightly talked about the complexity of conducting inference for biochemical Markov processes relating with SDE’s, again evaluating the impact of approximations. Andrew covered in particular a delayed rejection or rather acceptance method where a substitute is avoiding computing the complex target by rejecting the most unlikely values (with the drawback of having two acceptance steps) Maria de Iorio introduced us to metabonomics (“the latest of the onimcs”!) with models that relate to spectral analysis (and thus reminded me of some astronomy models) and to wavelets (for the background noise), the estimation procedure seemingly related to source separation techniques found in signal processing (?). And Dan Lawson ended up the first day session with a fairly original presentation of the Dirichlet process in population genetics: this was the first talk I ever saw with the picture of a vomiting monster (see below for the Warhammer monster)…! During the poster session, Ian Murray provided us with a quick explanation of his work on expanding MCMC validation behind proper random generators, a paper I wanted to discuss here. And will certainly now that it has been pre-processed for me!