The deadline for the discussions on the talk by Paul Fearnhead and Dennis Prangle is next Monday, so all potential discussants are invited to send their 400 word piece to the Royal Statistical Society. I have now written two discussions along most of the points I had prepared, for my oral discussion. I am now collecting written contributions from different authors to compile an arXiv document as on earlier occasions.
Archive for Errol Street
Today, I took part in the Read Paper session of the Royal Statistical Society, first by presenting an overview of MCMC methods, second by giving a short discussion on the paper by Mark Girolami and Ben Calderhead. The pre-ordinary as well as the ordinary sessions were very well-attended and it is a real pity that this was the first instance I attended when the talk was not given in the main lecture room. (Which, sadly enough, was already booked.) Instead, the meeting took place in the twice-as-small Council room which means people had to remain standing for the whole session… Anyhow, Mark Girolami gave two great talks where the geometric intuition was predominant. The following 13 oral discussions were quite diverse, from machine learning to Bayesian model choice, to infinite dimensional simulation and I am convinced the written discussion will be even richer. (Discussions have to be sent before October 27.) Here are my own slides focussing on the discretisation issue.
There will be an RSS Read Paper session on October 13 given by Marc Girolami and B. Calderhead on Riemann manifold Langevin and Hamiltonian Monte Carlo methods that I definitely plan to attend. Here is the abstract:
The paper proposes Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold to resolve the shortcomings of existing Monte Carlo algorithms when sampling from target densities that may be high dimensional and exhibit strong correlations. The methods provide fully automated adaptation mechanisms that circumvent the costly pilot runs that are required to tune proposal densities for Metropolis-Hastings or indeed Hamiltonian Monte Carlo and Metropolis adjusted Langevin algorithms. This allows for highly efficient sampling even in very high dimensions where different scalings may be required for the transient and stationary phases of the Markov chain. The methodology proposed exploits the Riemann geometry of the parameter space of statistical models and thus automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density. The performance of these Riemann manifold Monte Carlo methods is rigorously assessed by performing inference on logistic regression models, log-Gaussian Cox point processes, stochastic volatility models and Bayesian estimation of dynamic systems described by non-linear differential equations. Substantial improvements in the time-normalized effective sample size are reported when compared with alternative sampling approaches. MATLAB code that is available from the authors allows replication of all the results reported.
and as usual (400 word) comments can be submitted without any restriction.