As posted in early August from JSM 2010 in Vancouver, StatProb was launched as a way to promote an on-line encyclopedia/wiki with the scientific backup of expert reviewers. This was completely novel and I was quite excited to take part in the venture as a representative of the Royal Statistical Society. Most unfortunately, the separation of the originator of the project, John Kimmel, and of the editor Springer-Verlag (which is backing up the project) a few weeks later put an almost sure stop to the experiment by exposing the lack of incentive in investing a not-inconsiderable amount of our time in editing the entries and the need for part-time operators that would handle LaTeX and other editorial issues… The core of the matter is, I think, that the “reward” in getting involved in the wiki is sadly too limited from an academic perspective to balance the investment (the more because most members of the editorial board were senior researchers). This was clear for instance in the search of a person in charge of the LaTeX aspects of the submissions: I could not find a strong enough reason to convince a younger colleague to dedicate part of his (limitless!) energy to this task, apart from service to the community… So, in the end, and in agreement with the Royal Statistical Society, I have sadly resigned from the board of StatProb along with George Casella and Nando de Freitas.
Archive for JSM 2010
R.I.P. StatProb?
Posted in Books, R, Statistics, University life with tags JSM 2010, RSS, Springer-Verlag, statprob, Vancouver on November 23, 2010 by xi'anBC Stout
Posted in pictures, Wines with tags beer, British Columbia, Burgers and fries, Heroica stout, JSM 2010, Steamworks, Vancouver on October 17, 2010 by xi'anA good hearty stout from Vancouver that seemed to weight on the heads of those two fellows pictured in the Duke Statistics Alumni Newsletter…(Only seemed since we had not even had a taste then. Presumably the aftereffects of Burgers and fries.)
Parallel processing of independent Metropolis-Hastings algorithms
Posted in R, Statistics, University life with tags Chris Holmes, cloud computing, GPU, importance sampling, JSM 2010, Metropolis-Hastings, parallelisation, Rao-Blackwellisation, Valencia 9 on October 12, 2010 by xi'anWith Pierre Jacob, my PhD student, and Murray Smith, from National Institute of Water and Atmospheric Research, Wellington, who actually started us on this project at the last and latest Valencia meeting, we have completed a paper on using parallel computing in independent Metropolis-Hastings algorithms. The paper is arXived and the abstract goes as follows:
In this paper, we consider the implications of the fact that parallel raw-power can be exploited by a generic Metropolis–Hastings algorithm if the proposed values are independent. In particular, we present improvements to the independent Metropolis–Hastings algorithm that significantly decrease the variance of any estimator derived from the MCMC output, for a null computing cost since those improvements are based on a fixed number of target density evaluations. Furthermore, the techniques developed in this paper do not jeopardize the Markovian convergence properties of the algorithm, since they are based on the Rao–Blackwell principles of Gelfand and Smith (1990), already exploited in Casella and Robert 91996), Atchadé and Perron (2005) and Douc and Robert (2010). We illustrate those improvement both on a toy normal example and on a classical probit regression model but insist on the fact that they are universally applicable.
I am quite excited about the results in this paper, which took advantage of (a) older works of mine on Rao-Blackwellisation, (b) Murray’s interests in costly likelihoods, and (c) our mutual excitement when hearing about GPU parallel possibilities from Chris Holmes’ talk in Valencia. (As well as directions drafted in an exciting session in Vancouver!) The (free) gains over standard independent Metropolis-Hastings estimates are equivalent to using importance sampling gains, while keeping the Markov structure of the original chain. Given that 100 or more parallel threads can be enhanced from current GPU cards, this is clearly a field with much potential! The graph below
gives the variance improvements brought by three Rao-Blackwell estimates taking advantage of parallelisation over the initial MCMC estimate (first entry) with the importance sampling estimate (last entry) using only 10 parallel threads.
JSM 2011
Posted in Statistics, Travel, University life with tags ASA, Bayesian model choice, IMS, ISBA, JSM 2010, JSM 2011, Miami, philosophy of sciences on September 20, 2010 by xi'anWe are hardly back from JSM 2010 and we have to think forward to JSM 2011! On Saturday morning I got both news that the session I had proposed on behalf of ISBA was accepted by the ASA as an invited session on Bayesian Model Assessment (with Merlise Clyde, Andrew Gelman, Feng Liang and Jean-Michel Marin) and that the session on Controversies in the philosophy of Bayesian statistics proposed by Andrew Gelman, in which he had kindly invited me (along with Jim Berger, Rob Kass and Cosma Shalizi), was also accepted as an invited session. (I am not particularly attracted by the beaches of Miami!, but JSM is a good opportunity to interact with north America statisticians and to gather some of the on-going trends in the field…)
JSM 2010 [end]
Posted in Books, Statistics, Travel, University life with tags Erich Lehmann, JSM 2010, Vancouver on August 6, 2010 by xi'anOn Wednesday morning, before boarding my plane to San Francisco, I attended the first two talks in the Erich Lehmann memorial session. The first talk by Juliet Shaffer related Lehmann’s work on multiple testing to the recent developments on FDRs and FNRs. In particular, she mentioned the decision theoretic foundations of those false discovery indicators, but seemed unaware of our 2005 JASA paper with Peter Müller, Giovanni Parmigiani and Judith Rousseau where we set a decision-theoretic framework able to handle all four indicators. P
eter Bickel surveyed the works of Erich Lehmann in a very personal and compelling way. I have always considered both books by Lehmann on estimation and testing as major references. And still thinks students of statistics should be exposed to them. A nitpicking remark about Peter Bickel’s biography of Erich Lehmann: he mentioned that Lehmann was born in Strasbourg, France, during German occupation in the first World War, while this was actually Germany, annexed since the 1870 war… Sadly, I missed Persi Diaconis‘ talk for fear of missing my flight (only to discover once I had boarded the plane that the pilots were 90 minutes away!!!)
Overall, I have mixed feelings about the meeting: I met very interesting people and heard a few talks that gave me food for thought, but feel that the scientific tension I brought back from Washington D.C. last year was not palpable in Vancouver. (Maybe my fault for waking up too early for keeping my concentration over the day, as others did find the meeting exciting! Still, I went to many sessions with very little attendees and had a hard time with filling my schedule…)