The handbook of (recent) advances in Bayesian methods is now out (at the Elsevierian price of $250!) with chapters on Gibbs posteriors [Ryan Martin & Nicolas Syring], martingale distributions [Walker], selective inference [Daniel García Racines & Alastair Young], manifold simulations [Sumio Watanabe], MCMC for GLMMs [Vivek Roy] and multiple testing [Noirrit Chandra and Sourabh Bhattacharya]. (Along with my chapter on 50 shades of Bayesian testing.) Celebrating 102 years for C.R. Rao, one of the three editors of this volume (as well as the series) along with Arni Srivastava Rao and Alastair Young.
Archive for C.R. Rao
advancements in Bayesian methods and implementations
Posted in Books, Statistics, University life with tags Bayesian Analysis, Bayesian testing, C.R. Rao, Elsevier, Gibbs posterior, Handbook of Advancements in Bayesian Methods and Implementation, manifold exploration, martingales, multiple tests on November 10, 2022 by xi'anBlackwell-Rosenbluth Awards 2021
Posted in Statistics, University life with tags Arianna Rosenbluth, awards, Bayesian conference, C.R. Rao, David Blackwell, fencing, ISBA, j-ISBA, MCMC, Metropolis-Hastings algorithm, Rao-Blackwell theorem, University of Warwick on November 1, 2021 by xi'anCongratulations to the winners of the newly created award! This j-ISBA award is intended for junior researchers in different areas of Bayesian statistics. And named after David Blackwell and Arianna Rosenbluth. They will present their work at the newly created JB³ seminars on 10 and 12 November, both at 1pm UTC. (The awards are broken into two time zones, corresponding to the Americas and the rest of the World.)
UTC+0 to UTC+13
Marta Catalano, Warwick University
Samuel Livingstone, University College London
Dootika Vats, Indian Institute of Technology Kanpur
UTC-12 to UTC-1
Trevor Campbell, University of British Columbia
Daniel Kowal, Rice University
Yixin Wang, University of Michigan
RB4MCMC@ISR
Posted in Statistics with tags 100th birthday, arXiv, C.R. Rao, International Statistical Review, Rao-Blackwell theorem, Rao-Blackwellisation, typos, University of Warwick on August 18, 2021 by xi'anOur survey paper on Rao-Blackwellisation (and the first Robert&Roberts published paper!) just appeared on-line as part of the International Statistical Review mini-issue in honour of C.R. Rao on the occasion of his 100th birthday. (With an unfortunate omission of my affiliation with Warwick!). While the papers are unfortunately beyond a paywall, except for a few weeks!, the arXiv version is still available (and presumably with less typos!).
Rao-Blackwellisation in the MCMC era
Posted in Books, Statistics, University life with tags auxiliary variables, birthday, C.R. Rao, conditioning, David Blackwell, demarginalisation, International Statistical Review, MCMC, Monte Carlo Statistical Methods, Rao-Blackwell theorem, Rao-Blackwellisation on January 6, 2021 by xi'anA few months ago, as indicated on this blog, I was contacted by ISR editors to write a piece on Rao-Blackwellisation, towards a special issue celebrating Calyampudi Radhakrishna Rao’s 100th birthday. Gareth Roberts and I came up with this survey, now on arXiv, discussing different aspects of Monte Carlo and Markov Chain Monte Carlo that pertained to Rao-Blackwellisation, one way or another. As I discussed the topic with several friends over the Fall, it appeared that the difficulty was more in setting the boundaries. Than in finding connections. In a way anything conditioning or demarginalising or resorting to auxiliary variates is a form of Rao-Blackwellisation. When re-reading the JASA Gelfand and Smith 1990 paper where I first saw the link between the Rao-Blackwell theorem and simulation, I realised my memory of it had drifted from the original, since the authors proposed there an approximation of the marginal based on replicas rather than the original Markov chain. Being much closer to Tanner and Wong (1987) than I thought. It is only later that the true notion took shape. [Since the current version is still a draft, any comment or suggestion would be most welcomed!]
[The Art of] Regression and other stories
Posted in Books, R, Statistics, University life with tags Aki Vehtari, amazon associates, Americanisms, Andrew Gelman, book review, C.R. Rao, Cambridge University Press, causality, cum grano salis, cup, garden, glm, instrumental variables, Jennifer Hill, non-response, political science, R, robustness, stan_glm, US elections 2016 on July 23, 2020 by xi'anCoI: Andrew sent me this new book [scheduled for 23 July on amazon] of his with Jennifer Hill and Aki Vehtari. Which I read in my garden over a few sunny morns. And as Andrew and Aki are good friends on mine, this review is definitely subjective and biased! Hence to take with a spoonful of salt.
The “other stories’ in the title is a very nice touch. And a clever idea. As the construction of regression models comes as a story to tell, from gathering and checking the data, to choosing the model specifications, to analysing the output and setting the safety lines on its interpretation and usages. I added “The Art of” in my own title as the exercise sounds very much like an art and very little like a technical or even less mathematical practice. Even though the call to the resident stat_glm R function is ubiquitous.
The style itself is very story-like, very far from a mathematical statistics book as, e.g., C.R. Rao’s Linear Statistical Inference and Its Applications. Or his earlier Linear Models which I got while drafted in the Navy. While this makes the “Stories” part most relevant, I also wonder how I could teach from this book to my own undergrad students without acquiring first (myself) the massive expertise represented by the opinions and advice on what is correct and what is not in constructing and analysing linear and generalised linear models. In the sense that I would find justifying or explaining opinionated sentences an amathematical challenge. On the other hand, it would make for a great remote course material, leading the students through the many chapters and letting them experiment with the code provided therein, creating new datasets and checking modelling assumptions. The debate between Bayesian and likelihood solutions is quite muted, with a recommendation for weakly informative priors superseded by the call for exploring the impact of one’s assumption. (Although the horseshoe prior makes an appearance, p.209!) The chapter on math and probability is somewhat superfluous as I hardly fathom a reader entering this book without a certain amount of math and stats background. (While the book warns about over-trusting bootstrap outcomes, I find the description in the Simulation chapter a wee bit too vague.) The final chapters about causal inference are quite impressive in their coverage but clearly require a significant amount of investment from the reader to truly ingest these 110 pages.
“One thing that can be confusing in statistics is that similar analyses can be performed in different ways.” (p.121)
Unsurprisingly, the authors warn the reader about simplistic and unquestioning usages of linear models and software, with a particularly strong warning about significance. (Remember Abandon Statistical Significance?!) And keep (rightly) arguing about the importance of fake data comparisons (although this can be overly confident at times). Great Chapter 11 on assumptions, diagnostics and model evaluation. And terrific Appendix B on 10 pieces of advice for improving one’s regression model. Although there are two or three pages on the topic, at the very end, I would have also appreciated a more balanced and constructive coverage of machine learning as it remains a form of regression, which can be evaluated by simulation of fake data and assessed by X validation, hence quite within the range of the book.
The document reads quite well, even pleasantly once one is over the shock at the limited amount of math formulas!, my only grumble being a terrible handwritten graph for building copters(Figure 1.9) and the numerous and sometimes gigantic square root symbols throughout the book. At a more meaningful level, it may feel as somewhat US centric, at least given the large fraction of examples dedicated to US elections. (Even though restating the precise predictions made by decent models on the eve of the 2016 election is worthwhile.) The Oscar for the best section title goes to “Cockroaches and the zero-inflated negative binomial model” (p.248)! But overall this is a very modern, stats centred, engaging and careful book on the most common tool of statistical modelling! More stories to come maybe?!