## Archive for Bayesian Essentials with R

## free Springer textbooks [incl. Bayesian essentials]

Posted in Statistics with tags All of Statistics, Bayesian Essentials with R, Bayesian textbook, coronavirus epidemics, lockdown, Nature, quarantine, R, Springer-Verlag, textbook on May 4, 2020 by xi'an## ENSEA & CISEA 2019

Posted in Books, pictures, Statistics, Travel, University life with tags ., Abidjan, Africa, Bayesian Essentials with R, CISEA 2019, econometrics, ENSAE, ENSEA, Francophonie, Ivory Coast, The Bayesian Choice, West Africa on June 26, 2019 by xi'anI found my (short) trip to Abdijan for the CISEA 2019 conference quite fantastic as it allowed me to meet with old friends, from the earliest days at CREST and even before, and to meet new ones. Including local students of ENSEA who had taken a Bayesian course out of my Bayesian Choice book. And who had questions about the nature of priors and the difficulty they had in accepting that several replies were possible with the same data! I wish I had had more time to discuss the relativity of Bayesian statements with them but this was a great and rare opportunity to find avid readers of my books! I also had a long chat with another student worried about the use or mis-use of reversible jump algorithms to draw inference on time-series models in Bayesian Essentials, chat that actually demonstrated his perfect understanding of the matter. And it was fabulous to meet so many statisticians and econometricians from West Africa, most of them French-speaking. My only regret is not having any free time to visit Abidjan or the neighbourhood as the schedule of the conference did not allow for it [or even for a timely posting of a post!], especially as it regularly ran overtime. (But it did provide for a wide range of new local dishes that I definitely enjoyed tasting!) We are now discussing further opportunities to visit there, e.g. by teaching a short course at the Master or PhD levels.

## mea culpa!

Posted in Books, Kids, R, Statistics, University life with tags Bayesian Analysis, Bayesian Core, Bayesian Essentials with R, Book, cross validated, Gaussian model, typo on October 9, 2017 by xi'an**A**n entry about our Bayesian Essentials book on X validated alerted me to a typo in the derivation of the Gaussian posterior..! When deriving the posterior (which was left as an exercise in the Bayesian Core), I just forgot the term expressing the divergence between the prior mean and the sample mean. Mea culpa!!!

## relabelling in Bayesian mixtures by pivotal units

Posted in Statistics with tags Bayesian Essentials with R, finite mixtures, label switching, relabelling, ResearchGate on September 14, 2017 by xi'an**Y**et another paper on relabelling for mixtures, when one would think everything and more has already be said and written on the topic… This one appeared in Statistics and Computing last August and I only became aware of it through ResearchGate which sent me an unsolicited email that this paper quoted one of my own papers. As well as Bayesian Essentials.

The current paper by

The next part of the paper compares this approach with seven other solutions found in the literature, from Matthew Stephens’ (2000) to our permutation reordering. Which does pretty well in terms of MSE in the simulation study (see the massive Table 3) while being much cheaper to implement than the proposed pivotal relabelling (Table 4). And which, contrary to the authors’ objection, *does not require* the precise computation of the MAP since, as indicated in our paper, the relative maximum based on the MCMC iterations can be used as a proxy. I am thus less than convinced at the improvement brought by this alternative…

## a typo that went under the radar

Posted in Books, R, Statistics, University life with tags Bayesian Core, Bayesian Essentials with R, Bayesian model choice, cross validated, Jean-Michel Marin, model posterior probabilities, R, typos on January 25, 2017 by xi'an**A** chance occurrence on X validated: a question on an incomprehensible formula for Bayesian model choice: which, most unfortunately!, appeared in Bayesian Essentials with R! Eeech! It looks like one line in our *L ^{A}T_{E}X* file got erased and the likelihood part in the denominator altogether vanished. Apologies to all readers confused by this nonsensical formula!

## tractable Bayesian variable selection: beyond normality

Posted in R, Statistics, University life with tags Bayesian Essentials with R, calibration, marginal density, maximum likelihood estimation, parametric family, R, two-piece error model, University of Warwick on October 17, 2016 by xi'an**D**avid Rossell and Francisco Rubio (both from Warwick) arXived a month ago a paper on non-normal variable selection. They use two-piece error models that preserve manageable inference and allow for simple computational algorithms, but also characterise the behaviour of the resulting variable selection process under model misspecification. Interestingly, they show that the existence of asymmetries or heavy tails leads to power losses when using the Normal model. The two-piece error distribution is made of two halves of location-scale transforms of the same reference density on the two sides of the common location parameter. In this paper, the density is either Gaussian or Laplace (i.e., exponential?). In both cases the (log-)likelihood has a nice compact expression (although it does not allow for a useful sufficient statistic). One is the L¹ version versus the other which is the L² version. Which is the main reason for using this formalism based on only two families of parametric distributions, I presume. (As mentioned in an earlier post, I do not consider those distributions as mixtures because the component of a given observation can always be identified. And because as shown in the current paper, maximum likelihood estimates can be easily derived.) The prior construction follows the non-local prior principles of Johnson and Rossell (2010, 2012) also discussed in earlier posts. The construction is very detailed and hence highlights how many calibration steps are needed in the process.

“Bayes factor rates are the same as when the correct model is assumed [but] model misspecification often causes a decrease in the power to detect truly active variables.”

When there are too many models to compare at once, the authors propose a random walk on the finite set of models (which does not require advanced measure-theoretic tools like reversible jump MCMC). One interesting aspect is that moving away from the normal to another member of this small family is driven by the density of the data under the marginal densities, which means moving only to interesting alternatives. But also sticking to the normal only for adequate datasets. In a sense this is not extremely surprising given that the marginal likelihoods (model-wise) are available. It is also interesting that on real datasets, one of the four models is heavily favoured against the others, be it Normal (6.3) or Laplace (6.4). And that the four model framework returns almost identical values when compared with a single (most likely) model. Although not immensely surprising when acknowledging that the frequency of the most likely model is 0.998 and 0.998, respectively.

“Our framework represents a middle-ground to add flexibility in a parsimonious manner that remains analytically and computationally tractable, facilitating applications where either p is large or n is too moderate to fit more flexible models accurately.”

Overall, I find the experiment quite conclusive and do not object [much] to this choice of parametric family in that it is always more general and generic than the sempiternal Gaussian model. That we picked in our Bayesian Essentials, following tradition. In a sense, it would be natural to pick the most general possible parametric family that allows for fast computations, if this notion does make any sense…

## Bayesian Essentials with R [book review]

Posted in Books, R, Statistics, University life with tags Bayesian Core, Bayesian Essentials with R, book review, Jean-Michel Marin, Kent State University, R, Technometrics, time series on July 28, 2016 by xi'an[A review of Bayesian Essentials that appeared in Technometrics two weeks ago, with the first author being rechristened Jean-Michael!]

“Overall this book is a very helpful and useful introduction to Bayesianmethods of data analysis. I found the use of R, the code in the book, and thecompanion R package, bayess, to be helpful to those who want to begin usingBayesian methods in data analysis. One topic that I would like to see added isthe use of Bayesian methods in change point problems, a topic that we founduseful in a recent article and which could be added to the time series chapter.Overall this is a solid book and well worth considering by its intended audience.”

David E. BOOTH

Kent State University