## Archive for Bayesian textbook

## bayess’ back! [on CRAN]

Posted in Books, R, Statistics, University life with tags Bayesian Essentials with R, Bayesian textbook, CRAN, package, R, update, verion 1.5, version on September 22, 2022 by xi'an## deterministic moves in Metropolis-Hastings

Posted in Books, Kids, R, Statistics with tags Bayesian textbook, change of variables, cross validated, deterministic mixtures, Jacobian, MCMC, Metropolis-Hastings algorithm, Monte Carlo Statistical Methods, reversible jump MCMC on July 10, 2020 by xi'an**A** curio on X validated where an hybrid Metropolis-Hastings scheme involves a deterministic transform, once in a while. The idea is to flip the sample from one mode, ν, towards the other mode, μ, with a symmetry of the kind

μ-α(x+μ) and ν-α(x+ν)

with α a positive coefficient. Or the reciprocal,

-μ+(μ-x)/α and -ν+(ν-x)/α

for… reversibility reasons. In that case, the acceptance probability is simply the Jacobian of the transform to the proposal, just as in reversible jump MCMC.

Why the (annoying) Jacobian? As explained in the above slides (and other references), the Jacobian is there to account for the change of measure induced by the transform.

Returning to the curio, the originator of the question had spotted some discrepancy between the target and the MCMC sample, as the moments did not fit well enough. For a similar toy model, a balanced Normal mixture, and an artificial flip consisting of

x’=±1-x/2 or x’=±2-2x

implemented by

u=runif(5) if(u[1]<.5){ mhp=mh[t-1]+2*u[2]-1 mh[t]=ifelse(u[3]<gnorm(mhp)/gnorm(mh[t-1]),mhp,mh[t-1]) }else{ dx=1+(u[4]<.5) mhp=ifelse(dx==1, ifelse(mh[t-1]<0,1,-1)-mh[t-1]/2, 2*ifelse(mh[t-1]<0,-1,1)-2*mh[t-1]) mh[t]=ifelse(u[5]<dx*gnorm(mhp)/gnorm(mh[t-1])/(3-dx),mhp,mh[t-1])

I could not spot said discrepancy beyond Monte Carlo variability.

## free Springer textbooks [incl. Bayesian essentials]

Posted in Statistics with tags All of Statistics, Bayesian Essentials with R, Bayesian textbook, coronavirus epidemics, lockdown, Nature, quarantine, R, Springer-Verlag, textbook on May 4, 2020 by xi'an## dominating measure

Posted in Books, pictures, Statistics, Travel, University life with tags Bayesian textbook, Carnegie Mellon University, conjugate priors, cross validated, dominating measure, Jay Kadane, Pittsburgh, posterior distribution on March 21, 2019 by xi'an**Y**et another question on X validated reminded me of a discussion I had once with Jay Kadane when visiting Carnegie Mellon in Pittsburgh. Namely the fundamentally ill-posed nature of conjugate priors. Indeed, when considering the definition of a conjugate family as being a parameterised family Þ of distributions over the parameter space Θ stable under transform to the posterior distribution, this property is completely dependent (if there is such a notion as completely dependent!) on the dominating measure adopted on the parameter space Θ. Adopted is the word as there is no default, reference, natural, &tc. measure that promotes one specific measure on Θ as being *the* dominating measure. This is a well-known difficulty that also sticks out in most “objective Bayes” problems, as well as with maximum entropy priors. This means for instance that, while the Gamma distributions constitute a conjugate family for a Poisson likelihood, so do the truncated Gamma distributions. And so do the distributions which density (against a Lebesgue measure over an arbitrary subset of (0,∞)) is the product of a Gamma density by an arbitrary function of θ. I readily acknowledge that the standard conjugate priors as introduced in every Bayesian textbook are standard because they facilitate (to a certain extent) posterior computations. But, just like there exist an infinity of MaxEnt priors associated with an infinity of dominating measures, there exist an infinity of conjugate families, once more associated with an infinity of dominating measures. And the fundamental reason is that the sampling model (which induces the shape of the conjugate family) does not provide a measure on the parameter space Θ.

## I’m getting the point

Posted in Statistics with tags Bayesian statistics, Bayesian textbook, conjugate priors, cross validated, final exam, StackExchange, teaching on February 14, 2019 by xi'an**A** long-winded X validated discussion on the [textbook] mean-variance conjugate posterior for the Normal model left me [mildly] depressed at the point and use of answering questions on this forum. Especially as it came at the same time as a catastrophic outcome for my mathematical statistics exam. Possibly an incentive to quit X validated as one quits smoking, although this is not the first attempt…