Archive for Bayesian textbook

terrible graph and chilies [not a book review]

Posted in Books, Kids, pictures, Statistics, University life with tags , , , , , , on January 29, 2024 by xi'an

A question on X validated led me to this Bayesian book with a chill cover (except that it first made me seek a word from the chili sequence!), because of a graph within that confused the OP for that question. Here is the graph:

It represents three *prior* densities at once, namely the (uniform) prior density of a Binomial probability θ, the prior density of its transform θ², and the prior density of the other transform θ¹⁰. Which makes no sense since the first axis is indexed simultaneously by values of the three random variables. Meaning a particular index like 0.4 corresponds to three values of θ, namely θ=0.4 and θ²=0.4 and θ¹⁰=0.4… In other words, the “probability of event occurring, f(θ), corresponds to *three different* events and *three different* f’s. Another needless confusion is that the red and dashed density curves appeared as everywhere above one another, which is impossible since they both are probability densities. And the boxed legend does not help

Bayesian inference from the ground up [no book review]

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , , , , , , on November 7, 2023 by xi'an

bayess’ back! [on CRAN]

Posted in Books, R, Statistics, University life with tags , , , , , , , on September 22, 2022 by xi'an

deterministic moves in Metropolis-Hastings

Posted in Books, Kids, R, Statistics with tags , , , , , , , , on July 10, 2020 by xi'an

A curio on X validated where an hybrid Metropolis-Hastings scheme involves a deterministic transform, once in a while. The idea is to flip the sample from one mode, ν, towards the other mode, μ, with a symmetry of the kind

μ-α(x+μ) and ν-α(x+ν)

with α a positive coefficient. Or the reciprocal,

-μ+(μ-x)/α and -ν+(ν-x)/α

for… reversibility reasons. In that case, the acceptance probability is simply the Jacobian of the transform to the proposal, just as in reversible jump MCMC.

Why the (annoying) Jacobian? As explained in the above slides (and other references), the Jacobian is there to account for the change of measure induced by the transform.

Returning to the curio, the originator of the question had spotted some discrepancy between the target and the MCMC sample, as the moments did not fit well enough. For a similar toy model, a balanced Normal mixture, and an artificial flip consisting of

x’=±1-x/2 or x’=±2-2x

implemented by

  u=runif(5)
  if(u[1]<.5){
    mhp=mh[t-1]+2*u[2]-1
    mh[t]=ifelse(u[3]<gnorm(mhp)/gnorm(mh[t-1]),mhp,mh[t-1])
  }else{
    dx=1+(u[4]<.5)
    mhp=ifelse(dx==1,
               ifelse(mh[t-1]<0,1,-1)-mh[t-1]/2,
               2*ifelse(mh[t-1]<0,-1,1)-2*mh[t-1])
    mh[t]=ifelse(u[5]<dx*gnorm(mhp)/gnorm(mh[t-1])/(3-dx),mhp,mh[t-1])

I could not spot said discrepancy beyond Monte Carlo variability.

free Springer textbooks [incl. Bayesian essentials]

Posted in Statistics with tags , , , , , , , , , on May 4, 2020 by xi'an