Example 7.3: what a mess!

Robert_Casella_RBookA rather obscure question on Metropolis-Hastings algorithms on X Validated ended up being about our first illustration in Introducing Monte Carlo methods with R. And exposing some inconsistencies in the following example… Example 7.2 is based on a [toy] joint Beta x Binomial target, which leads to a basic Gibbs sampler. We thought this was straightforward, but it may confuse readers who think of using Gibbs sampling for posterior simulation as, in this case, there is neither observation nor posterior, but simply a (joint) target in (x,θ).

Example 7.3And then it indeed came out that we had incorrectly written Example 7.3 on the [toy] Normal posterior, using at times a Normal mean prior with a [prior] variance scaled by the sampling variance and at times a Normal mean prior with a [prior] variance unscaled by the sampling variance. I am rather amazed that this did not show up earlier. Although there were already typos listed about that example.Example 7.3 (7.4)

4 Responses to “Example 7.3: what a mess!”

  1. I also notice that the correct conditional distributions should read:
    $\latex\displaystyle \pi(\theta|\mathbf{x},\sigma^2) \propto e^{-\sum_i (x_i – \theta)^2/(2\sigma^2)}e^{-(\theta – \theta_0)^2/(2\tau^2)}$ and $\latex\displaystyle \pi(\sigma^2|\mathbf{x},\theta) \propto \left(\frac{1}{\sigma^2}\right)^{(n + 2a + 2)/2} e^{-\frac{1}{2\sigma^2}}(\sum_i (x_i – \theta)^2 + 2/b)$.

    • The second conditional should read: $\latex \displaystyle \pi(\sigma^2|\mathbf{x},\theta) \propto \left(\frac{1}{\sigma^2}\right)^{(n + 2a + 2)/2} e^{-\frac{1}{2\sigma^2}(\sum_i (x_i – \theta)^2 + 2/b)}$.

  2. Hi,

    First of all, thank you for your blog, is very refreshing to read it and be updated in many different and interesting topics. Second, I want to point out another errata in the foot note of Figure 3.6 page 78 of IMCM with R, I believe it must say “down left/right” in the 3rd and 4rd reference to figures. See you in ISBA!, best wishes!

  3. […] Please comment on the article here: R – Xi’an’s Og […]

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.