**W**hile in Denver, at JSM, I came across [across validated!] this primarily challenging problem of finding the posterior of the 10³ long probability vector of a Multinomial M(10⁶,p) when only observing the range of a realisation of M(10⁶,p). This sounded challenging because the distribution of the pair (min,max) is not available in closed form. (Although this allowed me to find a paper on the topic by the late Shanti Gupta, who was chair at Purdue University when I visited 32 years ago…) This seemed to call for ABC (especially since I was about to give an introductory lecture on the topic!, law of the hammer…), but the simulation of datasets compatible with the extreme values of both minimum and maximum, m=80 and M=12000, proved difficult when using a uniform Dirichlet prior on the probability vector, since these extremes called for both small and large values of the probabilities. However, I later realised that the problem could be brought down to a Multinomial with only three categories and the observation (m,M,n-m-M), leading to an obvious Dirichlet posterior and a predictive for the remaining 10³-2 realisations.

## Archive for Colorado

## a problem that did not need ABC in the end

Posted in Books, pictures, Statistics, Travel with tags ABC, Approximate Bayesian computation, Colorado, cross validated, dawn, Denver, high rise, introductory opening lecture, jatp, JSM 2019, law of the hammer, multinomial distribution, predictive on August 8, 2019 by xi'an## Denver snapshot [jatp]

Posted in pictures, Travel, Wines with tags ÌPA, beer, Belgian beer, Colorado, Denver, Fat Tyre, jatp, JSM 2019, USA, Voodoo Ranger IPA on July 28, 2019 by xi'an## Introductory overview lecture: the ABC of ABC [JSM19 #1]

Posted in Statistics with tags ABC, American Statistical Association, Approximate Bayesian computation, approximate Bayesian inference, causal inference, Colorado, Denver, evidence, forensic statistics, Joint Statistical Meeting, JSM 2019, lecture on July 28, 2019 by xi'an**H**ere are my slides [more or less] for the introductory overview lecture I am giving today at JSM 2019, 4:00-5:50, CC-Four Seasons I. There is obviously quite an overlap with earlier courses I gave on the topic, although I refrained here from mentioning any specific application (like population genetics) to focus on statistical and computational aspects.

Along with the other introductory overview lectures in this edition of JSM:

- Sunday 28, 2:00-3:50, CC-Four Seasons I: CSI at the JSM: Forensic Statistics and the Value of Scientific Evidence in Court by Hal Stern (University of California, Irvine)
- Monday 29, 8:30-10:20, CC-205: Assessing Procedures vs. Assessing Evidence by Michael Levine (University of Massachusetts, Amherst)
- Monday 29, 2:00_3:50, CC-205: Causal inference in modern statistics by Jennifer Hill (New York University)and Avi Feller (UC Berkeley)
- Tuesday 30, 8:30-10:20, CC-205: Modern Risk Analysis by Walter Piergorsch (University of Arizona) and David Banks (Duke University)

## off to Denver! [JSM2019]

Posted in Statistics with tags ABC, American Statistical Association, Colorado, Denver, Fort Collins, JSM 2019, Long Peak, Richard Tweedie, Rockies, USA on July 27, 2019 by xi'an**A**s off today, I am attending JSM 2019 in Denver, giving an “Introductory Overview Lecture” on The ABC of Approximate Bayesian Computation on Sunday afternoon and chairing an ABC session on Monday morning. As far as I know these are the only ABC sessions at JSM this year… And hence the only sessions I will be attending. (I have not been to Denver and the area since 1993, when I visited Kerrie Mengersen and Richard Tweedie in Fort Collins. And hiked up to Long Peak with Gerard. Alas, no time for climbing in the Rockies this time.)

## amazing appendix

Posted in Books, Statistics, Travel, University life with tags auxiliary variable, Colorado, Fort Collins, Gibbs sampler, Julian Besag, MCMC, Metropolis-within-Gibbs algorithm, Monte Carlo Statistical Methods, Oxford, random simulation, simulation, Statistical Science on February 13, 2018 by xi'an**I**n the first appendix of the 1995 Statistical Science paper of Besag, Green, Higdon and Mengersen, on MCMC, “Bayesian Computation and Stochastic Systems”, stands a fairly neat result I was not aware of (and which Arnaud Doucet, with his unrivalled knowledge of the literature!, pointed out to me in Oxford, avoiding me the tedium to try to prove it afresco!). I remember well reading a version of the paper in Fort Collins, Colorado, in 1993 (I think!) but nothing about this result.

It goes as follows: when running a Metropolis-within-Gibbs sampler for component x¹ of a collection of variates x¹,x²,…, thus aiming at simulating from the full conditional of x¹ given x⁻¹ by making a proposal q(x|x¹,x⁻¹), it is perfectly acceptable to use a proposal that depends on a parameter α (no surprise so far!) *and* to generate this parameter α anew at each iteration (still unsurprising as α can be taken as an auxiliary variable) *and* to have the distribution of this parameter α depending on the other variates x²,…, i.e., x⁻¹. This is the surprising part, as adding α as an auxiliary variable was messing up the update of x⁻¹. But the proof as found in the 1995 paper [page 35] does not require to consider α as such as it establishes global balance directly. (Or maybe still detailed balance when writing the whole Gibbs sampler as a cycle of Metropolis steps.) Terrific! And a whiff mysterious..!