Archive for ISBA

an extra day for registering for ISBA²²

Posted in Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , on April 15, 2022 by xi'an

O’Bayes 2022 in UC Santa X

Posted in Statistics with tags , , , , , , , , , on March 4, 2022 by xi'an

Bayes Comp 2023

Posted in Mountains, Statistics, Travel, University life with tags , , , , , , , , , , on November 23, 2021 by xi'an

The official website for Bayes Comp 2023, taking place in Levi, Northern Finland, 15-17 March 2023, is on! And it’s beautiful.

Blackwell-Rosenbluth Awards 2021

Posted in Statistics, University life with tags , , , , , , , , , , , on November 1, 2021 by xi'an

Congratulations to the winners of the newly created award! This j-ISBA award is intended for junior researchers in different areas of Bayesian statistics. And named after David Blackwell and Arianna  Rosenbluth. They will present their work at the newly created JB³ seminars on 10 and 12 November, both at 1pm UTC. (The awards are broken into two time zones, corresponding to the Americas and the rest of the World.)

UTC+0 to UTC+13

Marta Catalano, Warwick University
Samuel Livingstone, University College London
Dootika Vats, Indian Institute of Technology Kanpur

UTC-12 to UTC-1

Trevor Campbell, University of British Columbia
Daniel Kowal, Rice University
Yixin Wang, University of Michigan

empirically Bayesian [wISBApedia]

Posted in Statistics with tags , , , , , , , on August 9, 2021 by xi'an

Last week I was pointed out a puzzling entry in the “empirical Bayes” Wikipedia page. The introduction section indeed contains a description of an iterative simulation method that involves an hyperprior p(η) even though the empirical Bayes perspective does not involve an hyperprior.

While the entry is vague and lacks formulae

These suggest an iterative scheme, qualitatively similar in structure to a Gibbs sampler, to evolve successively improved approximations to p(θy) and p(ηy). First, calculate an initial approximation to p(θy) ignoring the η dependence completely; then calculate an approximation to p(η | y) based upon the initial approximate distribution of p(θy); then use this p(ηy) to update the approximation for p(θy); then update p(ηy); and so on.

it sounds essentially equivalent to a Gibbs sampler, possibly a multiple try Gibbs sampler (unless the author had another notion in mind, alas impossible to guess since no reference is included).

Beyond this specific case, where I think the entire paragraph should be erased from the “empirical Bayes” Wikipedia page, I discussed the general problem of some poor Bayesian entries in Wikipedia with Robin Ryder, who came with the neat idea of running (collective) Wikipedia editing labs at ISBA conferences. If we could further give an ISBA label to these entries, as a certificate of “Bayesian orthodoxy” (!), it would be terrific!

%d bloggers like this: