introduced in Stoehr et al. (2014) and the posterior (predictive)

error rates in Pudlo, Marin et al.

The only common feature between both local error rates is that they

depend on the observed data (through some summaries because of ABC).

The first one is the conditional expected value of the

misclassification loss knowing that the data (or more precisely some

summaries of the data) are what we have observed. Hence, when we

integrate this conditional error over the marginal distribution of the

summaries of the data, we recover the misclassification error integrated

over the whole prior space.

The posterior (predictive) error rate (presented in the paper “*ABC via random
forests*“) relies on an expected value over the predictive distribution

knowing the observed data. Thus, it includes a second integral over

the data space which does not appear in the condition error rate of

Stoehr et al. and its computation requires new simulations drawn from

the posterior distribution.

As a consequence, the conditional error rates of Stoehr et al. is on

the same ground as the posterior probabilities (see, for instance

Proposition 2 of the paper), a feature not shared by the posterior

predictive error.

The solution given by Busser and Cohen in Le Monde for n=10 allows as you mentioned it to derive a general analytical solution for n even. The case n odd is still pending. Did you find sth?

This game is a good opportunity to celebrate the birth centennial of Martin Gardner (21 oct 1914, 22 may 2010) who inspired so many people in his Mathematical Games column of Scientific American. See a brief recollection of his main topical columns on polyominoes, Penrose tiles, game of lifes, RSA cryptography, etc… in the last issue of SA by S Mulcahy and D Richards. ]]>

all the n^-2k, numbers right? ]]>

I wrote a critique of Kieran Healy and James Moody’s paper *Data Visualization in Sociology*, and in that paper they attempted to use the Minard graphic as a teaching example as well.

]]>I’m looking forward to the next lectures in this course.

]]>