Archive for the Statistics Category

BayesComp Satellite [AG:DC] program

Posted in Statistics with tags , , , , , , , on February 1, 2023 by xi'an

The programme for our [AG:DC] 12-14 March satellite of BayesComp 2023 in Levi, Finland, is now on-line. (There will be a gondola shuttle running from town to hotel for all sessions.)

snapshot from Martinique² [jatp]

Posted in Statistics with tags , , , , , , , , , on January 31, 2023 by xi'an

ABC with path signatures [One World ABC seminar, 2/2/23]

Posted in Books, pictures, Running, Statistics, Travel, University life with tags , , , , , , , on January 29, 2023 by xi'an

The next One World ABC seminar is by Joel Dyer (Oxford) at 1:30pm (UK time) on 02 February.

Title: Approximate Bayesian Computation with Path Signatures

Abstract: Simulation models often lack tractable likelihood functions, making likelihood-free inference methods indispensable. Approximate Bayesian computation (ABC) generates likelihood-free posterior samples by comparing simulated and observed data through some distance measure, but existing approaches are often poorly suited to time series simulators, for example due to an independent and identically distributed data assumption. In this talk, we will discuss our work on the use of path signatures in ABC as a means to handling the sequential nature of time series data of different kinds. We will begin by discussing popular approaches to ABC and how they may be extended to time series simulators. We will then introduce path signatures, and discuss how signatures naturally lead to two instances of ABC for time series simulators. Finally, we will demonstrate that the resulting signature-based ABC procedures can produce competitive Bayesian parameter inference for simulators generating univariate, multivariate, irregularly spaced, and even non-Euclidean sequences.

Reference: J. Dyer, P. Cannon, S. M Schmon (2022). Approximate Bayesian Computation with Path Signatures. arXiv preprint 2106.12555

latest math stats exam

Posted in Books, Kids, R, Statistics, University life with tags , , , , , , , , , on January 28, 2023 by xi'an


As I finished grading our undergrad math stats exam (in Paris Dauphine) over the weekend, which was very straightforward this year, the more because most questions had already been asked on weekly quizzes or during practicals, some answers stroke me as atypical (but ChatGPT is not to blame!). For instance, in question 1, (c) received a fair share of wrong eliminations as g not being necessarily bounded. Rather than being contradicted by (b) being false. (ChatGPT managed to solve that question, except for the L² convergence!)

Question 2 was much less successful than we expected, most failures due to a catastrophic change of parameterisation for computing the mgf that could have been ignored given this is a Bernoulli model, right?! Although the students wasted quite a while computing the Fisher information for the Binomial distribution in Question 3… (ChatGPT managed to solve that question!)

Question 4 was intentionally confusing and while most (of those who dealt with the R questions) spotted the opposition between sample and distribution, hence picking (f), a few fell into the trap (d).

Question 7 was also surprisingly incompletely covered by a significant fraction of the students, as they missed the sufficiency in (c). (ChatGPT did not manage to solve that question, starting with the inverted statement that “a minimal sufficient statistic is a sufficient statistic that is not a function of any other sufficient statistic”…)

And Question 8 was rarely complete, even though many recalled Basu’s theorem for (a) [more rarely (d)] and flunked (c). A large chunk of them argued that the ancilarity of statistics in (a) and (d) made them [distributionally] independent of μ, therefore [probabilistically] of the empirical mean! (Again flunked by ChatGPT, confusing completeness and sufficiency.)

Bayesian thinking for toddler & Bayesian probabilities for babies [book reviews]

Posted in Statistics with tags , , , , , , , , , , on January 27, 2023 by xi'an

My friend E.-J.  Wagenmakers sent me a copy of Bayesian Thinking for Toddlers, “a must-have for any toddler with even a passing interest in Ockham’s razor and the prequential principle.” E.-J. wrote the story and Viktor Beekman (of thesis’ cover fame!) drew the illustrations. The book can be read for free on https://psyarxiv.com/w5vbp/, but not purchased as publishers were not interested and self-publishing was not available at a high enough quality level. Hence, in the end, 200 copies were made as JASP material, with me being the happy owner of one of these. The story follows two young girls competing for dinosaur expertise, and being rewarded by cookies, in proportion to the probability of providing the correct answer to two dinosaur questions. Toddlers may get less enthusiastic than grown-ups about the message, but they will love the drawings (and the questions if they are into dinosaurs).

This reminded me of the Bayesian probabilities for babies book, by Chris Ferrie, which details the computation of the probability that a cookie contains candy when the first bite holds none. It is more genuinely intended for young kids, in shape and design, as can be checked on a YouTube video, with an hypothetical population of cookies (with and without candy) being the proxy for the prior distribution. I hope no baby will be traumatised from being exposed too early to the notions of prior and posterior. Only data can tell, twenty years from now, if the book induced a spike or a collapse in the proportion of Bayesian statisticians!

[Disclaimer about potential self-plagiarism: this post or an edited version will potentially appear in my Books Review section in CHANCE.

%d bloggers like this: