## Archive for Royal Society

## Dear President of the European Commission

Posted in Kids, Travel, University life with tags Brexit, Brussels, ETH Zurich, EU, European Commission, European science, European Solidarity, European Union, Horizon Europe, Northern Ireland, Republic of Ireland, Royal Society, Stick to Science, UK politics, Welcome Trust on July 12, 2022 by xi'an## Adrian Smith on [lack of] Horizon Europe

Posted in pictures, University life with tags Adrian Smith, Brexit, Horizon Europe, Royal Society, Stick to Science on July 6, 2022 by xi'an## sunset on the horizon…

Posted in pictures, Travel, University life with tags Adrian Smith, better together, Brexit, ERC, EU, European Research Council, Horizon Europe, Royal Society, UK, UK politics on June 6, 2022 by xi'an*“The window for association is closing fast, and we need to ensure that political issues do not get in the way of a sensible solution. We have always been very clear that association is the preferred outcome for protecting decades of collaborative research, and the benefits this has brought to people’s lives across the continent and beyond.”* Adrian Smith

As reported in the Guardian of today a terrible impact of ~~BoJo’s Vote Leave~~ ~~Brexit~~ the UK Government failing to implement the Northern Ireland protocol is that this threatens ERC funding for UK scientists, as the associate membership of Horizon Europe was part of the Brexit negociations, whose “deal” has been delayed because of this. About a hundred new ERC grant recipients who are currently located in the UK have to either relocate to (eager) EU universities or to give up this most prestigious funding…

## David Cox (1924-2022)

Posted in Books, Statistics, University life with tags ABC, Applied probabillity, Applied stochastic processes, Biometrika, Birmingham, Copley Medal, Cornell University, Cox process, CREST, David Cox, England, experimental design, FRS, Glasgow, Guy Medal in Gold, International Prize in Statistics, Ithaca, Kettering Prize for Cancer Research, mathematical statistics, Mike Titterington, New York, obituary, Royal Society, statistical methodology, University of Oxford on January 20, 2022 by xi'an**I**t is with much sadness that I heard from Oxford yesterday night that David Cox had passed away. Hither goes a giant of the field, whose contributions to theoretical and methodological statistics are enormous and whose impact on society is truly exceptional. He was the first recipient of the International Prize in Statistics in 2016 (aka the “Nobel of Statistics”) among many awards and a Fellow of the Royal Society among many other recognitions. He was also the editor of *Biometrika* for 25 years (!) and was still submitting papers to the journal a few month ago. Statistical Science published a conversation between Nancy Reid and him that tells a lot about the man and his amazing modesty. While I had met him in 1989, when he was visiting Cornell University as a distinguished visitor (and when I drove him to the house of Anne and George Casella for dinner once), then again in the 1990s when he came on a two-day visit to CREST, we only really had a significant conversation in 2011 (!), when David and I attended the colloquium in honour of Mike Titterington in Glasgow and he proved to be most interested in the ABC algorithm. He published a connected paper in *Biometrika* the year after, with Christiana Katsonaki. We met a few more times later, always in Oxford, to again discuss ABC. In each occasion, he was incredibly kind and considerate.

## false confidence, not fake news!

Posted in Books, Statistics with tags Bayes factors, confidence distribution, epistemic probability, Jeffreys-Lindley paradox, Proceedings of the Royal Society, Royal Society on May 28, 2021 by xi'an

“…aerospace researchers have recognized a counterintuitive phenomenon in satellite conjunction analysis, known as probability dilution. That is, as uncertainty in the satellite trajectories increases, the epistemic probability of collision eventually decreases. Since trajectory uncertainty is driven by errors in the tracking data, the seemingly absurd implication of probability dilution is that lower quality data reduce the risk of collision.”

**I**n 2019, Balch, Martin, and Ferson published a false confidence theorem [false confidence, not false theorem!] in the Proceedings of the Royal [astatistical] Society, motivated by satellite conjunction (i.e., fatal encounter) analysis. But discussing in fine the very meaning of a confidence statement. And returning to the century old opposition between randomness and epistemic uncertainty, aleatory versus epistemic probabilities.

“…the counterintuitiveness of probability dilution calls this [use of epistemic probability] into question, especially considering [its] unsettled status in the statistics and uncertainty quantification communities.”

The practical aspect of the paper is unclear in that the opposition of aleatory versus epistemic probabilities does not really apply when the model connecting the observables with the position of the satellites is unknown. And replaced with a stylised parametric model. When ignoring this aspect of uncertainty, the debate is mostly moot.

“…the problem with probability dilution is not the mathematics (…) if (…) inappropriate, thatinappropriateness must be rooted in a mismatch between the mathematics of probability theoryand the epistemic uncertainty to which they are applied in conjunction analysis.”

The probability dilution phenomenon as described in the paper is that, when (posterior) uncertainty increases, the posterior probability of collision eventually decreases, which makes sense since poor precision implies the observed distance is less trustworthy and the satellite could be anywhere. To conclude that increasing the prior or epistemic uncertainty makes the satellites safer from collision is thus fairly absurd as it only concerns the confidence in the statement that there will be a collision. But I agree with the conclusion that the statement of a low posterior probability is a misleading risk metric because, just like p-values, it is a.s. taken at face value. Bayes factors do relativise this statement [but are not mentioned in the paper]. But with the spectre of Lindley-Jeffreys paradox looming in the background.

The authors’ notion of *false confidence* is formally a highly probable [in the sample space] report of a high belief in a subset A of the parameter set when the true parameter does not belong to A. Which holds for all epistemic probabilities in the sense that there always exists such a set A. A theorem that I see as related to the fact that integrating an epistemic probability statement [conditional on the data x] wrt the true sampling distribution [itself conditional on the parameter θ] is not coherent from a probabilistic standpoint. The resolution of the paradox follows a principle set by Ryan Martin and Chuanhai Liu, such that “it is almost a tautology that a statistical approach satisfying this criterion will not suffer from the severe false confidence phenomenon”, although it sounds to me that this is a weak patch on a highly perforated tyre, the erroneous interpretation of probabilistic statements as frequentist ones.