**A**lbert Jacquard passed away last week. He was a humanist, engaged in the defence of outcasts (*laissés pour compte*) like homeless and illegal immigrants. He had a regular chronicle of two minutes on France Culture that I used to listen to (when driving at that time of the day). In the obituaries published in the recent days, this side of the character was put forward, while very little was said about his scientific legacy. He was a statistician, first at INSEE, then at INED. After getting a PhD in genetics from Stanford in 1968, he got back to INED as a population geneticist, writing in 1978 his most famous book, *Éloge de la Différence*, against racial theories, which is the first in a long series of vulgarisation and philosophical books. Among his scientific books, he wrote the entry on Probabilités in the popular vulgarisation series “Que Sais-Je?”, with more than 40,000 copies sold and used by generations of students. (Among its 125 pages, the imposed length for a “Que Sais-Je?”, the book includes Bayes theorem and, more importantly, the Bayesian approach to estimating unknown probabilities!)

## Archive for Theory of Probability

## Albert Jacquard (1925-2013)

Posted in Books, Statistics with tags Albert Jacquard, Bayes theorem, Bayesian statistics, INED, INSEE, population genetics, Theory of Probability on September 21, 2013 by xi'an## who’s afraid of the big B wolf?

Posted in Books, Statistics, University life with tags Allan Birnbaum, asymptotics, Bayes factor, Bayesian inference, Jeffreys-Lindley paradox, likelihood ratio, p-values, prior selection, testing of hypotheses, The Likelihood Principle, Theory of Probability on March 13, 2013 by xi'an**A**ris Spanos just published a paper entitled “Who should be afraid of the Jeffreys-Lindley paradox?” in the journal *Philosophy of Science*. This piece is a continuation of the debate about frequentist versus llikelihoodist versus Bayesian (should it be Bayesianist?! or Laplacist?!) testing approaches, exposed in Mayo and Spanos’ *Error and Inference*, and discussed in several posts of the ‘Og. I started reading the paper in conjunction with a paper I am currently writing for a special volume in honour of Dennis Lindley, paper that I will discuss later on the ‘Og…

“…the postdata severity evaluation (…) addresses the key problem with Fisherian p-values in the sense that the severity evaluation provides the “magnitude” of the warranted discrepancy from the null by taking into account the generic capacity of the test (that includes n) in question as it relates to the observed data”(p.88)

**F**irst, the antagonistic style of the paper is reminding me of Spanos’ previous works in that it relies on repeated value judgements (such as *“Bayesian charge”, “blatant misinterpretation”, “Bayesian allegations that have undermined the credibility of frequentist statistics”*, *“both approaches are far from immune to fallacious interpretations”*, *“only crude rules of thumbs”,* &tc.) and rhetorical sleights of hand. (See, e.g., *“In *contrast*, the severity account *ensures* learning from data by employing *trustworthy* evidence (…), the *reliability* of evidence being calibrated in terms of the *relevant* error probabilities”* [my stress].) Connectedly, Spanos often resorts to an unusual [at least for statisticians] vocabulary that amounts to newspeak. Here are some illustrations: *“summoning the generic capacity of the test”, ‘substantively significant”, “custom tailoring **the generic capacity of the test”,* “the fallacy of acceptance”, *“the relevance of **the generic capacity of the particular test”,* yes the term *“generic capacity”* is occurring there with a truly high frequency. Continue reading

## 17 equations that changed the World (#2)

Posted in Books, Statistics with tags 17 equations That Changed the World, BBC, Black and Scoles formula, book review, Dojima rice exchange, Edwin Jaynes, financial crisis, Harold Jeffreys, Henri Poincaré, Ian Stewart, Michelson-Morley, Stephen Wolfram, The Black Swan, The Universe in zero words, Theory of Probability, Vladimir Arnold, wikipedia, xkcd on October 16, 2012 by xi'an*(continuation of the book review)*

“

If you placed your finger at that point, the two halves of the string would still be able to vibrate in the sin 2x pattern, but not in the sin x one. This explains the Pythagorean discovery that a string half as long produced a note one octave higher.” (p.143)

** T**he following chapters are all about Physics: the wave equation, Fourier’s transform and the heat equation, Navier-Stokes’ equation(s), Maxwell’s equation(s)—as in ** The universe in zero word—**, the second law of thermodynamics,

**(of course!), and Schrödinger’s equation. I won’t go so much into details for those chapters, even though they are remarkably written. For instance, the chapter on waves made me understand the notion of harmonics in a much more intuitive and lasting way than previous readings. (This chapter 8 also mentions the “**

*E=mc²**English mathematician Harold Jeffreys*“, while Jeffreys was primarily a geophysicist. And a Bayesian statistician with major impact on the field, his

**arguably being the first modern Bayesian book. Interestingly, Jeffreys also was the first one to find approximations to the Schrödinger’s equation, however he is not mentioned in this later chapter.) Chapter 9 mentions the heat equation but is truly about Fourier’s transform which he uses as a tool and later became a universal technique. It also covers Lebesgue’s integration theory, wavelets, and JPEG compression. Chapter 10 on Navier-Stokes’ equation also mentions climate sciences, where it takes a (reasonable) stand. Chapter 11 on Maxwell’s equations is a short introduction to electromagnetism, with radio the obvious illustration. (Maybe not the best chapter in the book.) Continue reading**

*Theory of Probability*## not only defended but also applied [to appear]

Posted in Books, Statistics, University life with tags Andrew Gelman, arXiv, ASA, discussion paper, Harold Jeffreys, The American Statistician, Theory of Probability, William Feller on June 12, 2012 by xi'an**O**ur paper with Andrew Gelman, *“Not only defended but also applied”: the perceived absurdity of Bayesian inference*, has been reviewed for the second time and is to appear in The American Statistician, as a discussion paper. Terrific news! This is my first discussion paper in The American Statistician (and the second in total, the first one being the re-read of Jeffreys‘ ** Theory of Probability**.)

*[The updated version is now on arXiv.]*

## May I believe I am a Bayesian?!

Posted in Books, Statistics, University life with tags Bayesian inference, E.T. Jaynes, foundations, Harold Jeffreys, Markets and Morals, Rationality, Ronald Fisher, Stephen Hawking, Theory of Probability, Thomas Bayes on January 21, 2012 by xi'an“

…the argument is false that because some ideal form of this approach to reasoning seems excellent n theory it therefore follows that in practice using this and only this approach to reasoning is the right thing to do.” Stephen Senn, 2011

**D**eborah Mayo, Aris Spanos, and Kent Staley have edited a special issue of ** Rationality, Markets and Morals (RMM)** (a rather weird combination, esp. for a journal name!) on “

*Statistical Science and Philosophy of Science: Where Do (Should) They Meet in 2011 and Beyond?*” for which comments are open. Stephen Senn has a paper therein entitled

*You May Believe You Are a Bayesian But You Are Probably Wrong*in his usual witty, entertaining, and… Bayesian-bashing style! I find it very kind of him to allow us to remain in the wrong, very kind indeed…

**N**ow, the paper somehow intersects with the comments Stephen made on our review of Harold Jeffreys’ ** Theory of Probability** a while ago. It contains a nice introduction to the four great systems of statistical inference, embodied by de Finetti, Fisher, Jeffreys, and Neyman plus Pearson. The main criticism of Bayesianism à la de Finetti is that it is so perfect as to be outworldish. And, since this perfection is lost in the practical implementation, there is no compelling reason to be a Bayesian. Worse, that all practical Bayesian implementations conflict with Bayesian principles. Hence a Bayesian author “in practice is wrong”. Stephen concludes with a call for eclecticism, quite in line with his usual style since this is likely to antagonise everyone. (I wonder whether or not having no final dot to the paper has a philosophical meaning. Since I have been caught in over-interpreting book covers, I will not say more!) As I will try to explain below, I believe Stephen has paradoxically himself fallen victim of over-theorising/philosophising! (Referring the interested reader to the above post as well as to my comments on Don Fraser’s “

*Is Bayes posterior quick and dirty confidence?*” for more related points. Esp. about Senn’s criticisms of objective Bayes on page 52 that are not so central to this discussion… Same thing for the different notions of probability [p.49] and the relative difficulties of the terms in (2) [p.50]. Deborah Mayo has a ‘deconstructed” version of Stephen’s paper on her blog, with a much deeper if deBayesian philosophical discussion. And then Andrew Jaffe wrote a post in reply to Stephen’s paper. Whose points I cannot discuss for lack of time, but with an interesting mention of Jaynes as missing in Senn’s pantheon.)

“

The Bayesian theory is a theory on how to remain perfect but it does not explain how to become good.” Stephen Senn, 2011

**W**hile associating theories with characters is a reasonable rethoretical device, especially with large scale characters as the one above!, I think it deters the reader from a philosophical questioning on the theory behind the (big) man. (In fact, it is a form of bullying or, more politely (?), of having big names shoved down your throat as a form of argument.) In particular, Stephen freezes the (Bayesian reasoning about the) Bayesian paradigm in its de Finetti phase-state, arguing about what de Finetti thought and believed. While this is historically interesting, I do not see why we should care at the *praxis* level. (I have made similar comments on this blog about the unpleasant aspects of being associated with one character, esp. the mysterious Reverent Bayes!) But this is not my main point.

“

…in practice things are not so simple.” Stephen Senn, 2011

**T**he core argument in Senn’s diatribe is that reality is always more complex than the theory allows for and thus that a Bayesian has to compromise on her/his perfect theory with reality/practice in order to reach decisions. A kind of philosophical equivalent to Achille and the tortoise. However, it seems to me that the very fact that the Bayesian paradigm is a learning principle implies that imprecisions and imperfections are naturally endowed into the decision process. Thus avoiding the apparent infinite regress (*Regress ins Unendliche*) of having to run a Bayesian analysis to derive the prior for the Bayesian analysis at the level below (which is how I interpret Stephen’s first paragraph in Section 3). By refusing the transformation of a perfect albeit ideal Bayesian into a practical if imperfect bayesian (or coherent learner or whatever name that does not sound like being a member of a sect!), Stephen falls short of incorporating the *contrainte de réalité* into his own paradigm. The further criticisms found about prior justification, construction, evaluation (pp.59-60) are also of that kind, namely preventing the statistician to incorporate a degree of (probabilistic) uncertainty into her/his analysis.

**I**n conclusion, reading Stephen’s piece was a pleasant and thought-provoking moment. I am glad to be allowed to believe I am a Bayesian, even though I do not believe it is a belief! The *praxis* of thousands of scientists using Bayesian tools with their personal degree of subjective involvement is an evolutive organism that reaches much further than the highly stylised construct of de Finetti (or of de Finetti restaged by Stephen!). And appropriately getting away from claims to being perfect or right. Or even being more philosophical.