Archive for The Monty Hall problem

a new Monty Hall riddle

Posted in Books, Kids, Mountains, pictures, R, Statistics, Travel with tags , , , , , , , , , on May 22, 2020 by xi'an

The Riddler was sort of feeling the rising boredom of being under lockdown when proposing the following variant to the Monty Hall puzzle:

There are zero to three goats, with a probability ¼ each, and they are allocated to different doors uniformly among the three doors of the show. After the player chooses a door, Monty opens another door hidding a goat or signals this is impossible. Given that he did open a door, what is the probability that the player’s door does not hide a goat?

Indeed, a straightforward conditional probability computation considering all eight possible cases with the four cases corresponding to Monty opening a door leads to a probability of 3/8 for the player’s door. As confirmed by the following R code:

s=sample
m=c(0,0)
for(t in 1:1e6)m=m+(range(s(1:3,s(1:3,1)))>1)

Bayesian filtering and smoothing [book review]

Posted in Books, Statistics, Travel, University life with tags , , , , , , , , , , , , on February 25, 2015 by xi'an

When in Warwick last October, I met Simo Särkkä, who told me he had published an IMS monograph on Bayesian filtering and smoothing the year before. I thought it would be an appropriate book to review for CHANCE and tried to get a copy from Oxford University Press, unsuccessfully. I thus bought my own book that I received two weeks ago and took the opportunity of my Czech vacations to read it… [A warning pre-empting accusations of self-plagiarism: this is a preliminary draft for a review to appear in CHANCE under my true name!]

“From the Bayesian estimation point of view both the states and the static parameters are unknown (random) parameters of the system.” (p.20)

 Bayesian filtering and smoothing is an introduction to the topic that essentially starts from ground zero. Chapter 1 motivates the use of filtering and smoothing through examples and highlights the naturally Bayesian approach to the problem(s). Two graphs illustrate the difference between filtering and smoothing by plotting for the same series of observations the successive confidence bands. The performances are obviously poorer with filtering but the fact that those intervals are point-wise rather than joint, i.e., that the graphs do not provide a confidence band. (The exercise section of that chapter is superfluous in that it suggests re-reading Kalman’s original paper and rephrases the Monty Hall paradox in a story unconnected with filtering!) Chapter 2 gives an introduction to Bayesian statistics in general, with a few pages on Bayesian computational methods. A first remark is that the above quote is both correct and mildly confusing in that the parameters can be consistently estimated, while the latent states cannot. A second remark is that justifying the MAP as associated with the 0-1 loss is incorrect in continuous settings.  The third chapter deals with the batch updating of the posterior distribution, i.e., that the posterior at time t is the prior at time t+1. With applications to state-space systems including the Kalman filter. The fourth to sixth chapters concentrate on this Kalman filter and its extension, and I find it somewhat unsatisfactory in that the collection of such filters is overwhelming for a neophyte. And no assessment of the estimation error when the model is misspecified appears at this stage. And, as usual, I find the unscented Kalman filter hard to fathom! The same feeling applies to the smoothing chapters, from Chapter 8 to Chapter 10. Which mimic the earlier ones. Continue reading

no more car talk

Posted in Books, Kids, Travel with tags , , , , , on November 9, 2014 by xi'an

When I first came went to the US in 1987, I switched from listening to the French public radio to listening to NPR, the National Public Radio network. However, it was not until I met both George Casella and Bernhard Flury that I started listening to “Car Talk”, the Sunday morning talk-show by the Magliozzi brothers where listeners would call and expose their car problem and get jokes and sometime advice in reply. Both George and Bernhard were big fans of the show, much more for the unbelievable high spirits it provided than for any deep interest in mechanics. And indeed there was something of the spirit of Zen and the art of motorcycle maintenance in that show, namely that through mechanical issues, people would come to expose deeper worries that the Magliozzi brothers would help bring out, playing the role of garage-shack psychiatrists…Which made me listen to them, despite my complete lack of interest in car, mechanics and repair in general.

One of George’s moments of fame was when he wrote to the Magliozzi brothers about Monty Hall’s problem, because they had botched their explanation as to why one should always change door. And they read it on the air, with the line “Who is this Casella guy from Cornell University? A professor? A janitor?” since George had just signed George Casella, Cornell University. Besides, Bernhard was such a fan of the show that he taped every single morning show, that he would later replay on long car trips (I do not know how his familly enjoyed the exposure to the show, though!). And so happened to have this line about George on tape, that he sent him a few weeks later… I am reminiscing all this because I saw in the NYT today that the older brother, Tom Magliozzi, had just died. Some engines can alas not be fixed… But I am sure there will be a queue of former car addicts in some heavenly place eager to ask him their question about their favourite car. Thanks for the ride, Tom!

my statistician friend

Posted in Books, Kids, Running, Statistics, University life with tags , , , on April 7, 2013 by xi'an

A video made in Padova:(and shown during a break at the workshop), watch out for Bayes’ theorem!

“la formule qui décrypte le monde”

Posted in Books, Statistics, University life with tags , , , , , , , on November 6, 2012 by xi'an

“It is only in the 1980s that the American mathematician Judea Pearl has shown that, by aligning hundreds of Bayes formulas, it was possible to take into account the multiple causes of a complex phenomenon.” (my translation)

As a curious coincidence, the latest issue of Science & Vie appeared on the day I was posting about Peter Coles’s warnings on scientific communication. The cover title of the magazine is the title of this post, The formula decrypting the World, and it is of course about… Bayes’ formula, no-one else’s!!! The major section (16 pages) in this French scientific vulgarization magazine is indeed dedicated to Bayesian statistics and even more Bayesian networks, with the usual stylistic excesses of journalism. As it happens, one of the journalists in charge of this issue came to discuss the topic with me a long while ago in Paris-Dauphine and I remember the experience as being not particularly pleasant since I had trouble communicating the ideas of Bayesian statistics in layman terms. In the end, this rather lengthy interview produced two quotes from me, one that could be mine (in connection with some sentences from Henri Poincaré) and another that is definitely apocryphal (yes, indeed, the one above! I am adamant I could not have mentioned Judea Pearl, whose work I am not familiar with, and even less this bizarre image of hundreds of Bayes’ theorems… Presumably, this got mixed up with a quote from another interviewed Bayesian. The same misquoting occurred for my friend Jean-Michel Marin!).

Among the illustrations selected in the journal as vignettes, the Monty Hall paradox—which is an exercise in conditioning, not in statistical reasoning!—, signal processing for microscope images, Bayesian networks for robots, population genetics (and the return of the musk ox!), stellar cloud formation, tsunami prediction, microarray analysis, climate meta-analysis (with a quote from Noel Cressie), post-Higgs particle physics, ESP studies invalidation by Wagenmakers (missing the fact that the reply by Bern, Utts, and Johnson is equally Bayesian), quantum physics. From a more remote perspective, those are scientific studies using Bayesian statistics to establish important and novel results. However, it would have been easy to come up with equally important and novel results demonstrated via classical non-Bayesian approaches, such as exhibiting the Higgs boson. Now, I understand the difficulty in conveying to the layman the difference resulting from using a Bayesian reasoning to support a scientific argument, however this accumulation of superlatives opens the door to suspicions of bias and truncated perspectives… The second half of the report is less about statistics and more about psychology and learning, expanding on the notion that the brain operates in ways similar to Bayesian learning and networks. Continue reading

%d bloggers like this: