Archive for CERN

the many nuances of Bayesian testing [CERminar]

Posted in Statistics with tags , , , , , , , , , , , on January 19, 2022 by xi'an

CERminar

how can we tell someone “be natural”? [#2]

Posted in Books, Kids, pictures, University life with tags , , , , , , , , , on November 17, 2013 by xi'an

Following my earlier high school composition (or, as my daughter would stress, a first draft of vague ideas towards a composition!), I came upon an article in the Science leaflet of Le Monde (as of October 25) by the physicist Marco Zito (already commented on the ‘Og): “How natural is Nature?“. The following is my (commented) translation of the column, I cannot say I understand more than half of the words or hardly anything of its meaning, although checking some Wikipedia entries helped (I wonder how many readers have gotten to the end of this tribune)

The above question is related to physics in that (a) the electroweak interaction scale is about the mass of Higgs boson, at which scale [order of 100GeV] the electromagnetic and the weak forces are of the same intensity. And (b) there exists a gravitation scale, Planck’s mass, which is the energy [about 1.2209×1019GeV] where gravitation [general relativity] and quantum physics must be considered simultaneously. The difficulty is that this second fundamental scale differs from the first one, being larger by 17 orders of magnitude [so what?!]. The difference is puzzling, as a world with two fundamental scales that are so far apart does not sound natural [how does he define natural?]. The mass of Higgs boson depends on the other elementary particles and on the fluctuations of the related fields. Those fluctuations can be very large, of the same order as Planck’s scale. The sum of all those terms [which terms, dude?!] has no reason to be weak. In most possible universes, the mass of this boson should thus compare with Planck’s mass, hence a contradiction [uh?!].

And then enters this apparently massive probabilistic argument:

If you ask passerbys to select a number each between two large bounds, like – 10000 and 10000, it is very unlikely to obtain exactly zero as the sum of those numbers. So if you observe zero as the sum, you will consider the result is not «natural» [I’d rather say that the probabilistic model is wrong]. The physicists’ reasoning so far was «Nature cannot be unnatural. Thus the problem of the mass of Higgs’ boson must have a solution at energy scales that can be explored by CERN. We could then uncover a new and interesting  physics». Sadly, CERN has not (yet?) discovered new particles or new interactions. There is therefore no «natural» solution. Some of us imagine an unknown symmetry that bounds the mass of Higgs’ boson.

And a conclusion that could work for a high school philosophy homework:

This debate is typical of how science proceeds forward. Current theories are used to predict beyond what has been explored so fat. This extrapolation works for a little while, but some facts eventually come to invalidate them [sounds like philosophy of science 101, no?!]. Hence the importance to validate through experience our theories to abstain from attributing to Nature discourses that only reflect our own prejudices.

This Le Monde Science leaflet also had a short entry on a meteorite called Hypatia, because it was found in Egypt, home to the Alexandria 4th century mathematician Hypatia. And a book review of (the French translation of) Perfect Rigor, a second-hand biography of Grigory Perelman by Martha Gessen. (Terrible cover by the way, don’t they know at Houghton Mifflin that the integral sign is an elongated S, for sum, and not an f?! We happened to discuss and deplore with Andrew the other day this ridiculous tendency to mix wrong math symbols and greek letters in the titles of general public math books. The title itself is not much better, what is imperfect rigor?!)  And the Le Monde math puzzle #838

about randomness (im Hamburg)

Posted in Statistics, Travel, University life with tags , , , , , , , , , , , , on February 20, 2013 by xi'an

exhibit in DESY campus, Hamburg, Germany, Feb. 19, 2013True randomness was the topic of the `Random numbers; fifty years later’ talk in DESY by Frederick James from CERN. I had discussed a while ago a puzzling book related to this topic. This talk went along a rather different route, focussing on random generators. James put this claim that there are computer based physical generators that are truly random. (He had this assertion that statisticians do not understand randomness because they do not know quantum mechanics.) He distinguished those from pseudo-random generators: “nobody understood why they were (almost) random”, “IBM did not know how to generate random numbers”… But then spent the whole talk discussing those pseudo-random generators. Among other pieces of trivia, James mentioned that George Marsaglia was the one exhibiting the hyperplane features of congruential generators. That Knuth achieved no successful definition of what randomness is in his otherwise wonderful books! James thus introduced Kolmogorov’s mixing (not Kolmogorov’s complexity, mind you!) as advocated by Soviet physicists to underlie randomness. Not producing anything useful for RNGs in the 60’s. He then moved to the famous paper by Ferrenberg, Landau and Wong (1992) that I remember reading more or less at the time. In connection with the phase transition critical slowing down phenomena in Ising model simulations. And connecting with the Wang-Landau algorithm of flipping many sites at once (which exhibited long-term dependences in the generators). Most interestingly, a central character in this story is Martin Lüscher, based in DESY, who expressed the standard generator of the time RCARRY into one studied by those Soviet mathematicians,

X’=AX

showing that it enjoyed Kolmogorov mixing, but with a very poor Lyapunov coefficient. I partly lost track there as RCARRY was not perfect. And on how this Kolmogorov mixing would relate to long-term dependencies. One explanation by James was that this property is only asymptotic. (I would even say statistical!) Also interestingly, the 1994 paper by Lüscher produces the number of steps necessary to attain complete mixing, namely 15 steps, which thus works as a cutoff point. (I wonder why a 15-step RCARRY is slower, since A15 can be computed at once… It may be due to the fact that A is sparse while A15 is not.) James mentioned that Marsaglia’s Die Hard battery of tests is now obsolete and superseded by Pierre Lecuyer’s TestU01.

In conclusion, I did very much like this presentation from an insider, but still do not feel it makes a contribution to the debate on randomness, as it stayed put on pseudorandom generators. To keep the connection with von Neumann, they all produce wrong answers from a randomness point of view, if not from a statistical one. (A final quote from the talk: “Among statisticians and number theorists who are supposed to be specialists, they do not know about Kolmogorov mixing.”) [Discussing with Fred James at the reception after the talk was obviously extremely pleasant, as he happened to know a lot of my Bayesian acquaintances!]