Archive for realism

machine learning and the future of realism

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , on May 4, 2017 by xi'an

Giles and Cliff Hooker arXived a paper last week with this intriguing title. (Giles Hooker is an associate professor of statistics and biology at Cornell U, with an interesting blog on the notion of models, while Cliff Hooker is a professor of philosophy at Newcastle U, Australia.)

“Our conclusion is that simplicity is too complex”

The debate in this short paper is whether or not machine learning relates to a model. Or is it concerned with sheer (“naked”) prediction? And then does it pertain to science any longer?! While it sounds obvious at first, defining why science is more than prediction of effects given causes is much less obvious, although prediction sounds more pragmatic and engineer-like than scientific. (Furthermore, prediction has a somewhat negative flavour in French, being used as a synonym to divination and opposed to prévision.) In more philosophical terms, prediction offers no ontological feature. As for a machine learning structure like a neural network being scientific or a-scientific, its black box nature makes it much more the later than the former, in that it brings no explanation for the connection between input and output, between regressed and regressors. It further lacks the potential for universality of scientific models. For instance, as mentioned in the paper, Newton’s law of gravitation applies to any pair of weighted bodies, while a neural network built on a series of observations could not be assessed or guaranteed outside the domain where those observations are taken. Plus, would miss the simple square law established by Newton. Most fascinating questions, undoubtedly! Putting the stress on models from a totally different perspective from last week at the RSS.

As for machine learning being a challenge to realism, I am none the wiser after reading the paper. Utilising machine learning tools to produce predictions of causes given effects does not seem to modify the structure of the World and very little our understanding of it, since they do not bring explanation per se. What would lead to anti-realism is the adoption of those tools as substitutes for scientific theories and models.

on de Finetti’s instrumentalist philosophy of probability

Posted in Books, Statistics, Travel, University life with tags , , , , , , , , on January 5, 2016 by xi'an

Pont Alexandre III, Paris, May 8, 2012. On our way to the old-fashioned science museum, Palais de la Découverte, we had to cross the bridge on foot as the nearest métro station was closed, due to N. Sarkozy taking part in a war memorial ceremony there...On Wednesday January 6, there is a conference in Paris [10:30, IHPST, 13, rue du Four, Paris 6] by Joseph Berkovitz (University of Toronto) on the philosophy of probability of Bruno de Finetti. Too bad this is during MCMSkv!

De Finetti is one of the founding fathers of the modern theory of subjective probability, where probabilities are coherent degrees of belief. De Finetti held that probabilities are inherently subjective and he argued that none of the objective interpretations of probability makes sense. While his theory has been influential in science and philosophy, it has encountered various objections. In particular, it has been argued that de Finetti’s concept of probability is too permissive, licensing degrees of belief that we would normally call imprudent. Further, de Finetti is commonly conceived as giving an operational, behaviorist definition of degrees of belief and accordingly of probability. Thus, the theory is said to inherit the difficulties embodied in operationalism and behaviorism. We argue that these and some other objections to de Finetti’s theory are unfounded as they overlook various central aspects of de Finetti’s philosophy of probability. We then propose a new interpretation of de Finetti’s theory that highlights these central aspects and explains how they are an integral part of de Finetti’s instrumentalist philosophy of probability. Building on this interpretation of de Finetti’s theory, we draw some lessons for the realist-instrumentalist controversy about the nature of science.

the ultimate simulation

Posted in Books, University life with tags , , , , , , on February 23, 2014 by xi'an

Another breakfast read of the New York Times that engaged enough of my attention to write a post (an easily done feat!): besides a lengthy introduction, Edward Frenkel, the author of the column, considers the Platonic issue of whether or not “mathematical entities actually exist in and of themselves”, an issue also central to Neal Stephenson’s Anathem. And suddenly switches to another philosophical debate, realism versus idealism, the later view being that reality only exists in the mind. And seriously (?) considers the question of whether or not we live in a computer simulation… Uh?! There is actually research going on with this assumption, as shown by the arXiv paper the column links to. This is also called the Matrix Hypothesis on Wikipedia. While I understand the appeal of arguing that we cannot distinguish between living in a real world and living in the simulation of a real world (this is a modern extension of Plato’s cave), I do not get the point of addressing the issue in a Physics paper. Seems more appropriate for science-fiction literature. Like Philip K. Dick‘s…

snapshot from Budapest (#5)

Posted in pictures, Running, Travel with tags , , , , , on August 4, 2013 by xi'an


Truly random [again]

Posted in Books, R, Statistics, University life with tags , , , , , , , , on December 10, 2010 by xi'an

“The measurement outputs contain at the 99% confidence level 42 new random bits. This is a much stronger statement than passing or not passing statistical tests, which merely indicate that no obvious non-random patterns are present.” arXiv:0911.3427

As often, I bought La Recherche in the station newsagent for the wrong reason! The cover of the December issue was about “God and Science” and I thought this issue would bring some interesting and deep arguments in connection with my math and realism post. The debate is very short, does not go in any depth. reproduces the Hawking’s quote that started the earlier post, and recycles the same graph about cosmology I used last summer in Vancouver! However, there are alternative interesting entries about probabilistic proof checking in Mathematics and truly random numbers… The first part is on an ACM paper on the PCP theorem by Irit Dinur, but is too terse as is (while the theory behind presumably escapes my abilities!). The second part is about a paper in Nature published by Pironio et al. and arXived as well. It is entitled “Random numbers certified by Bell’s Theorem” and also is one of the laureates of the La Recherche prize this year. I was first annoyed by the French coverage of the paper, mentioning that “a number was random with a probability of 99%” (?!) and that “a sequence of numbers is  perfectly random” (re-?!). The original paper is however stating the same thing, hence stressing the different meaning associated to randomness by those physicists, “the unpredictable character of the outcomes” and “universally-composable security”. The above “probability of randomness” is actually a p-value (associated with the null hypothesis that Bell’s inequality is not violated) that is equal to 0.00077. (So the above quote is somehow paradoxical!) The huge apparatus used to produce those random events is not very efficient: on average, 7 binary random numbers are detected per hour… A far cry from the “truly random” generator produced by Intel!

Ps-As a concidence, Julien Cornebise pointed out to me that there is a supplement in the journal about “Le Savoir du Corps” which is in fact handled by the pharmaceutical company Servier, currently under investigation for its drug Mediator… A very annoying breach of basic journalistic ethics in my opinion!