Archive for Edinburgh

limited shelf validity

Posted in Books, pictures, Statistics, University life with tags , , , , , , , , , , , on December 11, 2019 by xi'an

A great article from Steve Stigler in the new, multi-scaled, and so exciting Harvard Data Science Review magisterially operated by Xiao-Li Meng, on the limitations of old datasets. Illustrated by three famous datasets used by three equally famous statisticians, Quetelet, Bortkiewicz, and Gosset. None of whom were fundamentally interested in the data for their own sake. First, Quetelet’s data was (wrongly) reconstructed and missed the opportunity to beat Galton at discovering correlation. Second, Bortkiewicz went looking (or even cherry-picking!) for these rare events in yearly tables of mortality minutely divided between causes such as military horse kicks. The third dataset is not Guinness‘, but a test between two sleeping pills, operated rather crudely over inmates from a psychiatric institution in Kalamazoo, with further mishandling by Gosset himself. Manipulations that turn the data into dead data, as Steve put it. (And illustrates with the above skull collection picture. As well as warning against attempts at resuscitating dead data into what could be called “zombie data”.)

“Successful resurrection is only slightly more common than in Christian theology.”

His global perspective on dead data is that they should stop being used before extending their (shelf) life, rather than turning into benchmarks recycled over and over as a proof of concept. If only (my two cents) because it leads to calibrate (and choose) methods doing well over these benchmarks. Another example that could have been added to the skulls above is the Galaxy Velocity Dataset that makes frequent appearances in works estimating Gaussian mixtures. Which Radford Neal signaled at the 2001 ICMS workshop on mixture estimation as an inappropriate use of the dataset since astrophysical arguments weighted against a mixture modelling.

“…the role of context in shaping data selection and form—context in temporal, political, and social as well as scientific terms—has been shown to be a powerful and interesting phenomenon.”

The potential for “dead-er” data (my neologism!) increases with the epoch in that the careful sleuth work Steve (and others) conducted about these historical datasets is absolutely impossible with the current massive data sets. Massive and proprietary. And presumably discarded once the associated neural net is designed and sold. Letting the burden of unmasking the potential (or highly probable?) biases to others. Most interestingly, this recoups a “comment” in Nature of 17 October by Sabina Leonelli on the transformation of data from a national treasure to a commodity which “ownership can confer and signal power”. But her call for openness and governance of research data seems as illusory as other attempts to sever the GAFAs from their extra-territorial privileges…

Bayes plaque

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , on November 22, 2019 by xi'an

at the centre of Bayes

Posted in Mountains, pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , , on October 14, 2019 by xi'an

HW AMS & EPSRC MAG-MIGS CDT seminar

Posted in Statistics with tags , , , , , , , , , on October 10, 2019 by xi'an

Some explanation for all these acronyms! I am giving a Actuarial Mathematics & Statistics (AMS) seminar at Heriot-Watt (HW) University, in Edinburgh, tomorow. But in the (new) Bayes Centre, at the University of Edinburgh, rather than on the campus of Heriot-Watt, as this is also the launching day of the Centre for Doctoral Training (CDT) on Mathematical Modelling, Analysis, & Computation (MAG) shared between Heriot-Watt, and the University of Edinburgh, funded by the EPSRC and located in the Maxwell Institute Graduate School (MIGS) in its Bayes Centre. My talk will be on ABC convergence and misspecification.

in a house of lies [book review]

Posted in Books, Travel with tags , , , , , , , , on August 7, 2019 by xi'an

While I found the latest Rankin’s Rebus novels a wee bit disappointing, this latest installment in the stories of the Edinburghian ex-detective is a true pleasure! Maybe because it takes the pretext of a “cold case” suddenly resurfacing to bring back to life characters met in earlier novels of the series. And the borderline practice of DI Rebus himself. Which should matter less at a stage when Rebus has been retired for 10 years (I could not believe it had been that long!, but I feel like I followed Rebus for most of his carreer…) The plot is quite strong with none of the last minute revelations found in some earlier volumes, with a secondary plot that is much more modern and poignant. I also suspect some of the new characters will reappear in the next books, as well as the consequences of a looming Brexit [pushed by a loony PM] on the Scottish underworld… (No,. I do not mean TorysTories!)

thermodynamic integration plus temperings

Posted in Statistics, Travel, University life with tags , , , , , , , , , , , , on July 30, 2019 by xi'an

Biljana Stojkova and David Campbel recently arXived a paper on the used of parallel simulated tempering for thermodynamic integration towards producing estimates of marginal likelihoods. Resulting into a rather unwieldy acronym of PT-STWNC for “Parallel Tempering – Simulated Tempering Without Normalizing Constants”. Remember that parallel tempering runs T chains in parallel for T different powers of the likelihood (from 0 to 1), potentially swapping chain values at each iteration. Simulated tempering monitors a single chain that explores both the parameter space and the temperature range. Requiring a prior on the temperature. Whose optimal if unrealistic choice was found by Geyer and Thomson (1995) to be proportional to the inverse (and unknown) normalising constant (albeit over a finite set of temperatures). Proposing the new temperature instead via a random walk, the Metropolis within Gibbs update of the temperature τ then involves normalising constants.

“This approach is explored as proof of concept and not in a general sense because the precision of the approximation depends on the quality of the interpolator which in turn will be impacted by smoothness and continuity of the manifold, properties which are difficult to characterize or guarantee given the multi-modal nature of the likelihoods.”

To bypass this issue, the authors pick for their (formal) prior on the temperature τ, a prior such that the profile posterior distribution on τ is constant, i.e. the joint distribution at τ and at the mode [of the conditional posterior distribution of the parameter] is constant. This choice makes for a closed form prior, provided this mode of the tempered posterior can de facto be computed for each value of τ. (However it is unclear to me why the exact mode would need to be used.) The resulting Metropolis ratio becomes independent of the normalising constants. The final version of the algorithm runs an extra exchange step on both this simulated tempering version and the untempered version, i.e., the original unnormalised posterior. For the marginal likelihood, thermodynamic integration is invoked, following Friel and Pettitt (2008), using simulated tempering samples of (θ,τ) pairs (associated instead with the above constant profile posterior) and simple Riemann integration of the expected log posterior. The paper stresses the gain due to a continuous temperature scale, as it “removes the need for optimal temperature discretization schedule.” The method is applied to the Glaxy (mixture) dataset in order to compare it with the earlier approach of Friel and Pettitt (2008), resulting in (a) a selection of the mixture with five components and (b) much more variability between the estimated marginal  likelihoods for different numbers of components than in the earlier approach (where the estimates hardly move with k). And (c) a trimodal distribution on the means [and unimodal on the variances]. This example is however hard to interpret, since there are many contradicting interpretations for the various numbers of components in the model. (I recall Radford Neal giving an impromptu talks at an ICMS workshop in Edinburgh in 2001 to warn us we should not use the dataset without a clear(er) understanding of the astrophysics behind. If I remember well he was excluded all low values for the number of components as being inappropriate…. I also remember taking two days off with Peter Green to go climbing Craigh Meagaidh, as the only authorised climbing place around during the foot-and-mouth epidemics.) In conclusion, after presumably too light a read (I did not referee the paper!), it remains unclear to me why the combination of the various tempering schemes is bringing a noticeable improvement over the existing. At a given computational cost. As the temperature distribution does not seem to favour spending time in the regions where the target is most quickly changing. As such the algorithm rather appears as a special form of exchange algorithm.

Fate & Fortune [book review]

Posted in Books, Travel with tags , , , , , , , on February 10, 2019 by xi'an

After enjoying very much the first book, Hue & Cry, in the Hew Cullan series by Shirley McKay, I bought the following ones and read Fate & Fortune over the vacation break. If anything, I enjoyed this one even more, as it disclosed other aspects of 16th Century Scotland, still with the oppressive domination of the Kirk, the highly puritan Church of Scotland, over all aspects of everyday life, but also a more rational form of Law, plus the first instances of caitch, imported from France jeu de paume. And the medical approach of the time against an epidemics of syphilis. And the dangerous life of printers at the time, always in danger of arrest and worse. As usual with historical whodunits, it is hard to guess what is genuinely from 1580’s and what has been imported from the present era, but this is a most pleasant (light and short) book to read!