A few days ago I found on the page Jeff Rosenthal has dedicated to Hastings that he has passed away peacefully on May 13, 2016 in Victoria, British Columbia, where he lived for 45 years as a professor at the University of Victoria. After holding positions at University of Toronto, University of Canterbury (New Zealand), and Bell Labs (New Jersey). As pointed out by Jeff, Hastings’ main paper is his 1970 Biometrika description of Markov chain Monte Carlo methods, Monte Carlo sampling methods using Markov chains and their applications. Which would take close to twenty years to become known to the statistics world at large, although you can trace a path through Peskun (his only PhD student) , Besag and others. I am sorry it took so long to come to my knowledge and also sorry it apparently went unnoticed by most of the computational statistics community.
Archive for the Travel Category
A rather weird question on X validated this week was about devising a manual way to simulate (a few) normal variates. By manual I presume the author of the question means without resorting to a computer or any other business machine. Now, I do not know of any real phenomenon that is exactly and provably Normal. As analysed in a great philosophy of science paper by Aidan Lyon, the standard explanations for a real phenomenon to be Normal are almost invariably false, even those invoking the Central Limit Theorem. Hence I cannot think of a mechanical device that would directly return Normal generations from a Normal distribution with known parameters. However, since it is possible to simulate by hand Uniform U(0,1) variates [up to a given precision] using a chronometre or a wheel, calls to versions of the Box-Müller algorithm that do not rely on logarithmic or trigonometric functions are feasible, for instance by generating two Exponential variates, x and y, until 2y>(1-x)², x being the output. And generating Exponential variates is easy provided a radioactive material with known half-life is available, along with a Geiger counter. Or, if not, by calling von Neumann’s exponential generator. As detailed in Devroye’s simulation book.
After proposing this solution, I received a comment from the author of the question towards a simpler solution based, e.g., on the Central Limit Theorem. Presumably for simple iid random variables such as coin tosses or dice experiments. While I used the CLT for simulating Normal variables in my very early days [just after programming on punched cards!], I do not think this is a very good or efficient method, as the tails grow very slowly to normality. By comparison, using the same amount of coin tosses to create a sufficient number of binary digits of a Uniform variate produces a computer-precision exact Uniform variate, which can be exploited in Box-Müller-like algorithms to return exact Normal variates… Even by hand if necessary. [For some reason, this question attracted a lot of traffic and an encyclopaedic answer on X validated, despite being borderline to the point of being proposed for closure.]
Dennis Prangle, Richard G. Everitt and Theodore Kypraios just arXived a new paper on ABC, aiming at handling high dimensional data with latent variables, thanks to a cascading (or nested) approximation of the probability of a near coincidence between the observed data and the ABC simulated data. The approach amalgamates a rare event simulation method based on SMC, pseudo-marginal Metropolis-Hastings and of course ABC. The rare event is the near coincidence of the observed summary and of a simulated summary. This is so rare that regular ABC is forced to accept not so near coincidences. Especially as the dimension increases. I mentioned nested above purposedly because I find that the rare event simulation method of Cérou et al. (2012) has a nested sampling flavour, in that each move of the particle system (in the sample space) is done according to a constrained MCMC move. Constraint derived from the distance between observed and simulated samples. Finding an efficient move of that kind may prove difficult or impossible. The authors opt for a slice sampler, proposed by Murray and Graham (2016), however they assume that the distribution of the latent variables is uniform over a unit hypercube, an assumption I do not fully understand. For the pseudo-marginal aspect, note that while the approach produces a better and faster evaluation of the likelihood, it remains an ABC likelihood and not the original likelihood. Because the estimate of the ABC likelihood is monotonic in the number of terms, a proposal can be terminated earlier without inducing a bias in the method.
This is certainly an innovative approach of clear interest and I hope we will discuss it at length at our BIRS ABC 15w5025 workshop next February. At this stage of light reading, I am slightly overwhelmed by the combination of so many computational techniques altogether towards a single algorithm. The authors argue there is very little calibration involved, but so many steps have to depend on as many configuration choices.
Julie Josse contacted me for advertising a postdoc position at École Polytechnique, in Palaiseau, south of Paris. “The fellowship is focusing on missing data. Interested graduates should apply as early as possible since the position will be filled when a suitable candidate is found. The Centre for Applied Mathematics (CMAP) is looking for highly motivated individuals able to develop a general multiple imputation method for multivariate continuous and categorical variables and its implementation in the free R software. The successful candidate will be part of research group in the statistical team on missing values. The postdoc will also have excellent opportunities to collaborate with researcher in public health with partners on the analysis of a large register from the Paris Hospital (APHP) to model the decisions and events when severe trauma patients are handled by emergency doctors. Candidates should contact Julie Josse at polytechnique.edu.”
Seth Flaxman (Oxford), Dougal J. Sutherland (UCL), Yu-Xiang Wang (CMU), and Yee Whye Teh (Oxford), published on arXiv this morning an analysis of the US election, in what they called most appropriately a post-mortem. Using ecological inference already employed after Obama’s re-election. And producing graphs like the following one:
I do not remember precisely for which reason I bought this book but it is most likely because the book popped up in a list of suggested books on a Amazon page. And I certainly feel grateful for the suggestion as this is one of the best books I read in the past years. And not just the best fantasy or the best Gothic book, clearly.
Clarke’s Jonathan Strange & Mr Norrell was published in 2004 and it soon got high-ranked in most best-seller lists, winning the same year both the Hugo and the Locus prizes. But, once again, while it caters to my tastes in fantasy literature, I find the book spans much more, recreating an alternative 19th Century literature where fairies and magic plays a role in the Napoleonic Wars, including Waterloo. The tone and style are reminders of Dickens, the Brontës, and Austen, but also Gothic 19th Century masters, like Ann Radcliffe, Bram Stoker and Mary Shelley. Even the grammar is modified into archaic or pseudo-archaic versions. But more importantly and enticingly the beautiful style reproduces some of the light irony of Dickens about the author and the characters themselves. Utterly enjoyable!
The story itself is about a new era of English magic launched by the two characters on the cover, after centuries of musty study of magic without the power or the will of practising any form of magic. (The book enjoys close to 200 footnotes documenting the history of magic in the past centuries, in a pastiche of scholarly works of older days.) While those two characters can manage incredible feats, they seem to have a rather empirical knowledge of the nature of magic and of what they can do about the ancient magicians of the fairy kingdoms that border Northern England. There is no indication in the book that magical abilities are found in other nations, which is most advantageous when fighting the French! A central axis of the plot is the opposition between Norrell and Strange, the former hoping to take complete control of English magic (and buying any book related to the topic to secure them in a private library), the later freely dispensing his art and taking students in. They also clash about the position to take about the fairy or Raven King, John Uskglass, from excluding him from the modern era to acknowledging his essential role in the existence of English magic. They separate and start fighting one another through books and newspaper articles, Strange leaving for Venezia after loosing his wife. Eventually, they have to reunite to fight the Raven King together and save Strange’s wife, even though the final outcome is somewhat and pleasantly unexpected. (Mind this is a crude summary for a novel of more than 1,000 pages!)
While it seems the author is preparing a sequel, the book stands quite well by itself and I feel another book is somewhat unnecessary: Dickens did not write a sequel to David Copperfield or another perspective on (the Gothic) Great Expectations. But in any case Susanna Clarke wrote there a masterpiece a feat that I hope she can repeat in the future with an altogether book. (And while I liked very much the Quincunx for similar reasons, I deem Jonathan Strange & Mr Norrell to be far far superior in its recreation of Victorian Gothic!)