Archive for Britain
Last year, I posted a review of Ishiguro’s “When we were orphans”, with the comment that, while I enjoyed the novel and appreciated its multiple layers, while missing a strong enough grasp on the characters… I brought back from New York Ishiguro’s latest novel, “The Buried Giant“, with high expectations, doubled by the location of the story in an Arthurian setting, at a time when Britons had not yet been subsumed into Anglo-Saxon culture or forced to migrate to little Britain (Brittany). Looking forward a re-creation of an Arthurian cycle, possibly with a post-modern twist. (Plus, the book as an object is quite nice, with a black slice.)
“I respect what I think he was trying to do, but for me it didn’t work. It couldn’t work. No writer can successfully use the ‘surface elements’ of a literary genre — far less its profound capacities — for a serious purpose, while despising it to the point of fearing identification with it. I found reading the book painful. It was like watching a man falling from a high wire while he shouts to the audience, “Are they going say I’m a tight-rope walker?”” Ursula Le Gun, March 2, 2015.
Alas, thrice alas, after reading it within a fortnight, I am quite disappointed by the book. Which, like the giant, would have better remained buried.. Ishiguro pursues his delving into the notion of memories and remembrances, with the twisted reality they convey. After the detective cum historical novel of “When we were orphans”, he moves to the allegory of the early medieval tale, where characters have to embark upon a quest and face supernatural dangers like pixies and ogres. But mostly suffer from a collective amnesia they cannot shake. The idea is quite clever and once again attractive, but the resulting story sounds too artificial and contrived to involve me into the devenir of its characters. As an aside, the two central characters, Beatrix and Axl, have hardly Briton names. Beatrix is of Latin origin and means traveller, while Axl is of Scandinavian origin and means father of peace. Appropriate symbols for their roles in the allegory, obviously. But this also makes me wonder how deep the allegory is, that is, how many levels of references and stories are hidden behind the bland trek of A & B through a fantasy Britain.
A book review in The Guardian links this book with Tolkien’s Lord of the Rings. I fail to see the connection: Tolkien was immersed for his whole life into Norse sagas and Saxon tales, creating his own myth out of his studies without a thought for parody or allegory. Here, the whole universe is misty and vague, and characters act with no reason or rationale. The whole episode in the monastery and the subsequent tunnel exploration do not make sense in terms of the story, while I cannot fathom what they are supposed to stand for. The theme of the ferryman carrying couples to an island where they may rest, together or not, sounds too obvious to just mean this. What else does it stand for?! The encounters of the rag woman, first in the Roman ruins where she threatens to cut a rabbit’s neck, then in a boat where she acts as a decoy, are completely obscure as to what they are supposed to mean. Maybe this accumulation of senseless events is the whole point of the book, but such a degree of deconstruction does not make for a pleasant read. Eventually, I came to hope that the mists rise again and carry away all past memories of “The Buried Giant“!
My friend and Warwick colleague Gareth Roberts just published a paper in Nature with Ellen Brooks-Pollock and Matt Keeling from the University of Warwick on the modelling of bovine tuberculosis dynamics in Britain and on the impact of control measures. The data comes from the Cattle Tracing System and the VetNet national testing database. The mathematical model is based on a stochastic process and its six parameters are estimated by sequential ABC (SMC-ABC). The summary statistics chosen in the model are the number of infected farms per county per year and the number of reactors (cattle failing a test) per county per year.
“Therefore, we predict that control of local badger populations and hence control of environmental transmission will have a relatively limited effect on all measures of bovine TB incidence.”
This advanced modelling of a comprehensive dataset on TB in Britain quickly got into a high profile as it addresses the highly controversial (not to say plain stupid) culling of badgers (who also carry TB) advocated by the government. The study concludes that “only generic measures such as more national testing, whole herd culling or vaccination that affect all routes of transmission are effective at controlling the spread of bovine TB.” While the elimination of badgers from the English countryside would have a limited effect. Good news for badgers! And the Badger Trust. Unsurprisingly, the study was immediately rejected by the UK farming minister! Not only does he object to the herd culling solution for economic reasons, but he “cannot accept the paper’s findings”. Maybe he does not like ABC… More seriously, the media oversimplified the findings of the study, “as usual”, with e.g. The Guardian headline of “tuberculosis threat requires mass cull of cattle”.
While I left Paris under a thunderstorm, the weather in London was warm and sunny, and I enjoyed a nice walk to the RSS. With a Betsey Trotwood pub on the way that obviously delighted the David Copperfield fan in me! The Bayes 250 meeting started with the videoed interview of Dennis Lindley by and thanks to Tony O’Hagan in his Devonshire home. I hope the video gets on-line soon as it is remarkable in rendering Dennis’ view on Bayesian statistics, being full of humour and unremitting in his defence of the Bayesian approach. (And as I missed a few points due to an imperfect sound system.) “Coherence is all” could best summarise this interview. And the sincere regret that Bayesianism has not taken over…
The talks started with Gareth Roberts explaining why MCMC was possible in infinite dimension despite the dimensionality curse. (Starting his talk with a Rev. Bayes meets Newton, Markov and Metropolis diaporama.) Then, after a lunch break where some participants eloped to Bayes’ tomb next door (!), Sylvia Richardson presented a broad vision of Bayesian biostatistics, answering in my opinion some of Dennis’ worries that Bayes had not taken off widely-enough (my rephrasing). Dennis Prangle also chose to give an overview of ABC, rejoining my perspective that it is more of a new kind of inference with Bayesian justifications than a mere computational tool, Michael Jordan talked about Kingman’s paintbox (in relation with Tamara Broderick’s talk I had enjoyed so much in Kyoto) before rushing back to Paris, Phil Dawid gave a somehow a-Bayesian talk about the frequentist (in)validation of predictors, in connection with his calibration talk in Padova a few months ago, Iain Murray explained his NADE modelling tool, mixing neural nets with mixtures, and YeeWhye Teh concluded the talks of the day with a presentation of his Gibbs sampler for jump processes that I found most interesting (I later realised this was a paper I had missed in Bayes 250 in Edinburgh by leaving early!). The day ended with a few posters, including one by Maria Lomelli Garcia and YeeWhye Teh on alpha-stable processes that provided a new auxiliary variable representation of clear appeal. (The day actually ended for good with a light and enjoyable dinner in this most improbable Renaissance Hotel that literally stands at the end of the tracks of St Pancras…)
The second day was just as rich: after [a run in Regent’s Park and] a welcome from the current RSS president (John Pullinger, who happens to live in Turnbridge Wells, of all places!), Michael Goldstein gave a spirited defence of Bayesian statistics as a projection device (putting expectation forward of probability as in deFinetti and Hartigan), Andrew Golightly discussed particle filter approximations based on discretised diffusions and fighting degeneracy via bridging, Nicky Best managed to give three talks in one (!) around Bayesian epidemiology, beginning with a Rev. Bayes meets Dr. Snow (who started spatial epidemiology with his famous cholera map). Then Christophe Andrieu presented what were new & exciting results for me, showing by Peskun and convex orderings that using more unbiased estimates of the likelihood function was theoretically as well as practically improving the performances of the associated Exact Approximation MCMC algorithm. This was followed by Ben Calderhead, who summarised his recently arXived paper with Mark Girolami and co-authors on using Bayesian analysis to evaluate the uncertainty associated with the numerical resolution of differential equations, connecting with the older paper by Persi Diaconis on the topic (paper I remember discussing with George Casella in an Ithaca café while we were waiting for his car to be fixed…). I wonder whether the approach could be used to handle the constant estimation paradox raised by Larry Wasserman (and discussed on the ‘Og as well)… Under the title of “the misspecified Bayesian”, Stephen Walker sketched an on-going work with Chris Holmes, work that resonated deeply with some of my current musings about the nature of Bayesian inference on intractable problems. Hence giving me new prospects on ABC validation and extension. More precisely, he showed us a way to handle problems where only some aspect of the model is of interest and where a pseudo-model that (asymptotically) manages this aspect can be found. The paper should soon be arXived and I will certainly discuss it more at length then! Simon Wilson did a “Rev. Bayes meets Dr. Linnaeus” introduction and talked about the estimation of the number of newly discoveries of (unknown) species, a problem that I find fascinating even though I find the current solutions of an essentially hypergeometric model somehow oversimplifying. Chris Yau introduced us to his current work on cancer analysis and to his way of managing the complexity of the mutation process by hierarchical models, and Peter Green ended the presentations with a survey or survol of his work on doing inference on decomposable graphs, with online exhibits.
The meeting concluded with Adrian Smith giving a personal reminiscence of the (poor) state of Bayesian statistics in the 60’s and 70’s, paying tribute to his advisor Dennis Lindley for keeping the faith against strong opposition and for ensuring the survival of the field onto the next generation. (And linking once again with John Kingman.) As hopefully shown by my summary, the field is definitely alive nowadays and has accomplished much by managing the computational hurdles. (As shown further by our Statistical Science incoming vignettes, there are many cases where Bayesian analysis looks like the only available answer.) However, the new challenges raised by Big Data may well jeopardise this revival of a 250 year old principle by moving to quick-and-dirty (and less principled) inference techniques. What really made this meeting so successful in my opinion is that a lot of the talks we heard in Errol Street over those two days were exposing progress being made towards handling the new challenges. Hence, there still is hope for Bayesian techniques in the coming century!
I just found out that Gareth Roberts and Terry Speed have been elected as Fellows of the Royal Society (FRS). Congratulations to both for this prestigious recognition of their major contributions to Science! (Another Fellow elected this year is Bill Bryson, in recognition of his scientific popularisation books. Including one on the Royal Society I reviewed for CHANCE a few months ago.)
On the way back from a good session at the climbing gym, bagging a diedral 6B+, I got an email that Odyssey, the lastest climbing movie from the UK climbing video company Hot Aches was available for free viewing till November 20. Enjoy, this is Brit trad climbing at its best! (And here is the conversion chart, for they use UK grades…)