And yet another roman noir taking place in Iceland! My bedside read over the past two months was “Someone to watch over me” by Yrsa Sigurðardóttir. (It took that long because I was mostly away in July and August, not because the book was boring me to sleep every night!) It is a fairly unusual book in several respects: the setting is an institution for mentally handicapped patients that was set on fire, killing five of the patients as a result, the investigator is an Icelandic lawyer, Þóra Guðmundsdóttir, along with her German unemployed-banker boyfriend, the action takes place at the height [or bottom!] of the Icelandic [and beyond!] economic crisis, when most divorce settlements are about splitting the debts of the household, and when replacing a computer becomes an issue, some of the protagonists, including the main suspects, are mentally ill, and the police and justice are strangely absent from most of the story. The the book tells a lot about the Icelandic society, where a hit-and-run is so unheard of that the police is clueless. Or seems to be. And where people see ghosts. Or think they do, as the author plays (heavily?) on the uncertainty about those ghosts. (At least, there are no elves. Nor trolls.) Definitely more in tune with the “true” Iceland than Available dark. (Well, as far as I can tell!) The mystery itself is a wee bit stretched and the final resolution slightly disappointing, implying some unlikely behaviour from the major characters. In particular, I do not buy the explanation motivating the arson itself. Terrible cover too. And not a great title in English (Watch me or Look at me would have been better) given the many books, movies and songs with the same title. Nonetheless, I liked very much the overall atmosphere of the book, enough to recommend it.
Archive for Iceland
As usual, Indriðason’s books are more about the past (of characters as well as of the whole country) than about current times. Voices does not switch from this pattern, the more because it is one of the earliest Inspector Erlendur’s books. Besides the murder of an hotel employee at the fringe of homelessness, lies the almost constant questioning in Indriðason’s books of the difficult or even impossible relations between parents and children and/or between siblings, and of the long-lasting consequences of this generation gap. The murder iitself is but a pretext to investigations on that theme and the murder resolution is far from the central point of the book. The story itself is thus less compelling than others I have read, maybe because the main character spends so much time closeted in his hotel room. But it nonetheless fits well within the Erlendur series. And although it is unrelated with the story, the cover reminded me very much of the Gullfoss waterfalls.
The second book, Strange Shores, is the farthest to a detective stories in the whole series. Indeed, Erlendur is back to his childhood cottage in Eastern Iceland, looking for a resolution of his childhood trauma, loosing his younger brother during a snowstorm. He also investigates another snowstorm disappearance, interrogating the few survivors and reluctant witnesses from that time. Outside any legal mandate. Sometimes very much outside! While the story is not completely plausible, both in the present and in the past, it remains a striking novel, even on its own. (Although it could read better after the earlier novels in the series.) Not only the resolution of the additional disappearance brings additional pain and no comfort to those involved, but the ending of Erlendur’s own quest is quite ambiguous. As the book reaches its final pages, I could not decide if he had reached redemption and deliverance and the potential to save his own children, or he was beyond redemption, reaching another circle of Hell. As explained by the author in an interview, this is intentional and not not the consequence of my poor understanding: ” Readers of Strange Shores are not quite certain what to make of the ending regarding Erlendur, and I’m quite happy to leave them in the dark!”. If the main character of this series focussing more on missing persons than on detective work, what’s next?!
In a somewhat desperate rush (started upon my return from Iceland and terminated on my return from Edinburgh), Marco Banterle, Clara Grazian and I managed to complete and submit our paper by last Friday evening… It is now arXived as well. The full title of the paper is Accelerating Metropolis-Hastings algorithms: Delayed acceptance with prefetching and the idea behind the generic acceleration is (a) to divide the acceptance step into parts, towards a major reduction in computing time that outranks the corresponding reduction in acceptance probability and (b) to exploit this division to build a dynamic prefetching algorithm. The division is to break the prior x likelihood target into a product such that some terms are much cheaper than others. Or equivalently to represent the acceptance-rejection ratio in the Metropolis-Hastings algorithm as
again with significant differences in the computing cost of those terms. Indeed, this division can be exploited by checking for each term sequentially, in the sense that the overall acceptance probability
is associated with the right (posterior) target! This lemma can be directly checked via the detailed balance condition, but it is also a consequence of a 2005 paper by Andrès Christen and Colin Fox on using approximate transition densities (with the same idea of gaining time: in case of an early rejection, the exact target needs not be computed). While the purpose of the recent [commented] paper by Doucet et al. is fundamentally orthogonal to ours, a special case of this decomposition of the acceptance step in the Metropolis–Hastings algorithm can be found therein. The division of the likelihood into parts also allows for a precomputation of the target solely based on a subsample, hence gaining time and allowing for a natural prefetching version, following recent developments in this direction. (Discussed on the ‘Og.) We study the novel method within two realistic environments, the first one made of logistic regression targets using benchmarks found in the earlier prefetching literature and a second one handling an original analysis of a parametric mixture model via genuine Jeffreys priors. [As I made preliminary notes along those weeks using the ‘Og as a notebook, several posts on the coming days will elaborate on the above.]
In the latest Sunday Review of the New York Times, the Norwegian novelist Jo Nesbo has a tribune on revenge against misdeeds and law as institutionalized revenge. Somewhat hidden in the current justifications of the legal system(s). (As an aside, he mentions the example of the Icelandic Alþingi where justice was dispensed once a year, resulting in beheadings, stake burnings, and drowning in the pond depicted above…) This came a few days after another tribune on a similar topic by Charles Blow, following the “botched Oklahoma execution of Clayton Lockett”, entitled “Eye-for-eye incivility” (an understatement if any!), and arguing about the economic inefficiency of the death penalty. Besides the basic moral quandaries of taking someone else’s life, perfectly summarised by Franquin in the following dark strip:
This sequence of tribunes links to one of my pet theories, which is that imprisonment is the most inadequate way of addressing crime and law breaking in (modern?) societies. Setting fully aside the moral notions of revenge and punishment, which aim more at the victim or victim’s relatives than at the perpetrator, and of redemption and remorse, which are at best hypothetical and inspired by religious considerations, I do wonder why economists have not tried to come up with more rational and game-theoretic ways of dealing with law-breakers than locking them up all together and expecting them to behave forever after the end of their term. More globally, I find it quite surprising that no one ever seems to question the very notion of sending people to jail. Indeed, it does bring any clear benefit to society as a whole. One of the usual arguments I receive in those occasions is that imprisonment keeps dangerous people away. But that seems a fairly weak notion: (i) most violent offenders are not dangerous in an absolute berserker sense but only because local circumstances made them violent at a given occurrence in space and time, (ii) those offenders are only put away for a while (in most civilised countries), (iii) they are not getting any less dangerous while in prison, and (iv) it does not apply to the vast majority of people jailed. Furthermore, from a pure offer-versus-demand perspective, this may be counterproductive: e.g., putting some thieves away in jail for a while simply gives an opportunity to other thieves to take advantage of the “thieving market”.
The Freakonomics blog has some entries on the topic—somewhat supportive of my notion that most criminals act in an overall rational way for which incentives and decentives could be considered—, but still fails to address the larger picture… I showed this post to Andrew who pointed me (of course!) to his blog, as several entries therein also consider the issue, like this one on the puzzles of criminal justice. Or prison terms for financial fraud? But I would push the argument further and call for an ultimate abolishment of the carceral system, seeking efficient and generalised alternatives to imprisonment. As detailed in this U.N. report I just came across. As I think a time will come when imprisonment will be seen as irrational as witch-burning is considered today.
Just before I left for Iceland, Matias Quiroz, Mattias Villani and Robert Kohn arXived a paper entitled “speeding up MCMC by efficient data subsampling”. Somewhat connected with the earlier papers by Koattikara et al., and Bardenet et al., both discussed on the ‘Og, the idea is to replace the log-likelihood by an unbiased subsampled version and to correct for the resulting bias of the exponentiation of this (Horwitz-Thompson or Hansen-Hurwitz) estimator. They ground their approach within the (currently cruising!) pseudo-marginal paradigm, even though their likelihood estimates are not completely unbiased. Since the optimal weights in the sampling step are proportional to the log-likelihood terms, they need to build a surrogate of the true likelihood, using either a Gaussian process or a spline approximation. This is all in all a very interesting contribution to the on-going debate about increasing MCMC speed when dealing with large datasets and ungainly likelihood functions. The proposed solution however has a major drawback in that the entire dataset must be stored at all times to ensure unbiasedness. For instance, the paper considers a bivariate probit model with a sample of 500,000 observations. Which must be available at all times. Further, unless I am confused, the subsampling step requires computing the surrogate likelihood for all observations, before running the subsampling step, another costly requirement.
It took me a fairly long while to realise there was a map of Iceland as a tag-cloud at the back of the AISTATS 2014 tee-shirt! As it was far too large for me, I thought about leaving it at the conference desk last week. I did bring it back for someone the proper size though and discovered the above when unfolding the tee… Nice but still not my size!