extending ABC to high dimensions via Gaussian copula

Posted in Books, pictures, Statistics, Travel, Uncategorized, University life with tags , , , on April 28, 2015 by xi'an

plane2Li, Nott, Fan, and Sisson arXived last week a new paper on ABC methodology that I read on my way to Warwick this morning. The central idea in the paper is (i) to estimate marginal posterior densities for the components of the model parameter by non-parametric means; and (ii) to consider all pairs of components to deduce the correlation matrix R of the Gaussian (pdf) transform of the pairwise rank statistic. From those two low-dimensional estimates, the authors derive a joint Gaussian-copula distribution by using inverse  pdf transforms and the correlation matrix R, to end up with a meta-Gaussian representation

f(\theta)=\dfrac{1}{|R|^{1/2}}\exp\{\eta^\prime(I-R^{-1})\eta/2\}\prod_{i=1}^p g_i(\theta_i)

where the η’s are the Gaussian transforms of the inverse-cdf transforms of the θ’s,that is,

\eta_i=\Phi^{-1}(G_i(\theta_i))

Or rather

\eta_i=\Phi^{-1}(\hat{G}_i(\theta_i))

given that the g’s are estimated.

This is obviously an approximation of the joint in that, even in the most favourable case when the g’s are perfectly estimated, and thus the components perfectly Gaussian, the joint is not necessarily Gaussian… But it sounds quite interesting, provided the cost of running all those transforms is not overwhelming. For instance, if the g’s are kernel density estimators, they involve sums of possibly a large number of terms.

One thing that bothers me in the approach, albeit mostly at a conceptual level for I realise the practical appeal is the use of different summary statistics for approximating different uni- and bi-dimensional marginals. This makes for an incoherent joint distribution, again at a conceptual level as I do not see immediate practical consequences… Those local summaries also have to be identified, component by component, which adds another level of computational cost to the approach, even when using a semi-automatic approach as in Fernhead and Prangle (2012). Although the whole algorithm relies on a single reference table.

The examples in the paper are (i) the banana shaped “Gaussian” distribution of Haario et al. (1999) that we used in our PMC papers, with a twist; and (ii) a g-and-k quantile distribution. The twist in the banana (!) is that the banana distribution is the prior associated with the mean of a Gaussian observation. In that case, the meta-Gaussian representation seems to hold almost perfectly, even in p=50 dimensions. (If I remember correctly, the hard part in analysing the banana distribution was reaching the tails, which are extremely elongated in at least one direction.) For the g-and-k quantile distribution, the same holds, even for a regular ABC. What seems to be of further interest would be to exhibit examples where the meta-Gaussian is clearly an approximation. If such cases exist.

the travelling politician problem

Posted in pictures, Statistics, Travel, University life with tags , , , , , on April 27, 2015 by xi'an

probabilistic numerics

Posted in pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , on April 27, 2015 by xi'an

sunwar2I attended an highly unusual workshop while in Warwick last week. Unusual for me, obviously. It was about probabilistic numerics, i.e., the use of probabilistic or stochastic arguments in the numerical resolution of (possibly) deterministic problems. The notion in this approach is fairly Bayesian in that it makes use to prior information or belief about the quantity of interest, e.g., a function, to construct an usually Gaussian process prior and derive both an estimator that is identical to a numerical method (e.g., Runge-Kutta or trapezoidal integration) and uncertainty or variability around this estimator. While I did not grasp much more than the classy introduction talk by Philipp Hennig, this concept sounds fairly interesting, if only because of the Bayesian connection, and I wonder if we will soon see a probability numerics section at ISBA! More seriously, placing priors on functions or functionals is a highly formal perspective (as in Bayesian non-parametrics) and it makes me wonder how much of the data (evaluation of a function at a given set of points) and how much of the prior is reflected in the output [variability]. (Obviously, one could also ask a similar question for statistical analyses!)  For instance, issues of singularity arise among those stochastic process priors.

Another question that stemmed from this talk is whether or not more efficient numerical methods can derived that way, in addition to recovering the most classical ones. Somewhat, somehow, given the idealised nature of the prior, it feels like priors could be more easily compared or ranked than in classical statistical problems. Since the aim is to figure out the value of an integral or the solution to an ODE. (Or maybe not, since again almost the same could be said about estimating a normal mean.)

fare well, Paula!

Posted in pictures, Running with tags , , , , , , on April 26, 2015 by xi'an

the forever war [book review]

Posted in Books, Kids with tags , , , , , on April 26, 2015 by xi'an

Another book I bought somewhat on a whim, although I cannot remember which one… The latest edition has a preface by John Scalzi, author of Old Man’s War and its sequels, where he acknowledged he would not have written this series, had he previously read The Forever War. Which strikes me as ironical as I found Scalzi’s novels way better. Deeper. And obviously not getting obsolete so immediately! (As an aside, Scalzi is returning to the Old Man’s War universe with a new novel, The End of All Things.)

“…it’s easy to compute your chances of being able to fight it out for ten years. It comes to about two one-thousandths of one percent. Or, to put it another way, get an old-fashioned six-shooter and play Russian Roulette with four of the six chambers loaded. If you can do it ten times in a row without decorating the opposite wall, congratulations! You’re a civilian.”

This may be the main issue with The Forever War. The fact that it sounds so antiquated. And hence makes reading the novel like an exercise in Creative Writing 101, in order to spot how the author was so rooted in the 1970’s that he could not project far enough in the future to make his novel sustainable. The main issue in the suspension of belief required to proceed through the book is the low-tech configuration of Halderman’s future. Even though intergalactic travel is possible via the traditional portals found in almost every sci’-fi’ book, computers are blatantly missing from the picture. And so is artificial intelligence as well. (2001 A space odyssey was made in 1968, right?!) The economics of a forever warring Earth are quite vague and unconvincing. There is no clever tactics in the war against the Taurans. Even the battle scenes are far from exciting. Esp. the parts where they fight with swords and arrows. And the treatment of sexuality has not aged well. So all that remains in favour of the story (and presumably made the success of the book) is the description of the ground soldier’s life which could almost transcribe verbatim to another war and another era. End of the story. (Unsurprisingly, while being the first book picked for the SF MasterworksThe Forever War did not make it into the 2011 series…)

bruggen in Amsterdam

Posted in pictures, Running, Travel with tags , , , , on April 25, 2015 by xi'an

Amster2Amster1

ontological argument

Posted in Books, Kids, pictures with tags , , on April 25, 2015 by xi'an

Follow

Get every new post delivered to your Inbox.

Join 821 other followers