Archive for SimStat2019

dodging bullets, IEDs, and fingerprint detection at SimStat19

Posted in pictures, Statistics, University life with tags , , , , , , , , , , , , , , , , , on September 10, 2019 by xi'an

I attended a fairly interesting forensic science session at SimStat 2019 in Salzburg as it concentrated on evidence and measures of evidence rather than on strict applications of Bayesian methodology to forensic problems. Even though American administrations like the FBI or various police departments were involved. It was a highly coherent session and I had a pleasant discussion with some of the speakers after the session. For instance, my friend Alicia Carriquiry presented an approach to determined from images of bullets whether or not they have been fired from the same gun, leading to an interesting case for a point null hypothesis where the point null makes complete sense. The work has been published in Annals of Applied Statistics and is used in practice. The second talk by Danica Ommen on fiducial forensics on IED, asking whether or not copper wires used in the bombs are the same, which is another point null illustration. Which also set an interesting questioning on the dependence of the alternative prior on the distribution of material chosen as it is supposed to cover all possible origins for the disputed item. But more interestingly this talk launched into a discussion of making decision based on finite samplers and unknown parameters, not that specific to forensics, with a definitely surprising representation of the Bayes factor as an expected likelihood ratio which made me first reminiscent of Aitkin’s (1991) infamous posterior likelihood (!) before it dawned on me this was a form of bridge sampling identity where the likelihood ratio only involved parameters common to both models, making it an expression well-defined under both models. This identity could be generalised to the general case by considering a ratio of integrated likelihoods, the extreme case being the ratio equal to the Bayes factor itself. The following two talks by Larry Tang and Christopher Saunders were also focused on the likelihood ratio and their statistical estimates, debating the coherence of using a score function and presenting a functional ABC algorithm where the prior is a Dirichlet (functional) prior. Thus a definitely relevant session from a Bayesian perspective!

 

Salzburg castle [jatp]

Posted in Mountains, pictures, Travel with tags , , , , , , on September 9, 2019 by xi'an

likelihood-free inference by ratio estimation

Posted in Books, Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , on September 9, 2019 by xi'an

“This approach for posterior estimation with generative models mirrors the approach of Gutmann and Hyvärinen (2012) for the estimation of unnormalised models. The main difference is that here we classify between two simulated data sets while Gutmann and Hyvärinen (2012) classified between the observed data and simulated reference data.”

A 2018 arXiv posting by Owen Thomas et al. (including my colleague at Warwick, Rito Dutta, CoI warning!) about estimating the likelihood (and the posterior) when it is intractable. Likelihood-free but not ABC, since the ratio likelihood to marginal is estimated in a non- or semi-parametric (and biased) way. Following Geyer’s 1994 fabulous estimate of an unknown normalising constant via logistic regression, the current paper which I read in preparation for my discussion in the ABC optimal design in Salzburg uses probabilistic classification and an exponential family representation of the ratio. Opposing data from the density and data from the marginal, assuming both can be readily produced. The logistic regression minimizing the asymptotic classification error is the logistic transform of the log-ratio. For a finite (double) sample, this minimization thus leads to an empirical version of the ratio. Or to a smooth version if the log-ratio is represented as a convex combination of summary statistics, turning the approximation into an exponential family,  which is a clever way to buckle the buckle towards ABC notions. And synthetic likelihood. Although with a difference in estimating the exponential family parameters β(θ) by minimizing the classification error, parameters that are indeed conditional on the parameter θ. Actually the paper introduces a further penalisation or regularisation term on those parameters β(θ), which could have been processed by Bayesian Lasso instead. This step is essentially dirving the selection of the summaries, except that it is for each value of the parameter θ, at the expense of a X-validation step. This is quite an original approach, as far as I can tell, but I wonder at the link with more standard density estimation methods, in particular in terms of the precision of the resulting estimate (and the speed of convergence with the sample size, if convergence there is).

likelihood-free Bayesian design [SimStat 2019 discussion]

Posted in Statistics with tags , , , , , , , , , , on September 5, 2019 by xi'an

über Salzburg [jatp]

Posted in Mountains, pictures, Running, Travel with tags , , , , , , , , on September 5, 2019 by xi'an

off to SimStat2019, Salzburg

Posted in Mountains, Running, Statistics, University life with tags , , , , , , , , , , , , , on September 2, 2019 by xi'an

Today, I am off to Salzburg for the SimStat 2019 workshop, or more formally the 10th International Workshop on Simulation and Statistics, where I give a talk on ABC. The program of the workshop is quite diverse and rich and so I do not think I will have time to take advantage of the Hohe Tauern or the Berchtesgaden Alps to go climbing. Especially since I am also discussing papers in an ABC session.