Archive for approximate Bayesian inference

one World ABC seminar [term #2]

Posted in Statistics with tags , , , , , , , , , , on September 29, 2020 by xi'an

The on-line One World ABC seminar continues on-line this semester! With talks every other Thursday at 11:30 UK time (12:30 central European time). Incoming speakers are

with presenters to be confirmed for 15 and 29 October. Anyone interested in presenting at this webinar in a near future should not hesitate in contacting Massimiliano Tamborrino in Warwick or any of the other organisers of the seminar!

special issue of Entropy

Posted in Statistics with tags , , , , , on September 11, 2020 by xi'an

computational advances in approximate Bayesian methods [at JSM]

Posted in Statistics with tags , , , , , , , on August 5, 2020 by xi'an

Another broadcast for an ABC (or rather ABM) session at JSM, organised and chaired by Robert Kohn, taking place tomorrow at 10am, ET, i.e., 2pm GMT, with variational and ABC talks:

454 * Thu, 8/6/2020, 10:00 AM – 11:50 AM Virtual
Computational Advances in Approximate Bayesian Methods — Topic Contributed Papers
Section on Bayesian Statistical Science
Organizer(s): Robert Kohn, University of New South Wales
Chair(s): Robert Kohn, University of New South Wales
10:05 AM Sparse Variational Inference: Bayesian Coresets from Scratch
Trevor Campbell, University of British Columbia
10:25 AM Fast Variational Approximation for Multivariate Factor Stochastic Volatility Model
David Gunawan, University of Wollongong; Robert Kohn, University of New South Wales; David Nott, National University of Singapore
10:45 AM High-Dimensional Copula Variational Approximation Through Transformation
Michael Smith, University of Melbourne; Ruben Loaiza-Maya, Monash University ; David Nott, National University of Singapore
11:05 AM Mini-Batch Metropolis-Hastings MCMC with Reversible SGLD Proposal
Rachel Wang, University of Sydney; Tung-Yu Wu, Stanford University; Wing Hung Wong, Stanford University
11:25 AM Weighted Approximate Bayesian Computation via Large Deviations Theory
Cecilia Viscardi, University of Florence; Michele Boreale, University of Florence; Fabio Corradi, University of Florence; Antonietta Mira, Università della Svizzera Italiana (USI)
11:45 AM Floor Discussion

Savage Award session today at JSM

Posted in Kids, Statistics, Travel, University life with tags , , , , , , , , , , on August 3, 2020 by xi'an

Pleased to broadcast the JSM session dedicated to the 2020 Savage Award, taking place today at 13:00 ET (17:00 GMT), with two of the Savage nominees being former OxWaSP students (and Warwick PhD students). For those who have not registered for JSM, the talks are also available on Bayeslab. (As it happens, I was also a member of the committee this year, but do not think this could be deemed a CoI!)

112 Mon, 8/3/2020, 1:00 PM – 2:50 PM Virtual
Savage Award Session — Invited Papers
International Society for Bayesian Analysis (ISBA)
Organizer(s): Maria De Iorio, University College London
Chair(s): Maria De Iorio, University College London
1:05 PM Bayesian Dynamic Modeling and Forecasting of Count Time Series
Lindsay Berry, Berry Consultants
1:30 PM Machine Learning Using Approximate Inference: Variational and Sequential Monte Carlo Methods
Christian Andersson Naesseth, Columbia University
1:55 PM Recent Advances in Bayesian Probabilistic Numerical Integration
Francois-Xavier Briol, University College London
2:20 PM Factor regression for dimensionality reduction and data integration techniques with applications to cancer data
Alejandra Avalos Pacheco, Harvard Medical School
2:45 PM Floor Discussion

improving synthetic likelihood

Posted in Books, Statistics, University life with tags , , , , , , , , on July 9, 2020 by xi'an

Chris Drovandi gave an after-dinner [QUT time!] talk for the One World ABC webinar on a recent paper he wrote with Jacob Proddle, Scott Sisson and David Frazier. Using a regular MCMC step on a synthetic likelihood approximation to the posterior. Or a (simulation based) unbiased estimator of it.

By evaluating the variance of the log-likelihood estimator, the authors show that the number of simulations n need scale like n²d² to keep the variance under control. And suggest PCA decorrelation of the summary statistic components as a mean to reduce the variance since it then scales as n²d. Rather idly, I wonder at the final relevance of precisely estimating the (synthetic) likelihood when considering it is not the true likelihood and when the n² part seems more damning. Moving from d² to d seems directly related to the estimation of a full correlation matrix for the Normal synthetic distribution of the summary statistic versus the estimation of a diagonal matrix. The usual complaint that performances highly depend on the choice of the summary statistic also applies here, in particular when its dimension is much larger than the dimension d of the parameter (as in the MA example). Although this does not seem to impact the scale of the variance.