Archive for coreset

computational advances in approximate Bayesian methods [at JSM]

Posted in Statistics with tags , , , , , , , on August 5, 2020 by xi'an

Another broadcast for an ABC (or rather ABM) session at JSM, organised and chaired by Robert Kohn, taking place tomorrow at 10am, ET, i.e., 2pm GMT, with variational and ABC talks:

454 * Thu, 8/6/2020, 10:00 AM – 11:50 AM Virtual
Computational Advances in Approximate Bayesian Methods — Topic Contributed Papers
Section on Bayesian Statistical Science
Organizer(s): Robert Kohn, University of New South Wales
Chair(s): Robert Kohn, University of New South Wales
10:05 AM Sparse Variational Inference: Bayesian Coresets from Scratch
Trevor Campbell, University of British Columbia
10:25 AM Fast Variational Approximation for Multivariate Factor Stochastic Volatility Model
David Gunawan, University of Wollongong; Robert Kohn, University of New South Wales; David Nott, National University of Singapore
10:45 AM High-Dimensional Copula Variational Approximation Through Transformation
Michael Smith, University of Melbourne; Ruben Loaiza-Maya, Monash University ; David Nott, National University of Singapore
11:05 AM Mini-Batch Metropolis-Hastings MCMC with Reversible SGLD Proposal
Rachel Wang, University of Sydney; Tung-Yu Wu, Stanford University; Wing Hung Wong, Stanford University
11:25 AM Weighted Approximate Bayesian Computation via Large Deviations Theory
Cecilia Viscardi, University of Florence; Michele Boreale, University of Florence; Fabio Corradi, University of Florence; Antonietta Mira, Università della Svizzera Italiana (USI)
11:45 AM Floor Discussion

ISBA 18 tidbits

Posted in Books, Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , , , on July 2, 2018 by xi'an

Among a continuous sequence of appealing sessions at this ISBA 2018 meeting [says a member of the scientific committee!], I happened to attend two talks [with a wee bit of overlap] by Sid Chib in two consecutive sessions, because his co-author Ana Simoni (CREST) was unfortunately sick. Their work was about models defined by a collection of moment conditions, as often happens in econometrics, developed in a recent JASA paper by Chib, Shin, and Simoni (2017). With an extension about moving to defining conditional expectations by use of a functional basis. The main approach relies on exponentially tilted empirical likelihoods, which reminded me of the empirical likelihood [BCel] implementation we ran with Kerrie Mengersen and Pierre Pudlo a few years ago. As a substitute to ABC. This problematic made me wonder on how much Bayesian the estimating equation concept is, as it should somewhat involve a nonparametric prior under the moment constraints.

Note that Sid’s [talks and] papers are disconnected from ABC, as everything comes in closed form, apart from the empirical likelihood derivation, as we actually found in our own work!, but this could become a substitute model for ABC uses. For instance, identifying the parameter θ of the model by identifying equations. Would that impose too much input from the modeller? I figure I came with this notion mostly because of the emphasis on proxy models the previous day at ABC in ‘burgh! Another connected item of interest in the work is the possibility of accounting for misspecification of these moment conditions by introducing a vector of errors with a spike & slab distribution, although I am not sure this is 100% necessary without getting further into the paper(s) [blame conference pressure on my time].

Another highlight was attending a fantastic poster session Monday night on computational methods except I would have needed four more hours to get through every and all posters. This new version of ISBA has split the posters between two sites (great) and themes (not so great!), while I would have preferred more sites covering all themes over all nights, to lower the noise (still bearable this year) and to increase the possibility to check all posters of interest in a particular theme…

Mentioning as well a great talk by Dan Roy about assessing deep learning performances by what he calls non-vacuous error bounds. Namely, through PAC-Bayesian bounds. One major comment of his was about deep learning models being much more non-parametric (number of parameters rising with number of observations) than parametric models, meaning that generative adversarial constructs as the one I discussed a few days ago may face a fundamental difficulty as models are taken at face value there.

On closed-form solutions, a closed-form Bayes factor for component selection in mixture models by Fũqene, Steel and Rossell that resemble the Savage-Dickey version, without the measure theoretic difficulties. But with non-local priors. And closed-form conjugate priors for the probit regression model, using unified skew-normal priors, as exhibited by Daniele Durante. Which are product of Normal cdfs and pdfs, and which allow for closed form marginal likelihoods and marginal posteriors as well. (The approach is not exactly conjugate as the prior and the posterior are not in the same family.)

And on the final session I attended, there were two talks on scalable MCMC, one on coresets, which will require some time and effort to assimilate, by Trevor Campbell and Tamara Broderick, and another one using Poisson subsampling. By Matias Quiroz and co-authors. Which did not completely convinced me (but this was the end of a long day…)

All in all, this has been a great edition of the ISBA meetings, if quite intense due to a non-stop schedule, with a very efficient organisation that made parallel sessions manageable and poster sessions back to a reasonable scale [although I did not once manage to cross the street to the other session]. Being in unreasonably sunny Edinburgh helped a lot obviously! I am a wee bit disappointed that no one else follows my call to wear a kilt, but I had low expectations to start with… And too bad I missed the Ironman 70.3 Edinburgh by one day!