Archive for conference

ISBA 20.2.21

Posted in Kids, Mountains, pictures, Running, Statistics, Travel, University life, Wines with tags , , , , , , , , , , , , , , , , , , , , , , , , , , , on June 30, 2021 by xi'an

A second day which started earlier and more smoothly with a 100% local j-ISBA session. (Not counting the invigorating swim in Morgiou!) With talks by junior researchers from master to postdoc level, as this ISBA mirror meeting was primarily designed for them, so that they could all present their work, towards gaining more visibility for their research and facilitating more interactions with the participants. CIRM also insisted on this aspect of the workshop, which was well-attended.

I alas had to skip the poster session [and the joys of gather.town] despite skipping lunch [BND], due to organisational constraints. Then attended the Approximate Bayesian computation section, including one talk by Geoff Nicholls on confidence estimation for ABC, following upon the talk given by Kate last evening. And one by Florian Maire on learning the bound in accept-reject algorithms on the go, as in Caffo et al. (2002), which I found quite exciting and opening new possibilities, esp. if the Markov chain thus produced can be recycled towards unbiasedness without getting the constant right! For instance, by Rao-Blackwellisation, multiple mixtures, black box importance sampling, whatever. (This also reminded me of the earlier Goffinet et al. 1996.)

Followed by another Bayesian (modeling and) computation session. With my old friend Peter Müller talking about mixture inference with dependent priors (and a saturated colour scheme!), Matteo Ruggieri [who could not make it to CIRM!] on computable Bayesian inference for HMMs. Providing an impressive improvement upon particle filters for approximating the evidence. Also bringing the most realistic Chinese restaurant with conveyor belt! And Ming Yuan Zhou using optimal transport to define distance between distributions. With two different conditional distributions depending on which marginal is first fixed. And a connection with GANs (of course!).

And it was great to watch and listen to my friend Alicia Carriquiry talking on forensic statistics and the case for (or not?!) Bayes factors. And remembering Dennis Lindley. And my friend Jim Berger on frequentism versus Bayes! Consistency seems innocuous as most Bayes procedures are. Empirical coverage is another kind of consistency, isn’t it?

A remark I made when re-typing the program for CIRM is that there are surprisingly few talks about COVID-19 overall, maybe due to the program being mostly set for ISBA 2020 in Kunming. Maybe because we are more cautious than the competition…?!

And, at last, despite a higher density of boars around the CIRM facilities, no one got hurt yesterday! Unless one counts the impact of the French defeat at the Euro 2021 on the football fans here…

ISBA 2021.1

Posted in Kids, Mountains, pictures, Running, Statistics, Travel, University life, Wines with tags , , , , , , , , , , , , , , , , , , on June 29, 2021 by xi'an

An infinite (mixture) session was truly the first one I could attend on Day 1, as a heap of unexpected last minute issues kept me busy or on hedge for the beginning of the day (if not preventing me from a dawn dip in Calanque de Morgiou). Using the CIRM video system for zoom talked required more preparation than I had thought and we made it barely in time for the first session, while I had to store zoom links for all speakers present in Luminy.  Plus allocate sessions to the rooms provided by CIRM, twice since there was a mishap with the other workshop present at CIRM. And reassuring speakers, made anxious by the absence of a clear schedule. Chairing the second ABC session was also a tense moment, from checking every speaker could connect and share slides, to ensuring they kept on schedule (and they did on both!, ta’), to checking for questions at the end. Spotting a possible connection between Takuo Mastubara’s Stein’s approximation for in the ABC setup and a related paper by Liu and Lee I had read just a few days ago. Alas, it was too early to relax as an inverter in the CIRM room burned and led to a local power failure. Fortunately this was restored prior to the mixture session! (As several boars were spotted on the campus yesternight, I hope no tragic encounter happens before the end of the meeting!!!) So the mixture session proposed new visions on infering K, the number of components, some of which reminded me of… my first talk at CIRM where I was trying to get rid of empty components at each MCMC step, albeit in a much more rudimentary way obviously. And later had the wonderful surprise of hearing Xiao-Li’s lecture start by an excerpt from Car Talk, the hilarious Sunday morning radio talk-show about the art of used car maintenance on National Public Radio (NPR) that George Casella could not miss (and where a letter he wrote them about a mistaken probability computation was mentioned!). The final session of the day was an invited ABC session I chaired (after being exfiltrated from the CIRM dinner table!) with Kate Lee, Ryan Giordano, and Julien Stoehr as speakers. Besides Julien’s talk on our Gibbs-ABC paper, both other talks shared a concern with the frequentist properties of the ABC posterior, either to be used as a control tool or as a faster assessment of the variability of the (Monte Carlo) ABC output.

EM degeneracy

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , , , , , , , on June 16, 2021 by xi'an

At the MHC 2021 conference today (to which I biked to attend for real!, first time since BayesComp!) I listened to Christophe Biernacki exposing the dangers of EM applied to mixtures in the presence of missing data, namely that the algorithm has a rising probability to reach a degenerate solution, namely a single observation component. Rising in the proportion of missing data. This is not hugely surprising as there is a real (global) mode at this solution. If one observation components are prohibited, they should not be accepted in the EM update. Just as in Bayesian analyses with improper priors, the likelihood should bar single or double  observations components… Which of course makes EM harder to implement. Or not?! MCEM, SEM and Gibbs are obviously straightforward to modify in this case.

Judith Rousseau also gave a fascinating talk on the properties of non-parametric mixtures, from a surprisingly light set of conditions for identifiability to posterior consistency . With an interesting use of several priors simultaneously that is a particular case of the cut models. Namely a correct joint distribution that cannot be a posterior, although this does not impact simulation issues. And a nice trick turning a hidden Markov chain into a fully finite hidden Markov chain as it is sufficient to recover a Bernstein von Mises asymptotic. If inefficient. Sylvain LeCorff presented a pseudo-marginal sequential sampler for smoothing, when the transition densities are replaced by unbiased estimators. With connection with approximate Bayesian computation smoothing. This proves harder than I first imagined because of the backward-sampling operations…

ICWES18 in Warwick [11-14 September 2020]

Posted in Statistics with tags , , , , , , , on March 10, 2020 by xi'an

ISBA2020 program

Posted in Kids, Statistics, Travel, University life with tags , , , , , , , , , , , , on January 29, 2020 by xi'an

The scheduled program for ISBA 2020 is now on-line. And full of exciting sessions, many with computational focus. With dear hopes that the nCo-2019 epidemics will have abated by then (and not solely for the sake of the conference, most obviously!). While early registration ends by 15 April, the deadline for junior travel support ends up this month. And so does the deadline for contributions.