Grégoire Clarté, whom I co-supervised with Robin Ryder, successfully defended his PhD thesis last Wednesday! On sign language classification, ABC-Gibbs and collective non-linear MCMC. Congrats to the now Dr.Clarté for this achievement and all the best for his coming Nordic adventure, as he is starting a postdoc at the University of Helsinki, with Aki Vehtari and others. It was quite fun to work with Grégoire along these years. And discussing on an unlimited number of unrelated topics, incl. fantasy books, teas, cooking and the role of conferences and travel in academic life! The defence itself proved a challenge as four members of the jury, incl. myself, were “present remotely” and frequently interrupted him for gaps in the Teams transmission, which nonetheless broadcasted perfectly the honks of the permanent traffic jam in Porte Dauphine… (And alas could not share a celebratory cup with him!)
Archive for University of Helsinki
congrats, Dr. Clarté!
Posted in Books, pictures, Statistics, Travel, University life with tags ABC, ABC in Helsinki, ABC-Gibbs, fantasy, hybrid Monte Carlo, PhD thesis, PSL Research University, Teams, thesis defence, Université Paris Dauphine, University of Helsinki on October 9, 2021 by xi'anEuropean statistics in Finland [EMS17]
Posted in Books, pictures, Running, Statistics, Travel, University life with tags ABC, AISTATS 2016, Amazon, AMIS, Bayesian optimisation, deterministic mixtures, EMS 2017, Europe, European Meeting of Statisticians, exact Monte Carlo, Helsinki, INLA, particle filters, probabilistic numerics, University of Helsinki on August 2, 2017 by xi'anWhile this European meeting of statisticians had a wide range of talks and topics, I found it to be more low key than the previous one I attended in Budapest, maybe because there was hardly any talk there in applied probability. (But there were some sessions in mathematical statistics and Mark Girolami gave a great entry to differential geometry and MCMC, in the spirit of his 2010 discussion paper. Using our recent trip to Montréal as an example of geodesic!) In the Bayesian software session [organised by Aki Vetahri], Javier Gonzáles gave a very neat introduction to Bayesian optimisation: he showed how optimisation can be turned into Bayesian inference or more specifically as a Bayesian decision problem using a loss function related to the problem of interest. The point in following a Bayesian path [or probabilist numerics] is to reduce uncertainty by the medium of prior measures on functions, although resorting [as usual] to Gaussian processes whose arbitrariness I somehow dislike within the infinity of priors (aka stochastic processes) on functions! One of his strong arguments was that the approach includes the possibility for design in picking the next observation point (as done in some ABC papers of Michael Guttman and co-authors, incl. the following talk at EMS 2017) but again the devil may be in the implementation when looking at minimising an objective function… The notion of the myopia of optimisation techniques was another good point: only looking one step ahead in the future diminishes the returns of the optimisation and an alternative presented at AISTATS 2016 [that I do not remember seeing in Càdiz] goes against this myopia.
Umberto Piccini also gave a talk on exploiting synthetic likelihoods in a Bayesian fashion (in connection with the talk he gave last year at MCqMC 2016). I wondered at the use of INLA for this Gaussian representation, as well as at the impact of the parameterisation of the summary statistics. And the session organised by Jean-Michel involved Jimmy Olson, Murray Pollock (Warwick) and myself, with great talks from both other speakers, on PaRIS and PaRISian algorithms by Jimmy, and on a wide range of exact simulation methods of continuous time processes by Murray, both managing to convey the intuition behind their results and avoiding the massive mathematics at work there. By comparison, I must have been quite unclear during my talk since someone interrupted me about how Owen & Zhou (2000) justified their deterministic mixture importance sampling representation. And then left when I could not make sense of his questions [or because it was lunchtime already].