**A**s I was in Paris and free for the occasion (!), I attended the Paris Statistics seminar this afternoon, in the Latin Quarter. With a first talk by Kweku Abraham on Bayesian inverse problems set a prior on the quantity of interest, γ, rather than its transform G(γ), observed with noise. Always perturbed by the juggling of different distances, like L² versus Kullback-Leibler, in non-parametric frameworks. Reminding me of probabilistic numerics, at least in the framework, since the crux of the talk was 100% about convergence. And a second talk by Leanaïc Chizat on convex neural networks corresponding to an infinite number of neurons, with surprising properties, including implicit bias. And a third talk by Anne Sabourin on PCA for extremes. Which assumed very little on the model but more on the geometry of the distribution, like extremes being concentrated on a subspace. As I was rather tired from an intense week at Warwick, and after a weekend of reading grant applications and Biometrika submissions (!), my foggy brain kept switching to these proposals, trying to make connections with the talks, not completely inappropriately in two cases out of three. (I am afraid the same may happen tomorrow at our probability seminar on computer-based proofs!)

## Archive for probabilistic numerics

## séminaire P de S

Posted in Statistics, University life, Books, pictures with tags Biometrika, computer-based proof, extreme value theory, Institut Henri Poincaré, neural network, Paris, probabilistic numerics, séminaire, seminar, stochastic gradient descent on February 18, 2020 by xi'an## Bayesian probabilistic numerical methods

Posted in Books, pictures, Statistics, University life with tags ANOVA, Bayesian nonparametrics, probabilistic numerics, SIAM, Siam Review, Society for Industrial and Applied Mathematics, University of Warwick on December 5, 2019 by xi'an

“…in isolation, the error of a numerical method can often be studied and understood, but when composed into a pipeline the resulting error structure maybe non-trivial and its analysis becomes more difficult. The real power of probabilistic numerics lies in its application to pipelines of numerical methods, where the probabilistic formulation permits analysis of variance (ANOVA) to understand the contribution of each discretisation to the overall numerical error.”

**J**on Cockayne (Warwick), Chris Oates (formerly Warwick), T.J. Sullivan, and Mark Girolami (formerly Warwick) got their survey on Bayesian probabilistic numerical methods in the SIAM (Society for Industrial and Applied Mathematics) Review, which is quite a feat given the non-statistical flavour of the journal (although Art Owen is now one of the editors of the review). As already reported in some posts on the ‘Og, the concept relies on the construction of a prior measure over a set of potential solutions, and numerical methods are assessed against the associated posterior measure. Not only is this framework more compelling in a conceptual sense, but it also leads to novel probabilistic numerical methods managing to solve quite challenging numerical tasks. Congrats to the authors!

## probabilistic methods in computational statistics [workshop]

Posted in pictures, Statistics, Travel, University life with tags Évry, computational statistics, France, INT, partly observed Markov models, probabilistic numerics, Randal Douc, RER C, Télecom Sudparis, workshop on November 5, 2019 by xi'an**A** one-day workshop is organised at Telecom Sudparis, Évry, on 22 November by R. Douc, F. Portier and F. Roueff. On the “hot topics” concerned with probabilistic methods in computational statistics. The workshop is funded by the project “Big-Pomm”, which strengthens the links between LTCI (Telecom Paristech) and SAMOVAR (Telecom Sudparis) around research projects implying partially observed Markov models. The participation to the workshop is free but registration is required for having access to the lunch buffet (40 participants max). (Évry is located 20km south of Paris, with trains on the RER C line.)

## Bayesian conjugate gradients [open for discussion]

Posted in Books, pictures, Statistics, University life with tags Bayesian Analysis, Bayesian methods for hackers, discussion paper, probabilistic numerics, probabilistic programming, University of Warwick on June 25, 2019 by xi'an**W**hen fishing for an illustration for this post on Google, I came upon this Bayesian methods for hackers cover, a book about which I have no clue whatsoever (!) but that mentions probabilistic programming. Which serves as a perfect (?!) introduction to the call for discussion in Bayesian Analysis of the incoming Bayesian conjugate gradient method by Jon Cockayne, Chris Oates (formerly Warwick), Ilse Ipsen and Mark Girolami (still partially Warwick!). Since indeed the paper is about probabilistic numerics à la Mark and co-authors. Surprisingly dealing with solving the deterministic equation Ax=b by Bayesian methods. The method produces a posterior distribution on the solution x⁰, given a fixed computing effort, which makes it pertain to the anytime algorithms. It also relates to an earlier 2015 paper by Christian Hennig where the posterior is on A⁻¹ rather than x⁰ (which is quite a surprising if valid approach to the problem!) The computing effort is translated here in computations of projections of random projections of Ax, which can be made compatible with conjugate gradient steps. Interestingly, the choice of the prior on x is quite important, including setting a low or high convergence rate… **Deadline is August 04!**

## European statistics in Finland [EMS17]

Posted in Books, pictures, Running, Statistics, Travel, University life with tags ABC, AISTATS 2016, Amazon, AMIS, Bayesian optimisation, deterministic mixtures, EMS 2017, Europe, European Meeting of Statisticians, exact Monte Carlo, Helsinki, INLA, particle filters, probabilistic numerics, University of Helsinki on August 2, 2017 by xi'an**W**hile this European meeting of statisticians had a wide range of talks and topics, I found it to be more low key than the previous one I attended in Budapest, maybe because there was hardly any talk there in applied probability. (But there were some sessions in mathematical statistics and Mark Girolami gave a great entry to differential geometry and MCMC, in the spirit of his 2010 discussion paper. Using our recent trip to Montréal as an example of geodesic!) In the Bayesian software session [organised by Aki Vetahri], Javier Gonzáles gave a very neat introduction to Bayesian optimisation: he showed how optimisation can be turned into Bayesian inference or more specifically as a Bayesian decision problem using a loss function related to the problem of interest. The point in following a Bayesian path [or probabilist numerics] is to reduce uncertainty by the medium of prior measures on functions, although resorting [as usual] to Gaussian processes whose arbitrariness I somehow dislike within the infinity of priors (aka stochastic processes) on functions! One of his strong arguments was that the approach includes the possibility for design in picking the next observation point (as done in some ABC papers of Michael Guttman and co-authors, incl. the following talk at EMS 2017) but again the devil may be in the implementation when looking at minimising an objective function… The notion of the myopia of optimisation techniques was another good point: only looking one step ahead in the future diminishes the returns of the optimisation and an alternative presented at AISTATS 2016 [that I do not remember seeing in Càdiz] goes against this myopia.

Umberto Piccini also gave a talk on exploiting synthetic likelihoods in a Bayesian fashion (in connection with the talk he gave last year at MCqMC 2016). I wondered at the use of INLA for this Gaussian representation, as well as at the impact of the parameterisation of the summary statistics. And the session organised by Jean-Michel involved Jimmy Olson, Murray Pollock (Warwick) and myself, with great talks from both other speakers, on PaRIS and PaRISian algorithms by Jimmy, and on a wide range of exact simulation methods of continuous time processes by Murray, both managing to convey the intuition behind their results and avoiding the massive mathematics at work there. By comparison, I must have been quite unclear during my talk since someone interrupted me about how Owen & Zhou (2000) justified their deterministic mixture importance sampling representation. And then left when I could not make sense of his questions [or because it was lunchtime already].

## plenary talks at JSM 2017 in Baltimore

Posted in Statistics with tags Abraham Wald, Baltimore, Bernstein-von Mises theorem, Emmanuel Candés, IMS, IMS Medallion, JSM 2017, Judith Rousseau, Mark Girolami, Maryland, probabilistic numerics on May 25, 2017 by xi'an## MCM 2017

Posted in pictures, Statistics, Travel, University life with tags Approximate Bayesian computation, Canada, MCMC, Monte Carlo integration, Monte Carlo Statistical Methods, Montréal, probabilistic numerics, Québec, Robert Charlebois, scalability, stochastic gradient on February 10, 2017 by xi'an**J**e reviendrai à Montréal, as the song by Robert Charlebois goes, for the MCM 2017 meeting there, on July 3-7. I was invited to give a plenary talk by the organisers of the conference . Along with

Steffen Dereich, WWU Münster, Germany

Paul Dupuis, Brown University, Providence, USA

Mark Girolami, Imperial College London, UK

Emmanuel Gobet, École Polytechnique, Palaiseau, France

Aicke Hinrichs, Johannes Kepler University, Linz, Austria

Alexander Keller, NVIDIA Research, Germany

Gunther Leobacher, Johannes Kepler University, Linz, Austria

Art B. Owen, Stanford University, USA

Note that, while special sessions are already selected, including oneon Stochastic Gradient methods for Monte Carlo and Variational Inference, organised by Victor Elvira and Ingmar Schuster (my only contribution to this session being the suggestion they organise it!), proposals for contributed talks will be selected based on one-page abstracts, to be submitted by March 1.