Archive for probabilistic numerics

Savage Award session today at JSM

Posted in Kids, Statistics, University life, Travel with tags , , , , , , , , , , on August 3, 2020 by xi'an

Pleased to broadcast the JSM session dedicated to the 2020 Savage Award, taking place today at 13:00 ET (17:00 GMT), with two of the Savage nominees being former OxWaSP students (and Warwick PhD students). For those who have not registered for JSM, the talks are also available on Bayeslab. (As it happens, I was also a member of the committee this year, but do not think this could be deemed a CoI!)

112 Mon, 8/3/2020, 1:00 PM – 2:50 PM Virtual
Savage Award Session — Invited Papers
International Society for Bayesian Analysis (ISBA)
Organizer(s): Maria De Iorio, University College London
Chair(s): Maria De Iorio, University College London
1:05 PM Bayesian Dynamic Modeling and Forecasting of Count Time Series
Lindsay Berry, Berry Consultants
1:30 PM Machine Learning Using Approximate Inference: Variational and Sequential Monte Carlo Methods
Christian Andersson Naesseth, Columbia University
1:55 PM Recent Advances in Bayesian Probabilistic Numerical Integration
Francois-Xavier Briol, University College London
2:20 PM Factor regression for dimensionality reduction and data integration techniques with applications to cancer data
Alejandra Avalos Pacheco, Harvard Medical School
2:45 PM Floor Discussion

séminaire P de S

Posted in Books, pictures, Statistics, University life with tags , , , , , , , , , on February 18, 2020 by xi'an

As I was in Paris and free for the occasion (!), I attended the Paris Statistics seminar this afternoon, in the Latin Quarter. With a first talk by Kweku Abraham on Bayesian inverse problems set a prior on the quantity of interest, γ, rather than its transform G(γ), observed with noise. Always perturbed by the juggling of different distances, like L² versus Kullback-Leibler, in non-parametric frameworks. Reminding me of probabilistic numerics, at least in the framework, since the crux of the talk was 100% about convergence. And a second talk by Leanaïc Chizat on convex neural networks corresponding to an infinite number of neurons, with surprising properties, including implicit bias. And a third talk by Anne Sabourin on PCA for extremes. Which assumed very little on the model but more on the geometry of the distribution, like extremes being concentrated on a subspace. As I was rather tired from an intense week at Warwick, and after a weekend of reading grant applications and Biometrika submissions (!), my foggy brain kept switching to these proposals, trying to make connections with the talks, not completely inappropriately in two cases out of three. (I am afraid the same may happen tomorrow at our probability seminar on computer-based proofs!)

Bayesian probabilistic numerical methods

Posted in Books, pictures, Statistics, University life with tags , , , , , , on December 5, 2019 by xi'an

“…in isolation, the error of a numerical method can often be studied and understood, but when composed into a pipeline the resulting error structure maybe non-trivial and its analysis becomes more difficult. The real power of probabilistic numerics lies in its application to pipelines of numerical methods, where the probabilistic formulation permits analysis of variance (ANOVA) to understand the contribution of each discretisation to the overall numerical error.”

Jon Cockayne (Warwick), Chris Oates (formerly Warwick), T.J. Sullivan, and Mark Girolami (formerly Warwick) got their survey on Bayesian probabilistic numerical methods in the SIAM (Society for Industrial and Applied Mathematics) Review, which is quite a feat given the non-statistical flavour of the journal (although Art Owen is now one of the editors of the review). As already reported in some posts on the ‘Og, the concept relies on the construction of a prior measure over a set of potential solutions, and numerical methods are assessed against the associated posterior measure. Not only is this framework more compelling in a conceptual sense, but it also leads to novel probabilistic numerical methods managing to solve quite challenging numerical tasks. Congrats to the authors!

probabilistic methods in computational statistics [workshop]

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , on November 5, 2019 by xi'an

A  one-day workshop is organised at Telecom Sudparis, Évry, on 22 November by R. Douc, F. Portier and F. Roueff. On the “hot topics” concerned with probabilistic methods in computational statistics. The workshop is funded by the project “Big-Pomm”, which strengthens the links between LTCI (Telecom Paristech) and SAMOVAR (Telecom Sudparis) around research projects implying partially observed Markov models. The participation to the workshop is free but registration is required for having access to the lunch buffet (40 participants max). (Évry is located 20km south of Paris, with trains on the RER C line.)

Bayesian conjugate gradients [open for discussion]

Posted in Books, pictures, Statistics, University life with tags , , , , , on June 25, 2019 by xi'an

When fishing for an illustration for this post on Google, I came upon this Bayesian methods for hackers cover, a book about which I have no clue whatsoever (!) but that mentions probabilistic programming. Which serves as a perfect (?!) introduction to the call for discussion in Bayesian Analysis of the incoming Bayesian conjugate gradient method by Jon Cockayne, Chris Oates (formerly Warwick), Ilse Ipsen and Mark Girolami (still partially Warwick!). Since indeed the paper is about probabilistic numerics à la Mark and co-authors. Surprisingly dealing with solving the deterministic equation Ax=b by Bayesian methods. The method produces a posterior distribution on the solution x⁰, given a fixed computing effort, which makes it pertain to the anytime algorithms. It also relates to an earlier 2015 paper by Christian Hennig where the posterior is on A⁻¹ rather than x⁰ (which is quite a surprising if valid approach to the problem!) The computing effort is translated here in computations of projections of random projections of Ax, which can be made compatible with conjugate gradient steps. Interestingly, the choice of the prior on x is quite important, including setting a low or high convergence rate…  Deadline is August 04!