Martin Hairer gets Breakthrough Prize (and \$3M)

Posted in Books, University life with tags , , , , , , , , , on September 14, 2020 by xi'an

Just heard the news that Fields Medallist Martin Hairer (formerly U of Warwick) got the 2021 Breakthrough Prize in Mathematics for his unification theory of stochastic partial differential equations, which he likens to a form of Taylor expansion in the massive Inventiones paper describing this breakthrough. (Looking at the previous winners of the prize, who also made its selection committee, this represents a break from focussing primarily on algebraic geometry! If not from sticking to male recipients…)

We introduce a new notion of “regularity structure” that provides an algebraic framework allowing to describe functions and/or distributions via a kind of “jet” or local Taylor expansion around each point. The main novel idea is to replace the classical polynomial model which is suitable for describing smooth functions by arbitrary models that are purpose-built for the problem at hand. In particular, this allows to describe the local behaviour not only of functions but also of large classes of distributions. We then build a calculus allowing to perform the various operations (multiplication, composition with smooth functions, integration against singular kernels) necessary to formulate fixed point equations for a very large class of semi-linear PDEs driven by some very singular (typically random) input. This allows, for the first time, to give a mathematically rigorous meaning to many interesting stochastic PDEs arising in physics. The theory comes with convergence results that allow to interpret the solutions obtained in this way as limits of classical solutions to regularised problems, possibly modified by the addition of diverging counterterms. These counterterms arise naturally through the action of a “renormalisation group” which is defined canonically in terms of the regularity structure associated to the given class of PDEs. Our theory also allows to easily recover many existing results on singular stochastic PDEs (KPZ equation, stochastic quantisation equations, Burgers-type equations) and to understand them as particular instances of a unified framework. One surprising insight is that in all of these instances local solutions are actually “smooth” in the sense that they can be approximated locally to arbitrarily high degree as linear combinations of a fixed family of random functions/distributions that play the role of “polynomials” in the theory. As an example of a novel application, we solve the long-standing problem of building a natural Markov process that is symmetric with respect to the (finite volume) measure describing the $\Phi^4_ 3$ Euclidean quantum field theory. It is natural to conjecture that the Markov process built in this way describes the Glauber dynamic of 3-dimensional ferromagnets near their critical temperature.

mixtures, Heremite polynomials, and ideals

Posted in Books, Kids, Statistics, University life with tags , , , , on September 24, 2015 by xi'an

A 3 page note that got arXived today is [University of Colorado?!] Andrew Clark’s “Expanding the Computation of Mixture Models by the use of Hermite Polynomials and Ideals“. With a typo on Hermite‘s name in the pdf title. The whole point of the note is to demonstrate that mixtures of different types of distributions (like t and Gaussian) are manageable.  A truly stupendous result… As if no one had ever mixed different distributions before.

“Using Hermite polynomials and computing ideals allows the investigator to mix distributions from distinct families.”

The second point of the paper is to derive the mixture weights from an algebraic equation based on the Hermite polynomials of the components, which implies that the components and the mixture distribution itself are already known. Which thus does not seem particularly relevant for mixture estimation…

reaching transcendence for Gaussian mixtures

Posted in Books, R, Statistics with tags , , , , on September 3, 2015 by xi'an

“…likelihood inference is in a fundamental way more complicated than the classical method of moments.”

Carlos Amendola, Mathias Drton, and Bernd Sturmfels arXived a paper this Friday on “maximum likelihood estimates for Gaussian mixtures are transcendental”. By which they mean that trying to solve the five likelihood equations for a two-component Gaussian mixture does not lead to an algebraic function of the data. (When excluding the trivial global maxima spiking at any observation.) This is not highly surprising when considering two observations, 0 and x, from a mixture of N(0,1/2) and N(μ,1/2) because the likelihood equation

$(x-\mu)\exp\{\mu^2\}-x+\mu\exp\{-\mu(2x-\mu)\}=0$

involves both exponential and algebraic terms. While this is not directly impacting (statistical) inference, this result has the computational consequence that the number of critical points ‘and also the maximum number of local maxima, depends on the sample size and increases beyond any bound’, which means that EM faces increasing difficulties in finding a global finite maximum as the sample size increases…