Archive for stochastic processes

from least squares to signal processing and particle filtering

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , on June 6, 2017 by xi'an

Nozer Singpurwalla, Nick. Polson, and Refik Soyer have just arXived a remarkable survey on the history of signal processing, from Gauß, Yule, Kolmogorov and Wiener, to Ragazzini, Shanon, Kálmán [who, I was surprised to learn, died in Gainesville last year!], Gibbs sampling, and the particle filters of the 1990’s.

Sunday morning reading

Posted in Books, Kids, University life with tags , , , , , on June 30, 2016 by xi'an

A very interesting issue of Nature I read this morning while having breakfast. A post-brexit read of a pre-brexit issue. Apart from the several articles arguing against Brexit and its dire consequences on British science [but preaching to the converted for which percentage of the Brexit voters does read Nature?!], a short vignette on the differences between fields for the average time spent for refereeing a paper (maths takes twice as long as social sciences and academics older than 65 half the time of researchers under 36!). A letter calling for action against predatory publishers. And the first maths paper published since I started reading Nature on an almost-regular basis: it studies mean first-passage time for non-Markov random walks. Which are specified as time-homogeneous increments. It is sort of a weird maths paper in that I do not see where the maths novelty stands and why the paper only contains half a dozen formulas… Maybe not a maths paper after all.

métro static

Posted in Kids, Statistics, Travel with tags , , , , , , on July 19, 2015 by xi'an

[In the train shuttle at Birmingham airport, two young guys, maybe back from SPA 2015, discussing signal processing:]

– In Bayesian statistics, they use a different approach to testing hypotheses… You see, they put priors on the different hypotheses…

– But in the end it all boils down to concentration inequalities…

Statistics month in Marseilles (CIRM)

Posted in Books, Kids, Mountains, pictures, Running, Statistics, Travel, University life, Wines with tags , , , , , , , , , , , , , , on June 24, 2015 by xi'an

Calanque de Morgiou, Marseille, July 7, 2010Next February, the fabulous Centre International de Recherche en Mathématiques (CIRM) in Marseilles, France, will hold a Statistics month, with the following programme over five weeks

Each week will see minicourses of a few hours (2-3) and advanced talks, leaving time for interactions and collaborations. (I will give one of those minicourses on Bayesian foundations.) The scientific organisers of the B’ week are Gilles Celeux and Nicolas Chopin.

The CIRM is a wonderful meeting place, in the mountains between Marseilles and Cassis, with many trails to walk and run, and hundreds of fantastic climbing routes in the Calanques at all levels. (In February, the sea is too cold to contemplate swimming. The good side is that it is not too warm to climb and the risk of bush fire is very low!) We stayed there with Jean-Michel Marin a few years ago when preparing Bayesian Essentials. The maths and stats library is well-provided, with permanent access for quiet working sessions. This is the French version of the equally fantastic German Mathematik Forschungsinstitut Oberwolfach. There will be financial support available from the supporting societies and research bodies, at least for young participants and the costs if any are low, for excellent food and excellent lodging. Definitely not a scam conference!

probabilistic numerics

Posted in pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , on April 27, 2015 by xi'an

sunwar2I attended an highly unusual workshop while in Warwick last week. Unusual for me, obviously. It was about probabilistic numerics, i.e., the use of probabilistic or stochastic arguments in the numerical resolution of (possibly) deterministic problems. The notion in this approach is fairly Bayesian in that it makes use to prior information or belief about the quantity of interest, e.g., a function, to construct an usually Gaussian process prior and derive both an estimator that is identical to a numerical method (e.g., Runge-Kutta or trapezoidal integration) and uncertainty or variability around this estimator. While I did not grasp much more than the classy introduction talk by Philipp Hennig, this concept sounds fairly interesting, if only because of the Bayesian connection, and I wonder if we will soon see a probability numerics section at ISBA! More seriously, placing priors on functions or functionals is a highly formal perspective (as in Bayesian non-parametrics) and it makes me wonder how much of the data (evaluation of a function at a given set of points) and how much of the prior is reflected in the output [variability]. (Obviously, one could also ask a similar question for statistical analyses!)  For instance, issues of singularity arise among those stochastic process priors.

Another question that stemmed from this talk is whether or not more efficient numerical methods can derived that way, in addition to recovering the most classical ones. Somewhat, somehow, given the idealised nature of the prior, it feels like priors could be more easily compared or ranked than in classical statistical problems. Since the aim is to figure out the value of an integral or the solution to an ODE. (Or maybe not, since again almost the same could be said about estimating a normal mean.)