Since this is an event unlikely to occur that frequently, let me point out that Université Paris-Dauphine got a nominal mention in Nature of two weeks ago, through an article covering the recent Abel Prize of Yves Meyer and his work on wavelets through a collection of French institutions, including Paris-Dauphine where he was a professor in the maths department (CEREMADE) from 1985 till 1996. (Except for including a somewhat distantly related picture of an oscilloscope and a mention of the Higgs boson, the Nature article is quite nice!)
Archive for wavelets
Just heard the great news that the Abel Prize for 2017 goes to Yves Meyer! Yves Meyer is an emeritus professor at École Normale de Cachan and has produced fundamental contributions to number theory, operator theory and harmonic analysis. He is one of the originators of the theory of wavelets and multiresolution analysis. Among other recognitions and prizes, he was an invited speaker at the International Congress of Mathematicians in 1970 (Nice), in 1983 (Warsaw), and in 1990 (Kyoto), and was awarded the Gauß Prize in 2010. Congratulations and total respect to Yves Meyer!!!
The Reading Classics Seminar today was about (the classic) Donoho and Johnstone’s denoising through wavelets, a 1995 JASA paper entitled Adapting to unknown smoothness via wavelet shrinkage. Two themes (shrinkage and wavelets) I discovered during my PhD years. (Although I did not work on wavelets then, I simply attended seminars where wavelets were the new thing.) My student Anouar Seghir gave a well-prepared presentation, introducing wavelets and Stein’s estimator of the risk. He clearly worked hard on the paper. However, I still feel the talk focussed too much on the maths and not enough on the motivations. For instance I failed to understand why the variance in the white noise was known and where the sparsity indicator came from. (This is anyway a common flaw in those Reading Classics presentations.) The presentation was helped by an on-line demonstration in Matlab, using the resident velvet code. Here are the slides:
“In the end, it really is just a matter of choosing the relevant parts of mathematics and ignoring the rest. Of course, the hard part is deciding what is irrelevant.”
Somehow, I had missed the first edition of this book and thus I started reading it this afternoon with a newcomer’s eyes (obviously, I will not comment on the differences with the first edition, sketched by the author in the Preface). Past the initial surprise of discovering it was a mathematics book rather than an algorithmic book, I became engrossed into my reading and could not let it go! Numerical Analysis for Statisticians, by Kenneth Lange, is a wonderful book. It provides most of the necessary background in calculus and some algebra to conduct rigorous numerical analyses of statistical problems. This includes expansions, eigen-analysis, optimisation, integration, approximation theory, and simulation, in less than 600 pages. It may be due to the fact that I was reading the book in my garden, with the background noise of the wind in tree leaves, but I cannot find any solid fact to grumble about! Not even about the MCMC chapters! I simply enjoyed Numerical Analysis for Statisticians from beginning till end.
“Many fine textbooks (…) are hardly substitutes for a theoretical treatment emphasizing mathematical motivations and derivations. However, students do need exposure to real computing and thoughtful numerical exercises. Mastery of theory is enhanced by the nitty gritty of coding.”
From the above, it may sound as if Numerical Analysis for Statisticians does not fulfill its purpose and is too much of a mathematical book. Be assured this is not the case: the contents are firmly grounded in calculus (analysis) but the (numerical) algorithms are only one code away. An illustration (among many) is found in Section 8.4: Finding a Single Eigenvalue, where Kenneth Lange shows how the Raleigh quotient algorithm of the previous section can be exploited to this aim, when supplemented with a good initial guess based on Gerschgorin’s circle theorem. This is brilliantly executed in two pages and the code is just one keyboard away. The EM algorithm is immersed into a larger M[&]M perspective. Problems are numerous and mostly of high standards, meaning one (including me) has to sit and think about them. References are kept to a minimum, they are mostly (highly recommended) books, plus a few papers primarily exploited in the problem sections. (When reading the Preface, I found that “John Kimmel, [his] long suffering editor, exhibited extraordinary patience in encouraging [him] to get on with this project”. The quality of Numerical Analysis for Statisticians is also a testimony to John’s editorial acumen!)
“Every advance in computer architecture and software tempts statisticians to tackle numerically harder problems. To do so intelligently requires a good working knowledge of numerical analysis. This book equips students to craft their own software and to understand the advantages and disadvantages of different numerical methods. Issues of numerical stability, accurate approximation, computational complexity, and mathematical modeling share the limelight in a broad yet rigorous overview of those parts of numerical analysis most relevant to statisticians.”
While I am reacting so enthusiastically to the book (imagine, there is even a full chapter on continued fractions!), it may be that my French math background is biasing my evaluation and that graduate students over the World would find the book too hard. However, I do not think so: the style of Numerical Analysis for Statisticians is very fluid and the rigorous mathematics are mostly at the level of undergraduate calculus. The more advanced topics like wavelets, Fourier transforms and Hilbert spaces are very well-introduced and do not require prerequisites in complex calculus or functional analysis. (Although I take no joy in this, even measure theory does not appear to be a prerequisite!) On the other hand, there is a prerequisite for a good background in statistics. This book will clearly involve a lot of work from the reader, but the respect shown by Kenneth Lange to those readers will sufficiently motivate them to keep them going till assimilation of those essential notions. Numerical Analysis for Statisticians is also recommended for more senior researchers and not only for building one or two courses on the bases of statistical computing. It contains most of the math bases that we need, even if we do not know we need them! Truly an essential book.