A back-cover story in Le Monde “Sciences & Médecine” of Stéphane Mallat, professor at École Normale and recently elected at the (French) Academy of Sciences and at the Collège de France, on a newly created Chair of Data Sciences. With works on wavelets, image compression, and neural networks, Stéphane Mallat will give his first lesson on Data Sciences at Collège de France, downtown Paris, on January 11. Entrance is free and open to everyone. (Collège de France is a unique institution, created by Guillaume Budé and supported by François Ier in 1530 to teach topics not taught (then) at the Sorbonne, as indicated by its motto Docet Omnia, including mathematics! Professors are nominated by the current faculty and the closest to statistics, prior to Stéphane Mallat, was Edmond Malinvaud.)
Archive for École Normale Supérieure
machine learning à l’Académie, au Collège, et dans Le Monde
Posted in Books, Statistics, University life with tags Académie des Sciences, École Normale Supérieure, Collège de France, data science, Guillaume Budé, neural network, Paris, Stéphane Mallat, wavelets on January 5, 2018 by xi'anBimPressioNs [BNP11]
Posted in Books, pictures, Statistics, Travel, University life, Wines with tags École Normale Supérieure, Bayesian nonparametrics, BNP11, empirical likelihood, French cheese, Hamiltonian, IHP, Noel Cressie, NPR, Paris on June 29, 2017 by xi'anWhile my participation to BNP 11 has so far been more at the janitor level [although not gaining George Casella’s reputation on NPR!] than at the scientific one, since we had decided in favour of the least expensive and unstaffed option for coffee breaks, to keep the registration fees at a minimum [although I would have gladly gone all the way to removing all coffee breaks!, if only because such breaks produce much garbage], I had fairly good chats at the second poster session, in particular around empirical likelihoods and HMC for discrete parameters, the first one based on the general Cressie-Read formulation and the second around the recently arXived paper of Nishimura et al., which I wanted to read. Plus many other good chats full stop, around terrific cheese platters!
This morning, the coffee breaks were much more under control and I managed to enjoy [and chair] the entire session on empirical likelihood, with absolutely fantastic talks from Nils Hjort and Art Owen (the third speaker having gone AWOL, possibly a direct consequence of Trump’s travel ban).
exciting week[s]
Posted in Mountains, pictures, Running, Statistics with tags ABC, ABC validation, École Normale Supérieure, Bayesian nonparametrics, BNP11, Domaine Coste Moynier, Grés de Montpellier, mixtures of distributions, PCI Comput Stats, PCI Evol Biol, Peer Community, Pic Saint Loup, Saint Christol, Université de Montpellier, Wasserstein distance on June 27, 2017 by xi'anThe past week was quite exciting, despite the heat wave that hit Paris and kept me from sleeping and running! First, I made a two-day visit to Jean-Michel Marin in Montpellier, where we discussed the potential Peer Community In Computational Statistics (PCI Comput Stats) with the people behind PCI Evol Biol at INRA, Hopefully taking shape in the coming months! And went one evening through a few vineyards in Saint Christol with Jean-Michel and Arnaud. Including a long chat with the owner of Domaine Coste Moynier. [Whose domain includes the above parcel with views of Pic Saint-Loup.] And last but not least! some work planning about approximate MCMC.
On top of this, we submitted our paper on ABC with Wasserstein distances [to be arXived in an extended version in the coming weeks], our revised paper on ABC consistency thanks to highly constructive and comments from the editorial board, which induced a much improved version in my opinion, and we received a very positive return from JCGS for our paper on weak priors for mixtures! Next week should be exciting as well, with BNP 11 taking place in downtown Paris, at École Normale!!!
Michael Jordan à Normale demain
Posted in Statistics, University life with tags École Normale Supérieure, bag of little bootstraps, big data, bootstrap, Michael Jordan, Paris on October 1, 2012 by xi'anTomorrow (Tuesday, Oct. 2), Michael Jordan (Berkeley) will give a seminar at École Normale Supérieure, in French:
Mardi 2 octobre 2012 à 11h15. (Le séminaire sera précédé d’un café à 11 heures)
Salle W, Toits du DMA, Ecole Normale Supérieure, 45 rue d’Ulm, Paris 5e.Diviser-pour-Régner and Inférence Statistique pour “Big Data”
Michael I. Jordan
Fondation Sciences Mathematiques de Paris & University of California, BerkeleyDans cet exposé nous présentons quelques résultats récents dans le domaine d’inférence pour “Big Data”. Diviser-pour-régner est un outil essentiel du point de vue computationel pour aborder des problèmes de traitement de données à grande échelle, surtout vu la croissance récente de systèmes distribués, mais ce paradigme présente des difficultés lorsqu’il s’agit d’inférence statistique. Considérons, par exemple, le problème fondamentale d’obtenir des intervalles de confiance pour les estimateurs. Le principe du “bootstrap” suggère d’échantilloner les données à plusieurs reprises pour obtenir des fluctuations et donc des intervalles de confiance, mais ceci est infaisable à grande échelle. Si on fait appel à des sous-échantillons on obtient des fluctuations qui ne sont pas à l’échelle correcte. Nous présentons une approche nouvelle, la “bag of little bootstraps,” qui circonvient ce problème et qui peut être appliquée à des données à grande échelle. Nous parlerons aussi du problème de complétion de matrice à grande échelle, où l’approche de diviser-pour-régner est un outil practique mais qui soulève des problèmes théoriques. Le soutient théorique est fournit par des théorèmes de concentration de matrices aléatoires, et à ce propos nous présentons une approche nouvelle à la concentration de matrices aléatoires basée sur la méthode de Stein. [En collaboration avec Ariel Kleiner, Lester Mackey, Purna Sarkar, Ameet Talwalkar, Richard Chen, Brendan Farrell et Joel Tropp].