This is the cover page of Marco Banterle‘s thesis, who will defend on Thursday [July 21, 13:00], at a rather quiet time for French universities, which is one reason for advertising it here. The thesis is built around several of Marco’s papers, like delayed acceptance, dimension expansion, and Gaussian copula for graphical models. The defence is open to everyone, so feel free to join if near Paris-Dauphine!
Archive for Paris
Last week, at the same time as the workshop on retrospective Monte Carlo in Warwick, there was a Monte Carlo conference in Paris, closing a Monte Carlo cycle run by Institut Louis Bachelier from October 2015 till June 2016. It took place in the convent of Les Cordeliers, downtown Paris [hence the title] and I alas could not attend the talks. As I organised a session on Bayesian (approximate) computations, with Richard Everitt, Jere Koskela, and Chris Sherlock as speakers (and Robin Ryder as chair), here are the slides of the speakers (actually, Jere most kindly agreed to give Chris’ talk as Chris was to sick to travel to Paris):
This Monday, I made a most pleasant trip to the Observatoire de Paris, which campus is located in Meudon and no longer in Paris. (There also is an Observatoire de Paris campus in downtown Paris, created in 1667, where no observation can take place.) Most pleasant for many reasons. First, I was to meet with Frédéric Arenou and two visiting astrostatisticians from Kolkata, India, whom I met in Bangalore two years ago. Working on a neat if no simple issue of inverted mean estimation. Second, because the place is beautiful, with great views of Paris (since the Observatoire is on a ridge), and with a classical-looking building actually made of recycled castle parts after the Franco-Prussian war of 1870, and because Frédéric gave us a grand tour of place. And third, because I went there by bike through the Forêt de Meudon which I did not suspect was that close to home and which I crossed on downhill muddy trails that made me feel far away from Paris! And giving me the opportunity to test the mettle of a new mountain bike elsewhere than again Parisian SUVs. (This was the first day of a relatively intense biking week, which really helped with the half-marathon training: San Francisco ½ is in less than a month!!! And I am in wave 2!)
As my daughter is working at a McDonald’s close to Paris-Dauphine [as a summer job], I did a neighbour visit two days ago and had a salad there! While there was nothing exciting about the salad, it was my first meal at McDonald’s for at least twenty-five years (although I may have had an occasional tea there in the meanwhile) and there was nothing wrong either. Judging solely from my daughter’s (limited) experience, I am actually impressed by the degree of Taylorism in the preparation and handling of food and the management of staff. Not that I am contemplating getting back to this chain in the next twenty years, for the food served there remains junk food, but the industrial size of the company means that health and safety regulations and labour laws are more likely to be respected there than in a small local restaurant. Again judging solely from my daughter’s experience.
In an apt contrast, we went to celebrate her admission to the medical school last weekend and picked a bento restaurant in Le Marais that had good press. And was open on a Sunday evening. The place is called Nanashis and looks like an immense railways dinning hall. Somewhat noisy but ultimately not unpleasant. And very good if pricey soba cold noodles. (Just avoid the wine. And possibly the deserts since our homemade matcha cake can compete with theirs!)
Next week, on June 7, at 4pm, Michael will give a seminar at INRIA, rue du Charolais, Paris 12 (map). Here is the abstract:
A Variational Perspective on Accelerated Methods in Optimization
Accelerated gradient methods play a central role in optimization,achieving optimal rates in many settings. While many generalizations and extensions of Nesterov’s original acceleration method have been proposed,it is not yet clear what is the natural scope of the acceleration concept.In this paper, we study accelerated methods from a continuous-time perspective. We show that there is a Lagrangian functional that we call the Bregman Lagrangian which generates a large class of accelerated methods in continuous time, including (but not limited to) accelerated gradient descent, its non-Euclidean extension, and accelerated higher-order gradient methods. We show that the continuous-time limit of all of these methods correspond to travelling the same curve in space time at different speeds, and in this sense the continuous-time setting is the natural one for understanding acceleration. Moreover, from this perspective, Nesterov’s technique and many of its generalizations can be viewed as a systematic way to go from the continuous-time curves generated by the Bregman Lagrangian to a family of discrete-time accelerated algorithms. [Joint work with Andre Wibisono and Ashia Wilson.]
(Interested readers need to register to attend the lecture.)