Archive for the Mountains Category

The Terminal [#2]

Posted in Mountains, pictures, Travel with tags , , , , , , , on February 19, 2017 by xi'an

blurFor the third time within a year, I have been stuck in an airport hotel by missing a connection! This time on my way to Calgary, thanks to fog over Paris and Amsterdam. And to Air France refusing to switch me to an earlier flight from Paris. Not as strictly stuck as in Delhi, as I could get outside in a sort of no man’s land between runways and expressways, or even reach downtown Amsterdam by public transportation, but with 24 hours to wait for the next flight. The most frustrating part is missing the ice-climbing day I had organised in Banff…

神々の山嶺 [the summit of the gods]

Posted in Books, Kids, Mountains, pictures with tags , , , , , , , , , , , on February 19, 2017 by xi'an

The summit of the gods is a five volume manga created by Jiro Taniguchi, who just passed away. While I do not find the mountaineering part of the story realistic [as in the above stripe], with feats and strength that seem beyond even the top himalayists like Reinhold Messner, Pierre Beghin, Abele Blanc, or Ueli Steck (to name a few), I keep re-reading the series for the unique style of the drawing, the story (despite the above), and the atmosphere of solo climbing in the 1970’s or 1980’s, especially as a testimony to Japanese climbers, as well as the perfect rendition of the call of the mountains… Reading Taniguchi’s obituaries over the weekend, I realised he was much more popular in France, where he won a prize for his drawing at the BD Festival in Angoulême in 2005, than in Japan.

off to Banff [17w5024]

Posted in Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , on February 18, 2017 by xi'an

Today, I fly from Paris to Amsterdam to Calgary to attend the ABC’ory workshop (15w2214) at the Banff International Research Station (BIRS) that Luke Bornn, Jukka Corander, Gael Martin, Dennis Prangle, Richard Wilkinson and myself built. The meeting is to brainstorm about the foundations of ABC for statistical inference rather than about the computational aspects of ABC, but the schedule is quite flexible for other directions!

Les Diablerets skyline [jatp]

Posted in Kids, Mountains, pictures, Statistics, Travel with tags , , , , , , , on February 12, 2017 by xi'an

The short course I gave in Les Diablerets, Switzerland, was highly enjoyable, at least for me!, as it gave me the opportunity to present an  overview of the field, just before our workshop in Banff and to stay in a fantastic skiing area for four days! While I found out that my limited skiing skills are gone even more limited, the constant fresh snow falling during my stay and the very small number of people on the slopes made the outdoor highly enjoyable, the more because the temperatures were quite tolerable. The above picture was taken on the only morning it did not snow, with a nice cloud inversion over the valley separating France and Switzerland.

whiteout in Les Diablerets [jatp]

Posted in Mountains, pictures, Travel with tags , , , , , , on February 5, 2017 by xi'an

incredible India

Posted in Kids, Mountains, pictures, Running, Travel with tags , , , , , , , , , , , , , , on January 15, 2017 by xi'an

[The following is a long and fairly naïve rant about India and its contradiction, without pretence at anything else than writing down some impressions from my last trip. JATP: Just another tourist post!]

Incredible India (or Incredible !ndia) is the slogan chosen by the Indian Ministry of Tourism to promote India. And it is indeed an incredible country, from its incredibly diverse landscapes [and not only the Himalayas!] and eco-systems, to its incredibly huge range of languages [although I found out during this trip that the differences between Urdu and Hindi are more communitarian and religious than linguistic, as they both derive from Hindustani, although the alphabets completely differ] and religions [a mixed blessing], to its incredibly rich history and culture, to its incredibly wide offer of local cuisines [as shown by the Bengali sample below, where the mustard seed fish cooked in banana leaves and the fried banana flowers are not visible!] and even wines [like Sula Vineyards, which offers a pretty nice Viognier]. Not to mention incredibly savoury teas from Darjeeling and Assam. Continue reading

empirical Bayes, reference priors, entropy & EM

Posted in Mountains, Statistics, Travel, University life with tags , , , , , , , , , , , on January 9, 2017 by xi'an

Klebanov and co-authors from Berlin arXived this paper a few weeks ago and it took me a quiet evening in Darjeeling to read it. It starts with the premises that led Robbins to introduce empirical Bayes in 1956 (although the paper does not appear in the references), where repeated experiments with different parameters are run. Except that it turns non-parametric in estimating the prior. And to avoid resorting to the non-parametric MLE, which is the empirical distribution, it adds a smoothness penalty function to the picture. (Warning: I am not a big fan of non-parametric MLE!) The idea seems to have been Good’s, who acknowledged using the entropy as penalty is missing in terms of reparameterisation invariance. Hence the authors suggest instead to use as penalty function on the prior a joint relative entropy on both the parameter and the prior, which amounts to the average of the Kullback-Leibler divergence between the sampling distribution and the predictive based on the prior. Which is then independent of the parameterisation. And of the dominating measure. This is the only tangible connection with reference priors found in the paper.

The authors then introduce a non-parametric EM algorithm, where the unknown prior becomes the “parameter” and the M step means optimising an entropy in terms of this prior. With an infinite amount of data, the true prior (meaning the overall distribution of the genuine parameters in this repeated experiment framework) is a fixed point of the algorithm. However, it seems that the only way it can be implemented is via discretisation of the parameter space, which opens a whole Pandora box of issues, from discretisation size to dimensionality problems. And to motivating the approach by regularisation arguments, since the final product remains an atomic distribution.

While the alternative of estimating the marginal density of the data by kernels and then aiming at the closest entropy prior is discussed, I find it surprising that the paper does not consider the rather natural of setting a prior on the prior, e.g. via Dirichlet processes.