Archive for the Travel Category

incredible India

Posted in Kids, Mountains, pictures, Running, Travel with tags , , , , , , , , , , , , , , on January 15, 2017 by xi'an

[The following is a long and fairly naïve rant about India and its contradiction, without pretence at anything else than writing down some impressions from my last trip. JATP: Just another tourist post!]

Incredible India (or Incredible !ndia) is the slogan chosen by the Indian Ministry of Tourism to promote India. And it is indeed an incredible country, from its incredibly diverse landscapes [and not only the Himalayas!] and eco-systems, to its incredibly huge range of languages [although I found out during this trip that the differences between Urdu and Hindi are more communitarian and religious than linguistic, as they both derive from Hindustani, although the alphabets completely differ] and religions [a mixed blessing], to its incredibly rich history and culture, to its incredibly wide offer of local cuisines [as shown by the Bengali sample below, where the mustard seed fish cooked in banana leaves and the fried banana flowers are not visible!] and even wines [like Sula Vineyards, which offers a pretty nice Viognier]. Not to mention incredibly savoury teas from Darjeeling and Assam. Continue reading

Le Monde puzzle [#990]

Posted in Books, Kids, pictures, Statistics, Travel, University life with tags , , , , on January 12, 2017 by xi'an

To celebrate the new year (assuming it is worth celebrating!), Le Monde mathematical puzzle came up with the following:

Two sequences (x¹,x²,…) and (y¹,y²,…) are defined as follows: the current value of x is either the previous value or twice the previous value, while the current value of y is the sum of the values of x up to now. What is the minimum number of steps to reach 2016 or 2017?

By considering that all consecutive powers of 2 must appear at least one, the puzzles boils down to finding the minimal number of replications in the remainder of the year minus the sum of all powers of 2. Which itself boils down to deriving the binary decomposition of that remainder. Hence the basic R code (using intToBits):

 while (sum(2^(0:m))>k) m=m-1
 if (sum(2^(0:m))==k){ return(rep(1,m+1))

which produces

> sum(deco(2016))
[1] 16
> sum(deco(2017))
[1] 16
> sum(deco(1789))
[1] 18

un lagarto en las Cataratas del Iguazú [guest jatp]

Posted in Kids, pictures, Travel with tags , , , , , on January 10, 2017 by xi'an


learning and inference for medical discovery in Oxford [postdoc]

Posted in Kids, pictures, Statistics, Travel, University life with tags , , , , , , on January 10, 2017 by xi'an

[Here is a call for a two-year postdoc in Oxford sent to me by Arnaud Doucet. For those worried about moving to Britain, I think that, given the current pace—or lack thereof—of the negotiations with the EU, it is very likely that Britain will not have Brexited two years from now.]

Numerous medical problems ranging from screening to diagnosis to treatment of chronic diseases to  management of care in hospitals requires the development of novel statistical models and methods. These models and methods need to address the unique characteristics of medical data such as sampling bias, heterogeneity, non-stationarity, informative censoring etc. Existing state-of-the-art machine learning and statistics techniques often fail to exploit those characteristics. Additionally, the focus needs to be on probabilistic models which are
interpretable by the clinicians so that the inference results can be integrated within the medical-decision making.

We have access to unique datasets for clinical deterioration of patients in the hospital, for cancer screening, and for treatment of chronic diseases. Preliminary work has been tested and implemented at UCLA Medical Center, resulting in significantly management care in this hospital.

The successful applicant will be expected to develop new probabilistic models and learning methods inspired by these applications. The focus will be primarily on methodological and theoretical developments, and involve collaborating with Oxford researchers in machine learning, computational statistics and medicine to bring these developments to practice.

The post-doctoral researcher will be jointly supervised by Prof. Mihaela van der Schaar and Prof. Arnaud Doucet. Both of them have a strong track-record in advising PhD students and post-doctoral researchers who subsequently became successful academics in statistics, engineering sciences, computer science and economics. The position is for 2 years.

empirical Bayes, reference priors, entropy & EM

Posted in Mountains, Statistics, Travel, University life with tags , , , , , , , , , , , on January 9, 2017 by xi'an

Klebanov and co-authors from Berlin arXived this paper a few weeks ago and it took me a quiet evening in Darjeeling to read it. It starts with the premises that led Robbins to introduce empirical Bayes in 1956 (although the paper does not appear in the references), where repeated experiments with different parameters are run. Except that it turns non-parametric in estimating the prior. And to avoid resorting to the non-parametric MLE, which is the empirical distribution, it adds a smoothness penalty function to the picture. (Warning: I am not a big fan of non-parametric MLE!) The idea seems to have been Good’s, who acknowledged using the entropy as penalty is missing in terms of reparameterisation invariance. Hence the authors suggest instead to use as penalty function on the prior a joint relative entropy on both the parameter and the prior, which amounts to the average of the Kullback-Leibler divergence between the sampling distribution and the predictive based on the prior. Which is then independent of the parameterisation. And of the dominating measure. This is the only tangible connection with reference priors found in the paper.

The authors then introduce a non-parametric EM algorithm, where the unknown prior becomes the “parameter” and the M step means optimising an entropy in terms of this prior. With an infinite amount of data, the true prior (meaning the overall distribution of the genuine parameters in this repeated experiment framework) is a fixed point of the algorithm. However, it seems that the only way it can be implemented is via discretisation of the parameter space, which opens a whole Pandora box of issues, from discretisation size to dimensionality problems. And to motivating the approach by regularisation arguments, since the final product remains an atomic distribution.

While the alternative of estimating the marginal density of the data by kernels and then aiming at the closest entropy prior is discussed, I find it surprising that the paper does not consider the rather natural of setting a prior on the prior, e.g. via Dirichlet processes.

tea rampage

Posted in Mountains, pictures, Travel with tags , , , , , , , , , , on January 8, 2017 by xi'an

While in India, I took the opportunity to buy Darjeeling tea in large quantities (15 boxes!) as teas from a wide variety of gardens and types were available, including my favourite, Muscatel. Some of which are for family and friends. We also drove through huge tea gardens on our way back to Bagdogra airport, with new buds already growing on the tea bushes. (The pictures below are taken from the Long View tea estate.)

the terminal

Posted in Books, pictures, Travel with tags , , , , , , , , on January 7, 2017 by xi'an

The Terminal is this (terrible) movie featuring Tom Hanks getting stuck in an airport international zone for an indefinite time (and based on a real story that saw Karimi Nasseri remain in Terminal 1 of Roissy, for 18 years, when being there for a few hours is already unbearable!). In a similar spirit, we got quarantined to the international zone in Delhi airport for 24 hours, thanks to a missed connection. And to an Air India representative who could not be bothered in finding another route, letting us out to visit the city, or even providing us access to our bags. So we ended up waiting in the airport short stay hotel, around the clock, with a bed, food and wifi. Not the end of the World, obviously! And with a rather unique view on the registration desks below.