Archive for Paris

to bike or not to bike

Posted in Kids, pictures, Running, Travel with tags , , , , , , , , , on March 22, 2020 by xi'an

A recent debate between the candidates to the Paris mayorship, including a former Health minister and physician, led to arguments as to whether or not biking in Paris is healthy. Obviously, it is beneficial for the community, but the question is rather about the personal benefits vs dangers of riding a bike daily to work. Extra physical activity on the one hand, exposition to air pollution and accidents on the other hand. With an accident rate that increased during the recent strikes, but at a lesser rate (153%) than the number of cyclists in the streets of Paris (260%). While I do not find the air particularly stinky or unpleasant on my daily 25km, except in the frequent jams between Porte d’Auteuil and Porte de la Muette, and while I haven’t noticed a direct impact on my breathing or general shape, I try to avoid rush hours, especially on the way back home with a good climb near Porte de Versailles (the more on days when it is jammed solid with delivery trucks for the nearby exhibition centre). As for accidents, trying to maintain constant vigilance and predicting potential fishtails is the rule, as is avoiding most bike paths as I find them much more accident-prone than main streets… (Green lights are also more dangerous than red lights, in my opinion!) Presumably, so far at least, benefits outweight the costs!

Dauphine blocked for a few hours

Posted in Statistics with tags , , , , , , , on March 14, 2020 by xi'an

are pseudopriors required in Bayesian model selection?

Posted in Books, Kids, pictures, Statistics, University life with tags , , , , , , , , , , on February 29, 2020 by xi'an

An interesting question from X validated about constructing pseudo-priors for Bayesian model selection. Namely, how useful are these for the concept rather than the implementation? The only case where I am aware of pseudo-priors being used is in Bayesian MCMC algorithms such as Carlin and Chib (1995), where the distributions are used to complement the posterior distribution conditional on a single model (index) into a joint distribution across all model parameters. The trick of this construction is that the pseudo-priors can be essentially anything, including depending on the data as well. And while the impact the ability of the resulting Markov chain to move between spaces, they have no say on the resulting inference, either when choosing a model or when estimating the parameters of a chosen model. The concept of pseudo-priors was also central to the mis-interpretations found in Congdon (2006) and Scott (2002). Which we reanalysed with Jean-Michel Marin in Bayesian Analysis (2008) as the distinction between model-based posteriors and joint pseudo-posteriors.

unbiased MCMC with couplings [4pm, 26 Feb., Paris]

Posted in Books, pictures, Statistics, University life with tags , , , , , , , , , , , on February 24, 2020 by xi'an

On Wednesday, 26 February, Pierre Jacob (Havard U, currently visiting Paris-Dauphine) is giving a seminar on unbiased MCMC methods with couplings at AgroParisTech, bvd Claude Bernard, Paris 5ième, Room 32, at 4pm in the All about that Bayes seminar.

MCMC methods yield estimators that converge to integrals of interest in the limit of the number of iterations. This iterative asymptotic justification is not ideal; first, it stands at odds with current trends in computing hardware, with increasingly parallel architectures; secondly, the choice of “burn-in” or “warm-up” is arduous. This talk will describe recently proposed estimators that are unbiased for the expectations of interest while having a finite computing cost and a finite variance. They can thus be generated independently in parallel and averaged over. The method also provides practical upper bounds on the distance (e.g. total variation) between the marginal distribution of the chain at a finite step and its invariant distribution. The key idea is to generate “faithful” couplings of Markov chains, whereby pairs of chains coalesce after a random number of iterations. This talk will provide an overview of this line of research.

Col d’Orcia

Posted in Wines with tags , , , , , on February 20, 2020 by xi'an

7 years later…

Posted in Statistics with tags , , , , , , on February 20, 2020 by xi'an

séminaire P de S

Posted in Books, pictures, Statistics, University life with tags , , , , , , , , , on February 18, 2020 by xi'an

As I was in Paris and free for the occasion (!), I attended the Paris Statistics seminar this afternoon, in the Latin Quarter. With a first talk by Kweku Abraham on Bayesian inverse problems set a prior on the quantity of interest, γ, rather than its transform G(γ), observed with noise. Always perturbed by the juggling of different distances, like L² versus Kullback-Leibler, in non-parametric frameworks. Reminding me of probabilistic numerics, at least in the framework, since the crux of the talk was 100% about convergence. And a second talk by Leanaïc Chizat on convex neural networks corresponding to an infinite number of neurons, with surprising properties, including implicit bias. And a third talk by Anne Sabourin on PCA for extremes. Which assumed very little on the model but more on the geometry of the distribution, like extremes being concentrated on a subspace. As I was rather tired from an intense week at Warwick, and after a weekend of reading grant applications and Biometrika submissions (!), my foggy brain kept switching to these proposals, trying to make connections with the talks, not completely inappropriately in two cases out of three. (I am afraid the same may happen tomorrow at our probability seminar on computer-based proofs!)