Archive for Grenoble

ABC in Grenoble, 19-20 March 2020 [registration open]

Posted in Mountains, Statistics, Travel, University life with tags , , , , , , , , , , , , , , on January 7, 2020 by xi'an

Reminding readers that the next occurrence of the “ABC in…” workshops will very soon take place in Grenoble, France, on 19-20 March 2020. Confirmed speakers and sessions (with more to come) are

Misspecified models

Links with Machine Learning

  • Flora Jay (Université d’Orsay, France) TBA
  • Pierre-Alexandre Mattei (Inria Sophia Antipolis – Méditerranée, France) Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation
  • Dennis Prangle (Newcastle University, UK) Scalable approximate inference for state space models with normalising flows

As in most earlier versions of the “ABC in…”workshops (ABC in Paris, London, Roma, &tc.), we are aiming at a workshop atmosphere and, thanks to local sponsors, the registration fees are null, but registration is compulsory. And now open!

I also remind ‘Og’s readers that Grenoble can be easily reached by fast trains from Paris, Roissy, Geneva and Lyon. (There are also flights to Grenoble airport from Warwick, as well as Bristol, Edinburgh, London, Manchester, Rotterdam, Stockholm, Warsaw, but this is less convenient than flying to Lyon Saint-Exupery airport and then catching a direct train at the airport.) To add to the appeal of the place, the workshop occurs during the skiing season, with three mountain ranges in the close vicinity. Making ABski a genuine possibility for the weekend after!

ABC in Grenoble, 19-20 March 2020

Posted in Mountains, pictures, Running, Statistics, Travel, University life with tags , , , , , , , , , , , , on May 22, 2019 by xi'an

The next occurrence of the “ABC in…” workshops will take place in Grenoble, France, on 19-20 March 2020. Both local organising and international scientific committees have been constituted and the program should soon be constructed, along with calls to contributions launched at the same time. As in most earlier versions of the workshops (ABC in Paris, London, Roma, &tc.), we are aiming at a workshop atmosphere and, thanks to local sponsors, the registration fees if any will be low.

Grenoble can be easily reached by fast trains from Paris, Roissy, Geneva and Lyon. (There are also flights to Grenoble airport from Warwick, as well as Bristol, Edinburgh, London, Manchester, Rotterdam, Stockholm, Warsaw, but this is less convenient than flying to Lyon Saint-Exupery airport and catching a fast train at the airport.) To add to the appeal of the place, the workshop occurs during the skiing season, with three mountain ranges in the close vicinity. Making ABski a genuine possibility for the weekend after!

statlearn 2019, Grenoble

Posted in Statistics with tags , , , , on March 22, 2019 by xi'an

In case you are near the French Alps next week, STATLEARN 2019 will take place in Grenoble, the week after next, 04 and 05 April. The program is quite exciting, registration is free of charge!, and still open, plus the mountains are next door!

Bayesian workshop in the French Alps

Posted in Statistics with tags , , , , , , , on June 22, 2018 by xi'an

[Astrostat summer school] sunrise [jatp]

Posted in Statistics with tags , , , , , , , , , , , on October 10, 2017 by xi'an

[summer Astrostat school] room with a view [jatp]

Posted in Mountains, pictures, R, Running, Statistics, Travel, University life with tags , , , , , , , , , , on October 9, 2017 by xi'an

I just arrived in Autrans, on the Plateau du Vercors overlooking Grenoble and the view is fabulistic! Trees have started to turn red and yellow, the weather is very mild, and my duties are restricted to teaching ABC to a group of enthusiastic astronomers and cosmologists..! Second advanced course on ABC in the mountains this year, hard to beat (except by a third course). The surroundings are so serene and peaceful that I even conceded to install RStudio for my course! Instead of sticking to my favourite vim editor and line commands.

Bayesian methods in cosmology

Posted in Statistics with tags , , , , , , , , , , , , on January 18, 2017 by xi'an

A rather massive document was arXived a few days ago by Roberto Trotta on Bayesian methods for cosmology, in conjunction with an earlier winter school, the 44th Saas Fee Advanced Course on Astronomy and Astrophysics, “Cosmology with wide-field surveys”. While I never had the opportunity to give a winter school in Saas Fee, I will give next month a course on ABC to statistics graduates in another Swiss dream location, Les Diablerets.  And next Fall a course on ABC again but to astronomers and cosmologists, in Autrans, near Grenoble.

The course document is an 80 pages introduction to probability and statistics, in particular Bayesian inference and Bayesian model choice. Including exercises and references. As such, it is rather standard in that the material could be found as well in textbooks. Statistics textbooks.

When introducing the Bayesian perspective, Roberto Trotta advances several arguments in favour of this approach. The first one is that it is generally easier to follow a Bayesian approach when compared with seeking a non-Bayesian one, while recovering long-term properties. (Although there are inconsistent Bayesian settings.) The second one is that a Bayesian modelling allows to handle naturally nuisance parameters, because there are essentially no nuisance parameters. (Even though preventing small world modelling may lead to difficulties as in the Robbins-Wasserman paradox.) The following two reasons are the incorporation of prior information and the appeal on conditioning on the actual data.

trottaThe document also includes this above and nice illustration of the concentration of measure as the dimension of the parameter increases. (Although one should not over-interpret it. The concentration does not occur in the same way for a normal distribution for instance.) It further spends quite some space on the Bayes factor, its scaling as a natural Occam’s razor,  and the comparison with p-values, before (unsurprisingly) introducing nested sampling. And the Savage-Dickey ratio. The conclusion of this model choice section proposes some open problems, with a rather unorthodox—in the Bayesian sense—line on the justification of priors and the notion of a “correct” prior (yeech!), plus an musing about adopting a loss function, with which I quite agree.