Archive for O’Bayes

O’Bayes17, next December in Austin

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , on April 5, 2017 by xi'an

The next edition of the OBayes meetings is taking place this December in Austin, Texas! On the campus of the University of Texas (UT), organised by Carlos Carvalho, Peter Mueller,  James Scott, and Tom Shively. On December 10-13. Following a tradition of more than 20 years—I went to most meetings although I missed the very first conference in West Lafayette, Indiana, and only stayed 27 hours in Shanghai!, plus adopted the O’Bayes logo for the Aussois meeting, even though I meant the number of the year rather than for the edition!!—, this meeting brings together researchers interested in objective Bayes theory, methodology, and applications, and related topics, to provide opportunities for young researchers, and to establish new collaborations and partnerships. (The meeting is the biennial meeting of the Objective Bayes section of the International Society for Bayesian Analysis, of which I happen to be the current president.)

The list of speakers and discussants this year is quite impressive and far reaching, and everyone is more than welcome to present a poster at the workshop. The first (Sun)day will see a series of tutorials, given by members of the scientific committee (myself included), followed by three days of invited talks with discussions,  plus a poster session on Monday night. And possibly a desert excursion on Thursday! It should be a great meeting and I most warmly invite all ‘Og’s readers to join us in Texas!

Forte di Bard

Posted in Kids, Mountains, pictures, Travel, University life, Wines with tags , , , , , , , , , , , , on August 4, 2016 by xi'an

After our aborted attempt at Monte Rosa, Abele Blanc treated us to a quick visit to Forte di Bard, a 19th Century military fortress in the Valley of Aosta [a first version of which was razed by Napoleon’s troops in 1800] on top the medieval village of Bard. Ironically, the current fortress never saw action as Napoleon’s siege was the last invasion of the kingdom of Savoy by French troops.

The buildings are impressive, so seamlessly connected to the rock spur that supports them that they appear to have grown out of it. They reminded me of Vauban’s fortresses, with the feeling that they were already outdated when they got built. (On the French Savoy side, there is a series of fortresses that similarly faced no battle as they were designed to keep the French out, becoming overnight useless when this part of Savoy was ceded to France in exchange for its support of the unification of Italy. For instance, there is such a fort in Aussois, which now houses an hostel, a gastronomical restaurant [we enjoyed at O’Bayes 03], and a via ferrata…)

The fortress has been recently and beautifully renovated with the help of the Italian State and of the European Union. It houses conferences and art exhibits. Like those on Marc Chagall and Elliot Erwitt that we briefly saw, missing the massive museum of the Alps… A few dozen kilometers from Torino, it would be a perfect location for a small workshop, albeit not large enough for a future MCMski.

hasta luego, Susie!

Posted in Statistics, University life with tags , , , , , , , , on August 20, 2014 by xi'an

I just heard that our dear, dear friend Susie Bayarri passed away early this morning, on August 19, in Valencià, Spain… I had known Susie for many, many years, our first meeting being in Purdue in 1987, and we shared many, many great times during simultaneous visits to Purdue University and Cornell University in the 1990’s. During a workshop in Cornell organised by George Casella (to become the unforgettable Camp Casella!), we shared a flat together and our common breakfasts led her to make fun of my abnormal consumption of cereals  forever after, a recurrent joke each time we met! Another time, we were coming from the movie theatre in Lafayette in Susie’ s car when we got stopped for going through a red light. Although she tried very hard, her humour and Spanish verve were for once insufficient to convince her interlocutor.

Susie was a great Bayesian, contributing to the foundations of Bayesian testing in her numerous papers and through the direction of deep PhD theses in Valencia. As well as to queuing systems and computer models. She was also incredibly active in ISBA, from the very start of the Bayesian society, and was one of the first ISBA presidents. She also definitely contributed to the Objective Bayes section of ISBA, especially in the construction of the O’Bayes meetings. She gave a great tutorial on Bayes factors at the last O’Bayes conference in Duke last December, full of jokes and passion, despite being already weak from her cancer…

So, hasta luego, Susie!, from all your friends. I know we shared the same attitude about our Catholic education and our first names heavily laden with religious meaning, but I’d still like to believe that your rich and contagious laugh now resonates throughout the cosmos. So, hasta luego, Susie, and un abrazo to all of us missing her.

AppliBUGS day celebrating Jean-Louis Foulley

Posted in pictures, Statistics, University life with tags , , , , , , on June 10, 2014 by xi'an

Sunset from Paris-Dauphine, Nov. 12, 2010In case you are in Paris tomorrow and free, there will be an AppliBUGS day focussing on the contributions of our friend Jean-Louis Foulley. (And a regular contributor to the ‘Og!) The meeting takes place in the ampitheatre on second floor of  ENGREF-Montparnasse (19 av du Maine, 75015 Paris, Métro Montparnasse Bienvenüe). I will give a part of the O’Bayes tutorial on alternatives to the Bayes factor.

penalising model component complexity

Posted in Books, Mountains, pictures, Statistics, University life with tags , , , , , , , , , , on April 1, 2014 by xi'an

“Prior selection is the fundamental issue in Bayesian statistics. Priors are the Bayesian’s greatest tool, but they are also the greatest point for criticism: the arbitrariness of prior selection procedures and the lack of realistic sensitivity analysis (…) are a serious argument against current Bayesian practice.” (p.23)

A paper that I first read and annotated in the very early hours of the morning in Banff, when temperatures were down in the mid minus 20’s now appeared on arXiv, “Penalising model component complexity: A principled, practical approach to constructing priors” by Thiago Martins, Dan Simpson, Andrea Riebler, Håvard Rue, and Sigrunn Sørbye. It is a highly timely and pertinent paper on the selection of default priors! Which shows that the field of “objective” Bayes is still full of open problems and significant advances and makes a great argument for the future president [that I am] of the O’Bayes section of ISBA to encourage young Bayesian researchers to consider this branch of the field.

“On the other end of the hunt for the holy grail, “objective” priors are data-dependent and are not uniformly accepted among Bayesians on philosophical grounds.” (p.2)

Apart from the above quote, as objective priors are not data-dependent! (this is presumably a typo, used instead of model-dependent), I like very much the introduction (appreciating the reference to the very recent Kamary (2014) that just got rejected by TAS for quoting my blog post way too much… and that we jointly resubmitted to Statistics and Computing). Maybe missing the alternative solution of going hierarchical as far as needed and ending up with default priors [at the top of the ladder]. And not discussing the difficulty in specifying the sensitivity of weakly informative priors.

“Most model components can be naturally regarded as a flexible version of a base model.” (p.3)

The starting point for the modelling is the base model. How easy is it to define this base model? Does it [always?] translate into a null hypothesis formulation? Is there an automated derivation? I assume this somewhat follows from the “block” idea that I do like but how generic is model construction by blocks?


“Occam’s razor is the principle of parsimony, for which simpler model formulations should be preferred until there is enough support for a more complex model.” (p.4)

I also like this idea of putting a prior on the distance from the base! Even more because it is parameterisation invariant (at least at the hyperparameter level). (This vaguely reminded me of a paper we wrote with George a while ago replacing tests with distance evaluations.) And because it gives a definitive meaning to Occam’s razor. However, unless the hyperparameter ξ is one-dimensional this does not define a prior on ξ per se. I equally like Eqn (2) as it shows how the base constraint takes one away from Jeffrey’s prior. Plus, if one takes the Kullback as an intrinsic loss function, this also sounds related to Holmes’s and Walker’s substitute loss pseudopriors, no? Now, eqn (2) does not sound right in the general case. Unless one implicitly takes a uniform prior on the Kullback sphere of radius d? There is a feeling of one-d-ness in the description of the paper (at least till page 6) and I wanted to see how it extends to models with many (≥2) hyperparameters. Until I reached Section 6 where the authors state exactly that! There is also a potential difficulty in that d(ξ) cannot be computed in a general setting. (Assuming that d(ξ) has a non-vanishing Jacobian as on page 19 sounds rather unrealistic.) Still about Section 6, handling reference priors on correlation matrices is a major endeavour, which should produce a steady flow of followers..!

“The current practice of prior specification is, to be honest, not in a good shape. While there has been a strong growth of Bayesian analysis in science, the research field of “practical prior specification” has been left behind.” (*p.23)

There are still quantities to specify and calibrate in the PC priors, which may actually be deemed a good thing by Bayesians (and some modellers). But overall I think this paper and its message constitute a terrific step for Bayesian statistics and I hope the paper can make it to a major journal.