## O’Bayes 2015 [day #3]

**T**he third day of the meeting was a good illustration of the diversity of the themes *[says a member of the scientific committee!]*, from “traditional” O’Bayes talks on reference priors by the father of all reference priors (!), José Bernardo, re-examinations of expected posterior priors, on properties of Bayes factors, or on new versions of the Lindley-Jeffreys paradox, to the radically different approach of Simpson et al. presented by Håvard Rue. I was obviously most interested in posterior expected priors!, with the new notion brought in by Dimitris Fouskakis, Ioannis Ntzoufras and David Draper of a lower impact of the minimal sample on the resulting prior by the trick of a lower (than one) power of the likelihood. Since this change seemed to go beyond the “minimal” in minimal sample size, I am somehow puzzled that this can be achieved, but the normal example shows this is indeed possible. The next difficulty is then in calibrating this power as I do not see any intuitive justification in a specific power. The central talk of the day was in my opinion Håvard’s as it challenged most tenets of the Objective Bayes approach, presented in a most eager tone, even though it did not generate particularly heated comments from the audience. I have already discussed here an earlier version of this paper and I keep on thinking this proposal for PC priors is a major breakthrough in the way we envision priors and their derivation. I was thus sorry to hear the paper had not been selected as a Read Paper by the Royal Statistical Society, as it would have nicely suited an open discussion, but I hope it will find another outlet that allows for a discussion! As an aside, Håvard discussed the case of a Student’s t degree of freedom as particularly challenging for prior construction, albeit I would have analysed the problem using instead a model choice perspective (on an usually continuous space of models).

As this conference day had a free evening, I took the tram with friends to the town beach and we had a fantastic [if hurried] dinner in a small bodega [away from the uninspiring beach front] called Casa Montaña, a place decorated with huge barrels, offering amazing tapas and wines, a perfect finale to my Spanish trip. Too bad we had to vacate the dinner room for the next batch of customers…

June 6, 2015 at 9:22 pm

I don’t think I’ve ever seen continuous model choice (except cross-validation, but that’s a touch different to what I assume you’re talking about). Do you have a reference in mind? (Everything Bayesian I can think of is very tied to a discrete model space).

And thanks for the nice words about the PC priors! I was really happy with the interest they generated at the meeting. (And pleasantly surprised – you never know how these things will go…)

June 7, 2015 at 12:07 pm

Neither have I. But in essence nothing prevents one from exploring a continuous space of models if all one seeks is the most probable model. Ehr… rather the highest evidence since I cannot think of a probability distribution across a continuous space of models…