## Archive for the University life Category

## en route to Boston!

Posted in pictures, Running, Travel, University life with tags Bayesian foundations, BFF4, Boston harbour, Charles river, fiducial inference, frequentist inference, Harvard University, USA on April 29, 2017 by xi'an## Bayes, reproducibility and the Quest for Truth

Posted in Books, Statistics, University life with tags Bayesian foundations, frequency properties, frequentist coverage, L'Aquila, Statistical Science, truth on April 27, 2017 by xi'anDon Fraser, Mylène Bédard, and three coauthors have written a paper with the above dramatic title in Statistical Science about the reproducibility of Bayesian inference in the framework of what they call a mathematical prior. Connecting with the earlier quick-and-dirty tag attributed by Don to Bayesian credible intervals.

“We provide simple (…) counter-examples to general claims that Bayes can offer accuracy for statistical inference. To obtain this accuracy with Bayes, more effort is required compared to recent likelihood methods (…) [and] accuracy beyond first order is routinely not available (…) An alternative is to view default Bayes as an exploratory technique and then ask does it do as it overtly claims? Is it reproducible as understood in contemporary science? (…) No one has answers although speculative claims abound.” (p. 1)

The early stages of the paper questions the nature of a prior distribution in terms of objectivity and reproducibility, which strikes me as a return to older debates on the nature of probability. And of a dubious insistence on the reality of a prior when the said reality is customarily and implicitly assumed for the sampling distribution. While we “can certainly ask how [a posterior] quantile relates to the true value of the parameter”, I see no compelling reason why the associated quantile should be endowed with a frequentist coverage meaning, i.e., be more than a normative indication of the deviation from the true value. (Assuming there is such a parameter.) To consider that the credible interval of interest can be “objectively” assessed by simulation experiments evaluating its coverage is thus doomed from the start (since there is not reason for the nominal coverage) and situated on the wrong plane since it stems from the hypothetical frequentist model for a range of parameter values. Instead I find simulations from (generating) models useful in a general ABC sense, namely by producing realisations from the predictive one can assess at which degree of roughness the data is compatible with the formal construct. To bind reproducibility to the frequentist framework thus sounds wrong [to me] as being model-based. In other words, I do not find the definition of reproducibility used in the paper to be objective (literally bouncing back from Gelman and Hennig Read Paper)

At several points in the paper, the legal consequences of using a subjective prior are evoked as legally binding and implicitly as dangerous. With the example of the L’Aquila expert trial. I have trouble seeing the relevance of this entry as an adverse lawyer is as entitled to attack the expert on her or his sampling model. More fundamentally, I feel quite uneasy about bringing this type of argument into the debate!

## Gregynog #2 [jatp]

Posted in Kids, pictures, Running, Statistics, Travel, University life with tags Aberystwyth, Britain, Charles Hanbury-Tracy, concrete, Dennis Lindley, Gregynog, Gregynog Statistical Conference, jatp, Powys, seminar, Tregynon, University of Wales, University of Warwick, Wales on April 26, 2017 by xi'an## ABC postdoc in Olso

Posted in Kids, Mountains, pictures, Travel, University life with tags ABC, French elections, Norway, Oslo, postdoc, postdoctoral position, Scandinavia, UiO, University of Oslo on April 26, 2017 by xi'anJukka Corander sent me the announcement that he is opening a 3 year postdoctoral position at the University of Oslo, to work with him and his team on ABC projects. This sounds quite an exciting offer, plus gives the nominee the opportunity to live in the most enjoyable city of Oslo for several years in fairly comfy conditions! **The deadline is May 31.** (If I was at a stage of my career where applying made sense, I would definitely candidate. Not even waiting for the outcome of the French elections on May 7!)

## marginal likelihoods from MCMC

Posted in Books, pictures, Statistics, University life with tags ABC, arXiv, Bayesian Methods in Cosmology, curse of dimensionality, evidence, INLA, k-nearest neighbour, marginal likelihood, nested sampling, Planck experiment, San Antonio, satellite on April 26, 2017 by xi'an**A** new arXiv entry on ways to approximate marginal likelihoods based on MCMC output, by astronomers (apparently). With an application to the 2015 Planck satellite analysis of cosmic microwave background radiation data, which reminded me of our joint work with the cosmologists of the Paris Institut d’Astrophysique ten years ago. In the literature review, the authors miss several surveys on the approximation of those marginals, including our San Antonio chapter, on Bayes factors approximations, but mention our ABC survey somewhat inappropriately since it is not advocating the use of ABC for such a purpose. (They mention as well variational Bayes approximations, INLA, powered likelihoods, if not nested sampling.)

The proposal of this paper is to identify the marginal *m* [actually denoted *a* there] as the normalising constant of an unnormalised posterior density. And to do so the authors estimate the posterior by a non-parametric approach, namely a k-nearest-neighbour estimate. With the additional twist of producing a sort of Bayesian posterior on the constant *m*. [And the unusual notion of number density, used for the unnormalised posterior.] The Bayesian estimation of m relies on a Poisson sampling assumption on the k-nearest neighbour distribution. (Sort of, since k is actually fixed, not random.)

If the above sounds confusing and imprecise it is because I am myself rather mystified by the whole approach and find it difficult to see the point in this alternative. The Bayesian numerics does not seem to have other purposes than producing a MAP estimate. And using a non-parametric density estimate opens a Pandora box of difficulties, the most obvious one being the curse of dimension(ality). This reminded me of the commented paper of Delyon and Portier where they achieve super-efficient convergence when using a kernel estimator, but with a considerable cost and a similar sensitivity to dimension.