**T**he full and definitive program of the O’Bayes 2019 conference in Warwick is now on line. Including discussants for all papers. And the three [and free] tutorials on Friday afternoon, 28 June, on model selection (M. Barbieri), MCMC recent advances (G.O. Roberts) and BART (E.I. George). Registration remains open at the reduced rate and submissions of posters can still be sent to me for all conference participants.

## Archive for frequentist inference

## O’Bayes 2019 conference program

Posted in Kids, pictures, Statistics, Travel, University life with tags Bayesian conference, Bayesian model selection, BNP12, England, frequentist inference, imprecise probabilities, ISBA, O'Bayes 2019, objective Bayes, prior selection, Statistical learning, University of Warwick on May 13, 2019 by xi'an## double yolk priors [a reply from the authors]

Posted in Books, Statistics, University life with tags Brad Efron, David Spiegelhalter, egg double yolks, empirical Bayes methods, frequentist inference, reply on March 14, 2018 by xi'an*[Here is an email I received from Subhadeep Mukhopadhyay, one of the authors of the paper I discussed yesterday.}*

*Leg*and

_{j}(u)_j*T*

_{j}=*Leg*. One is orthonormal polyn of

_{j}(G(θ))*L*and the other one is

_{2}[0,1]*L*. The second one is poly of rank-transform

_{2}[G]*G(θ)*.

*g⁻¹(θ)*, which is the d(u;G,F) over unit interval. Now, this new transformed function is a proper density.

*d(G(θ))*which can be expended into (NOT in Leg-basis) in , in eq (2.2), as it lives in the Hilbert space

*L*

_{2}(G)*g*.

## double yolk priors

Posted in Statistics with tags Brad Efron, David Spiegelhalter, egg double yolks, empirical Bayes methods, frequentist inference on March 13, 2018 by xi'an

“To develop a “defendable and defensible” Bayesian learning model, we have to go beyond blindly ‘turning the crank’ based on a “go-as-you-like” [approximate guess] prior. A lackluster attitude towards prior modeling could lead to disastrous inference, impacting various fields from clinical drug development to presidential election forecasts. The real questions are: How can we uncover the blind spots of the conventional wisdom-based prior? How can we develop the science of prior model-building that combines both data and science [DS-prior] in a testable manner – a double-yolk Bayesian egg?”

**I** came through R bloggers on this presentation of a paper by Subhadeep Mukhopadhyay and Douglas Fletcher, Bayesian modelling via goodness of fit, that aims at solving all existing problems with classical Bayesian solutions, apparently! (With also apparently no awareness of David Spiegelhalter’s take on the matter.) As illustrated by both quotes, above and below:

“The two key issues of modern Bayesian statistics are: (i) establishing principled approach for distilling statistical prior that is consistent with the given data from an initial believable scientific prior; and (ii) development of a Bayes-frequentist consolidated data analysis work ow that is more effective than either of the two separately.”

(I wonder who else in this Universe would characterise “modern Bayesian statistics” in such a non-Bayesian way! And love the notion of distillation applied to priors!) The setup is actually one of empirical Bayes inference where repeated values of the parameter θ drawn from the prior are behind independent observations. Which is not the usual framework for a statistical analysis, where a single value of the parameter is supposed to hide behind the data, but most convenient for frequency based arguments behind empirical Bayes methods (which is the case here). The paper adopts a far-from-modern discourse on the “truth” of “the” prior… (Which is always conjugate in that Universe!) Instead of recognising the relativity of a statistical analysis based on a given prior.

When I tried to read the paper any further, I hit a wall as I could not understand the principle described therein. And how it “consolidates Bayes and frequentist, parametric and nonparametric, subjective and objective, quantile and information-theoretic philosophies.”. Presumably the lack of oxygen at the altitude of Chamonix…. Given an “initial guess” at the prior, g, a conjugate prior (in dimension one with an invertible cdf), a family of priors is created in what first looks like a form of non-parametric exponential tilting of g. But a closer look [at (2.1)] exposes the “family” as the tautological π(θ)=g(θ)x π(θ)/g(θ). The ratio is expanded into a Legendre polynomial series. Which use in Bayesian statistics dates a wee bit further back than indicated in the paper (see, e.g., Friedman, 1985; Diaconis, 1986). With the side issue that the resulting approximation does not integrate to one. Another side issue is that the coefficients of the Legendre truncated series are approximated by simulations from the prior [Step 3 of the Type II algorithm], rarely an efficient approach to the posterior.

## distributions for parameters [seminar]

Posted in Books, Statistics, University life with tags Bayesian paradigm, BFF4, Canada, CANSSI, confidence distribution, COPSS Award, fiducial inference, foundations, frequentist inference, Nancy Reid, National Academy of Science, seminar, Université Paris Dauphine, University of Toronto on January 22, 2018 by xi'an**N**ext Thursday, January 25, Nancy Reid will give a seminar in Paris-Dauphine on distributions for parameters that covers different statistical paradigms and bring a new light on the foundations of statistics. (Coffee is at 10am in the Maths department common room and the talk is at 10:15 in room A, second floor.)

Nancy Reid is University Professor of Statistical Sciences and the Canada Research Chair in Statistical Theory and Applications at the University of Toronto and internationally acclaimed statistician, as well as a 2014 Fellow of the Royal Society of Canada. In 2015, she received the Order of Canada, was elected a foreign associate of the National Academy of Sciences in 2016 and has been awarded many other prestigious statistical and science honours, including the Committee of Presidents of Statistical Societies (COPSS) Award in 1992.

Nancy Reid’s research focuses on finding more accurate and efficient methods to deduce and conclude facts from complex data sets to ultimately help scientists find specific solutions to specific problems.

There is currently some renewed interest in developing distributions for parameters, often without relying on prior probability measures. Several approaches have been proposed and discussed in the literature and in a series of “Bayes, fiducial, and frequentist” workshops and meeting sessions. Confidence distributions, generalized fiducial inference, inferential models, belief functions, are some of the terms associated with these approaches. I will survey some of this work, with particular emphasis on common elements and calibration properties. I will try to situate the discussion in the context of the current explosion of interest in big data and data science.

## en route to Boston!

Posted in pictures, Running, Travel, University life with tags Bayesian foundations, BFF4, Boston harbour, Charles river, fiducial inference, frequentist inference, Harvard University, USA on April 29, 2017 by xi'an## beyond objectivity, subjectivity, and other ‘bjectivities

Posted in Statistics with tags Andrew Gelman, Christian Hennig, discussion paper, Errol Street, frequentist inference, London, objectivism, Read paper, Royal Statistical Society, RSS, Series A, statistical modelling, subjective versus objective Bayes, subjectivity on April 12, 2017 by xi'an**H**ere is my discussion of Gelman and Hennig at the Royal Statistical Society, which I am about to deliver!

## objective and subjective RSS Read Paper next week

Posted in Books, pictures, Statistics, Travel, University life, Wines with tags Andrew Gelman, Christian Hennig, discussion paper, England, frequentist inference, London, objective Bayes, objectivism, Philosophy of Science, Read paper, Royal Statistical Society, RSS, Series A, subjective versus objective Bayes, subjectivity on April 5, 2017 by xi'an**A**ndrew Gelman and Christian Hennig will give a Read Paper presentation next Wednesday, April 12, 5pm, at the Royal Statistical Society, London, on their paper “Beyond subjective and objective in statistics“. Which I hope to attend and else to write a discussion. Since the discussion (to published in Series A) is open to everyone, I strongly encourage ‘Og’s readers to take a look at the paper and the “radical” views therein to hopefully contribute to this discussion. Either as a written discussion or as comments on this very post.