**A**mong the flury of papers arXived around the ICML 2019 deadline, I read on my way back from Oxford a paper by Wiqvist et al. on learning summary statistics for ABC by neural nets. Pointing out at another recent paper by Jiang et al. (2017, Statistica Sinica) which constructed a neural network for predicting each component of the parameter vector based on the input (raw) data, as an automated non-parametric regression of sorts. Creel (2017) does the same but with summary statistics. The current paper builds up from Jiang et al. (2017), by adding the constraint that exchangeability and partial exchangeability features should be reflected by the neural net prediction function. With applications to Markovian models. Due to a factorisation theorem for d-block invariant models, the authors impose partial exchangeability for order d Markov models by combining two neural networks that end up satisfying this factorisation. The concept is exemplified for one-dimension g-and-k distributions, alpha-stable distributions, both of which are made of independent observations, and the AR(2) and MA(2) models, as in our 2012 ABC survey paper. Since the later is not Markovian the authors experiment with different orders and reach the conclusion that an order of 10 is most appropriate, although this may be impacted by being a ble to handle the true likelihood.

## Archive for Oxford

## a pen for ABC

Posted in Books, pictures, Statistics, Travel, University life with tags ABC, alpha-stable processes, exchangeability, g-and-k distributions, ICML, MA(q) model, Markov model, neural network, Oxford, partial exchangeability on February 13, 2019 by xi'an## Jeffreys priors for hypothesis testing [Bayesian reads #2]

Posted in Books, Statistics, University life with tags Arnold Zellner, Bayes factor, Bayesian tests of hypotheses, CDT, class, classics, Gaussian mixture, improper priors, Jeffreys prior, JRSSB, Kullback-Leibler divergence, Oxford, PhD course, Saint Giles cemetery, Susie Bayarri, Theory of Probability, University of Oxford on February 9, 2019 by xi'anA second (re)visit to a reference paper I gave to my OxWaSP students for the last round of this CDT joint program. Indeed, this may be my first complete read of Susie Bayarri and Gonzalo Garcia-Donato 2008 Series B paper, inspired by Jeffreys’, Zellner’s and Siow’s proposals in the Normal case. *(Disclaimer: I was not the JRSS B editor for this paper.) *Which I saw as a talk at the O’Bayes 2009 meeting in Phillie.

The paper aims at constructing formal rules for objective proper priors in testing embedded hypotheses, in the spirit of Jeffreys’ Theory of Probability “hidden gem” (Chapter 3). The proposal is based on symmetrised versions of the Kullback-Leibler divergence κ between null and alternative used in a transform like an inverse power of 1+κ. With a power large enough to make the prior proper. Eventually multiplied by a reference measure (i.e., the arbitrary choice of a dominating measure.) Can be generalised to any intrinsic loss (not to be confused with an intrinsic prior à la Berger and Pericchi!). Approximately Cauchy or Student’s t by a Taylor expansion. To be compared with Jeffreys’ original prior equal to the derivative of the atan transform of the root divergence (!). A delicate calibration by an effective sample size, lacking a general definition.

At the start the authors rightly insist on having the nuisance parameter v to differ for each model but… as we all often do they relapse back to having the “same ν” in both models for integrability reasons. Nuisance parameters make the definition of the divergence prior somewhat harder. Or somewhat arbitrary. Indeed, as in reference prior settings, the authors work first conditional on the nuisance then use a prior on ν that may be improper by the “same” argument. (Although *conditioning* is not the proper term if the marginal prior on ν is improper.)

The paper also contains an interesting case of the translated Exponential, where the prior is L¹ Student’s t with 2 degrees of freedom. And another one of mixture models albeit in the simple case of a location parameter on one component only.

## cement homeless

Posted in pictures, Travel with tags homeless, Isaac Cordal, New York city, Oxford, Oxford city council, sculpture, shelter, The Guardian, Unfinished People in NYC” on February 2, 2019 by xi'an## BNP12

Posted in pictures, Statistics, Travel, University life with tags Bayesian nonparametrics, BNP12, Coventry, England, ISBA, Midlands, O'Bayes 2019, objective Bayes, Oxford, support, UK, University of Oxford, University of Warwick on October 9, 2018 by xi'an**T**he next BNP (Bayesian nonparametric) conference is taking place in Oxford (UK), prior to the O’Bayes 2019 conference in Warwick, in June 24-28 and June 29-July 2, respectively. At this stage, the Scientific Committee of BNP12 invites submissions for possible contributed talks. The deadline for submitting a title/abstract is 15th December 2018. And the submission of applications for travel support closes on 15th December 2018. Currently, there are 35 awards that could be either travel awards or accommodation awards. The support is for junior researchers (students currently enrolled in a Dphil (PhD) programme or having graduated after 1st October 2015). The applicant agrees to present her/his work at the conference as a poster or oraly if awarded the travel support.

As for O’Bayes 2019, we are currently composing the programme, following the 20 years tradition of these O’Bayes meetings of having the Scientific Committee (Marilena Barbieri, Ed George, Brunero Liseo, Luis Pericchi, Judith Rousseau and myself) inviting about 25 speakers to present their recent work and 25 discussants to… discuss these works. With a first day of introductory tutorials to Bayes, O’Bayes and beyond. I (successfully) proposed this date and location to the O’Bayes board to take advantage of the nonparametric Bayes community present in the vicinity so that they could attend both meetings at limited cost and carbon impact.