Archive for University of Oxford

The New Yorker [April 4, 2022]

Posted in Books, Kids, pictures, Travel, University life with tags , , , , , , , on April 30, 2022 by xi'an

I had not bought a paper issue of The New Yorker from a newstand for ages, possibly decades, so I jumped on the opportunity on my way back from Rutgers! This April 4 issue contained several impressive articles, which I read over the following weeks. (As I discovered during the lockdown, one issue a week is too much material, even during lockdown!) The most moving one is about Mackenzie Fierceton, a brilliant Penn University graduate, who escaped familial abuse to purse  sociology studies at Penn, to the point of receiving a most notorious Rhodes scholarship in 2021 to start a PhD in social policy at the University of Oxford. Only for the scholarship to be rescinded by the Rhodes Trust, after action from Penn. While I cannot judge of the facts and arguments based on this sole article (which 12 pages are definitely impressive in their detail and careful depictions of the whole story), the corporate-damage-control attitude of Penn in this affair is appalling.  And revealing of a awfully biased perception of abuse and psychological damage being limited to low-income classes. All the best to this student, in the pursuit of her studies and ideals.

Big Bayes postdoctoral position in Oxford [UK]

Posted in Statistics with tags , , , , , , , , , , , on March 3, 2022 by xi'an

Forwarding a call for postdoctoral applications from Prof Judith Rousseau, with deadline 30 March:

Seeking a Postdoctoral Research Assistant, to join our group at the Department of Statistics. The Postdoctoral Research Assistant will be carrying out research for the ERC project General Theory for Big Bayes, reporting to Professor Judith Rousseau. They will provide guidance to junior members of the research group such as PhD students, and/or project volunteers.

The aim of this project is to develop a general theory for the analysis of Bayesian methods in complex and high (or infinite) dimensional models which will cover not only fine understanding of the posterior distributions but also an analysis of the output of the algorithms used to implement the approaches. The main objectives of the project are (briefly): 1) Asymptotic analysis of the posterior distribution of complex high dimensional models 2) Interactions between the asymptotic theory of high dimensional posterior distributions and computational complexity. We will also enrich these theoretical developments by 3) strongly related domains of applications, namely neuroscience, terrorism and crimes, and ecology.

The postholder will hold or be close to completion of a PhD/DPhil in statistics together with relevant experience. They will have the ability to manage own academic research and associated activities and have previous experience of contributing to publications/presentations. They will contribute ideas for new research projects and research income generation. Ideally, the postholder will also have experience in theoretical properties of Bayesian procedures and/or approximate Bayesian methods.

robust inference using posterior bootstrap

Posted in Books, Statistics, University life with tags , , , , , , , , , , , , , , , on February 18, 2022 by xi'an

The famous 1994 Read Paper by Michael Newton and Adrian Raftery was entitled Approximate Bayesian inference, where the boostrap aspect is in randomly (exponentially) weighting each observation in the iid sample through a power of the corresponding density, a proposal that happened at about the same time as Tony O’Hagan suggested the related fractional Bayes factor. (The paper may also be equally famous for suggesting the harmonic mean estimator of the evidence!, although it only appeared as an appendix to the paper.) What is unclear to me is the nature of the distribution g(θ) associated with the weighted bootstrap sample, conditional on the original sample, since the outcome is the result of a random Exponential sample and of an optimisation step. With no impact of the prior (which could have been used as a penalisation factor), corrected by Michael and Adrian via an importance step involving the estimation of g(·).

At the Algorithm Seminar today in Warwick, Emilie Pompe presented recent research, including some written jointly with Pierre Jacob, [which I have not yet read] that does exactly that inclusion of the log prior as penalisation factor, along with an extra weight different from one, as motivated by the possibility of a misspecification. Including a new approach to cut models. An alternative mentioned during the talk that reminds me of GANs is to generate a pseudo-sample from the prior predictive and add it to the original sample. (Some attendees commented on the dependence of the later version on the chosen parameterisation, which is an issue that had X’ed my mind as well.)

David Cox (1924-2022)

Posted in Books, Statistics, University life with tags , , , , , , , , , , , , , , , , , , , , , , , , on January 20, 2022 by xi'an

It is with much sadness that I heard from Oxford yesterday night that David Cox had passed away. Hither goes a giant of the field, whose contributions to theoretical and methodological statistics are enormous and whose impact on society is truly exceptional. He was the first recipient of the International Prize in Statistics in 2016 (aka the “Nobel of Statistics”) among many awards and a Fellow of the Royal Society among many other recognitions. He was also the editor of Biometrika for 25 years (!) and was still submitting papers to the journal a few month ago. Statistical Science published a conversation between Nancy Reid and him that tells a lot about the man and his amazing modesty. While I had met him in 1989, when he was visiting Cornell University as a distinguished visitor (and when I drove him to the house of Anne and George Casella for dinner once), then again in the 1990s when he came on a two-day visit to CREST,  we only really had a significant conversation in 2011 (!), when David and I attended the colloquium in honour of Mike Titterington in Glasgow and he proved to be most interested in the ABC algorithm. He published a connected paper in Biometrika the year after, with Christiana Katsonaki. We met a few more times later, always in Oxford, to again discuss ABC. In each occasion, he was incredibly kind and considerate.

improving bridge samplers by GANs

Posted in Books, pictures, Statistics with tags , , , , , , , on July 20, 2021 by xi'an

Hanwen Xing from Oxford recently posted a paper on arXiv about using GANs to improve the overlap bewtween the densities in bridge sampling. Bringing out new connections with noise contrastive estimation. The idea is to optimise a transform of one of the densities h() to bring it closer to the other density k(), using for instance normalising flows. (The call to transforms for bridge is not new, dating at least to Voter in 1985, the year I was starting my PhD!) Furthermore, using an f-divergence as a measure of functional distance allows for a reasonably straightforward update of the transform. That can be reformulated as a GAN target, which is somewhat natural in that the transform aims at confusing simulation from the transform of h and from k. This is quite an interesting proposal,  even though calculating the optimal transform is time-consuming and subjet to the curse of dimensionality. I also wonder at whether or not iterating the optimisation, one density after the other, would be bring further improvement.

%d bloggers like this: