Archive for Bayesian computing

ISBA2020 program

Posted in Kids, Statistics, Travel, University life with tags , , , , , , , , , , , , on January 29, 2020 by xi'an

The scheduled program for ISBA 2020 is now on-line. And full of exciting sessions, many with computational focus. With dear hopes that the nCo-2019 epidemics will have abated by then (and not solely for the sake of the conference, most obviously!). While early registration ends by 15 April, the deadline for junior travel support ends up this month. And so does the deadline for contributions.

MCMC, with common misunderstandings

Posted in Books, pictures, R, Statistics, University life with tags , , , , , , , , , , , , on January 27, 2020 by xi'an

As I was asked to write a chapter on MCMC methods for an incoming Handbook of Computational Statistics and Data Science, published by Wiley, rather than cautiously declining!, I decided to recycle the answers I wrote on X validated to what I considered to be the most characteristic misunderstandings about MCMC and other computing methods, using as background the introduction produced by Wu Changye in his PhD thesis. Waiting for the opinion of the editors of the Handbook on this Q&A style. The outcome is certainly lighter than other recent surveys like the one we wrote with Peter Green, Krys Latuszinski, and Marcelo Pereyra, for Statistics and Computing, or the one with Victor Elvira, Nick Tawn, and Changye Wu.

don’t be late for BayesComp’2020

Posted in Statistics with tags , , , , , , , , , , , , , on October 4, 2019 by xi'an

An important reminder that October 14 is the deadline for regular registration to BayesComp 2020 as late fees will apply afterwards!!! The conference looks attractive enough to agree to pay more, but still…

deadlines for BayesComp’2020

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , on August 17, 2019 by xi'an

While I have forgotten to send a reminder that August 15 was the first deadline of BayesComp 2020 for the early registrations, here are further deadlines and dates

  1. BayesComp 2020 occurs on January 7-10 2020 in Gainesville, Florida, USA
  2. Registration is open with regular rates till October 14, 2019
  3. Deadline for submission of poster proposals is December 15, 2019
  4. Deadline for travel support applications is September 20, 2019
  5. There are four free tutorials on January 7, 2020, related with Stan, NIMBLE, SAS, and AutoStat

BayesComp 20: call for contributed sessions!

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , on March 20, 2019 by xi'an

Just to remind readers of the incoming deadline for BayesComp sessions:

The deadline for providing a title and brief abstract that the session is April 1, 2019. Please provide the names and affiliations of the organizer and the three speakers (the organizer can be one of them). Each session lasts 90 minutes and each talk should be 30 minutes long including Q&A. Contributed sessions can also consist of tutorials on the use of novel software. Decisions will be made by April 15, 2019. Please send your proposals to Christian Robert, co-chair of the scientific committee. We look forward to seeing you at BayesComp 20!

In case you do not feel like organising a whole session by yourself, contact the ISBA section you feel affinity with and suggest it helps building this session together!

a come-back of the harmonic mean estimator

Posted in Statistics with tags , , , , , , on September 6, 2018 by xi'an

Are we in for a return of the harmonic mean estimator?! Allen Caldwell and co-authors arXived a new document that Allen also sent me, following a technique that offers similarities with our earlier approach with Darren Wraith, the difference being in the more careful and practical construct of the partition set and use of multiple hypercubes, which is the smart thing. I visited Allen’s group at the Max Planck Institut für Physik (Heisenberg) in München (Garching) in 2015 and we confronted our perspectives on harmonic means at that time. The approach followed in the paper starts from what I would call the canonical Gelfand and Dey (1995) representation with a uniform prior, namely that the integral of an arbitrary non-negative function [or unnormalised density] ƒ can be connected with the integral of the said function ƒ over a smaller set Δ with a finite measure measure [or volume]. And therefore to simulations from the density ƒ restricted to this set Δ. Which can be recycled by the harmonic mean identity towards producing an estimate of the integral of ƒ over the set Δ. When considering a partition, these integrals sum up to the integral of interest but this is not necessarily the only exploitation one can make of the fundamental identity. The most novel part stands in constructing an adaptive partition based on the sample, made of hypercubes obtained after whitening of the sample. Only keeping points with large enough density and sufficient separation to avoid overlap. (I am unsure a genuine partition is needed.) In order to avoid selection biases the original sample is separated into two groups, used independently. Integrals that stand too much away from the others are removed as well. This construction may sound a bit daunting in the number of steps it involves and in the poor adequation of a Normal to an hypercube or conversely, but it seems to shy away from the number one issue with the basic harmonic mean estimator, the almost certain infinite variance. Although it would be nice to be completely certain this doom is avoided. I still wonder at the degenerateness of the approximation of the integral with the dimension, as well as at other ways of exploiting this always fascinating [if fraught with dangers] representation. And comparing variances.

advanced computational methods for complex models in Biology [talk]

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , on September 29, 2016 by xi'an

St Pancras. London, Jan. 26, 2012

Here are the slides of the presentation I gave at the EPSRC Advanced Computational methods for complex models in Biology at University College London, last week. Introducing random forests as proper summaries for both model choice and parameter estimation (with considerable overlap with earlier slides, obviously!). The other talks of that highly interesting day on computational Biology were mostly about ancestral graphs, using Wright-Fisher diffusions for coalescents, plus a comparison of expectation-propagation and ABC on a genealogy model by Mark Beaumont and the decision theoretic approach to HMM order estimation by Chris Holmes. In addition, it gave me the opportunity to come back to the Department of Statistics at UCL more than twenty years after my previous visit, at a time when my friend Costas Goutis was still there. And to realise it had moved from its historical premises years ago. (I wonder what happened to the two staircases built to reduce frictions between Fisher and Pearson if I remember correctly…)