The full program is now available on the conference webpage of BayesComp 20, next 710 Jan 2020. There are eleven invited sessions, including one jISBA session, and a further thirteen contributed sessions were selected by the scientific committee. Calls are still open for tutorials on Tuesday 07 January (with two already planed on Nimble and AutoStat) and for posters. Now is the best time for registering! Note also that travel support should be available for junior researchers.
Archive for Bayesian computation
BayesComp 20 [full program]
Posted in pictures, R, Statistics, Travel, University life with tags AutoStat, AutoStat Institute, BayesComp 2020, Bayesian computation, breaking news, conference, Florida, Gainesville, ISBA, Nimble, poster session, travel support, tutorial, University of Florida on April 15, 2019 by xi'anBayesComp 20 [registration open]
Posted in pictures, Statistics, Travel with tags BayesComp 2020, Bayesian computation, breaking news, conference, Florida, Gainesville, ISBA, poster session, registration fees, University of Florida on April 3, 2019 by xi'anThe registration page is now open for BayesComp 20, in Gainesville, Florida, next 710 Jan 2020. The fees are quite moderate, imho, given the fact that they cover all breaks (if not the conference dinner). Deadline for these early rates is August 14. There will also be travel support from various sponsors, with deadline for application being September 20. Contributed sessions will be announced soon, with possible openings for last minute breakthrough sessions. Calls are still open for tutorials on 07 January and for posters.
IMS workshop [day 3]
Posted in pictures, R, Statistics, Travel, University life with tags Bayesian computation, Birch, delayed simulation, high dimensions, hypocoercivity, IMS, Institute for Mathematical Sciences, Lapland, MCqMC 2018, National University Singapore, nonreversible diffusion, NUS, ODE, partly deterministic processes, probabilistic programming, RaoBlackwellisation, Rennes, Singapore, WangLandau algorithm, workshop on August 30, 2018 by xi'anI made the “capital” mistake of walking across the entire NUS campus this morning, which is quite green and pretty, but which almost enjoys an additional dimension brought by such an intense humidity that one feels having to get around this humidity!, a feature I have managed to completely erase from my memory of my previous visit there. Anyway, nothing of any relevance. oNE talk in the morning was by Markus Eisenbach on tools used by physicists to speed up Monte Carlo methods, like the WangLandau flat histogram, towards computing the partition function, or the distribution of the energy levels, definitely addressing issues close to my interest, but somewhat beyond my reach for using a different language and stress, as often in physics. (I mean, as often in physics talks I attend.) An idea that came out clear to me was to bypass a (flat) histogram target and aim directly at a constant slope cdf for the energy levels. (But got scared away by the Fourier transforms!)
Lawrence Murray then discussed some features of the Birch probabilistic programming language he is currently developing, especially a fairly fascinating concept of delayed sampling, which connects with locallyoptimal proposals and Rao Blackwellisation. Which I plan to get back to later [and hopefully sooner than later!].
In the afternoon, Maria de Iorio gave a talk about the construction of nonparametric priors that create dependence between a sequence of functions, a notion I had not thought of before, with an array of possibilities when using the stick breaking construction of Dirichlet processes.
And Christophe Andrieu gave a very smooth and helpful entry to partly deterministic Markov processes (PDMP) in preparation for talks he is giving next week for the continuation of the workshop at IMS. Starting with the guided random walk of Gustafson (1998), which extended a bit later into the nonreversible paper of Diaconis, Holmes, and Neal (2000). Although I had a vague idea of the contents of these papers, the role of the velocity ν became much clearer. And premonitory of the advances made by the more recent PDMP proposals. There is obviously a continuation with the equally pedagogical talk Christophe gave at MCqMC in Rennes two months [and half the globe] ago, but the focus being somewhat different, it really felt like a new talk [my short term memory may also play some role in this feeling!, as I now remember the discussion of Hilderbrand (2002) for nonreversible processes]. An introduction to the topic I would recommend to anyone interested in this new branch of Monte Carlo simulation! To be followed by the most recently arXived hypocoercivity paper by Christophe and coauthors.
positions at QUT stats
Posted in Statistics with tags academic position, Australia, Bayesian computation, Brisbane, data science, Gold Coast, postdoctoral position, Queensland University of Technology, QUT on September 4, 2017 by xi'anChris Drovandi sent me the information that the Statistics Group, QUT, Brisbane, is advertising for three positions:
 Professor in Statistical Data Science (Remuneration package from $AUD206,729 per year)
 Lecturer in Statistical Data Science (Remuneration package of $AUD108,796 to $AUD129,209 per year)
 PhD Scholarship in Computational Bayesian Statistics ($AUD35,000 per year taxfree for 3 years, with an additional $AUD5,000 per year for project costs)
This is a great opportunity, a very active group, and a great location, which I visited several times, so if interested apply before October 1.
Bayesian model selection without evidence
Posted in Books, Statistics, University life with tags Bayes factor, Bayesian computation, evidence, MetropolisHastings algorithm, Monte Carlo Statistical Methods, normalising constant, Peter Green, reversible jump MCMC on September 20, 2016 by xi'an“The new method circumvents the challenges associated with accurate evidence calculations by computing posterior odds ratios using Bayesian parameter estimation”
One paper leading to another, I had a look at Hee et al. 2015 paper on Bayes factor estimation. The “novelty” stands in introducing the model index as an extra parameter in a single model encompassing all models under comparison, the “new” parameterisation being in (θ,n) rather than in θ. With the distinction that the parameter θ is now made of the union of all parameters across all models. Which reminds us very much of Carlin and Chib (1995) approach to the problem. (Peter Green in his Biometrika (1995) paper on reversible jump MCMC uses instead a direct sum of parameter spaces.) The authors indeed suggest simulating jointly (θ,n) in an MCMC or nested sampling scheme. Rather than being updated by arbitrary transforms as in Carlin and Chib (1995) the useless parameters from the other models are kept constant… The goal being to estimate P(nD) the marginal posterior on the model index, aka the posterior probability of model n.
Now, I am quite not certain keeping the other parameter constants is a valid move: given a uniform prior on n and an equally uniform proposal, the acceptance probability simplifies into the regular MetropolisHastings ratio for model n. Hence the move is valid within model n. If not, I presume the previous pair (θ⁰,n⁰) is repeated. Wait!, actually, this is slightly more elaborate: if a new value of n, m, is proposed, then the acceptance ratio involves the posteriors for both n⁰ and m, possibly only the likelihoods when the proposal is the prior. So the move will directly depend on the likelihood ratio in this simplified case, which indicates the scheme could be correct after all. Except that this neglects the measure theoretic subtleties that led to reversible jump symmetry and hence makes me wonder. In other words, it follows exactly the same pattern as reversible jump without the constraints of the latter… Free lunch, anyone?!
Statistics & Computing [toc]
Posted in Books, Statistics with tags academic journals, Bayesian computation, Monte Carlo Statistical Methods, SpringerVerlag, Statistics and Computing on June 29, 2016 by xi'anThe latest [June] issue of Statistics & Computing is full of interesting Bayesian and Monte Carlo entries, some of which are even open access!

Computation of Gaussian orthant probabilities in high dimension
James Ridgway Pages 899916Download PDF (998KB) View Article
afternoon on Bayesian computation
Posted in Statistics, Travel, University life with tags advanced Monte Carlo methods, Antonietta Mira, Arnaud Doucet, Bayesian computation, CRiSM, estimating a constant, Ingmar Schuster, Monte Carlo Statistical Methods, pub, United Kingdom, Université Paris Dauphine, University of Oxford, University of Reading, University of Warwick on April 6, 2016 by xi'anRichard Everitt organises an afternoon workshop on Bayesian computation in Reading, UK, on April 19, the day before the Estimating Constant workshop in Warwick, following a successful afternoon last year. Here is the programme:
12301315 Antonietta Mira, Università della Svizzera italiana 13151345 Ingmar Schuster, Université ParisDauphine 13451415 FrancoisXavier Briol, University of Warwick 14151445 Jack Baker, University of Lancaster 14451515 Alexander Mihailov, University of Reading 15151545 Coffee break 15451630 Arnaud Doucet, University of Oxford 16301700 Philip Maybank, University of Reading 17001730 Elske van der Vaart, University of Reading 17301800 Reham Badawy, Aston University 1815late Pub and food (SCR, UoR campus)
and the general abstract:
The Bayesian approach to statistical inference has seen major successes in the past twenty years, finding application in many areas of science, engineering, finance and elsewhere. The main drivers of these successes were developments in Monte Carlo methods and the wide availability of desktop computers. More recently, the use of standard Monte Carlo methods has become infeasible due the size and complexity of data now available. This has been countered by the development of nextgeneration Monte Carlo techniques, which are the topic of this meeting.
The meeting takes place in the Nike Lecture Theatre, Agriculture Building [building number 59].