## Archive for MCMC

## coordinate sampler on-line

Posted in Statistics with tags coordinate sampler, Gibbs sampler, MCMC, non-reversible diffusion, PDMP, piecewise deterministic, Statistics & Computing on March 13, 2020 by xi'an## unbiased MCMC with couplings [4pm, 26 Feb., Paris]

Posted in Books, pictures, Statistics, University life with tags AgroParisTech, All about that Bayes, burns, Claude Bernard, Harvard University, maximal coupling, MCMC, Paris, PSL, seminar, unbiased MCMC, Université Paris Dauphine on February 24, 2020 by xi'an**O**n Wednesday, 26 February, Pierre Jacob (Havard U, currently visiting Paris-Dauphine) is giving a seminar on unbiased MCMC methods with couplings at AgroParisTech, bvd Claude Bernard, Paris 5ième, Room 32, at 4pm in the All about that Bayes seminar.

MCMC methods yield estimators that converge to integrals of interest in the limit of the number of iterations. This iterative asymptotic justification is not ideal; first, it stands at odds with current trends in computing hardware, with increasingly parallel architectures; secondly, the choice of “burn-in” or “warm-up” is arduous. This talk will describe recently proposed estimators that are unbiased for the expectations of interest while having a finite computing cost and a finite variance. They can thus be generated independently in parallel and averaged over. The method also provides practical upper bounds on the distance (e.g. total variation) between the marginal distribution of the chain at a finite step and its invariant distribution. The key idea is to generate “faithful” couplings of Markov chains, whereby pairs of chains coalesce after a random number of iterations. This talk will provide an overview of this line of research.

## Roberto Casarin in Warwick [joint Stats/Econometrics seminar series]

Posted in Statistics with tags autoregressive model, Bayesian econometrics, Coventry, MCMC, PARAFAC, tensor, UK, University of Warwick on February 11, 2020 by xi'an**M**y friend, coauthor and former student Roberto Casarin (da Ca’Foscari Venezia) is giving a talk tomorrow in Warwick:

Bayesian Dynamic Tensor Regression (joint with Billio, M., Iacopini, M., and Kaufmann, S.)Tensor-valued data (i.e. multidimensional data) are becoming increasingly available and call for suitable econometric tools. We propose a new dynamic linear regression model for tensor-valued response variables and covariates that encompasses some well-known multivariate models as special cases. We exploit the PARAFAC low-rank decomposition for providing a parsimonious parametrization and to incorporate sparsity effects. Our contribution is twofold: first, we extend multivariate econometric models to account for tensor-valued response and covariates; second, we define a tensor autoregressive process (TAR) and the associated impulse response function for studying shock propagation. Inference is carried out in the Bayesian framework combined with Monte Carlo Markov Chain (MCMC). We apply the TAR model for studying time-varying multilayer economic networks concerning international trade and international capital stocks. We provide an impulse response analysis for assessing propagation of trade and financial shocks across countries, over time and between layers.

The seminar will take place on Thursday Feb. 13 at 14:00 in OC0.01 (Oculus), University of Warwick, Coventry, UK.

## MCMC, with common misunderstandings

Posted in Books, pictures, R, Statistics, University life with tags ABC, Bayesian computing, computational statistics, Gibbs sampling, Handbook of Computational Statistics and Data Science, HMC, IMS Lawrence D. Brown PhD Student Award, MCMC, PhD thesis, Q&A format, Statistics and Computing, survey, variational Bayes methods on January 27, 2020 by xi'an**A**s I was asked to write a chapter on MCMC methods for an incoming *Handbook of Computational Statistics and Data Science*, published by Wiley, rather than cautiously declining!, I decided to recycle the answers I wrote on X validated to what I considered to be the most characteristic misunderstandings about MCMC and other computing methods, using as background the introduction produced by Wu Changye in his PhD thesis. Waiting for the opinion of the editors of the *Handbook* on this Q&A style. The outcome is certainly lighter than other recent surveys like the one we wrote with Peter Green, Krys Latuszinski, and Marcelo Pereyra, for Statistics and Computing, or the one with Victor Elvira, Nick Tawn, and Changye Wu.

## an elegant sampler

Posted in Books, Kids, R, University life with tags cross validated, MCMC, Metropolis-Hastings algorithm, R, random walk, sampling from an atomic population, simplex, uniform simulation on January 15, 2020 by xi'an**F**ollowing an X validated question on how to simulate a multinomial with fixed average, W. Huber produced a highly elegant and efficient resolution with the compact R code

tabulate(sample.int((k-1)*n, s-n) %% n + 1, n) + 1

where *k* is the number of classes, *n* the number of draws, and *s* equal to *n* times the fixed average. The R function *sample.int* is an alternative to *sample* that seems faster. Breaking the outcome of

sample.int((k-1)*n, s-n)

as nonzero positions in an *n x (k-1)* matrix and adding a adding a row of *n* 1’s leads to a simulation of integers between 1 and *k* by counting the 1’s in each of the *n* columns, which is the meaning of the above picture. Where the colour code is added after counting the number of 1’s. Since there are *s* 1’s in this matrix, the sum is automatically equal to *s*. Since the *s-n* positions are chosen uniformly over the *n x (k-1)* locations, the outcome is uniform. The rest of the R code is a brutally efficient way to translate the idea into a function. (By comparison, I brute-forced the question by suggesting a basic Metropolis algorithm.)