Archive for mostly Monte Carlo seminar

MCMC without evaluating the target [aatB-mMC joint seminar, 24 April]

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , on April 11, 2024 by xi'an

On 24 April 2024, Guanyang Wang (Rutgers University, visiting ESSEC) will give a joint All about that Bayes – mostly Monte Carlo seminar on

MCMC when you do not want to evaluate the target distribution

In sampling tasks, it is common for target distributions to be known up to a normalizing constant. However, in many situations, evaluating even the unnormalized distribution can be costly or infeasible. This issue arises in scenarios such as sampling from the Bayesian posterior for large datasets and the ‘doubly intractable’ distributions. We provide a way to unify various MCMC algorithms, including several minibatch MCMC algorithms and the exchange algorithm. This framework not only simplifies the theoretical analysis of existing algorithms but also creates new algorithms. Similar frameworks exist in the literature, but they concentrate on different objectives.

The talk takes place at 4pm CEST, in room 8 at PariSanté Campus, Paris 15.

mostly MC [April]

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , , , , , , , , , , , , , , , , , , on April 5, 2024 by xi'an

mostly M[ar]C[h]

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , , , , , , , on February 27, 2024 by xi'an

mostly MC[bruary]

Posted in Books, Kids, Statistics, University life with tags , , , , , , , , , , , , , , , , on February 18, 2024 by xi'an

combining normalizing flows and QMC

Posted in Books, Kids, Statistics with tags , , , , , , , , , , , , , on January 23, 2024 by xi'an

My PhD student Charly Andral [presented at the mostly Monte Carlo seminar and] arXived a new preprint yesterday, on training a normalizing flow network as an importance sampler (as in Gabrié et al.) or an independent Metropolis proposal, and exploiting its invertibility to call quasi-Monte Carlo low discrepancy sequences to boost its efficiency. (Training the flow is not covered by the paper.) This extends the recent study of He et al. (which was presented at MCM 2023 in Paris) to the normalising flow setting. In the current experiments, the randomized QMC samples are computed using the SciPy package (Roy et al. 2023), where the Sobol’ sequence is based on Joe and Kuo (2008) and on Matouˇsek (1998) for the scrambling, and where the Halton sequence is based on Owen (2017). (No pure QMC was harmed in the process!) The flows are constructed using the package FlowMC. As expected the QMC version brings a significant improvement in the quality of the Monte Carlo approximations, for equivalent computing times, with however a rapid decrease in the efficiency as the dimension of the targetted distribution increases. On the other hand, the architecture of the flow demonstrates little relevance. And the type of  RQMC sequence makes a difference, the advantage apparently going to a scrambled Sobol’ sequence.