**N**ext Fall, on 15-16 September, I will take part in a CRiSM workshop on hypothesis testing. In our department in Warwick. The registration is now open [until Sept 2] with a moderate registration free of £40 and a call for posters. Jim Berger and Joris Mulder will both deliver a plenary talk there, while Andrew Gelman will alas give a remote talk from New York. (A terrific poster by the way!)

## Archive for CRiSM

## contemporary issues in hypothesis testing

Posted in pictures, Statistics, Travel, University life with tags Andrew Gelman, Bayes factors, Bayesian foundations, Bayesian statistics, Coventry, CRiSM, England, Fall, hypothesis testing, Jim Berger, Joris Mulder, statistical tests, University of Warwick, workshop on May 3, 2016 by xi'an## estimating constants [impression soleil levant]

Posted in pictures, Running, Statistics, Travel, University life with tags adiabatic Monte Carlo, bridge sampling, Coventry, CRiSM, Gibbs random field, harmonic mean estimator, Ising model, noise-contrastive estimation, normalising constant, particle filters, path sampling, power posterior, pseudo-marginal MCMC, thermodynamic integration, University of Warwick on April 25, 2016 by xi'an**T**he CRiSM workshop on estimating constants which took place here in Warwick from April 20 till April 22 was quite enjoyable *[says most objectively one of the organisers!]*, with all speakers present to deliver their talks (!) and around sixty participants, including 17 posters. It remains a exciting aspect of the field that so many and so different perspectives are available on the “doubly intractable” problem of estimating a normalising constant. Several talks and posters concentrated on Ising models, which always sound a bit artificial to me, but also are perfect testing grounds for approximations to classical algorithms.

On top of [clearly interesting!] talks associated with papers I had already read [and commented here], I had not previously heard about Pierre Jacob’s coupling SMC sequence, which paper is not yet out *[no spoiler then!]*. Or about Michael Betancourt’s adiabatic Monte Carlo and its connection with the normalising constant. Nicolas Chopin talked about the unnormalised Poisson process I discussed a while ago, with this feature that the normalising constant itself becomes an additional parameter. And that integration can be replaced with (likelihood) maximisation. The approach, which is based on a reference distribution (and an artificial logistic regression à la Geyer), reminded me of bridge sampling. And indirectly of path sampling, esp. when Merrilee Hurn gave us a very cool introduction to power posteriors in the following talk. Also mentioning the controlled thermodynamic integration of Chris Oates and co-authors I discussed a while ago. (Too bad that Chris Oates could not make it to this workshop!) And also pointing out that thermodynamic integration could be a feasible alternative to nested sampling.

Another novel aspect was found in Yves Atchadé’s talk about sparse high-dimension matrices with priors made of mutually exclusive measures and quasi-likelihood approximations. A simplified version of the talk being in having a non-identified non-constrained matrix later projected onto one of those measure supports. While I was aware of his noise-contrastive estimation of normalising constants, I had not previously heard Michael Gutmann give a talk on that approach (linking to Geyer’s 1994 mythical paper!). And I do remain nonplussed at the possibility of including the normalising constant as an additional parameter [in a computational and statistical sense]..! Both Chris Sherlock and Christophe Andrieu talked about novel aspects on pseudo-marginal techniques, Chris on the lack of variance reduction brought by averaging unbiased estimators of the likelihood and Christophe on the case of large datasets, recovering better performances in latent variable models by estimating the ratio rather than taking a ratio of estimators. (With Christophe pointing out that this was an exceptional case when harmonic mean estimators could be considered!)

## afternoon on Bayesian computation

Posted in Statistics, Travel, University life with tags advanced Monte Carlo methods, Antonietta Mira, Arnaud Doucet, Bayesian computation, CRiSM, estimating a constant, Ingmar Schuster, Monte Carlo Statistical Methods, pub, United Kingdom, Université Paris Dauphine, University of Oxford, University of Reading, University of Warwick on April 6, 2016 by xi'an**R**ichard Everitt organises an afternoon workshop on Bayesian computation in Reading, UK, on April 19, the day before the Estimating Constant workshop in Warwick, following a successful afternoon last year. Here is the programme:

1230-1315 Antonietta Mira, Università della Svizzera italiana 1315-1345 Ingmar Schuster, Université Paris-Dauphine 1345-1415 Francois-Xavier Briol, University of Warwick 1415-1445 Jack Baker, University of Lancaster 1445-1515 Alexander Mihailov, University of Reading 1515-1545 Coffee break 1545-1630 Arnaud Doucet, University of Oxford 1630-1700 Philip Maybank, University of Reading 1700-1730 Elske van der Vaart, University of Reading 1730-1800 Reham Badawy, Aston University 1815-late Pub and food (SCR, UoR campus)

and the general abstract:

The Bayesian approach to statistical inference has seen major successes in the past twenty years, finding application in many areas of science, engineering, finance and elsewhere. The main drivers of these successes were developments in Monte Carlo methods and the wide availability of desktop computers. More recently, the use of standard Monte Carlo methods has become infeasible due the size and complexity of data now available. This has been countered by the development of next-generation Monte Carlo techniques, which are the topic of this meeting.

The meeting takes place in the Nike Lecture Theatre, Agriculture Building [building number 59].

## CRiSM workshop on estimating constants [#2]

Posted in pictures, Statistics, Travel, University life, Wines with tags Bayesian computing, CRiSM, evidence, Monte Carlo Statistical Methods, normalising constant, partition, poster session, University of Warwick, workshop, Zeeman building on March 31, 2016 by xi'an**T**he schedule for the CRiSM workshop on estimating constants that Nial Friel, Helen Ogden and myself host next April 20-22 at the University of Warwick is now set as follows. (The plain registration fees are £40 and accommodation on the campus is available through the online form.)

**April 20, 2016**

11:45 — 12:30: Adam Johansen

*12:30 — 14:00: Lunch*

14:00 — 14:45: Anne-Marie Lyne

14:45 — 15:30: Pierre Jacob

*15:30 — 16:00: Break*

16:00 — 16:45: Roberto Trotta

17:00 — 18:00: ‘Elevator’ talks

18:00 — 20:00: Poster session, Cheese and wine

** April 21, 2016**

9:00 — 9:45: Michael Betancourt

9:45 — 10:30: Nicolas Chopin

*10:30 — 11:00: Coffee break*

11:00 — 11:45: Merrilee Hurn

11:45 — 12:30: Jean-Michel Marin

*12:30 — 14:00: Lunch*

14:00 — 14:45: Sumit Mukherjee

14:45 — 15:30: Yves Atchadé

*15:30 — 16:00: Break*

16:00 — 16:45: Michael Gutmann

16:45 — 17:30: Panayiota Touloupou

*19:00 — 22:00: Dinner*

**April 22, 2016**

9:00 — 9:45: Chris Sherlock

9:45 — 10:30: Christophe Andrieu

*10:30 — 11:00: Coffee break*

11:00 — 11:45: Antonietta Mira

## contemporary issues on hypothesis testing

Posted in pictures, Statistics, Travel, University life with tags AR models, Bayes factor, Bayesian hypothesis testing, CRiSM, Glencoe, skyrace, University of Warwick, workshop on March 2, 2016 by xi'an**A**t Warwick U., CRiSM is sponsoring another workshop, this time on hypothesis testing next Sept. 15 and 16 (just before the Glencoe race!). Registration and poster submission are already open. Most obviously, given my current interests, I am quite excited by the prospect of taking part in this workshop (and sorry that Andrew can only take part by teleconference!).

## conference deadlines [register now!!]

Posted in Kids, pictures, Statistics, Travel, University life with tags AISTATS 2016, Cadiz, Coventry, CRiSM, ISBA 2016, Italy, registration fees, Santa Margherita di Pula, Sardinia, Spain, University of Warwick on February 11, 2016 by xi'an**R**egistration is now open for our [fabulous!] CRiSM workshop on estimating [normalising] constants, in Warwick, on April 20-22 this year. While it is almost free (*almost* as in £40.00!), we strongly suggest you register asap if only to secure a bedroom on the campus at a moderate rate of £55.00 per night (breakfast included!). Plus we would like to organise the poster session(s) and the associated “elevator” talks for the poster presenters.

While the deadline for early registration at AISTATS is now truly over, we also encourage all researchers interested in this [great] conference to register as early as possible, if only [again] to secure a room at the conference location, the Parador Hotel in Cádiz. (Otherwise, there are plenty of rentals in the neighbourhood.)

Last and not least, the early registration for ISBA 2016 in Santa Margherita di Pula, Sardinia, is still open till February 29. And the rate will move immediately to late registration fees. The same deadline applies to bedroom reservations in the resort, with apparently only a few rooms left for some of the nights. Rentals and hotels around are also getting filled rather quickly.

## Bayesian model comparison with intractable constants

Posted in Books, Kids, pictures, Statistics, Travel, University life with tags ABC, auxiliary variable, bias vs. variance, CRiSM, estimating constants, importance sampling, Monte Carlo Statistical Methods, normalising constant, pseudo-marginal MCMC, SMC, unbiased estimation, University of Warwick on February 8, 2016 by xi'an**R**ichard Everitt, Adam Johansen (Warwick), Ellen Rowing and Melina Evdemon-Hogan have updated [on arXiv] a survey paper on the computation of Bayes factors in the presence of intractable normalising constants. Apparently destined for *Statistics and Computing* when considering the style. A great entry, in particular for those attending the CRiSM workshop Estimating Constants in a few months!

A question that came to me from reading the introduction to the paper is why a method like Møller et al.’s (2006) auxiliary variable trick should be considered more “exact” than the pseudo-marginal approach of Andrieu and Roberts (2009) since the later can equally be seen as an auxiliary variable approach. The answer was on the next page (!) as it is indeed a special case of Andrieu and Roberts (2009). Murray et al. (2006) also belongs to this group with a product-type importance sampling estimator, based on a sequence of tempered intermediaries… As noted by the authors, there is a whole spectrum of related methods in this area, some of which qualify as exact-approximate, inexact approximate and noisy versions.

Their main argument is to support importance sampling as the method of choice, including sequential Monte Carlo (SMC) for large dimensional parameters. The auxiliary variable of Møller et al.’s (2006) is then part of the importance scheme. In the first toy example, a Poisson is opposed to a Geometric distribution, as in our ABC model choice papers, for which a multiple auxiliary variable approach dominates both ABC and Simon Wood’s synthetic likelihood for a given computing cost. I did not spot which artificial choice was made for the Z(θ)’s in both models, since the constants are entirely known in those densities. A very interesting section of the paper is when envisioning *biased* approximations to the intractable density. If only because the importance weights are most often biased due to the renormalisation (possibly by resampling). And because the variance derivations are then intractable as well. However, due to this intractability, the paper can only approach the impact of those approximations via empirical experiments. This leads however to the interrogation on how to evaluate the validity of the approximation in settings where truth and even its magnitude are unknown… Cross-validation and bootstrap type evaluations may prove too costly in realistic problems. Using biased solutions thus mostly remains an open problem in my opinion.

The SMC part in the paper is equally interesting if only because it focuses on the data thinning idea studied by Chopin (2002) and many other papers in the recent years. This made me wonder why an alternative relying on a sequence of approximations to the target with *tractable* normalising constants could not be considered. A whole sequence of auxiliary variable completions sounds highly demanding in terms of computing budget and also requires a corresponding sequence of calibrations. (Now, ABC fares no better since it requires heavy simulations and repeated calibrations, while further exhibiting a damning missing link with the target density. ) Unfortunately, embarking upon a theoretical exploration of the properties of approximate SMC is quite difficult, as shown by the strong assumptions made in the paper to bound the total variation distance to the true target.