This “week in Warwick” was not chosen at random as I was aware there is a workshop on non-reversible MCMC going on. (Even though CRiSM sponsored so many workshops in September that almost any week would have worked for the above sentence!) It has always been kind of a mystery to me that non-reversibility could make a massive difference in practice, even though I am quite aware that it does. And I can grasp some of the theoretical arguments why it does. So it was quite rewarding to sit in this Warwick amphitheatre and learn about overdamped Langevin algorithms and other non-reversible diffusions, to see results where convergence times moved from n to √n, and to grasp some of the appeal of lifting albeit in finite state spaces. Plus, the cartoon presentation of Hamiltonian Monte Carlo by Michael Betancourt was a great moment, not only because of the satellite bursting into flames on the screen but also because it gave a very welcome intuition about why reversibility was inefficient and HMC appealing. So I am grateful to my two colleagues, Joris Bierkens and Gareth Roberts, for organising this exciting workshop, with a most profitable scheduling favouring long and few talks. My next visit to Warwick will also coincide with a workshop on intractable likelihood, next November. This time part of the new Alan Turing Institute programme.
Archive for CRiSM
At the last (European) AISTATS 2014, I agreed to be the program co-chair for AISTATS 2016, along with Arthur Gretton from the Gatsby Unit, at UCL. (AISTATS stands for Artificial Intelligence and Statistics.) Thanks to Arthur’s efforts and dedication, as the organisation of an AISTATS meeting is far more complex than any conference I have organised so far!, the meeting is taking shape. First, it will take place in Cadiz, Andalucía, Spain, on May 9-11, 2016. (A place more related to the conference palm tree logo than the previous location in Reykjavik, even though I would be the last one to complain it took place in Iceland!)
Second, the call for submissions is now open. The process is similar to other machine learning conferences in that papers are first submitted for the conference proceedings, then undergo a severe and tight reviewing process, with a response period for the authors to respond to the reviewers’ comments, and that only the accepted papers can be presented as posters, some of which are selected for an additional oral presentation. The major dates for submitting to AISTATS 2016 are
|Proceedings track paper submission deadline||23:59UTC Oct 9, 2015|
|Proceedings track initial reviews available||Nov 16, 2015|
|Proceedings track author feedback deadline||Nov 23, 2015|
|Proceedings track paper decision notifications||Dec 20, 2015|
I was quite impressed by the quality and intensity of the AISTATS 2014 conference, which is why I accepted so readily being program co-chair, and hence predict an equally rewarding AISTATS 2016, thus encouraging all interested ‘Og’s readers to consider submitting a paper there! Even though I confess it will make a rather busy first semester for 2016, between MCMSki V in January, the CIRM Statistics month in February, the CRiSM workshop on Eatimating constants in April, AISTATS 2016 thus in May, and ISBA 2016 in June…
While I discussed on the ‘Og in the past the difference I saw between estimating an unknown parameter from a distribution and evaluating a normalising constant, evaluating such constants and hence handling [properly] doubly intractable models is obviously of the utmost importance! For this reason, Nial Friel, Helen Ogden and myself have put together a CRiSM workshop on the topic (with the tongue-in-cheek title of Estimating constants!), to be held at the University of Warwick next April 20-22.
The CRiSM workshop will focus on computational methods for approximating challenging normalising constants found in Monte Carlo, likelihood and Bayesian models. Such methods may be used in a wide range of problems: to compute intractable likelihoods, to find the evidence in Bayesian model selection, and to compute the partition function in Physics. The meeting will bring together different communities working on these related problems, some of which have developed original if little advertised solutions. It will also highlight the novel challenges associated with large data and highly complex models. Besides a dozen invited talks, the schedule will highlight two afternoon poster sessions with speed (2-5mn) oral presentations called ‘Elevator’ talks.
While 2016 is going to be quite busy with all kinds of meetings (MCMSkv, ISBA 2016, the CIRM Statistics month, AISTATS 2016, …), this should be an exciting two-day workshop, given the on-going activity in this area, and I thus suggest interested readers to mark the dates in their diary. I will obviously keep you posted about registration and accommodation when those entries are available.
I attended an highly unusual workshop while in Warwick last week. Unusual for me, obviously. It was about probabilistic numerics, i.e., the use of probabilistic or stochastic arguments in the numerical resolution of (possibly) deterministic problems. The notion in this approach is fairly Bayesian in that it makes use to prior information or belief about the quantity of interest, e.g., a function, to construct an usually Gaussian process prior and derive both an estimator that is identical to a numerical method (e.g., Runge-Kutta or trapezoidal integration) and uncertainty or variability around this estimator. While I did not grasp much more than the classy introduction talk by Philipp Hennig, this concept sounds fairly interesting, if only because of the Bayesian connection, and I wonder if we will soon see a probability numerics section at ISBA! More seriously, placing priors on functions or functionals is a highly formal perspective (as in Bayesian non-parametrics) and it makes me wonder how much of the data (evaluation of a function at a given set of points) and how much of the prior is reflected in the output [variability]. (Obviously, one could also ask a similar question for statistical analyses!) For instance, issues of singularity arise among those stochastic process priors.
Another question that stemmed from this talk is whether or not more efficient numerical methods can derived that way, in addition to recovering the most classical ones. Somewhat, somehow, given the idealised nature of the prior, it feels like priors could be more easily compared or ranked than in classical statistical problems. Since the aim is to figure out the value of an integral or the solution to an ODE. (Or maybe not, since again almost the same could be said about estimating a normal mean.)
Next week is my week at the University of Warwick, where I will give a seminar at the CRiSM seminar on Thursday. Along with my friend Olli Ratmann. Except that I also got invited at the Luxembourg annual statistics conference the same week, meaning I will travel to Luxembourg on Wednesday to give my talk. (First time ever.) And the easiest way from Coventry is to fly thru Paris. In preparation for this travelling schedule bordering the insane, I printed a whole heap of arXiv publications… Keep posted!
Today, I am attending a workshop on the use of graphics processing units in Statistics in Warwick, supported by CRiSM, presenting our recent works with Randal Douc, Pierre Jacob and Murray Smith. (I will use the same slides as in Telecom two months ago, hopefully avoiding the loss of integral and summation signs this time!) Pierre Jacob will talk about Wang-Landau.
This year useR! conference will take place in Warwick, on August 16-18. It is being organised by the department of Statistics and funded by CRiSM and Revolution Analytics (providers of the R tee-shirt!). I wish I could attend but mid-August is usually associated with genuine (post-JSM) family vacations.