Archive for Scotland
The ISBA 2018 World Meeting will take place in Edinburgh, Scotland, on 24-29 June 2018. (Since there was some confusion about the date, it is worth stressing that these new dates are definitive!) Note also that there are other relevant conferences and workshops in the surrounding weeks:
- a possible ABC in Edinburgh the previous weekend, 23-24 June [to be confirmed!]
- the Young Bayesian Meeting (BaYSM) in Warwick, 2-3 July 2018 [with a potential short course on fundamentals of simulation in the following days, to be confirmed!]
- MCqMC 2018 in Rennes, 1-6 July 208
- ICML 2018 in Stockholm, 10-15 July 2018
- the 2018 International Biometrics Conference in Barcelona, 8-13 July 2018
Following my earlier post on that paper by Matt Graham and Amos Storkey (University of Edinburgh), I now read through it. The beginning is somewhat unsettling, albeit mildly!, as it starts by mentioning notions like variational auto-encoders, generative adversial nets, and simulator models, by which they mean generative models represented by a (differentiable) function g that essentially turn basic variates with density p into the variates of interest (with intractable density). A setting similar to Meeds’ and Welling’s optimisation Monte Carlo. Another proximity pointed out in the paper is Meeds et al.’s Hamiltonian ABC.
“…the probability of generating simulated data exactly matching the observed data is zero.”
The section on the standard ABC algorithms mentions the fact that ABC MCMC can be (re-)interpreted as a pseudo-marginal MCMC, albeit one targeting the ABC posterior instead of the original posterior. The starting point of the paper is the above quote, which echoes a conversation I had with Gabriel Stolz a few weeks ago, when he presented me his free energy method and when I could not see how to connect it with ABC, because having an exact match seemed to cancel the appeal of ABC, all parameter simulations then producing an exact match under the right constraint. However, the paper maintains this can be done, by looking at the joint distribution of the parameters, latent variables, and observables. Under the implicit restriction imposed by keeping the observables constant. Which defines a manifold. The mathematical validation is achieved by designing the density over this manifold, which looks like
if the constraint can be rewritten as g⁰(u)=0. (This actually follows from a 2013 paper by Diaconis, Holmes, and Shahshahani.) In the paper, the simulation is conducted by Hamiltonian Monte Carlo (HMC), the leapfrog steps consisting of an unconstrained move followed by a projection onto the manifold. This however sounds somewhat intense in that it involves a quasi-Newton resolution at each step. I also find it surprising that this projection step does not jeopardise the stationary distribution of the process, as the argument found therein about the approximation of the approximation is not particularly deep. But the main thing that remains unclear to me after reading the paper is how the constraint that the pseudo-data be equal to the observable data can be turned into a closed form condition like g⁰(u)=0. As mentioned above, the authors assume a generative model based on uniform (or other simple) random inputs but this representation seems impossible to achieve in reasonably complex settings.
Above is the solution produced by a team at the University of Waterloo to the travelling salesman problem of linking all pubs in the UK (which includes pubs in Northern Ireland as well as some Scottish islands—even though I doubt there is no pub at all on the Island of Skye! They also missed a lot of pubs in Glasgow! And worst gaffe of all, they did not include the Clachaigh Inn, probably the best pub on Earth…). This path links over 24 thousand pubs, which is less than the largest travelling salesman problem solved at the current time, except that this case used the exact distances provided by Google maps. Of course, it would somehow make more sense to increase the distances by random amounts as the pub visits increase, unless the visitor sticks to tonic. Or tea.
In his plenary talk this morning, Arnaud Doucet discussed the application of pseudo-marginal techniques to the latent variable models he has been investigating for many years. And its limiting behaviour towards efficiency, with the idea of introducing correlation in the estimation of the likelihood ratio. Reducing complexity from O(T²) to O(T√T). With the very surprising conclusion that the correlation must go to 1 at a precise rate to get this reduction, since perfect correlation would induce a bias. A massive piece of work, indeed!
The next session of the morning was another instance of conflicting talks and I hoped from one room to the next to listen to Hani Doss’s empirical Bayes estimation with intractable constants (where maybe SAME could be of interest), Youssef Marzouk’s transport maps for MCMC, which sounds like an attractive idea provided the construction of the map remains manageable, and Paul Russel’s adaptive importance sampling that somehow sounded connected with our population Monte Carlo approach. (With the additional step of considering transform maps.)
An interesting item of information I got from the final announcements at MCqMC 2016 just before heading to Monash, Melbourne, is that MCqMC 2018 will take place in the city of Rennes, Brittany, on July 2-6. Not only it is a nice location on its own, but it is most conveniently located in space and time to attend ISBA 2018 in Edinburgh the week after! Just moving from one Celtic city to another Celtic city. Along with other planned satellite workshops, this occurrence should make ISBA 2018 more attractive [if need be!] for participants from oversea.
A new Rankin, a new Rebus! (New as in 2015 since I waited to buy the paperback version.) Sounds like Ian Rankin cannot let his favourite character rest for his retirement and hence set in back into action, along with the new Malcom Fox [working in the Complaints] and most major characters of the Rebus series. Including the unbreakable villain, Big Ger Cafferty. This as classical as you get, borrows from half a dozen former Rebus novels, not to mention this neo-Holmes novel I reviewed a while ago. But it is gritty, deadly efficient and captivating. I read the book within a few days from returning from Warwick.
About the title, this is a song by The Associates that plays a role in the book. I did not this band, but looking for it got me to a clip that used an excerpt from the Night of the Hunter. Fantastic movie, one of my favourites.