Archive for SMC²

ISBA regional meeting in Varanasi (day 3)

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , , on January 11, 2013 by xi'an

plaque in the department of mathematical sciences. BHU, Varanasi, Uttar Pradesh, Jan. 10, 2013On the last/my day of the ISBA meeting in Varanasi, I attended a few talks before being kindly driven to the airport (early, too early, but with the unpredictable traffic there, it was better to err on the cautionary side!). In the dynamical model session, Simon Wilson presented a way to approximate posteriors for HMMs based on Chib’s (or Bayes’!) formula, while Jonathan Stroud exposed another approach to state-space model approximation involving a move of the state parameter based on a normal approximation of its conditional given the observable, approximation which seemed acceptable for the cloud analysis model he was processing. Nicolas Chopin then gave a quick introduction to particle MCMC, all the way to SMC². (As a stern chairmain of the session, I know Nicolas felt he did not have enough time but he did a really good job of motivating those different methods, in particular in explaining why the auxiliary variable approach makes the unbiased estimator of the likelihood a valid MCMC method.) Peter Green’s plenary talk was about a emission tomography image analysis whose statistical processing turned into a complex (Bernstein-von Mises) convergence theorem (whose preliminary version I saw in Bristol during Natalia Bochkina’s talk).

boats on the Ganges before sunset, Jan. 8, 2013Overall, as forewarned by and expected from the program, this ISBA meeting was of the highest scientific quality. (I only wish I had had hindi god abilities to duplicate and attend several parallel sessions at the same time!) Besides, much besides!, the wamr attention paid to everyone by the organisers was just simply un-be-lie-vable! The cultural program went in par with the scientific program. The numerous graduate students and faculty involved in the workshop organisation had a minute knowledge of our schedules and locations, and were constantly anticipating our needs and moves. Almost to a fault, i.e. to a point that was close to embarassing for our cultural habits. I am therefore immensely grateful [personally and as former ISBA president] to all those people that contributed to the success of this ISBA meeting and first and foremost to Professor Satyanshu Upadhyay who worked relentlessly towards this goal during many months! (As a conference organiser, I realise I was and am simply unable to provide this level of welcome to the participants, even for much smaller meetings… The contrast with my previous conference in Berlin could not be more extreme as, for a much higher registration fee, the return was very, very limited.) I will forever (at least until my next reincarnation!) keep the memory of this meeting as a very special one, quite besides giving me the opportunity of my first visit to India

ABC and SMC²

Posted in Books, Statistics, University life with tags , , , , , , on January 17, 2012 by xi'an

Today, Nicolas Chopin gave his talk at CREST. While he tried to encompass as much as possible of the background on ABC for a general audience (he is also giving the same talk in Nice this week), the talk led me to think about the parameterisation of ABC methods. He chose a non-parametric presentation, as in Fearnhead and Prangle. From this viewpoint, the choice of the kernel K and of the distance measure should not matter so much, when compared with the choice of the bandwith/tolerance. Further, the non-parametric flavour tells us that the tolerance itself can be optimised for a given sample size, i.e. in ABC settings a given number of iterations. When looking at MCMC-ABC I briefly wondered at whether or not the tolerance should be optimised against the Metropolis-Hastings acceptance probability. Because from this perspective the ratio of the kernels is a ratio of estimators of the densities of the data at both values of the parameter(s). (Rather than an estimator of the ratio.)

The second part of Nicolas’ talk was about SMC² and hence unrelated to ABC, except that he mentioned that SMC is an (unbiased) approximation for the Metropolis-Hastings acceptance probability. Which is also an interpretation of the ideal  ABC (zero tolerance) and the noisy ABC. (Plus, Marc Beaumont is a common denominator for both perspectives!) Unfortunately, Nicolas ran out of time (because of a tight schedule) and did not give much detail on SMC². Overall, his motivational introduction was quite worth it, though! Here are his slides:

This talk also led me to reflect on the organisation of my incoming PhD class on ABC: it should include

  • justify MCMC-ABC from first principles, for regular and noisy ABC;
  • aggregate the literature on non-parametric justifications of ABC. The 2002 paper by Beaumont et al. remains a reference there;
  • understand the link with indirect inference (the book by Gouriéroux and Monfort is sitting on my desk!);
  • answer the questions “is aBc the sole and unique solution?” and “how can we evaluate/estimate the error?”;

ABC and Monte Carlo seminar in CREST

Posted in Statistics, University life with tags , , , , , , , on January 13, 2012 by xi'an

On Monday (Jan. 16, 3pm, CRESTENSAE, Room S08), Nicolas Chopin will present a talk on:

Dealing with intractability: recent advances in Bayesian Monte-Carlo methods for intractable likelihoods
(joint works with P. Jacob, O. Papaspiliopoulos and S. Barthelmé)

This talk will start with a review of recent advancements in Monte Carlo methodology for intractable problems; that is problems involving intractable quantities, typically intractable likelihoods. I will discuss in turn ABC type methods (a.k.a. likelihood-free), auxiliary variable methods for dealing with intractable normalising constants (e.g. the exchange algorithm), and MC² type of algorithms, a recent extension of which being the PMCMC algorithm (Andrieu et al., 2010). Then, I will present two recent pieces of work in these direction. First, and more briefly briefly, I’ll present the ABC-EP algorithm (Chopin and Barthelmé, 2011). I’ll also discuss some possible future research in ABC theory. Second, I’ll discuss the SMC² algorithm (Chopin, Jacob and Papaspiliopoulos, 2011), a new type of MC² algorithm that makes it possible to perform sequential analysis for virtually any state-space models, including models with an intractable Markov transition.

Catching up faster by switching sooner

Posted in R, Statistics, University life with tags , , , , , , , , on October 26, 2011 by xi'an

Here is our discussion (with Nicolas Chopin) of the Read Paper of last Wednesday by T. van Erven, P. Grünwald and S. de Rooij (Centrum voor Wiskunde en Informatica, Amsterdam), entitled Catching up faster by switching sooner: a predictive approach to adaptive estimation with an application to the Akaike information criterion–Bayesian information criterion dilemma. It is still available for written discussions, to be published in Series B. Even though the topic is quite tangential to our interests, the fact that the authors evolve in a Bayesian environment called for the following (my main contribution being in pointing out that the procedure is not Bayesian by failing to incorporate the switch in the predictive (6), hence using the same data for all models under competition…):

Figure 1 – Bayes factors of Model 2 vs.~Model 1 (gray line) and Model 3 vs.~Model 1 (dark line), plotted against the number of observations, i.e. of iterations, when comparing three stochastic volatility models; see Chopin et al. (2011) for full details.

This paper is an interesting attempt at a particularly important problem. We nonetheless believe more classical tools should be used instead if models are truly relevant in the inference led by the authors: Figure 1, reproduced from Chopin et al. (2011), plots [against time] the Bayes factors of Models 2 and 3 vs. Model 1, where all models are state-space models of increasing complexity, fitted to some real data. In this context, one often observes that more complex models need more time to “ascertain themselves”. On the other hand, even BMA based prediction is a very challenging computational problem (the only generic solution currently being the SMC² algorithm of the aforementioned paper), and we believe that the current proposed predictive strategy will remain too computationally expensive for practical use for nonlinear state-space models.

For other classes of models, since the provable methods put forward by this paper are based on “frozen strategies”, which are hard to defend from a modelling perspective, and since the more reasonable “basic switch” strategy seems to perform as well numerically, we would be curious to see how the proposed methods compare to predictive distributions obtained from genuine Bayesian models. A true change point model for instance would generate a coherent prediction strategy, which is not equivalent to the basic switch strategy. (Indeed, for one thing, the proposal made by the authors utilises the whole past to compute the switching probabilities, rather than allocating the proper portion of the data to the relevant model. In this sense, the proposal is “using the data [at least] twice” in a pseudo-Bayesian setting, similar to Aitkin’s, 1991.) More generally, the authors seem to focus on situations where the true generative process is a non-parametric class, and the completed models is an infinite sequence of richer and richer—but also of more and more complex—parametric models, which is a very sensible set-up in practice. Then, we wonder whether or not it would make more sense to set the prior distribution over the switch parameter s in such a way that (a) switches only occurs from one model to another model with greater complexity and (b) the number of switches is infinite.

For ABC readers, note the future Read Paper meeting on December 14 by Paul Fearnhead and Dennis Prangle.

A new series of mishaps

Posted in Kids, R, University life with tags , , , , , on March 13, 2011 by xi'an

Following the slight difficulties of last week, I had a hard week on the computer front: indeed, on Monday, I received my 2007 macbook from the repair shop, with a new video card, courtesy of Apple. Unfortunately, this started a series of problems. First, the old macbook stopped recognizing the NVIDIA video and, while it worked under VESA, the system got unstable to the point that I had no choice but to install Maverick Meerkat (Ubuntu 10.10) to make it work again… This proved rather unsuccessful in that the new installment is periodically login me out, especially when I am under Firefox or Acrobat… (Most items are working, though, including the bluetooth Apple mouse… Anyway, Pierre kindly offered to take a look over the weekend.) This was not the end of my worries, though, as I stupidly typed sudo chown xian on my new macbook while on / (when I thought I was on /home/xian). This was very very s.t.u.p.i.d. (I committed this major error during Pierre’s talk two days ago, while typing my post on SMC²!) and it also ruined the whole system, despite a manual attempt at recreating the ownership of the system directories… This also ended in a reinstallation of Maverick Meerkat from the start, which surprisingly did not produce the same outcome as the first time, with hardly anything working. Even under the minimal Ubuntu configuration. So this morning I once again started from scratch, using the whole disk and letting the installation CD make all the decisions. Except for the sound (!), everything seems to be working now. Including the webcam as shown above. (Once I installed Isight.) Then tonight, the hinge of the macbook Air I keep at home (and often borrowed by my daughter) broke, a frequent problem with macbook air‘s as I brought one from the department to the repair shop on Monday in exchange for my repaired macbook… And to make the day complete, this morning a neighbouring car kindly pointed out to me a flat tire on my car! I am glad I recovered my bike with a new axis earlier this week…