Archive for Pierre Pudlo
souvenirs de Luminy
Posted in Books, Kids, pictures, Statistics, Travel, University life with tags amazon associates, applied Bayesian analysis, Bayesian data analysis, case studies, CIRM, Jean Morlet Chair, Kerrie Mengersen, Lecture Notes in Mathematics, Luminy, Marseille, Pierre Pudlo, Société Mathématique de France, Springer-Verlag, Université Aix Marseille on July 6, 2020 by xi'aninsufficient statistics for ABC model choice
Posted in Books, Kids, Statistics, University life with tags ABC, arXiv, Cross Validation, Gibbs random field, hidden Markov models, Markov random field, Monte Carlo Statistical Methods, paradigm shift, Pierre Pudlo, predictive loss, simulation, summary statistics on October 17, 2014 by xi'an[Here is a revised version of my comments on the paper by Julien Stoehr, Pierre Pudlo, and Lionel Cucala, now to appear [both paper and comments] in Statistics and Computing special MCMSki 4 issue.]
Approximate Bayesian computation techniques are 2000’s successors of MCMC methods as handling new models where MCMC algorithms are at a loss, in the same way the latter were able in the 1990’s to cover models that regular Monte Carlo approaches could not reach. While they first sounded like “quick-and-dirty” solutions, only to be considered until more elaborate solutions could (not) be found, they have been progressively incorporated within the statistican’s toolbox as a novel form of non-parametric inference handling partly defined models. A statistically relevant feature of those ACB methods is that they require replacing the data with smaller dimension summaries or statistics, because of the complexity of the former. In almost every case when calling ABC is the unique solution, those summaries are not sufficient and the method thus implies a loss of statistical information, at least at a formal level since relying on the raw data is out of question. This forced reduction of statistical information raises many relevant questions, from the choice of summary statistics to the consistency of the ensuing inference.
In this paper of the special MCMSki 4 issue of Statistics and Computing, Stoehr et al. attack the recurrent problem of selecting summary statistics for ABC in a hidden Markov random field, since there is no fixed dimension sufficient statistics in that case. The paper provides a very broad overview of the issues and difficulties related with ABC model choice, which has been the focus of some advanced research only for a few years. Most interestingly, the authors define a novel, local, and somewhat Bayesian misclassification rate, an error that is conditional on the observed value and derived from the ABC reference table. It is the posterior predictive error rate
integrating in both the model index m and the corresponding random variable Y (and the hidden intermediary parameter) given the observation. Or rather given the transform of the observation by the summary statistic S. The authors even go further to define the error rate of a classification rule based on a first (collection of) statistic, conditional on a second (collection of) statistic (see Definition 1). A notion rather delicate to validate on a fully Bayesian basis. And they advocate the substitution of the unreliable (estimates of the) posterior probabilities by this local error rate, estimated by traditional non-parametric kernel methods. Methods that are calibrated by cross-validation. Given a reference summary statistic, this perspective leads (at least in theory) to select the optimal summary statistic as the one leading to the minimal local error rate. Besides its application to hidden Markov random fields, which is of interest per se, this paper thus opens a new vista on calibrating ABC methods and evaluating their true performances conditional on the actual data. (The advocated abandonment of the posterior probabilities could almost justify the denomination of a paradigm shift. This is also the approach advocated in our random forest paper.)
last Big MC [seminar] before summer [June 19, 3pm]
Posted in pictures, Statistics, University life with tags ABC, Big'MC, Chris Holmes, IHP, Institut Henri Poincaré, machine learning, Monte Carlo s, Montpellier, Paris, Pierre Pudlo, seminar, University of Oxford on June 17, 2014 by xi'anLast session of our Big’MC seminar at Institut Henri Poincaré this year, on
Tuesday Thursday, June 19, with
Chris Holmes (Oxford) at 3pm on
Robust statistical decisions via re-weighted Monte Carlo samples
and Pierre Pudlo (iC3M, Université de Montpellier 2) at 4:15pm on [our joint work]
ABC and machine learning