## AISTATS 2016 [call for submissions]

Posted in pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , on August 21, 2015 by xi'an

At the last (European) AISTATS 2014, I agreed to be the program co-chair for AISTATS 2016, along with Arthur Gretton from the Gatsby Unit, at UCL. (AISTATS stands for Artificial Intelligence and Statistics.) Thanks to Arthur’s efforts and dedication, as the organisation of an AISTATS meeting is far more complex than any conference I have organised so far!, the meeting is taking shape. First, it will take place in Cadiz, Andalucía, Spain, on May 9-11, 2016. (A place more related to the conference palm tree logo than the previous location in Reykjavik, even though I would be the last one to complain it took place in Iceland!)

Second, the call for submissions is now open. The process is similar to other machine learning conferences in that papers are first submitted for the conference proceedings, then undergo a severe and tight reviewing process, with a response period for the authors to respond to the reviewers’ comments, and that only the accepted papers can be presented as posters, some of which are selected for an additional oral presentation. The major dates for submitting to AISTATS 2016 are

 Proceedings track paper submission deadline 23:59 UTC Oct 9, 2015 Proceedings track initial reviews available Nov 16, 2015 Proceedings track author feedback deadline Nov 23, 2015 Proceedings track paper decision notifications Dec 20, 2015

With submission instructions available at this address. Including the electronic submission site.

I was quite impressed by the quality and intensity of the AISTATS 2014 conference, which is why I accepted so readily being program co-chair, and hence predict an equally rewarding AISTATS 2016, thus encouraging all interested ‘Og’s readers to consider submitting a paper there! Even though I confess it will make a rather busy first semester for 2016, between MCMSki V in January, the CIRM Statistics month in February, the CRiSM workshop on Eatimating constants in April, AISTATS 2016 thus in May, and ISBA 2016 in June…

## carrer de Thomas Bayes [formulas magistrales]

Posted in pictures, Statistics, Travel with tags , , , , on June 11, 2015 by xi'an

## An objective prior that unifies objective Bayes and information-based inference

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , on June 8, 2015 by xi'an

During the Valencia O’Bayes 2015 meeting, Colin LaMont and Paul Wiggins arxived a paper entitled “An objective prior that unifies objective Bayes and information-based inference”. It would have been interesting to have the authors in Valencia, as they make bold claims about their w-prior as being uniformly and maximally uninformative. Plus achieving this unification advertised in the title of the paper. Meaning that the free energy (log transform of the inverse evidence) is the Akaike information criterion.

The paper starts by defining a true prior distribution (presumably in analogy with the true value of the parameter?) and generalised posterior distributions as associated with any arbitrary prior. (Some notations are imprecise, check (3) with the wrong denominator or the predictivity that is supposed to cover N new observations on p.2…) It then introduces a discretisation by considering all models within a certain Kullback divergence δ to be undistinguishable. (A definition that does not account for the assymmetry of the Kullback divergence.) From there, it most surprisingly [given the above discretisation] derives a density on the whole parameter space

$\pi(\theta) \propto \text{det} I(\theta)^{1/2} (N/2\pi \delta)^{K/2}$

where N is the number of observations and K the dimension of θ. Dimension which may vary. The dependence on N of the above is a result of using the predictive on N points instead of one. The w-prior is however defined differently: “as the density of indistinguishable models such that the multiplicity is unity for all true models”. Where the log transform of the multiplicity is the expected log marginal likelihood minus the expected log predictive [all expectations under the sampling distributions, conditional on θ]. Rather puzzling in that it involves the “true” value of the parameter—another notational imprecision, since it has to hold for all θ’s—as well as possibly improper priors. When the prior is improper, the log-multiplicity is a difference of two terms such that the first term depends on the constant used with the improper prior, while the second one does not…  Unless the multiplicity constraint also determines the normalising constant?! But this does not seem to be the case when considering the following section on normalising the w-prior. Mentioning a “cutoff” for the integration that seems to pop out of nowhere. Curiouser and curiouser. Due to this unclear handling of infinite mass priors, and since the claimed properties of uniform and maximal uninformativeness are not established in any formal way, and since the existence of a non-asymptotic solution to the multiplicity equation is neither demonstrated, I quickly lost interest in the paper. Which does not contain any worked out example. Read at your own risk!

## Valencia snapshot [#2]

Posted in Kids, pictures, Running, Travel with tags , , , , on June 5, 2015 by xi'an

## Valencia snapshot

Posted in pictures, Statistics, Travel, University life, Wines with tags , , , , on June 4, 2015 by xi'an

## jogging for Susie

Posted in Running, Statistics, Travel, University life with tags , , , , on June 2, 2015 by xi'an

A few of us met (somewhat) early this morning to run together in memory of Susie, wearing the bright red tee-shirts given to us by the O’Bayes 2015 conference organisers. And going along the riverbed that circles the old town of Valencià. Till next run, Susie!

## O’Bayes15: my tutorial

Posted in Books, Kids, pictures, Running with tags , , , , , , on June 1, 2015 by xi'an

Here are the slides I made for a short tutorial I will deliver this afternoon for the opening day of the International Workshop on Objective Bayes Methodology, O-Bayes15, held in the city of Valencià, so intricately linked with Bayesians and Bayesianism. The more so as we celebrating this time the career and life of our dear friend Susie. Celebrating with talks and stories, morning runs and afternoon drinks, laughs, tears, and more laughs, even though they cannot equate Susie’s unique and vibrant communicative laugh. I will remember how, at O’Bayes 13, Susie was the one who delivered this tutorial. And how, despite physical frailty and fatigue, she did with her usual energy and mental strength. And obviously again with laugh. I will also remember that the last time I visited Valencià, it was for Anabel Forte’s thesis defence, upon invitation from Susie, and that we had a terrific time, from discussing objective Bayes ideas to eating and drinking local goodies, to walking around the grandiose monuments just built (which presumably contributed to ruin the City of Valencià for quite a while!)