Archive for ISI

transformation MCMC

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , on January 3, 2022 by xi'an

For reasons too long to describe here, I recently came across a 2013 paper by Dutta and Bhattacharya (from ISI Kolkata) entitled MCMC based on deterministic transforms, which sounded a bit dubious until I realised the deterministic label apply to the choice of the transformation and not to the Metropolis-Hastings proposal… The core of the proposed method is to make a proposal that simultaneously considers a move and its inverse, namely from x to either x’=T(x,ε) or x”=T⁻¹(x,ε) , where ε is an independent random noise, possibly degenerated to a manifold of lesser dimension. Due to the symmetry the acceptance probability is then a ratio of the target, multiplied by the x-Jacobian of T (as in reversible jump). I tried the method on a mixture of Gamma distributions target (in red) with an Exponential scale change and the resulting sample indeed fitted said target.

The authors even make an argument in favour of a unidimensional noise, although this amounts to running an implicit Gibbs sampler. Argument based on a reduced simulation cost for ε, albeit the full dimensional transform x’=T(x,ε) still requires to be computed. And as noted in the paper this also requires checking for irreducibility. The claim for higher efficiency found therein is thus mostly unsubstantiated…

“The detailed balance requirement also demands that, given x, the regions covered by the forward and the backward transformations are disjoint.”

The above statement is also surprising in that the generic detailed balance condition does not impose such a restriction.

 

bootstrap in Nature

Posted in Statistics with tags , , , , , , , , , , on December 29, 2018 by xi'an

A news item in the latest issue of Nature I received about Brad Efron winning the “Nobel Prize of Statistics” this year. The bootstrap is certainly an invention worth the recognition, not to mention Efron’s contribution to empirical Bayes analysis,, even though I remain overall reserved about the very notion of a Nobel prize in any field… With an appropriate XXL quote, who called the bootstrap method the ‘best statistical pain reliever ever produced’!

on confidence distributions

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , on January 10, 2018 by xi'an

As Regina Liu gave her talk at ISI this morning on fusion learning and confidence distributions, this led me to think anew about this strange notion of confidence distributions, building a distribution on the parameter space without a prior to go with it, implicitly or explicitly, and vaguely differing from fiducial inference. (As an aside, the Wikipedia page on confidence distributions is rather heavily supporting the concept and was primarily written by someone from Rutgers, where the modern version was developed. [And as an aside inside the aside, Schweder and Hjort’s book is sitting in my office, waiting for me!])

Recall that a confidence distribution is a sample dependent distribution on the parameter space, which is uniform U(0,1) [in the sample] at the “true” value of the parameter. Used thereafter as a posterior distribution. (Again, almost always without a prior to go with it. Which is an incoherence from a probabilistic perspective. not mentioning the issue of operating without a pre-defined dominating measure. This measure issue is truly bothering me!) This seems to include fiducial distributions based on a pivot, unless I am confused. As noted in the review by Nadarajah et al. Moreover, the concept of creating a pseudo-posterior out of an existing (frequentist) confidence interval procedure to create a new (frequentist) procedure does not carry an additional validation per se, as it clearly depends on the choice of the initialising procedure. (Not even mentioning the lack of invariance and the intricacy of multidimensional extensions.)

Better together in Kolkata [slides]

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , , , , , , , , , on January 4, 2018 by xi'an

Here are the slides of the talk on modularisation I am giving today at the PC Mahalanobis 125 Conference in Kolkata, mostly borrowed from Pierre’s talk at O’Bayes 2018 last month:

[which made me realise Slideshare has discontinued the option to update one’s presentation, forcing users to create a new presentation for each update!] Incidentally, the amphitheatre at ISI is located right on top of a geological exhibit room with a reconstituted Barapasaurus tagorei so I will figuratively ride a dinosaur during my talk!

Jayanta Kumar Ghosh [1937-2017]

Posted in Books, pictures, Statistics, Travel, University life with tags , , , , , , , , on October 2, 2017 by xi'an

Just head from Sonia and Judith that our friend and fellow Bayesian Jayanta K Ghosh (জয়ন্ত কুমার ঘোষ in Bengali) has passed away a few days ago in Lafayette. He was a wonderful man, very kind to everyone and open for discussing all aspects of Bayesian theory and methodology. While he worked on many branches of statistics, he is more know to Bayesians for his contributions to Bayesian asymptotics. From Bernstein-von-Mises convergence theorems to frequentist validation of non-informative priors, to the Bayesian analysis of infinite dimensional problems, including consistency of posteriors and rates of convergence, and to Bayesian and Empirical Bayes model selection rules in high dimensional problems. He also wrote an introductory textbook on Bayesian Statistics ten years ago with Mohan Delampady and Tapas Samanta. And a monograph of higher order asymptotics. I knew from this summer that J K was quite sick and am quite sad to learn of his demise. He will be missed by all for his gentleness and by Bayesians for his contributions to the fields of objective and non-parametric Bayesian statistics…

%d bloggers like this: