This Friday, I am briefly taking part in the 10th French Econometrics Conference as a discussant of Anna Simoni’s (CREST) talk, based on a paper co-written with Sid Chib and Minchul Shin. The conference is located at the Paris School of Economics (PSE), on Paris South End, in an impressive new building. The topic of the paper is a Bayesian empirical likelihood approach to the econometrics notion of moments model. Which I discussed here during ISBA last summer since Sid spoke (twice!) there.
Archive for moments
French Econometrics [discussion]
Posted in Books, pictures, Statistics, University life with tags 10th French Econometrics Conference, architecture, econometrics, empirical likelihood, France, moments, Paris, Paris School of Economics on November 30, 2018 by xi'anweakly informative reparameterisations
Posted in Books, pictures, R, Statistics, University life with tags Bayesian modelling, Edinburgh, Gaussian mixture, JCGS, location-scale parameterisation, moments, non-informative priors, publication, R package, Ultimixt on February 14, 2018 by xi'anOur paper, weakly informative reparameterisations of location-scale mixtures, with Kaniav Kamary and Kate Lee, got accepted by JCGS! Great news, which comes in perfect timing for Kaniav as she is currently applying for positions. The paper proposes a unidimensional mixture Bayesian modelling based on the first and second moment constraints, since these turn the remainder of the parameter space into a compact. While we had already developed an associated R package, Ultimixt, the current editorial policy of JCGS imposes the R code used to produce all results to be attached to the submission and it took us a few more weeks than it should have to produce a directly executable code, due to internal library incompatibilities. (For this entry, I was looking for a link to our special JCGS issue with my picture of Edinburgh but realised I did not have this picture.)
occupancy rules
Posted in Kids, R, Statistics with tags moment derivation, moments, multinomial distribution, occupancy, R, Stack Exchange, Stirling number, surjection on May 23, 2016 by xi'anWhile the last riddle on The Riddler was rather anticlimactic, namely to find the mean of the number Y of empty bins in a uniform multinomial with n bins and m draws, with solution
[which still has a link with e in that the fraction of empty bins converges to e⁻¹ when n=m], this led me to some more involved investigation on the distribution of Y. While it can be shown directly that the probability that k bins are non-empty is
with an R representation by
miss<-function(n,m){ p=rep(0,n) for (k in 1:n) p[k]=choose(n,k)*sum((-1)^((k-1):0)*choose(k,1:k)*(1:k)^m) return(rev(p)/n^m)}
I wanted to take advantage of the moments of Y, since it writes as a sum of n indicators, counting the number of empty cells. However, the higher moments of Y are not as straightforward as its expectation and I struggled with the representation until I came upon this formula
where S(k,i) denotes the Stirling number of the second kind… Or i!S(n,i) is the number of surjections from a set of size n to a set of size i. Which leads to the distribution of Y by inverting the moment equations, as in the following R code:
diss<-function(n,m){ A=matrix(0,n,n) mome=rep(0,n) A[n,]=rep(1,n) mome[n]=1 for (k in 1:(n-1)){ A[k,]=(0:(n-1))^k for (i in 1:k) mome[k]=mome[k]+factorial(i)*as.integer(Stirling2(n,i))* (1-(i+1)/n)^m*factorial(k)/factorial(k-i-1)} return(solve(A,mome))}
that I still checked by raw simulations from the multinomial
zample<-function(n,m,T=1e4){ x=matrix(sample(1:n,m*T,rep=TRUE),nrow=T) x=sapply(apply(x,1,unique),length) return(n-x)}
borderline infinite variance in importance sampling
Posted in Books, Kids, Statistics with tags continuity, importance sampling, infinite variance estimators, moments, Monte Carlo experiment, Monte Carlo Statistical Methods on November 23, 2015 by xi'anAs I was still musing about the posts of last week around infinite variance importance sampling and its potential corrections, I wondered at whether or not there was a fundamental difference between “just” having a [finite] variance and “just” having none. In conjunction with Aki’s post. To get a better feeling, I ran a quick experiment with Exp(1) as the target and Exp(a) as the importance distribution. When estimating E[X]=1, the above graph opposes a=1.95 to a=2.05 (variance versus no variance, bright yellow versus wheat), a=2.95 to a=3.05 (third moment versus none, bright yellow versus wheat), and a=3.95 to a=4.05 (fourth moment versus none, bright yellow versus wheat). The graph below is the same for the estimation of E[exp(X/2)]=2, which has an integrand that is not square integrable under the target. Hence seems to require higher moments for the importance weight. Hard to derive universal theories from those two graphs, however they show that protection against sudden drifts in the estimation sequence. As an aside [not really!], apart from our rather confidential Confidence bands for Brownian motion and applications to Monte Carlo simulation with Wilfrid Kendall and Jean-Michel Marin, I do not know of many studies that consider the sequence of averages time-wise rather than across realisations at a given time and still think this is a more relevant perspective for simulation purposes.
truncated t’s [typo]
Posted in pictures, Statistics with tags arXiv, Elben, Hamburg, moments, truncated t distribution, typo on March 14, 2014 by xi'anLast night, I received this email from Piero Foscari (im Hamburg) about my moment derivations for the absolute and the positive t distribution:
There might be two typos in the final second moment formula and its derivation (assuming no silly symmetric mistakes in my validation code): the first ν ought to be -ν, and there should be a corresponding scaling factor also for the boundary μ in Pμ,ν-2 since it arises from a change of variable. Btw in the text reference to Fig. 2 |X| wasn’t updated to X+. I hope that this is of some use.
and I checked that indeed I had forgotten the scale factor ν/(ν-2) in the t distribution with ν-2 degrees of freedom as well as the sign… So I modified the note and rearXived it. Sorry about this lack of attention to the derivation!