Archive for characteristic function

simulating a sum of Uniforms

Posted in Statistics with tags , , , , , , , , , , on May 1, 2020 by xi'an

When considering the distribution of the sum (or average) of N Uniform variates, called either Irwin-Hall for the sum or Bates for the average, simulating the N uniforms then adding them shows a linear cost in N. The density of the resulting variate is well-known,

f_X(x;N)=\dfrac{1}{2(N-1)!}\sum_{k=0}^N (-1)^k{N \choose k} (x-k)^{N-1}\text{sign}(x-k)

but similarly is of order N. Furthermore, controlling the terms in the alternating sum may prove delicate, as shown by the R function unifed::dirwin.hall() whose code

for (k in 0:floor(x)) ret1 <- ret1 + (-1)^k * choose(n, k) * 
    (x - k)^(n - 1)

quickly becomes unreliable (although I managed an easy fix by using logs and a reference value of the magnitude of the terms in the summation). There is however a quick solution provided by [of course!] Devroye (NURVG, Section XIV.3, p.708), using the fact that the characteristic function of the Irwin-Hall distribution [for Uniforms over (-1,1)] is quite straightforward

\Phi_N(t) = [\sin(t)/t]^N

which means the density can be bounded from above and results in an algorithm (NURVG, Section XIV.3, p.714) with complexity at most N to the power 5/8, if not clearly spelled out in the book. Obviously, it can be objected that for N large enough, like N=20, the difference between the true distribution and the CLT approximation is quite negligible (reminding me of my early simulating days where generating a Normal was done by averaging a dozen uniforms and properly rescaling!). But this is not an exact approach and the correction proves too costly. As shown by Section XIV.4 on the simulation of sums in NURVG. So… the game is afoot!

αBC

Posted in Statistics with tags , , , , on January 4, 2010 by xi'an

The week before last, Peters, Sisson and Fan posted a new ABC paper on arXiv, where they propose to use ABC to run Bayesian inference on α-stable distributions. Since those distributions are defined via a property of their characteristic functions,

\varphi(t) = \exp \left\{ \iota t \mu - |ct|^\alpha(1-\iota \beta \text{sign}(t)\Phi ) \right\}

where \iota^2=-1 and \alpha,\beta,\mu,c are the parameters of the distribution, it is certainly of interest to find a way to handle those complex distributions by likelihood-free methods. My former student Roberto Casarin worked on this problem in the univariate case and proposed an MCMC approach, but he was alas not successful in getting it published, due to another paper appearing at the time…. From a quick perusal, I think Peters et al.’s approach relies on a discretisation of the characteristic function which, while maybe unavoidable, modifies the nature of the problem. Within this framework, the paper examines the performance of several sets of summary statistics over the mean square error for the true parameters., as well as compares it with MCMC approaches when available. Because the behaviour of a genuine Bayesian estimate is unknown in this case, this comparison, while interesting, does not tell us whether or not the ABC approximation is doing well in this complex setting.

%d bloggers like this: