Archive for existence of unbiased estimators

on completeness

Posted in Books, Kids, Statistics with tags , , , , , , on November 19, 2020 by xi'an

Another X validated question that proved a bit of a challenge, enough for my returning to its resolution on consecutive days. The question was about the completeness of the natural sufficient statistic associated with a sample from the shifted exponential distribution

f(x;\theta) = \frac{1}{\theta^2}\exp\{-\theta^{-2}(x-\theta)\}\mathbb{I}_{x>\theta}

[weirdly called negative exponential in the question] meaning the (minimal) sufficient statistic is made of the first order statistic and of the sample sum (or average), or equivalently

T=(X_{(1)},\sum_{i=2}^n \{X_{(i)}-X_{(1)}\})

Finding the joint distribution of T is rather straightforward as the first component is a drifted Exponential again and the second a Gamma variate with n-2 degrees of freedom and the scale θ². (Devroye’s Bible can be invoked since the Gamma distribution follows from his section on Exponential spacings, p.211.) While the derivation of a function with constant expectation is straightforward for the alternate exponential distribution

f(x;\theta) = \frac{1}{\theta}\exp\{-\theta^{-1}(x-\theta)\}\mathbb{I}_{x>\theta}

since the ratio of the components of T has a fixed distribution, it proved harder for the current case as I was seeking a parameter free transform. When attempting to explain the difficulty on my office board, I realised I was seeking the wrong property since an expectation was enough. Removing the dependence on θ was simpler and led to

\mathbb E_\theta\left[\frac{X_{(1)}}{Y}-\frac{\Gamma(n-2)}{\Gamma(n-3/2)}Y^\frac{-1}{2}\right]=\frac{\Gamma(n-2)}{n\Gamma(n-1)}

but one version of a transform with fixed expectation. This also led me to wonder at the range of possible functions of θ one could use as scale and still retrieve incompleteness of T. Any power of θ should work but what about exp(θ²) or sin²(θ³), i.e. functions for which there exists no unbiased estimator..?

another Bernoulli factory

Posted in Books, Kids, pictures, R, Statistics with tags , , , , , , on May 18, 2020 by xi'an

A question that came out on X validated is asking for help in figuring out the UMVUE (uniformly minimal variance unbiased estimator) of (1-θ)½ when observing iid Bernoulli B(θ). As it happens, there is no unbiased estimator of this quantity and hence not UMVUE. But there exists a Bernoulli factory producing a coin with probability (1-θ)½ from draws of a coin with probability θ, hence a mean to produce unbiased estimators of this quantity. Although of course UMVUE does not make sense in this sequential framework. While Nacu & Peres (2005) were uncertain there was a Bernoulli factory for θ½, witness their Question #1, Mendo (2018) and Thomas and Blanchet (2018) showed that there does exist a Bernoulli factory solution for θa, 0≤a≤1, with constructive arguments that only require the series expansion of θ½. In my answer to that question, using a straightforward R code, I tested the proposed algorithm, which indeed produces an unbiased estimate of θ½… (Most surprisingly, the question got closed as a “self-study” question, which sounds absurd since it could not occur as an exercise or an exam question, unless the instructor is particularly clueless.)

unbiased estimators that do not exist

Posted in Statistics with tags , , , , , , , on January 21, 2019 by xi'an

When looking at questions on X validated, I came across this seemingly obvious request for an unbiased estimator of P(X=k), when X~B(n,p). Except that X is not observed but only Y~B(s,p) with s<n. Since P(X=k) is a polynomial in p, I was expecting such an unbiased estimator to exist. But it does not, for the reasons that Y only takes s+1 values and that any function of Y, including the MLE of P(X=k), has an expectation involving monomials in p of power s at most. It is actually straightforward to establish properly that the unbiased estimator does not exist. But this remains an interesting additional example of the rarity of the existence of unbiased estimators, to be saved until a future mathematical statistics exam!

%d bloggers like this: