Edit Problem


Properties of estimators
|
Easy


(A Variance Limit Result and its Converse)

Suppose $X_1,X_2,,X_3...$ is a sequence of random variables, all with the same expectation $\mu$. A consequence of Chebyshev's theorem is:

If $lim_{n \to \infty} V[X_n] = 0$, then $X_{(n)}$ converges in probability to $\mu$.

Prove that the converse is true or provide a counterexample:

If $X_{(n)}$ converges in probability to $\mu$, then $lim_{n \to \infty} V[X_n] = 0$.

Solution:

The converse statement is false. For a counterexample, let each $X_i$ be discrete with pmf,

$$f_i = \begin{cases}1-\frac{1}{i}, & x=0 \\frac{1}{2i}, &x \in {-i, i }\0, &\text{otherwise} \end{cases}$$

Since $X_i$ is symmetric around 0, $E[X_i]=0$. Moreover, for any $\epsilon > 0$, the window $(-\epsilon,\epsilon)$ includes 0, so $$P(X_i \in (-\epsilon,\epsilon) ) \ge f_i(0) = 1-\frac{1}{i}$$. This approaches 1 as $i \to \infty$, so $X_{(n)}$ converges in probability to 0.

We can also compute,

$$V[X_i] = E[ (X_i - 0)^2 ] = E[X_i^2] = \sum_{x \in Supp[X_i]} x^2 f_i(x) = (-i)^2 \frac{1}{2i} + (0)^2(1 - \frac{1}{i}) + (i)^2 \frac{1}{2i} = \frac{i}{2}$$

Hence, as $i \to \infty$, $V[X_i] \to \infty$, not to zero as required by the statement.