Suppose that $X_1, X_2, ...$ is a sequence of iid random variables, each distributed uniform over $[0,a]$, where $a>0$ is an unknown parameter. Define $M_n = \frac{2}{n} \sum_{i=1}^n X_i$.
a. Prove that each $M_n$ is an unbiased estimator for $a$.
b. Prove that ${M_n}$ is a consistent estimator for $a$.
Suppose that $\{X_1, X_2, ...\}$ is a sequence of independent Bernoulli random variables, with common parameter $p$. For each $n \in \{1,2,...\}$, define the estimator $M_n = X_1$.
a. Prove that each $M_n$ is an unbiased estimator for $p$.
b. Prove that $\{M_n\}$ is not a consistent estimator for $p$.
Suppose that you buy $n$ silicon chips, numbered $1,2,..,n$. Let random variable $L_i$ represent the lifespan of chip $i$. Assume all $L_i$ are i.i.d. with common expectation 2. The manufacturer agrees to refund you an amount (in dollars) equal to $(4 - \sum_{i=1}^n L_i/n)^2 $.
Prove that as $n \to \infty$, the probability limit of your refund is 4 dollars.
Suppose $X_1,X_2,,X_3...$ is a sequence of random variables, all with the same expectation $\mu$. A consequence of Chebyshev's theorem is:
If $lim_{n \to \infty} V[X_n] = 0$, then $X_{(n)}$ converges in probability to $\mu$.
Prove that the converse is true or provide a counterexample:
If $X_{(n)}$ converges in probability to $\mu$, then $lim_{n \to \infty} V[X_n] = 0$.