Edit Problem


Blp
|
Easy


(Closer Look at Moment Conditions)

Suppose X and Y have joint mass function given by,

$$f(x,y) = \begin{cases} 0.25, &(x,y) \in \{ (1,1),(1,-1), (-1,1), (-1,-1) \} \\
0, &otherwise \end{cases} $$

Consider a linear predictor, $g(x) = \beta_0 + \beta_1 x$, and let $\epsilon = Y - g(X)$ be the error term.

  1. Compute $E[\epsilon]$ in terms of $\beta_0$ and $\beta_1$.

Now set $\beta_0$ to fulfill the first moment condition: $E[\epsilon] = 0$.

  1. Compute $E[X \epsilon]$ in terms of $\beta_1$

Now set $\beta_1$ to fulfill the second moment condition: $E[X \epsilon] = 0$.

  1. What is the BLP?
Solution:
  1. $E[\epsilon]=E[Y-\beta_0 - \beta_1 X]=E[Y]-\beta_0 -\beta_1 E[X]$Given the PMF, we find the P(X=1)=0.5, P(X=-1)=0.5, P(Y=1)=0.5, P(Y=-1)=0.5Using these probablities, we can calculate the expectations for X and Y as follows:E[Y]=0.5 $\times$ 1+(-1)$\times$ 0.5 = 0E[X]=0.5 $\times$ 1+(-1)$\times$ 0.5 = 0Therefore, E[$\epsilon$]=0-$\beta_0$-$\beta_1 \times$ 0=$-\beta_0$For $E[X \epsilon]$=0, $\beta_0$=0

  2. $E[X \epsilon]=E[X(Y-\beta_1 X)]=E[XY-\beta_1 X^2]=E[XY]-\beta_1 E[X^2]$To find $E[X^2]$, we use the probabilities we found in (1) and LOTUS:$E[X^2]$=1 $\times$ 0.5+ $(-1)^2 \times$ 0.5=1To find E[XY], we have:1 $\times$ 1 $\times$ 0.25+1 $\times$ (-1) $\times$ 0.25+(-1) $\times$ 1 $\times$ 0.25+(-1) $\times$ (-1) $\times$ 0.25=0

    So $E[X \epsilon]= 0 \times \beta_1 \times 1 = - \beta_1$

    To get $E[X \epsilon]=0$, we need $\beta_1$=0

  3. The BLP is 0 $\times$ X+ 0=0, the horizontal line through the origin.