Suppose X and Y have joint density given by,
$$f(x,y) = \begin{cases} 1, &0 \leq x \leq 1, 0 \leq y \leq 1\\
0, &otherwise
\end{cases}
$$
Prove that the linear predictor $g(x) = 0.5$ fulfills the first population moment condition.
Prove that the linear predictor $g(x) = 0.5$ fulfills the second population moment condition.
You conclude that $g$ is the BLP.
Suppose X and Y have joint mass function given by,
$$f(x,y) = \begin{cases} 0.25, &(x,y) \in \{ (1,1),(1,-1), (-1,1), (-1,-1) \} \\
0, &otherwise
\end{cases}
$$
Consider a linear predictor, $g(x) = \beta_0 + \beta_1 x$, and let $\epsilon = Y - g(X)$ be the error term.
Now set $\beta_0$ to fulfill the first moment condition: $E[\epsilon] = 0$.
Now set $\beta_1$ to fulfill the second moment condition: $E[X \epsilon] = 0$.
Consider random variables $X$ and $Y$. In the technique called regression through the origin, we are interested in linear predictors of the form,
$$g(x) = b_1 x$$
In other words, linear predictors that pass through the origin. Given such a predictor, define $\epsilon = Y - g(X)$ as always. We are interested in minimizing mean squared error:
$$\beta_1 = \text{argmin}_{b_1} E[\epsilon^2]$$
Examine the proof on page 77 of Agnostic Statistics and consider how it would be different for regression through the origin.
In $\mathbb{R}^2$, define square region A with corners at $\{ (0,1), (1,1),(1,2), (0,2) \}$ and square region B with corners at $\{ (0.5,0),(1.5,0),(1.5,1),(0.5,1) \}$. Suppose that random variables X and Y have joint density given by,
$$f(x,y) = \begin{cases} 0.5, &(x,y) \in A \cup B \\
0, &otherwise
\end{cases}
$$
Suppose random variables X and Y have joint density given by,
$$f(x,y) = \begin{cases} 1/x, &0 < x < 1, 0 < y < x \\
0, &otherwise
\end{cases}
$$
This is a long (but revealing) way to compute $V[Y]$