# A variance-bias decomposition of L1 norm

Suppose we have real random variables ${X,X',Y,Y'}$, ${X,X'\sim F}$, ${Y,Y'\sim G}$ where ${F}$ and ${G}$ are cumulative distribution function and ${X,X',Y,Y'}$ are all independent and ${\mathbb{E}|X|,\mathbb{E}|Y|}$ are finite. We prove the following theorem.

Theorem 1 (Variance-Bias decomposition of ${L1}$ norm) For independent random variables ${X,X',Y,Y'}$ defined above, we have

$\displaystyle 2\mathbb{E}|X-Y| = \mathbb{E}|X-X'| + \mathbb{E}|Y-Y'| +2 \int (G(u)-F(u))^2du.$

Thus

\displaystyle \begin{aligned} 2\mathbb{E}|X-Y| \geq \mathbb{E}|X-X'| + \mathbb{E}|Y-Y'| \end{aligned} \ \ \ \ \ (1)

with equality holds if and only if ${G=F}$.

The quantity ${\mathbb{E}|X-X'|}$ is usually referred as mean absolute difference and it measures the spread of a distribution. I don’t know the term for the quantity ${\mathbb{E}|X-Y|}$ but what it measures is the difference between the distribution ${F}$ and ${G}$. I think cross mean difference would be a nice name.

The equality can be considered as an analogue of the well-known variance-bias decomposition of estimators/predictors in statistics. If we think we are using ${X}$ to estimate/predicts ${Y}$, then the expected error (the cross mean difference) in terms of absolute value (${L1}$ norm in more advance term) is the sum of the mean absolute difference in ${X}$ and ${Y}$, i.e., $\mathbb{E}|X-X'|, \mathbb{E}|Y-Y'|$, which can be considered as variance and the difference in the two distribution ,i.e., ${\int(F-G)^2}$, which can be considered as bias.

There is an analogue in terms of the usual square loss (or ${L2}$ norm) and it is

$\displaystyle 2\mathbb{E}(X-Y)^2 = \mathbb{E}(X-X')^2 + \mathbb{E}(Y-Y')^2 + 2(\mathbb{E}X-\mathbb{E}Y)^2.$

Under this setting, we also see a decomposition of the estimator/prediction error in terms of the variance in ${X}$and ${Y}$, i.e., ${\mathbb{E}(X-X')^2, \mathbb{E}(Y-Y')^2}$, and the difference of mean can be considered as bias as well.

The theorem assume ${X,Y}$ both have first finite moments. In the case either ${X}$ or ${Y}$ has no finite first moment, the equality of the decomposition and inequality is still true by inspecting the proof below. But that the equality holds for inequality (1) does not necessarily imply that ${F =G}$.

Proof: The trick to establish the equality is to write the quantity ${\mathbb{E}|X-Y|}$ in the following form.

\displaystyle \begin{aligned} 2\mathbb{E}|X-Y| & = 2\mathbb{E}(X-Y) 1_{\{ X\geq Y\}} + 2\mathbb{E}(Y-X) 1_{\{ Y\geq X\}}\\ & =2 \mathbb{E}\int 1_{\{Y\leq u\leq X\}}du + 2 \mathbb{E}\int 1_{\{ X\leq u\leq Y\}}du\\ & =2 \int \mathbb{P}(Y\leq u\leq X)du+ 2\int \mathbb{P}(X\leq u\leq Y)du\\ & =2\int G(u)(1-F(u))du +2\int F(u)(1-G(u))du\\ & = 2\int G(u)(1-F(u)) + F(u)(1-G(u)) du. \end{aligned} \ \ \ \ \ (2)

The third equality is due to Fubini’s theorem and the fourth is because of the independence between ${X}$ and ${Y}$. Similarly, we have

$\displaystyle \mathbb{E}|X-X'| +\mathbb{E}|Y-Y'| =2 \int F(u)(1-F(u)) + G(u)(1-G(u)) du.$

Thus the difference of ${2\mathbb{E}|X-Y|}$ and ${\mathbb{E}|X-X'|+\mathbb{E}|Y-Y'|}$ which is finite because ${\mathbb{E}|X|,\mathbb{E}|Y|<\infty}$ is

\displaystyle \begin{aligned} &2\mathbb{E}|X-Y|-\mathbb{E}|X-X'|-\mathbb{E}|Y-Y'|\\ =&2\int G(u)(1-F(u)) + F(u)(1-G(u)) du -2 \int F(u)(1-F(u)) + G(u)(1-G(u)) du\\ = &2\int (G(u)-F(u))^2du \geq 0\\ \end{aligned} \ \ \ \ \ (3)

Thus the equality of decomposition in the theorem and the inequality (1) is established. Now we argue equality of inequality (1) holds if and only ${F=G}$. If ${F=G}$, then inequality (1) obviously becomes an equality, Now if inequality (1) becomes an equality, by the last line of inequality (3), we have

$\displaystyle \int (G(u)-F(u))^2 du = 0.$

This means that

$\displaystyle G(u) = F(u)\quad \text{almost everywhere}.$

But ${G}$ and ${F}$ are right continuous, we have for all ${u\in \mathbb{R}}$,

$\displaystyle G(u) = F(u).$

$\Box$