Suppose we have real random variables , , where and are cumulative distribution function and are all independent and are finite. We prove the following theorem.

Theorem 1 (Variance-Bias decomposition of norm)For independent random variables defined above, we have

The quantity is usually referred as **mean absolute difference** and it measures the spread of a distribution. I don’t know the term for the quantity but what it measures is the difference between the distribution and . I think **cross mean difference** would be a nice name.

The equality can be considered as an analogue of the well-known variance-bias decomposition of estimators/predictors in statistics. If we think we are using to estimate/predicts , then the expected error (the cross mean difference) in terms of absolute value ( norm in more advance term) is the sum of the mean absolute difference in and , *i.e.*, , which can be considered as variance and the difference in the two distribution ,*i.e.*, , which can be considered as bias.

There is an analogue in terms of the usual square loss (or norm) and it is

Under this setting, we also see a decomposition of the estimator/prediction error in terms of the variance in and , *i.e.*, , and the difference of mean can be considered as bias as well.

The theorem assume both have first finite moments. In the case either or has no finite first moment, the equality of the decomposition and inequality is still true by inspecting the proof below. But that the equality holds for inequality (1) does not necessarily imply that .

*Proof:* The trick to establish the equality is to write the quantity in the following form.

The third equality is due to Fubini’s theorem and the fourth is because of the independence between and . Similarly, we have

Thus the difference of and which is finite because is

Thus the equality of decomposition in the theorem and the inequality (1) is established. Now we argue equality of inequality (1) holds if and only . If , then inequality (1) obviously becomes an equality, Now if inequality (1) becomes an equality, by the last line of inequality (3), we have

This means that

But and are right continuous, we have for all ,