Wikipedia:Reference desk/Archives/Mathematics/2010 March 31

Mathematics desk
< March 30 << Feb | March | Apr >> April 1 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is a transcluded archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


March 31

edit

Variance

edit

Suppose we have identically distributed random variables Xi's. Then Var(sum of Xi's, from i=1 to n) = nVar(X)
Is the above true only if the Xi's are all independent? If they are not independent, then is the above not equal to nVar(X)? —Preceding unsigned comment added by 70.68.120.162 (talk) 03:23, 31 March 2010 (UTC)[reply]

In general, Var(A+B) = Var(A) + Var(B) + 2 Cov(A, B). So you can see that even in the case of two variables with identical distributions and non-zero covariance, the formula breaks down. However, it is possible to have rvs that are not independent, but have zero covariance, so independence is sufficient for your formula to hold, but not necessary. —Preceding unsigned comment added by ConMan 05:58, 31 March 2010 (UTC)[reply]
The simplest counterexample is when all variables are equal. Then  . -- Meni Rosenfeld (talk) 08:11, 31 March 2010 (UTC)[reply]

...and more generally if you have more than two random variables, you have something like this:

var(X + Y + Z) = var(X) + var(Y) + var(Z) + 2cov(X,Y) + 2cov(X,Z) + 2cov(Y,Z),

and so on. So if the sum of the covariances is 0, then the variance of the sum equals the sum of the variances, even if the individual covariances are not 0. Michael Hardy (talk) 18:59, 31 March 2010 (UTC)[reply]