# Covariance, Variance Of Sums: 7 Important Facts

## COVARIANCE, VARIANCE OF SUMS, AND CORRELATIONS OF RANDOM VARIABLES

The statistical parameters of the random variables of different nature using the definition of expectation of random variable is easy to obtain and understand, in the following we will find some parameters with the help of mathematical expectation of random variable.

## Moments of the number of events that occur

So far we know that expectation of different powers of random variable is the moments of random variables and how to find the expectation of random variable from the events if number of event occurred already, now we are interested in the expectation if pair of number of events already occurred, now if X represents the number of event occurred then for the events A1, A2, ….,An define the indicator variable Ii as

$I_{i}=\begin{cases} 1, &\text{if } A_{i} \ \ occurs \\ 0, &\text{otherwise} \end{cases}$

the expectation of X in discrete sense will be

$E[X]= E\left [ \sum_{i=1}^{n} I_{i} \right ] =\sum_{i=1}^{n} E[I_{i}] =\sum_{i=1}^{n} P\left ( A_{i} \right )$

because the random variable X is

$E=\sum_{i=1}^{n} E I_{i}$

now to find expectation if number of pair of event occurred already we have to use combination as

$\binom{X}{2} = \sum_{i< j} I_{i}J_{i}$

this gives expectation as

$E\left [ \binom{X}{2} \right ]=\sum_{i< j} E[I_{i}I_{j}] = \sum_{i< j} P(A_{i}A_{j})$

$E\left [ \frac{X(X-1)}{2} \right ] = \sum_{i< j}^{} P(A_{i}A_{j})$

$E[X^{2}] -E[X] =2 \sum_{i< j}^{} P(A_{i}A_{j})$

from this we get the expectation of x square and the value of variance also by

$Var(X)=E[X^{2}] -(E[X])^{2}$

By using this discussion we focus different kinds of random variable to find such moments.

## Moments of binomial random variables

If p is the probability of success from n independent trials then lets denote Ai for the trial i as success so

$When \ \ i\neq j, P(A_{i}A_{j})=p^{2}$

$E\left [ \binom{X}{2} \right ]= \sum_{i< j}^{} p^{2} = \binom{n}{2}p^{2}$

$E[X(X-1)] =n(n-1)p^{2}$

$E[X^{2}] -E[X] =n(n-1)p^{2}$

and hence the variance of binomial random variable will be

$Var(X)=E[X^{2}] -(E[X])^{2}=n(n-1)p^{2} +np – (np)^{2}=np(1-p)$

because

$E[X] =\sum_{i=1}^{n} P(A_{i}) =np$

if we generalize for k events

$P(A_{i_{1}}A_{i_{2}} \cdot \cdot \cdot A_{i_{k}})=p^{k}$

$E[X(X-1) \cdot \cdot \cdot \cdot (X-k+1) ] =n(n-1) \cdot\cdot\cdot (n-k+1)p^{k}$

this expectation we can obtain successively for the value of k greater than 3  let us find for 3

$E[X(X-1)(X-2) ] =n(n-1)(n-2)p^{3}$

$E[X^{3} -3X^{2} +2X] =n(n-1)(n-2)p^{3}$

$E[X^{3}] =3E[X^{2}] -2E[X] + n(n-1)(n-2)p^{3}$

$=3n(n-1)p^{2} +np + n(n-1)(n-2)p^{3}$

using this iteration we can get

$E[X^{k}], k\geq 3,$

## Moments of hypergeometric random variables

The moments of this random variable we will understand with the help of an example suppose n pens are randomly selected from a box containing N pens of which m are blue, Let Ai denote the events that i-th pen is blue, Now X is the number of blue pen selected is equal to the number of events A1,A2,…..,An that occur because the ith pen selected is equally likely to any of the N pens of which m are blue

$P(A_{i}) =\frac{m}{N} \ \ , E[X]=\sum_{i=1}^{n}P(A_{i})$

and so

$P(A_{i}A_{j}) = P(A_{i}) P(A_{j}/A_{i}) =\frac{m}{N} \frac{m-1}{N-1}$

$E\left [ \binom{X}{2} \right ] =\sum_{i< j}^{}\frac{m(m-1)}{n(n-1)} =\binom{n}{2}\frac{m(m-1)}{n(n-1)}$

$X[X(X-1)] =n(n-1)\frac{m(m-1)}{N(N-1)}$

this gives

$E[X^{2}] =n(n-1)\frac{m(m-1)}{N(N-1)} + E[X]$

so the variance of hypergeometric random variable will be

$Var(X)=E[X^{2}]-(E[X])^{2}$

$= n(n-1)\frac{m(m-1)}{N(N-1)} +\frac{nm}{N} -\frac{n^{2}m^{2}}{N^{2}}$

$=\frac{nm}{N} \left [ \frac{(n-1)(m-1)}{N-1} +1 + -\frac{mn}{N} \right ]$

in similar way for the higher moments

$P(A_{i_{1}} A_{i_{2}} \cdot \cdot \cdot \cdot A_{i_{k}}) =\frac{m(m-1)\cdot \cdot \cdot \cdot (m-k+1)}{N(N-1)\cdot \cdot \cdot \cdot (N-k+1)}$

$E\left [ \binom{X}{k} \right ] = \binom{n}{k} \frac{m(m-1)\cdot \cdot \cdot \cdot (m-k+1)}{N(N-1)\cdot \cdot \cdot \cdot (N-k+1)}$

hence

$E[X(X-1) \cdot \cdot \cdot (X-k+1) ] =n(n-1) \cdot \cdot \cdot \cdot (n-k+1) \frac{m(m-1)\cdot \cdot \cdot \cdot (m-k+1)}{N(N-1)\cdot \cdot \cdot \cdot (N-k+1)}$

## Moments of the negative hypergeometric random variables

consider the example of a package containing n+m vaccines of which n are special and m are ordinary, these vaccines removed one at a time, with each new removal equally likely to be any of the vaccine that remain in the package. Now let random variable Y denote the number of vaccines that need to be withdrawn until a total of r special vaccines have been removed, which is negative hypergeometric distribution, this is somehow similar with negative binomial to binomial as to hypergeometric distribution. to find the probability mass function if the kth draw gives the special vaccine after k-1 draw gives r-1 special and k-r ordinary vaccine

$P(X=k)=\frac{\binom{n}{r-1}\binom{m}{k-r}}{\binom{n+m}{k-1}} \frac{n-r+1}{n+m-k+1}$

now the random variable Y

Y=r+X

for the events Ai

$E[Y]=r+E[X] =r + \sum_{i=1}^{m} P(A_{i})$

$E[Y]=r+ m\frac{r}{n+1}=\frac{r(n+m+1)}{n+1}$

as

$P(A_{i})=\frac{r}{n+1}$

hence to find the variance of Y we must know the variance of X so

$E(X(X-1))=2\sum_{i< j}^{} P(A_{i}A_{j})$

$\sum_{i< j}^{} P(A_{i}A_{j}) = \frac{\binom{2}{2}\binom{n}{r-1}}{\binom{n+2}{r+1}} =\frac{r(r+1)}{(n+1)(n+2)}$

$E[X(X-1)]=2\binom{m}{2}\frac{r(r+1)}{(n+1)(n+2)}$

$E[X^{2}] = m(m-1)\frac{r(r+1)}{(n+1)(n+2)} + E[X]$

$Var(Y)=Var(X) = m(m-1)\frac{r(r+1)}{(n+1)(n+2)} m \frac{r}{n+1} – \left ( m\frac{r}{n+1} \right )^{2}$

hence

$Var(Y) =\frac{mr(n+1-r)(m+n+1)}{(n+1)^{2}(n+2)}$

## COVARIANCE

The relationship between two random variable can be represented by the statistical parameter covariance, before the definition of covariance of two random variable X and Y recall that the expectation of two functions g and h of random variables X and Y respectively gives

$E[g(X)h(Y)]= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} g(x)h(y) f(x,y)dx dy$

$= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} g(x)h(y) f_{X}(x) f_{Y}(x) dx dy$

$= \int_{-\infty}^{\infty} h(y) f_{Y}(x) dy \int_{-\infty}^{\infty} g(x)f_{X}(x) dx$

$=E[h(Y)] E[g(X)]$

$E[g(X)h(Y)]=E[h(Y)] E[g(X)]$

using this relation of expectation we can define covariance as

“ The covariance between random variable X and random variable Y denoted by cov(X,Y)  is defined as

$Cov(X,Y)=E[(X-E[X])(Y-E[Y])]$

using definition of expectation and expanding we get

$Cov(X,Y)=E[XY-E[X]Y -XE[Y] +E[Y]E[X] ]$

$=E[XY] – E[X]E[Y] – E[X]E[Y] +E[X]E[Y]$

$=E[XY] – E[X]E[Y]$

it is clear that if the random variables X and Y are independent then

$Cov(X,Y)=0$

but the converse is not true for example if

$P(X=0)=P(X=1)=p(X=-1)=\frac{1}{3}$

and defining the random variable Y as

$Y= \begin{cases} 0 &\text{if } X \neq 0 \\ 1 &\text{if } X =0 \end{cases}$

so

$Cov(X,Y)=E[XY] -E[X]E[Y]=0$

here clearly X and Y are not independent but covariance is zero.

## Properties of covariance

Covariance between random variables X and Y has some properties as follows

$\ \ (i) \ \ Cov(X,Y)=Cov(Y,X)$

$\ \ (ii) \ \ Cov(X,X)=Var(X)$

$\ \ (iii) \ \ Cov(aX, Y)=aCov(X,Y)$

$\ \ (iv) \ \ Cov\left ( \sum_{i=1}^{n} X_{i} , \sum_{j=1}^{m} Y_{j} \right ) = \sum_{i=1}^{n} \sum_{j=1}^{m} Cov(X_{i}, Y_{j})$

using the definition off the covariance the first three properties are immediate and the fourth property follows by considering

$E\left [ \sum_{i=1}^{n} X_{i} \right ] =\sum_{i=1}^{n} \mu {i} , \ \ E\left [ \sum{j=1}^{m} Y_{j} \right ] =\sum_{j=1}^{m} v_{j}$

now by definition

## Variance of the sums

The important result from these properties is

$var\left ( \sum_{i=1}^{n} X_{i} \right ) =\sum_{i=1}^{n} Var(X_{i})$

as

$var\left ( \sum_{i=1}^{n} X_{i} \right ) =Cov\left ( \sum_{i=1}^{n} X_{i} \sum_{j=1}^{n} X_{j} \right )$

$= \sum_{i=1}^{n} \sum_{j=1}^{n}X_{i} X_{j}$

$= \sum_{i=1}^{n} Var(X_{i}) \sum \sum_{i\neq j}^{} Cov(X_{i},X_{j})$

$Var\left ( \sum_{i=1}^{n} X_{i} \right ) =\sum_{i=1}^{n}Var(X_{i}) +2 \sum \sum_{i< j}^{} Cov(X_{i},X_{j})$

If Xi ‘s are pairwise independent then

$Var\left ( \sum_{i=1}^{n} X_{i} \right ) =\sum_{i=1}^{n}Var(X_{i})$

## Example: Variance of a binomial random variable

If X is the random variable

$X=X_{1} + \cdot \cdot \cdot \cdot + X_{n}$

where Xi are the independent Bernoulli random variables such that

$X_{i}=\begin{cases} 1 &\text{if the i-th trail is success } \\ 0 &\text{otherwise } \end{cases}$

then find the variance of a binomial random variable X with parameters n and p.

Solution:

since

$Var\left ( \sum_{i=1}^{n} X_{i} \right ) =\sum_{i=1}^{n} Var(X_{i})$

$Var(X) =Var(X_{1}) + \cdot \cdot \cdot \cdot +Var(X_{n})$

so for single variable we have

$Var(X_{i}) =E[X_{i}^{2}] -(E[X_{i}])^{2}$

$=E[X_{i}] -(E[X_{i}])^{2} \ \ Since \ \ X_{i}^{2} =X_{i}$

$=p-p^{2}$

so the variance is

$Var(X)=np(1-p)$

## Example

For the independent random variables Xi with the respective means and variance and a new random variable with deviation as

$S^{2}=\sum_{i=1}^{n}\frac{(X_{i} -\overline{X})^{2}}{n-1}$

then compute

$\ \ (a) \ \ Var(\overline{X}) \ \ and \ \ (b) \ \ E[S^{2}]$

solution:

By using the above property and definition we have

$\ \ (a) \ \ Var(\overline{X}) =\left ( \frac{1}{n} \right )^{2} Var\left ( \sum_{i=1}^{n} X_{i} \right )$

$=\left ( \frac{1}{n} \right )^{2} \sum_{i=1}^{n} Var(X_{i}) \ \ by \ \ independence$

$=\frac{\sigma ^{2}}{n}$

now for the random variable S

take the expectation

$(n-1)E[S^{2}] =\sum_{i=1}^{n} E[(X_{i} -\mu)^{2}] -nE[(\overline{X} -\mu )^{2}]$

## Example:

Find the covariance of indicator functions for the events A and B.

Solution:

for the events A and B the indicator functions are

$I_{A}=\begin{cases} 1 &\text{if A occurs} \\ 0 &\text{otherwise } \end{cases}$

$I_{B}=\begin{cases} 1 &\text{if B occurs} \\ 0 &\text{otherwise } \end{cases}$

so the expectation of these are

$E[I_{A}] =P(A)$

$E[I_{B}] =P(B)$

$E[I_{A}I_{B}] =P(AB)$

thus the covariance is

$Cov(I_{A},I_{B}) = P(AB) – P(A)P(B)$

$= P(B)[P(A/B) – P(A)]$

## Example:

Show that

$Cov(X_{i}- \overline{X}, \overline{X}) =0$

where Xi are independent random variables with variance.

Solution:

The covariance using the properties and definition will be

$Cov(X_{i}- \overline{X}, \overline{X}) = Cov(X_{i}, \overline{X}) – Cov(\overline{X}, \overline{X})$

$Cov\left ( X_{i}, \frac{1}{n} \sum_{j=1}^{n} X_{j} \right ) – Var(\overline{X})$

$= \frac{1}{n} \sum_{j=1}^{n} Cov(X_{i},X_{j}) – \frac{\sigma ^{2}}{n}$

$= \frac{\sigma ^{2}}{n} – \frac{\sigma ^{2}}{n} =0$

## Example:

Calculate the mean and variance of random variable S which is the sum of n sampled values if set of N people each of whom has an opinion about a certain subject that is measured by a real number v that represents the person’s “strength of feeling” about the subject. Let  represent the strength of feeling of person  which is unknown, to collect information a sample of n from N is taken randomly, these n people are questioned and their feeling is obtained to calculate vi

Solution

let us define the indicator function as

$I_{i}=\begin{cases} 1 &\text{if person i is in the random sample } \\ 0 &\text{otherwise } \end{cases}$

thus we can express S as

$S = \sum_{i=1}^{N} v_{i}I_{i}$

and its expectation as

$E[S] = \sum_{i=1}^{N} v_{i}E[I_{i}]$

this gives the variance as

$Var(S) =\sum_{i=1}^{N} Var(v_{i}I_{i}) +2\sum_{}^{}\sum_{i< j}^{} Cov(v_{i}I_{i}, v_{j}I_{j})$

$=\sum_{i=1}^{N} v_{i}^{2} Var(I_{i}) +2\sum_{}^{}\sum_{i< j}^{} v_{i}v_{j} Cov(I_{i}, I_{j})$

since

$E[I_{i}] =\frac{n}{N}$

$E[I_{i} I_{j}] =\frac{n}{N} \frac{n-1}{N-1}$

we have

$Var (I_{i}) =\frac{n}{N}\left ( 1- \frac{n}{N} \right )$

$Cov(I_{i}, I_{j}) = \frac{n(n-1)}{N(N-1)} -\left ( \frac{n}{N} \right )^{2}$

$= \frac{-n(N-1)}{N^{2}(N-1)}$

$E[s] =n\sum_{i=1}^{N}\frac{v_{i}}{N} =n\overline{v}$

$Var(S)=\frac{n}{N}\frac{N-n}{N} \sum_{i=1}^{N}v_{i}^{2} -\frac{2n(N-n)}{N^{2}(N-1)} \sum \sum_{i< j}^{} v_{i}v_{j}$

we know the identity

$(v_{1} + \cdot \cdot \cdot + v_{N})^{2} =\sum_{i=1}^{N}v_{i}^{2} +2 \sum \sum_{i< j}^{} v_{i}v_{j}$

so

$Var(S) =\frac{n(N-1)}{(N-1)} \left ( \frac{\sum_{i=1}^{N}v_{i}^{2}}{N} -\overline{v}^{2} \right )$

$E[S]= n\overline{v}= np \ \ since \ \ n\overline{v}=\frac{Np}{N}=p$

$Var(S)= \frac{n(N-n)}{N-1} \left ( \frac{Np}{N} -p^{2} \right )$

$= \frac{n(N-n)}{N-1} p(1-p)$

so the mean and variance for the said random variable will be

$E\left [ \frac{S}{n} \right ] =p$

$Var\left ( \frac{S}{n} \right )=\frac{N-n}{n(N-1)}p(1-p)$

## Conclusion:

The correlation between two random variables is defined as covariance and using the covariance the sum of the variance is obtained for different random variables, the covariance and different moments with the help of definition of expectation is obtained  , if you require further reading go through

https://en.wikipedia.org/wiki/Expectation

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH.