Moment Generating Functions: 13 Important Facts


Moment generating function    

Moment generating function is very important function which generates the moments of random variable which involve mean, standard deviation and variance etc., so with the help of moment generating function only, we can find basic moments as well as higher moments, In this article we will see moment generating functions for the different discrete and continuous random variables. since the Moment generating function(MGF) is defined with the help of mathematical expectation denoted by M(t) as

[latex][/latex]

[latex]M(t)=E\left[e^{t X}\right][/latex]

and using the definition of expectation for the discrete and continuous random variable this function will be

[latex]M(t)=\left\{\begin{array}{ll}
\sum_{x} e^{t x} p(x) & \text { if } X \text { is discrete with mass function } p(x) \\
\int_{-\infty}^{\infty} e^{i x} f(x) d x & \text { if } X \text { is continuous with density } f(x)
\end{array}\right.
[/latex]

which by substituting the value of t as zero generates respective moments. These moments we have to collect by differentiating this moment generating function for example for first moment or mean we can obtain by differentiating once as

[latex]\begin{aligned}
M^{\prime}(t) &=\frac{d}{d t} E\left[e^{t X}\right] \\
&=E\left[\frac{d}{d t}\left(e^{L X}\right)\right] \\
&=E\left[X e^{t X}\right]
\end{aligned}[/latex]

This gives the hint that differentiation is interchangeable under the expectation and we can write it as

[latex]\frac{d}{d t}\left[\sum_{x} e^{i x} p(x)\right]=\sum_{x} \frac{d}{d t}\left[e^{\operatorname{tr}} p(x)\right][/latex]

and

[latex]\frac{d}{d t}\left[\int e^{i x} f(x) d x\right]=\int \frac{d}{d t}\left[e^{t x} f(x)\right] d x[/latex]

if t=0 the above moments will be

[latex]M^{\prime}(0)=E[X][/latex]

and

[latex]\begin{aligned}
M^{\prime \prime}(t) &=\frac{d}{d t} M^{\prime}(t) \\
&=\frac{d}{d t} E\left[X e^{t X}\right] \\
&=E\left[\frac{d}{d t}\left(X e^{t X}\right)\right] \\
&=E\left[X^{2} e^{L X}\right]\\
M^{\prime \prime}(0)&=E\left[X^{2}\right]
\end{aligned}[/latex]

In general we can say that

[latex]M^{n}(t)=E\left[X^{n} e^{t X}\right] \quad n \geq 1[/latex]

hence

[latex]M^{n}(0)=E\left[X^{n}\right] \quad n \geq 1[/latex]

Moment generating function of Binomial distribution||Binomial distribution moment generating function||MGF of Binomial distribution||Mean and Variance of Binomial distribution using moment generating function

The Moment generating function for the random variable X which is Binomially distribution will follow the probability function of binomial distribution with the parameters n and p as

[latex]\begin{aligned}
M(t) &=E\left[e^{t X}\right] \\
&=\sum_{k=0}^{n} e^{t k}\left(\begin{array}{l}
n \\
k
\end{array}\right) p^{k}(1-p)^{n-k} \\
&=\sum_{k=0}^{n}\left(\begin{array}{l}
n \\
k
\end{array}\right)\left(p e^{t}\right)^{k}(1-p)^{n-k} \\
&=\left(p e^{t}+1-p\right)^{n}
\end{aligned}[/latex]

which is the result by binomial theorem, now differentiating and putting the value of t=0

[latex]M^{\prime}(t)=n\left(p e^{t}+1-p\right)^{n-1} p e^{t}\\
E[X]=M^{\prime}(0)=n p[/latex]

which is the mean or first moment of binomial distribution similarly the second moment will be

[latex]M^{\prime}(t)=n(n-1)\left(p e^{t}+1-p\right)^{n-2}\left(p e^{t}\right)^{2}+n\left(p e^{t}+1-p\right)^{n-1} p e^{t}\\
E\left[X^{2}\right]=M^{\prime \prime}(0)=n(n-1) p^{2}+n p[/latex]

so the variance of the binomial distribution will be

[latex]\begin{aligned}
\operatorname{Var}(X) &=E\left[X^{2}\right]-(E[X])^{2} \\
&=n(n-1) p^{2}+n p-n^{2} p^{2} \\
&=n p(1-p)
\end{aligned}[/latex]

which is the standard mean and variance of Binomial distribution, similarly the higher moments also we can find using this moment generating function.

Moment generating function of Poisson distribution||Poisson distribution moment generating function||MGF of Poisson distribution||Mean and Variance of Poisson distribution using moment generating function

 If we have the random variable X which is Poisson distributed with parameter Lambda then the moment generating function for this distribution will be

[latex]\begin{aligned}
M(t) &=E\left[e^{\ell X}\right] \\
&=\sum_{n=0}^{\infty} \frac{e^{i n} e^{-\lambda} \lambda^{n}}{n !} \\
&=e^{-\lambda} \sum_{n=0}^{\infty} \frac{\left(\lambda e^{t}\right)^{n}}{n !}\\
&=e^{-\lambda} e\\
&=e^ {\left\{\lambda\left(e^{t}-1\right)\right\}}
\end{aligned}[/latex]

now differentiating this will give

[latex]\begin{aligned}
M^{\prime}(t) &=\lambda e^{t} e^{\left\{\lambda\left(e^{t}-1\right)\right\} }\\
M^{\prime \prime}(t) &=\left(\lambda e^{t}\right)^{2} e^{\left\{\lambda\left(e^{t}-1\right)\right\}}+\lambda e^{t} e^{ \left\{\lambda\left(e^{t}-1\right)\right\}}
\end{aligned}[/latex]

this gives

[latex]\begin{aligned}
E[X] &=M^{\prime}(0)=\lambda \\
E\left[X^{2}\right] &=M^{\prime \prime}(0)=\lambda^{2}+\lambda \\
\operatorname{Var}(X) &=E\left[X^{2}\right]-(E[X])^{2} \\
&=\lambda
\end{aligned}[/latex]

which gives the mean and variance for the Poisson distribution same which is true

Moment generating function of Exponential distribution||Exponential distribution moment generating function||MGF of Exponential distribution||Mean and Variance of Exponential distribution using moment generating function

                The Moment generating function for the exponential random variable X by following the definition is

[latex]\begin{aligned}
M(t) &=E\left[e^{t X}\right] \\
&=\int_{0}^{\infty} e^{\lfloor x} \lambda e^{-\lambda x} d x \\
&=\lambda \int_{0}^{\infty} e^{-(\lambda-t) x} d x \\
&=\frac{\lambda}{\lambda-t} \quad \text { for } t<\lambda
\end{aligned}[/latex]

here the value of t is less than the parameter lambda, now differentiating this will give

[latex]M^{\prime}(t)=\frac{\lambda}{(\lambda-t)^{2}} \quad M^{\prime \prime}(t)=\frac{2 \lambda}{(\lambda-t)^{3}}[/latex]

which provides the moments

[latex]E[X]=M^{\prime}(0)=\frac{1}{\lambda} \quad E\left[X^{2}\right]=M^{\prime \prime}(0)=\frac{2}{\lambda^{2}}[/latex]

clearly

[latex]\begin{aligned}
\operatorname{Var}(X) &=E\left[X^{2}\right]-(E[X])^{2} \\
&=\frac{1}{\lambda^{2}}
\end{aligned}[/latex]

Which are the mean and variance of exponential distribution.

Moment generating function of Normal distribution||Normal distribution moment generating function||MGF of Normal distribution||Mean and Variance of Normal distribution using moment generating function

  The Moment generating function for the continuous distributions also same as the discrete one so the moment generating function for the normal distribution with standard probability density function will be

[latex]\begin{aligned}
M_{Z}(t) &=E\left[e^{t Z}\right] \\
&=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{t x} e^{-x^{2} / 2} d x
\end{aligned}[/latex]

this integration we can solve by adjustment as

[latex]\begin{array}{l}
=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{ \left\{-\frac{\left(x^{2}-2 t x\right)}{2}\right\} }d x \\
=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{ \left\{-\frac{(x-t)^{2}}{2}+\frac{t^{2}}{2}\right\}} d x \\
=e^{t^{2} / 2} \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{-(x-t)^{2} / 2} d x \\
=e^{t^{2} / 2}
\end{array}[/latex]

since the value of integration is 1. Thus the moment generating function for the standard normal variate will be

[latex]M_{Z}(t)=e^{t^{2} / 2}[/latex]

from this we can find for any general normal random variable the moment generating function by using the relation

[latex]X=\mu+\sigma Z[/latex]

thus

[latex]\begin{aligned}
M_{X}(t) &=E\left[e^{t X}\right] \\
&=E\left[e^{t(\mu+\sigma Z)}\right] \\
&=E\left[e^{t \mu} e^{b \sigma Z}\right] \\
&=e^{t \mu} E\left[e^{k \sigma Z}\right] \\
&=e^{t \mu} M_{Z}(t \sigma) \\
&=e^{t \mu} e^{(t \sigma)^{2} / 2} \\
&=e^{\left\{\frac{\sigma^{2} t^{2}}{2}+\mu t\right\}}
\end{aligned}[/latex]

so differetiation gives us

[latex]\begin{array}{l}
M_{X}^{\prime}(t)=\left(\mu+t \sigma^{2}\right) \exp \left\{\frac{\sigma^{2} t^{2}}{2}+\mu t\right\} \\
M_{X}^{\prime \prime}(t)=\left(\mu+t \sigma^{2}\right)^{2} \exp \left\{\frac{\sigma^{2} t^{2}}{2}+\mu t\right\}+\sigma^{2} \exp \left\{\frac{\sigma^{2} t^{2}}{2}+\mu t\right\}
\end{array}[/latex]

thus

[latex]\begin{aligned}
E[X] &=M^{\prime}(0)=\mu \\
E\left[X^{2}\right] &=M^{\prime \prime}(0)=\mu^{2}+\sigma^{2}
\end{aligned}[/latex]

so the variance will be

[latex]\begin{aligned}
\operatorname{Var}(X) &=E\left[X^{2}\right]-E([X])^{2} \\
&=\sigma^{2}
\end{aligned}[/latex]

Moment generating function of Sum of random variables

The Moment generating function of sum of random variables gives important property that it equals the product of moment generating function of respective independent random variables that is for independent random variables X and Y then the moment generating function for the sum of random variable X+Y is

Moment generating function
MGF OF SUM

here moment generating functions of each X and Y are independent by the property of mathematical expectation. In the succession we will find the sum of moment generating functions of different distributions.

Sum of Binomial random variables

If the random variables X and Y are distributed by Binomial distribution with the parameters (n,p) and (m,p) respectively then moment generating function of their sum X+Y will be

[latex]\begin{aligned}
M_{X+Y}(t)=M_{X}(t) M_{Y}(t) &=\left(p e^{t}+1-p\right)^{n}\left(p e^{t}+1-p\right)^{m} \\
&=\left(p e^{t}+1-p\right)^{m+n}
\end{aligned}[/latex]

where the parameters for the sum is (n+m,p).

Sum of Poisson random variables

The distribution for the sum of independent random variables X and Y with respective means which are distributed by Poisson distribution we can find as

[latex]\begin{aligned}
M_{X+Y}(t) &=M_{X}(t) M_{Y}(t) \\
&=\exp \left\{\lambda_{1}\left(e^{t}-1\right)\right\} \exp \left\{\lambda_{2}\left(e^{t}-1\right)\right\} \\
&=\exp \left\{\left(\lambda_{1}+\lambda_{2}\right)\left(e^{t}-1\right)\right\}
\end{aligned}[/latex]

Where

[latex]\lambda_{1}+\lambda_{2}[/latex]

is the mean of Poisson random variable X+Y.

Sum of Normal random variables

     Consider the independent normal random variables X and Y with the parameters

[latex]left(\mu_{1}, \sigma_{1}^{2}\right) \ and \ \left(\mu_{2}, \sigma_{2}^{2}\right)[/latex]

then for the sum of random variables X+Y with parameters

[latex]\mu_{1}+\mu_{2} \ and \ \sigma_{1}^{2}+\sigma_{2}^{2}[/latex]

so the moment generating function will be

[latex]\begin{aligned}
M_{X+Y}(t) &=M_{X}(t) M_{Y}(t) \\
&=e^{\left\{\frac{\sigma_{1}^{2} t^{2}}{2}+\mu_{1} t\right\} \exp \left\{\frac{\sigma_{2}^{2} t^{2}}{2}+\mu_{2} t\right\}} \\
&=e^{\left\{\frac{\left(\sigma_{1}^{2}+\sigma_{2}^{2}\right) t^{2}}{2}+\left(\mu_{1}+\mu_{2}\right) t\right\}}
\end{aligned}[/latex]

which is moment generating function with additive mean and variance.

Sum of random number of random variables

To find the moment generating function of the sum of random number of random variables let us assume the random variable

[latex]Y=\sum_{i=1}^{N} X_{i[/latex]

where the random variables X1,X2, … are sequence of random variables of any type, which are independent and identically distributed then the moment generating function will be

[latex]\begin{aligned}
E\left[\exp \left\{t \sum_{1}^{N} X_{i}\right\} \mid N=n\right] &=E\left[\exp \left\{t \sum_{1}^{n} X_{i}\right\} \mid N=n\right] \\
&=E\left[\exp \left\{t \sum_{1}^{n} X_{i}\right\}\right] \\
&=\left[M_{X}(t)\right]^{n}
\end{aligned}[/latex]

[latex]\text{where }M X(t)=E\left[e^{t X_{i}}\right]\\ \text{Thus } E\left[e^{t Y} \mid N\right]=\left(M_{X}(t)\right)^{N}\\ M_{Y}(t)=E\left[\left(M_{X}(t)\right)^{N}\right][/latex]

Which gives the moment generating function of Y on differentiation as

[latex]M_{Y}^{\prime}(t)=E\left[N\left(M_{X}(t)\right)^{N-1} M_{X}^{\prime}(t)\right][/latex]

hence

[latex]\begin{aligned}
E[Y] &=M_{Y}^{\prime}(0) \\
&=E\left[N\left(M_{X}(0)\right)^{N-1} M_{X}^{\prime}(0)\right] \\
&=E[N E X] \\
&=E[N] E[X]
\end{aligned}[/latex]

in the similar way the differentiation two times will give

[latex]M_{Y}^{\prime \prime}(t)=E\left[N(N-1)\left(M_{X}(t)\right)^{N-2}\left(M_{X}^{\prime}(t)\right)^{2}+N\left(M_{X}(t)\right)^{N-1} M_{X}^{\prime \prime}(t)\right][/latex]

which give

[latex]\begin{aligned}
E\left[Y^{2}\right] &=M_{Y}^{\prime \prime}(0) \\
&=E\left[N(N-1)(E[X])^{2}+N E\left[X^{2}\right]\right] \\
&=(E[X])^{2}\left(E\left[N^{2}\right]-E[N]\right)+E[N] E\left[X^{2}\right] \\
&=E[N]\left(E\left[X^{2}\right]-(E[X])^{2}\right)+(E[X])^{2} E\left[N^{2}\right] \\
&=E[N] \operatorname{Var}(X)+(E[X])^{2} E\left[N^{2}\right]
\end{aligned}[/latex]

thus the variance will be

[latex]\begin{aligned}
\operatorname{Var}(Y) &=E[N] \operatorname{Var}(X)+(E[X])^{2}\left(E\left[N^{2}\right]-(E[N])^{2}\right) \\
&=E[N] \operatorname{Var}(X)+(E[X])^{2} \operatorname{Var}(N)
\end{aligned}[/latex]

Example of Chi-square random variable

Calculate the moment generating function of the Chi-squared random variable with n-degree of freedom.

Solution: consider the Chi-squared random variable with the n-degree of freedom for

[latex]Z_{1}^{2}+\cdots+Z_{n}^{2}[/latex]

the sequence of standard normal variables then the moment generating function will be

[latex]M(t)=\left(E\left[e^{t Z^{2}}\right]\right)^{n}[/latex]

so it gives

[latex]\begin{aligned}
E\left[e^{t Z^{2}}\right] &=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{t x^{2}} e^{-x^{2} / 2} d x \\
&=\frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{\infty} e^{-x^{2} / 2 \sigma^{2}} d x \quad \text { where } \sigma^{2}=(1-2 t)^{-1} \\
&=\sigma \\
&=(1-2 t)^{-1 / 2}
\end{aligned}[/latex]

the normal density with mean 0 and variance σ2 integrates to 1

[latex]M(t)=(1-2 t)^{-n / 2}[/latex]

which is the required moment generating function of n degree of freedom.

Example of Uniform random variable

Find the moment generating function of random variable X which is binomially distributed with parameters n and p given the conditional random variable Y=p on the interval (0,1)

Solution: To find the moment generating function of random variable X given Y

[latex]E\left[e^{X X} \mid Y=p\right]=\left(p e^{t}+1-p\right)^{n}[/latex]

using the binomial distribution, sin Y is the Uniform random variable on the interval (0,1)

[latex]
\begin{array}{l}
E\left[e^{t X}\right]=\int_{0}^{1}\left(p e^{t}+1-p\right)^{n} d p
\\=\frac{1}{e^{t}-1} \int_{1}^{e^{t}} y^{n} d y\\
=\frac{1}{n+1} \frac{e^{t(n+1)}-1}{e^{t}-1} \\
=\frac{1}{n+1}\left(1+e^{t}+e^{2 t}+\cdots+e^{n t}\right)
\end{array}
\\\text{by substituting }\left.y=p e^{t}+1-p\right)
[/latex]

Joint moment generating function

The Joint moment generating function for the n number of random variables X1,X2,…,Xn

[latex]M\left(t_{1}, \ldots, t_{n}\right)=E\left[e^{t_{1} X_{1}+\cdots+t_{n} X_{n}}\right][/latex]

where t1,t2,……tn are the real numbers, from the joint moment generating function we can find the individual moment generating function as

[latex]M_{X_{i}}(t)=E\left[e^{t X_{i}}\right]=M(0, \ldots, 0, t, 0, \ldots, 0)[/latex]

Theorem: The random variables X1,X2,…,Xn are independent if and only if the joint mement generating function

[latex]M\left(t_{1}, \ldots, t_{n}\right)=M X_{1}\left(t_{1}\right) \cdots M X_{n}\left(t_{n}\right)[/latex]

Proof: Let us assume that the given random variables X1,X2,…,Xn are independent then

[latex]\begin{aligned}
M\left(t_{1}, \ldots, t_{n}\right) &=E\left[e^{\left(t_{1} X_{1}+\cdots+t_{n} X_{n}\right)}\right] \\
&=E\left[e^{t_{1} X_{1}} \ldots e^{t_{n} X_{n}}\right] \\
&=E\left[e^{t_{1} X_{1}}\right] \cdots E\left[e^{t_{n} X_{n}}\right] \quad \text { by independence } \\
&=M_{X_{1}}\left(t_{1}\right) \cdots M_{X_{n}}\left(t_{n}\right)
\end{aligned}[/latex]

Now assume that the joint moment generating function satisfies the equation

[latex]M\left(t_{1}, \ldots, t_{n}\right)=M X_{1}\left(t_{1}\right) \cdots M X_{n}\left(t_{n}\right)[/latex]

  • to prove the random variables X1,X2,…,Xn are independent we have the result that the joint moment generating function uniquely gives the joint distribution(this is another important result which requires proof) so we must have joint distribution which shows the random variables are independent, hence the necessary and sufficient condition proved.

Example of Joint Moment generating function

1.Calculate the joint moment generating function of the random variable X+Y and X-Y

Solution : Since the sum of random variables X+Y and subtraction of random variables X-Y are independent as for the independent random variables X and Y so the joint moment generating function for these will be

[latex]\begin{aligned}
E\left[e^{n(X+Y)+s(X-Y)}\right] &=E\left[e^{(t+s) X+(t-s) Y}\right] \\
&=E\left[e^{(t+s) X}\right] E\left[e^{(t-s) Y}\right] \\
&=e^{\mu(t+s)+\sigma^{2}(t+s)^{2} / 2} e^{\mu(t-s)+\sigma^{2}(t-s)^{2} / 2} \\
&=e^{2 \mu t+\sigma^{2} t^{2}} e^{\sigma^{2} s^{2}}
\end{aligned}[/latex]

as this moment generating function determine the joint distribution so from this we can have X+Y and X-Y are independent random variables.

2. Consider for the experiment the number of events counted and uncounted distributed by poisson distribution with probability p and the mean λ, show that the number of counted and uncounted events are independent with respective means λp and λ(1-p).

Solution: We will consider X as the number of events and Xc the number of counted events so the number of uncounted events is X-Xc,the joint moment genrating function will generate moment

[latex]\begin{aligned}
E\left[e^{\kappa X_{\varepsilon}+t\left(X-X_{c}\right)} \mid X=n\right] &=e^{\ln } E\left[e^{(s-t) X_{c}} \mid X=n\right] \\
&=e^{i n}\left(p e^{s-t}+1-p\right)^{n} \\
&=\left(p e^{s}+(1-p) e^{t}\right)^{n}
\end{aligned}[/latex]

and by the moment generating function of binomial distribution

[latex]E\left[e^{s X_{\varepsilon}+t\left(X-X_{\varepsilon}\right)} \mid X\right]=\left(p e^{s}+(1-p) e^{t}\right)^{X}[/latex]

and taking expectation off these will give

[latex]E\left[e^{\sum X_{c}+t\left(X-X_{c}\right)}\right]=E\left[\left(p e^{s}+(1-p) e^{t}\right)^{X}\right]\\
\begin{aligned}
E\left[e^{s X_{c}+t\left(X-X_{c}\right)}\right] &=e^{\lambda\left(p e^{\prime}+(1-p) e^{t}-1\right)} \\
&=e^{\lambda p\left(e^{c-1}\right)} e^{\lambda(1-p)\left(e^{t}-1\right)}
\end{aligned}[/latex]

Conclusion:

By using the standard definition of moment generating function the moments for the different distributions like binomial, poisson, normal etc were discussed and the sum of these random variables either the discrete or continuous the moment generating function for those and joint moment generating function were obtained with suitable examples , if you require further reading go through below books.

For more articles on Mathematics, please see our Mathematics page.

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

DR. MOHAMMED MAZHAR UL HAQUE

I am DR. Mohammed Mazhar Ul Haque , Assistant professor in Mathematics. Having 12 years of experience in teaching. Having vast knowledge in Pure Mathematics , precisely on Algebra. Having the immense ability of problem designing and solving. Capable of Motivating candidates to enhance their performance. I love to contribute to Lambdageeks to make Mathematics Simple , Interesting & Self Explanatory for beginners as well as experts. Let's connect through LinkedIn - https://www.linkedin.com/in/dr-mohammed-mazhar-ul-haque-58747899/

Recent Posts