Inverse Gamma Distribution | Its 6 Important Properties

Inverse gamma distribution and moment generating function of gamma distribution

      In continuation with gamma distribution we will see the concept of inverse gamma distribution and moment generating function, measure of central tendencies mean, mode and median of gamma distribution by following some of the basic properties of gamma distribution.

gamma distribution properties

Some of the important properties of gamma distribution are enlisted as follows

The probability density function for the gamma distribution is

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\tau (\alpha )} &\ x\geq 0 \\ \ 0 &\ x < 0 \end{cases}

or

f(x) = \begin{cases} \frac{ e^{-\frac{x}{\beta }}(x)^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha )} , &\ x\geq 0 \\ \ 0 &\ x < 0 \end{cases}

where the gamma function is

\tau (\alpha )=\int_{0}^{\infty} e^{-y}y^{\alpha -1} dy

2.The cumulative distribution function for the gamma distribution is

f(a)=P(X\in (-\infty,a] ) =\int_{-\infty}^{a} f(x)dx

where f(x) is the probability density function as given above in particular cdf is

F(x) = \begin{cases} 0 , &\ x\leq 0 , \ \frac{1}{\tau (\alpha )\beta ^{\alpha }}\int_{0}^{x}y^{\alpha -1}e^-{(y/\beta) } dy &\ x > 0 \end{cases}

  • The mean and variance of the gamma distribution is

E[X]={\alpha\lambda}

and

Var(X)={{\alpha}\lambda}^2

respectively or

E[X]=α*β

and

Var(X)={{\alpha}\beta}^2

  • The moment generating function M(t) for the gamma distribution is

= \left ( \frac{1}{1-\beta t} \right )^{\alpha } \ \ if \ \ t< \frac{1}{\beta }

or

= \left ( \frac{\lambda }{\lambda - t} \right )^{\alpha }

  • The curve for the pdf and cdf is
Inverse gamma distribution
  • The invers gamma distribution can be defined by taking reciprocal of the probability density function of gamma distribution as

f(x) = \begin{cases} \frac{e^-{\frac{1}{\beta x}}(\frac{1}{x})^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha)} &\ x\geq 0 \\  \ 0 &\ x < 0 \end{cases}

  • The sum of independent gamma distribution is again the gamma distribution with sum of the parameters.

inverse gamma distribution | normal inverse gamma distribution

                If in the gamma distribution in the probability density function

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\tau (\alpha )} &\ x\geq 0 \ 0 &\ x < 0 \end{cases}

or

f(x) = \begin{cases} \frac{ e^{-\frac{x}{\beta }}(x)^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha )} , &\ x\geq 0 \\ \ 0 &\ x < 0 \end{cases}

we take the variable reciprocal or inverse then the probability density function will be

f(x) = \begin{cases} \frac{e^-{\frac{1}{\beta x}}(\frac{1}{x})^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha)} &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

Thus the random variable with this probability density function is known to be the inverse gamma random variable or inverse gamma distribution or inverted gamma distribution.

f_{Y}(y) = f_{X}(1/y)\left | \frac{\mathrm{d} }{\mathrm{d} y}y^{-1} \right |

= \frac{1}{\tau (\alpha )\beta ^{\alpha }}y^{-\alpha +1}e^{(-1/\beta y)}y^{-2}

= \frac{(\frac{1}{\beta })^{\alpha }}{\tau (\alpha )}y^{-\alpha -1}e^{(-1/\beta)/y}

The above probability density function in any parameter we can take either in the form of lambda or theta the probability density function which is the reciprocal of gamma distribution is the probability density function of inverse gamma distribution.

Cumulative distribution function or cdf of inverse gamma distribution

                The cumulative distribution function for the inverse gamma distribution is the distribution function

f(a)=P(X\in (-\infty,a] ) =\int_{-\infty}^{a} f(x)dx

in which the f(x) is the probability density function of the inverse gamma distribution as

f(x) = \begin{cases} \frac{e^-{\frac{1}{\beta x}}(\frac{1}{x})^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha)} &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

Mean and variance of the inverse gamma distribution

  The mean and variance of the inverse gamma distribution by following the usual definition of expectation and variance will be

E[X]=\frac{\beta }{\alpha -1} \ \ , \alpha > 1

and

Var[X]=\frac{\beta ^{2}}{(\alpha -1)^{2}(\alpha -2)} \ \ , \alpha > 2

Mean and variance of the inverse gamma distribution proof

        To get the mean and variance of the inverse gamma distribution using the probability density function

f(x) = \begin{cases} \frac{e^-{\frac{1}{\beta x}}(\frac{1}{x})^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha)} &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

and the definition of expectations, we first find the expectation for any power of x as

E(X^{n})=\frac{\beta ^{\alpha }}{\tau (\alpha )}\int_{0}^{\infty}x^{n}x^{-\alpha -1} e^{(-\beta /x)} dx

E(X^{n})=\frac{\beta ^{\alpha }}{\tau (\alpha )}\int_{0}^{\infty}x^{n-\alpha -1} e^{(-\beta /x)} dx

E(X^{n})=\frac{\beta ^{\alpha }}{\tau (\alpha )} \frac{\tau (\alpha -n)}{\beta^{\alpha -n}}

=\frac{\beta ^{n}\tau (\alpha-n)}{(\alpha -1)….(\alpha -n)\tau (\alpha -n)}

=\frac{\beta ^{n}}{(\alpha -1)….(\alpha -n)}

in the above integral we used the density function as

f(x)=\frac{\beta ^{\alpha }}{\tau \alpha }x^{-\alpha -1} e^{(-\beta /x)}

now for the value of α greater than one and n as one

E(X)=\frac{\beta }{\alpha -1}

similarly the value for n=2 is for alpha greater than 2

E(X^{2})=\frac{\beta^{2} }{(\alpha -1)(\alpha -2)}

using these expectations will give us the value of variance as

Var(X)=E(X^{2}) -E(X)^{2} =\frac{\beta^{2} }{(\alpha -1)^{2}(\alpha -2)}

Invers gamma distribution plot | Inverse gamma distribution graph

                The inverse gamma distribution is the reciprocal of the gamma distribution so while observing the gamma distribution it is good to observe the nature of the curves of inverse gamma distribution having probability density function as

f(x) = \begin{cases} \frac{e^-{\frac{1}{\beta x}}(\frac{1}{x})^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha)} &\ x\geq 0 \ \ 0 &\ x < 0 \end{cases}

and the cumulative distribution function by following

F(a)=P(X\in (-\infty,a] ) =\int_{-\infty}^{a} f(x)dx

Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 1 and varying the value of β.

Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 2 and varying the value of β

Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 3 and varying the value of β.

Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 1 and varying the value of α.

Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 2 and varying the value of α

Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 3 and varying the value of α.

moment generating function of gamma distribution

Before understanding the concept of moment generating function for the gamma distribution let us recall some concept of moment generating function

Moments

    The moment of the random variable is defined with the help of expectation as

{\mu_{r} }'=E(X^{r})

this is known as r-th moment of the random variable X it is the moment about origin and commonly known as raw moment.

     If we take the r-th moment of the random variable about the mean μ as

{\mu_{r} }=E[(X-\mu)^{r}]

this moment about the mean is known as central moment and the expectation will be as per the nature of random variable as

{\mu_{r} }=\sum (X-\mu)^{r} f(x) \ \ (discrete \ \ variable)

{\mu_{r} }=\int_{-\infty}^{\infty}(X-\mu)^{r} f(x) \ \ (continuous \ \ variable)

in the central moment if we put values of r then we get some initial moments as

{\mu}<em>{0}=1, {\mu}</em>{1}=0 , {\mu}_{2}=\sigma ^{2}

If we take the binomial expansion in the central moments then we can easily get the relationship between the central and raw moments as

{\mu}<em>{r}={\mu}'</em>{r}- \binom{r}{1}{\mu}'<em>{r-1}{\mu}+…..+(-1)^{j}\binom{r}{j}{\mu}'</em>{r-j}{\mu}^{j} + …..+(-1)^{^{r}}{\mu}'_{0}{\mu}^{r}

some of the initial relationships are as follows

{\mu}'<em>{1}={\mu} \ \ and \ \ {\mu}'</em>{0}=1 , \ \ \ {\mu}<em>{2}= {\mu}'</em>{2}- {\mu}^{2} \ {\mu}<em>{3}= {\mu}'</em>{3}-3{\mu}'<em>{2}{\mu} +2{\mu}^{3} \ {\mu}</em>{4}= {\mu}'<em>{4}-4{\mu}'</em>{3}{\mu}+6{\mu}'<em>{2}{\mu}^{2} -3{\mu}'</em>{4}

Moment generating function

   The moments we can generate with the help of a function that function is known as moment generating function and is defined as

M_{X}(t)=E(e^{tX})

this function generates the moments with the help of expansion of exponential function in either of the form

M_{X}(t)=\sum e^{tX}f(x) \ \ (discrete \ \ variable) \ M_{X}(t)=\int_{-\infty}^{\infty} e^{tX}f(x) \ \ (continuous \ \ variable)

using Taylors form as

M_{X}(t)=1+\mu t+\mu' <em>{2}\frac{t^{2}}{2!} +….+\mu' </em>{r}\frac{t^{r}}{r!}+..

differentiating this expanded function with respect to t gives the different moments as

\mu' <em>{r}=\frac{\mathrm{d^{r}} }{\mathrm{d} t^{r}}M</em>{X}(t)\lvert_{t=0 }

on in another way if we take the derivative directly as

M'(t)=\frac{\mathrm{d} }{\mathrm{d} t} E[e^{tX}] \ = E\left [ \frac{\mathrm{d} }{\mathrm{d} t}(e^{tX}) \right ] \ =E\left [ Xe^{tX} \right ]

since for both discrete

\frac{\mathrm{d} }{\mathrm{d} t}\left [ \sum_{x}e^{tx}p(x) \right ] =\sum_{x}\frac{\mathrm{d} }{\mathrm{d} t}[e^{tx}p(x)]

and continuous we have

\frac{\mathrm{d} }{\mathrm{d} t}\left [ \int e^{tx}f(x)dx \right ] =\int \frac{\mathrm{d} }{\mathrm{d} t}[e^{tx}f(x)]dx

so for t=0 we will get

M'(0)=E[X]

likewise

M''(t)=\frac{\mathrm{d} }{\mathrm{d} t} M'(t) \ =\frac{\mathrm{d} }{\mathrm{d} t} E[Xe^{tX}] \ =E\left [ \frac{\mathrm{d} }{\mathrm{d} t} (Xe^{tX})\right ] \ =E[X^{2}e^{tX}]

as

M''(0)=E[X^{2}]

and in general

M^{n}(t)=E[X^{n}e^{tX}] \ \ n\geq 1 \ M^{n}(0)=E[X^{n}] \ \ n\geq 1

there is two important relations for the moment generating functions

M_{(X+a)/b}(t)=e^{at/b}] M_{X}(t/b) \ M_{(X+Y)}(t)=M_{X}(t) M_{Y}(t)

moment generating function of a gamma distribution | mgf of gamma distribution | moment generating function for gamma distribution

Now for the gamma distribution the moment generating function M(t) for the pdf

f(x) = \begin{cases} \frac{ e^{-\frac{x}{\beta }}(x)^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha )} , &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

is

=\left ( \frac{1}{1-\beta t} \right )^{\alpha } \ \ if \ \ t< \frac{1}{\beta }

and for the pdf

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\tau (\alpha )} &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

the moment generating function is

=\left ( \frac{\lambda }{\lambda -t} \right )^{\alpha }

gamma distribution moment generating function proof | mgf of gamma distribution proof

    Now first take the form of probability density function as

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\tau (\alpha )} &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

and using the definition of moment generating function M(t) we have

M_{X}(t)=E(e^{tX})

=E\left [ e^{tX} \right ] \ =\frac{\lambda ^{\alpha }}{\tau (\alpha )}\int_{0}^{\infty} e^{tx}e^{-\lambda x}x^{\alpha -1} dx \ =\frac{\lambda ^{\alpha }}{\tau (\alpha )}\int_{0}^{\infty} e^{-(\lambda -t)x}x^{\alpha -1} dx \ =\frac{\lambda }{\lambda -t}^{\alpha }\frac{1}{\tau (\alpha )} \int_{0}^{\infty} e^{-y}y^{\alpha -1} dy \ \ \ \ [by \ \ y=(\lambda -t)x ] \ =\frac{\lambda }{\lambda -t}^{\alpha }

we can find the mean and variance of the gamma distribution with the help of moment generating function as differentiating with respect to t two times this function we will get

=\frac{\alpha \lambda ^{\alpha }}{(\lambda -t)^{\alpha +1}}  \\ = \frac{\alpha (\alpha +1)\lambda ^{\alpha }}{(\lambda -t)^{\alpha +2}}

if we put t=0 then first value will be

E[X]=\frac{\alpha }{\lambda }

and

E[X^{2}]=\frac{\alpha(\alpha +1)}{\lambda^{2} }

Now putting the value of these expectation in

Var(X)= E[X^{2}] -E[X]^{2} \ Var(X)= \frac{\alpha (\alpha +1)}{\lambda ^{2}} -\frac{\alpha ^{2}}{\lambda^{2}} \ Var(X) =\frac{\alpha ^{2}+\alpha }{\lambda ^{2}} -\frac{\alpha ^{2}}{\lambda^{2}} = \frac{\alpha }{\lambda ^{2}}

alternately for the pdf of the form

f(x) = \begin{cases} \frac{ e^{-\frac{x}{\beta }}(x)^{\alpha -1}}{\beta ^{\alpha }\tau (\alpha )} , &\ x\geq 0 \\ \0 &\ x < 0 \end{cases}

the moment generating function will be

M(t)=\frac{1}{\tau (\alpha )\beta ^{\alpha }}\int_{0}^{\infty}e^{^(x(t-1/\beta )} x^{\alpha -1} dx \ = \left ( \frac{1}{1-\beta t} \right )^{\alpha }\int_{0}^{\infty} \frac{y^{\alpha -1} e^{-y}}{\tau (\alpha )} dy \ \ , \ \ t< \frac{1}{\beta } \ = (1-\beta t)^{-\alpha } \ \ t< \frac{1}{\beta }

and differentiating and putting t=0 will give mean and variance as follows

EX=M'(t) \lvert_{t=0} =\alpha \beta , \ EX^{2} =M''(t) \lvert_{t=0}=\alpha (\alpha +1)\beta ^{2} , \ Var(X) =\alpha \beta ^{2}

2nd moment of gamma distribution

   The second moment of gamma distribution by differentiating moment generating function two times and putting the value of t=0 in second derivative of that function we will get

E[X^{2}]=\frac{\alpha (\alpha +1)}{\lambda ^{2}}

third moment of gamma distribution

                The third moment of gamma distribution we can find by differentiating the moment generating function three times and putting the value of t=0 in third derivative of the mgf we will get

E[X^{3}]=\frac{\alpha (\alpha +1)(\alpha +2)}{\lambda ^{3}}

or directly by integrating as

E[X^{3}]=\int_{0}^{\infty}x^{3}f_{X}(x)dx \ =\int_{0}^{\infty}\frac{\lambda ^{\alpha }x^{3+\alpha -1}e^{-\lambda x}}{\tau (\alpha )} dx \ = \frac{1}{\lambda^{3}}\int_{0}^{\infty} \frac{\lambda ^{\alpha +3}x^{3+\alpha -1}e^{-\lambda x}}{\tau (\alpha )} dx \ = \frac{\tau (\alpha +3)}{\lambda ^{3}\tau (\alpha )} \int_{0}^{\infty} \frac{\lambda ^{\alpha +3}x^{3+\alpha -1}e^{-\lambda x}}{\tau (\alpha +3 )} dx

 sigma for gamma distribution

   sigma or standard deviation of gamma distribution we can find by taking the square root of variance of gamma distribution of type

Var(X)=\alpha \beta ^{2}

or

Var(X)= \frac{\alpha}{\lambda ^{2}}

for any defined value of alpha, beta and lambda.

characteristic function of gamma distribution | gamma distribution characteristic function

      If the variable t in the moment generating function is purely an imaginary number as t=iω then the function is known as the characteristic function of gamma distribution denoted and expressed as

\phi_{X}(\omega )=M_{X}(i\omega )=E(e^{i\omega X})

as for any random variable the characteristic function will be

\phi_{X}(\omega )= \sum E(e^{i\omega X})f(x) \ \ (discrete \ \ variable) \ \phi_{X}(\omega )= \int_{-\infty}^{\infty} E(e^{i\omega X})f(x) \ \ (continuous\ \ variable) \

Thus for the gamma distribution the characteristic function by following the pdf of gamma distribution is

\phi_{X}(\omega )= (1-i\beta \omega )^{-\alpha }

following

\int_{-\infty}^{\infty}x^{\alpha -1}e^{-x}(1-i\beta t)/\beta dx =((1-i\beta t)/\beta )^{-\alpha }\int_{0}^{\infty}x^{\alpha -1}e^{-x} dx=\tau (\alpha )\beta ^{\alpha }(1-i\beta t)^{-\alpha }

There is another form of this characteristics function also if

M_{X}(t)=(1-\frac{2h}{n}t)^{-n/2}

then

\phi_{X} (t)=(1-\frac{2h}{n}it)^{-n/2}

sum of gamma distributions | sum of exponential distribution gamma

  To know the result of sum of gamma distribution we must first of all understand sum of independent random variable for the continuous random variable, for this let us have probability density functions for the continuous random variables X and Y then the cumulative distribution function for the sum of random variables will be

F_{X}+_{Y}(a)=P {(X +Y \leq a )} \ \
=\iint{X+Y\leq a} f_{X}(x)f_{Y}(y)dx dy \ =\int_{-\infty}^{\infty}\int_{-\infty}^{a-y}f_{X}(x)f_{Y}(y)dx dy \ =\int_{-\infty}^{\infty}\int_{-\infty}^{a-y}f_{X}(x)dx f_{Y}(y) dy \ =\int_{-\infty}^{\infty}F_{X} (a-y) f_{Y}(y)dy

differentiating this convolution of integral for the probability density functions of X and Y will give the probability density function for the sum of random variables as

F_{X}+<em>{Y}(a) =\frac{\mathrm{d} }{\mathrm{d} a}\int</em>{-\infty}^{\infty}F_{X}(a-y)f_{Y}(y) dy \ = \int_{-\infty}^{\infty}\frac{\mathrm{d} }{\mathrm{d} a}F_{X}(a-y)f_{Y}(y) dy \ = \int_{-\infty}^{\infty}f_{X}(a-y)f_{Y}(y) dy

Now let us prove if X and Y are the gamma random variables with respective density functions then there sum will also be gamma distribution with sum of same parameters

considering the probability density function of the form

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\tau (\alpha )} &\ x\geq 0 \0 &\ x < 0 \end{cases}

for the random variable X take alpha as  s and for random variable Y take alpha as t so using the probability density for the sum of random variables we have

F_{X}+<em>{Y}(a) =\frac{1}{\Gamma (s)\Gamma (t)}\int</em>{0}^{a}\lambda e^{-\lambda (a-y)} (\lambda (a-y))^{s-1}\lambda e^{-\lambda y} (\lambda y)^{t-1} dy

here C is independent of a , now the value will be

F_{X}+_{Y}(a) =\frac{\lambda e^{-\lambda a}(\lambda a)^{s+t-1}}{\Gamma (s+t)}

which represent the probability density function of sum of X and Y and which is of the Gamma distribution, hence the sum of the gamma distribution also represents the gamma distribution by respective sum of parameters.

mode of gamma distribution

    To find the mode of gamma distribution let us consider the probability density function as

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\Gamma (\alpha )} &\ x\geq 0 \ \0 &\ x < 0 \end{cases}

now differentiate this pdf with respect to x, we will get the differentiation as

= \frac{\lambda ^{\alpha }}{\Gamma (\alpha )}e^{-\lambda x} [(\alpha -1)x^{\alpha -2}-\lambda x^{\alpha -1}] = \frac{\lambda ^{\alpha }}{\Gamma (\alpha )}e^{-\lambda x} [(\alpha -1)x^{\alpha -2}[(\alpha -1)-\lambda x]]

this will be zero for x=0 or x=(α -1)/λ

so these are only critical points at which our first derivative will be zero if alpha greater than or equal to zero then x=0 will not be mode because this makes pdf zero so mode will be (α -1)/λ

and for alpha strictly less than one the derivative decreases from infinity to zero as x increases from zero to infinity so this is not possible hence the mode of gamma distribution is

\textbf{mode} =\mathbf{\frac{\alpha -1}{\lambda }}

median of gamma distribution

The median of the gamma distribution can be found with the help of inverse gamma distribution as

\textbf{median} ={\frac{\1}{\lambda }\gamma ^{-1} \left ( \alpha , \frac{\Gamma (\alpha )}{2} \right ) }

or

\textbf{median} =\beta \gamma ^{-1}\left ( \alpha , \frac{\Gamma (\alpha )}{2} \right )

provided

n+\frac{2}{3}< median(n)< min(n+log2,n+\frac{2}{3}+(2n+2)^{-1})

which gives

median(n)=n+\frac{2}{3}+\frac{8}{405n} -\frac{64}{5103n^{2}}+…..

gamma distribution shape

     Gamma distribution takes different shape depending on the shape parameter when shape parameter is one gamma distribution is equal to the exponential distribution but when we vary the shape parameter the skewness of the curve of gamma distribution decreases as the increase in the shape parameter, in another words the shape of the curve of gamma distribution changes as per the standard deviation .

skewness of gamma distribution

    skewness of any distribution can be observed by observing the probability density function of that distribution and skewness coefficient

\gamma <em>{1}=\frac{E\left [ \left ( X -\mu \right )^{3} \right ]}{\sigma ^{3}} =\frac{\mu</em>{3}}{\sigma ^{3}}

for the gamma distribution we have

E(X^{k})=\frac{(\alpha +k-1)(\alpha +k-2)…….\alpha }{\beta ^{k}}

so

\gamma _{1}=\frac{ \frac{(\alpha +2)(\alpha +1)\alpha }{\beta ^{3}}-3\frac{\alpha }{\beta }\frac{\alpha }{\beta ^{3}}-\frac{\alpha ^{3}}{\beta ^{3}} }{{\left ( \frac{\alpha }{\beta ^{2}} \right )}^{\frac{3}{2}}}=\frac{2}{\sqrt{\alpha }}

this shows the skewness depends on alpha only if alpha increases to infinity curve will be more symmetric and sharp and when alpha goes to zero the gamma distribution density curve positively skewed which can be observed in the density graphs.

generalized gamma distribution | shape and scale parameter in gamma distribution | three parameter gamma distribution | multivariate gamma distribution

f(x)=\frac{ (\frac{(x-\mu)}{\beta })^{\gamma -1}e^{-\frac{x-\mu}{\beta }}}{\beta \Gamma (\gamma )} \ \ x\geq \mu ; \gamma ,\beta > 0

where γ, μ and β are the shape, location and scale parameters respectively, by assigning specific values to these parameters we can get the two parameter gamma distribution specifically if we put μ=0, β=1 then we will get standard gamma distribution as

f(x)=\frac{x^{\gamma -1}e^{-x}}{\Gamma(\gamma)} \ \ x\geq 0 ; \gamma > 0

using this 3 parameter gamma distribution probability density function we can find the expectation and variance by following there definition respectively.

Conclusion:

The concept of reciprocal of gamma distribution that is inverse gamma distribution in comparison with gamma distribution and measure of central tendencies of gamma distribution with the help of moment generating function were the focus of this article, if you require further reading go through suggested books and links. For more post on mathematics, visit our mathematics page.

https://en.wikipedia.org/wiki/Gamma_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

About DR. MOHAMMED MAZHAR UL HAQUE

I am DR. Mohammed Mazhar Ul Haque , Assistant professor in Mathematics. Having 12 years of experience in teaching. Having vast knowledge in Pure Mathematics , precisely on Algebra. Having the immense ability of problem designing and solving. Capable of Motivating candidates to enhance their performance.
I love to contribute to Lambdageeks to make Mathematics Simple , Interesting & Self Explanatory for beginners as well as experts.
Let's connect through LinkedIn - https://www.linkedin.com/in/dr-mohammed-mazhar-ul-haque-58747899/