Gamma Distribution Exponential Family | Its 5 Important Properties

Content

  1. Special form of Gamma distributions and relationships of Gamma distribution
  2. Gamma distribution exponential family
  3. Relationship between gamma and normal distribution
  4. Poisson gamma distribution | poisson gamma distribution negative binomial
  5. Weibull gamma distribution
  6. Application of gamma distribution in real life | gamma distribution uses | application of gamma distribution in statistics 
  7. Beta gamma distribution | relationship between gamma and beta distribution
  8. Bivariate gamma distribution
  9. Double gamma distribution
  10. Relation between gamma and exponential distribution | exponential and gamma distribution | gamma exponential distribution
  11. Fit gamma distribution
  12. Shifted gamma distribution
  13. Truncated gamma distribution
  14. Survival function of gamma distribution
  15. MLE of gamma distribution | maximum likelihood gamma distribution | likelihood function of gamma distribution
  16. Gamma distribution parameter estimation method of moments | method of moments estimator gamma distribution
  17. Confidence interval for gamma distribution
  18. Gamma distribution conjugate prior for exponential distribution | gamma prior distribution | posterior distribution poisson gamma
  19. Gamma distribution quantile function
  20. Generalized gamma distribution
  21. Beta generalized gamma distribution

Special form of Gamma distributions and relationships of Gamma distribution

  In this article we will discuss the special forms of gamma distributions and the relationships of gamma distribution with different continuous and discrete random variables also some estimation methods  in sampling of population using gamma distribution is briefly discuss.

Gamma distribution exponential family

  The gamma distribution exponential family and it is two parameter exponential family which is largely and applicable family of distribution as most of real life problems can be modelled in the gamma distribution exponential family and the quick and useful calculation  within the exponential family can be done easily, in the two parameter if we take probability density function as

\frac{e^{-\lambda /x}x^{\alpha -1}}{\lambda ^{\alpha }\Gamma (\alpha )}I_{x}> 0

if we restrict the known value of α (alpha) this two parameter family will reduce to one parameter exponential family

f(x/\lambda )=e^{-\lambda /x}-a \ \ log\lambda \frac{x^{\alpha -1}}{\Gamma (\alpha) }I_{x}> 0

and for λ (lambda)

f(x|\alpha )=e^{\alpha logx -a(log\lambda)}- log{\Gamma(\alpha) } e^{-\frac{x}{\lambda }}I_{x}> 0

Relationship between gamma and normal distribution

  In the probability density function of gamma distribution if we take alpha nearer to 50 we will get the nature of density function as

Gamma distribution exponential family
Gamma distribution exponential family

even the shape parameter in gamma distribution we are increasing which is resulting in similarity of normal distribution normal curve, if we tend shape parameter alpha tends to infinity the gamma distribution will be more symmetric and normal but as alpha tends to infinity value of x in gamma distribution will tends to minus infinity which result in semi infinite support of gamma distribution infinite hence even gamma distribution becomes symmetric but not same with normal distribution.

poisson gamma distribution | poisson gamma distribution negative binomial

   The poisson gamma distribution and binomial distribution are the discrete random variable whose random variable deals with the discrete values specifically success and failure in the form of Bernoulli trials which gives random success or failure as a result only, now the mixture of Poisson and gamma distribution also known as negative binomial distribution is the outcome of the repeated trial of Bernoulli’s trial, this can be parameterize in different way as if r-th success occurs in number of trials then it can be parameterize as

P(X_{1}=x|p,r)=\binom{x-1}{r-1}p^{r}(1-p)^{x-r}

and if the number of failures before the r-th success then it can be parameterize as

P(X_{2}=x|p,r)=\binom{x+r-1}{x}p^{r}(1-p)^{x}

and considering the values of r and p

r=\frac{\mu^{2}}{\sigma ^{2}-\mu}

p=\frac{r}{r+\mu}

the general form of the parameterization for the negative binomial or poisson gamma distribution is

P(X=x)=\binom{x+r-1}{x}p^{r}(1-p)^{x} \ \ x=0,1,2,…

and alternative one is

P(X=x)=\binom{x+r-1}{x} \left ( \frac{\alpha }{\alpha +1} \right )^{r} \left ( \frac{1}{\alpha +1} \right )^{x} \ \ x=0,1,2,…

this binomial distribution is known as negative because of the coefficient

\binom{x+r-1}{x} =\frac{(x+r-1)(x+r-2)….r}{x!} \ = (-1)^{x}\frac{(-r-(x-1))(-r-(x-2))…..(-r)}{x!} \ = (-1)^{x}\frac{(-r)(-r-1)…. -r-(x-1))}{x!} \ =(-1)^{x}\binom{-r}{x}

and this negative binomial or poisson gamma distribution is well define as the total probability we will get as one for this distribution

1=p^{r}p^{-r} \ =p^{r}(1-q)^{-r} \ =p^{r} \sum_{0}^{\infty}\binom{-r}{x}(-q)^{x} \ =p^{r} \sum_{0}^{\infty} (-1)^{x} \binom{-r}{x}(q)^{x} \ =\sum_{0}^{\infty} \binom{x+r-1}{x}p^{r}q^{x} \

The mean and variance for this negative binomial or poisson gamma distribution is

E(X)=\frac{r(1-p)}{p}

var(X)=\frac{r(1-p)}{p^{2}}

the poisson and gamma relation we can get by the following calculation

P(X=x)=\frac{1}{\Gamma (\alpha) \beta ^{\alpha }}\int_{0}^{\infty}\frac{e^{-\lambda }\lambda ^{x}}{x!}\lambda ^{\alpha -1}e^{-\lambda /\beta } d\lambda

=\frac{1}{x!\Gamma (\alpha)\beta ^{\alpha }}\int_{0}^{\infty}\lambda ^{\alpha +x-1}e^{-\lambda (1+1/\beta )}d\lambda

=\frac{1}{\Gamma (x+1)\Gamma (\alpha )\beta ^{\alpha }} \Gamma (\alpha +x)\left ( \frac{\beta }{\beta +1} \right )^{\alpha +x}

=\binom{\alpha +x-1}{x}\left ( \frac{1}{\beta +1} \right )^{\alpha } \left ( 1-\frac{1}{\beta +1} \right )^{x}

Thus negative binomial is the mixture of poisson and gamma distribution and this distribution is used in day to day problems modelling where discrete and continuous mixture we require.

Gamma distribution exponential family
Gamma distribution exponential family

Weibull gamma distribution

   There are generalization of exponential distribution which involve Weibull as well as gamma distribution as the Weibull distribution has the probability density function as

f(x) = \begin{cases} \ 0 & x \leq v \ \\ \frac{\beta }{\alpha}\left ( \frac{x-v}{\alpha } \right )^{\beta -1} exp{{ -\left ( \frac{x-v}{\alpha } \right )^{\beta }}} &\ x > v \end{cases}

and cumulative distribution function as

F(x) = \begin{cases} \ 0 &\ x \leq v \\ \ 1- exp { -\left ( \frac{x-v}{\alpha } \right )^{\beta } } & \ x > v \end{cases}

where as pdf and cdf of gamma distribution is already we discussed above the main connection between Weibull and gamma distribution is both are generalization of exponential distribution the difference between them is when power of variable is greater than one then Weibull distribution gives quick result while for less than 1 gamma gives quick result.

     We will not discuss here generalized Weibull gamma distribution that require separate discussion.

application of gamma distribution in real life | gamma distribution uses | application of gamma distribution in statistics 

  There are number of  application where gamma distribution is used to model the situation such as insurance claim to aggregate, rainfall amount accumulation, for any product its manufacturing and distribution, the crowd on specific web,  and in telecom exchange etc. actually the gamma distribution give the wait time prediction till next event for nth event. There are number of application of gamma distribution in real life.

beta gamma distribution | relationship between gamma and beta distribution

    The beta distribution is the random variable with the probability density function

f(x) = \begin{cases} \ \frac{1}{B(a,b)}x^{a-1}(1-x)^{b-1} &\ 0< x < 1 \\ \ 0 &\ otherwise \end{cases}

where

B(a,b)= \int_{0}^{1}x^{a-1}(1-x)^{b-1} dx

which has the relationship with gamma function as

B(a,b)= \frac{\Gamma (a)\Gamma (b)}{\Gamma (a+b)}

and beta distribution related to gamma distribution as if X be gamma distribution with parameter alpha and beta as one and Y be the gamma distribution with parameter alpha as one and beta then the random variable X/(X+Y) is beta distribution.

or If X is Gamma(α,1) and Y is Gamma (1, β) then the random variable X/(X+Y) is Beta (α, β) 

and also

\mathbf{\lim_{n \to \infty} nB(k,n) =\Gamma (k,1)}

bivariate gamma distribution

     A two dimensional or bivariate random variable is continuous if there exists a function f(x,y) such that the joint distribution function

F(x,y)=\int_{-\infty}^{x}\left [ \int_{-\infty}^{y}f(u,v) dv \right ]du

where

F(+\infty,+\infty)=\lim_{x \to +\infty, y \to +\infty } \int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dvdu

= \int_{-\infty}^{\infty}\int_{-\infty}^{\infty} f(u,v)dvdu =1

and the joint probability density function obtained by

\frac{\partial^2 F(x,y)}{\partial x \partial y }= f(x,y)

there are number of bivariate gamma distribution one of them is the bivariate gamma distribution with probability density function as

f(x,y)=\frac{\beta ^{\alpha +\gamma }}{\Gamma (\alpha )\Gamma (\gamma )}x^{\alpha -1}(y-x)^{\gamma -1}e^{-\beta y}, \ \ 0< x 0

double gamma distribution

  Double gamma distribution is one of the bivariate distribution with gamma random variables having parameter alpha and one with joint probability density function as

f_{Y_{1}{Y_{2}}}(y_{1},y_{2})=\frac{1}{\Gamma (\alpha <em>{1})\Gamma (\alpha </em>{2})}y_{1}^{\alpha_{1} -1}y_{2}^{\alpha_{2} -1} exp(-y_{1} -y_{2}), y_{1}> 0, y_{2}> 0

this density forms the double gamma distribution with respective random variables and the moment generating function for double gamma distribution is

\mathbf{M_{Y_{1}Y_{2}(t,s)}=\left ( \frac{1}{1-t} \right )^{\alpha <em>{1}} \left (\frac{1}{1-s} \right )^{\alpha </em>{2}} }

relation between gamma and exponential distribution | exponential and gamma distribution | gamma exponential distribution

   since the exponential distribution is the distribution with the probability density function

f(x) = \begin{cases} \ \lambda e^{-\lambda x} &\ if \ \ x\geq 0 \ \ 0 &\ \ \ if x< 0 \end{cases}

and the gamma distribution has the probability density function

f(x) = \begin{cases} \frac{\lambda e^{-\lambda x}(\lambda x)^{\alpha -1}}{\tau (\alpha )} &\ x\geq 0 \ \ 0 &\ x < 0 \end{cases}

clearly the value of alpha if we put as one we will get the exponential distribution, that is the gamma distribution is nothing but the generalization of the exponential distribution, which predict the wait time till the occurrence of next nth event while exponential distribution predict the wait time till the occurrence of the next event.

fit gamma distribution

   As far as fitting the given data in the form of gamma distribution imply finding the two parameter probability density function which involve shape, location and scale parameters so finding these parameters with different application and calculating the mean, variance, standard deviation and moment generating function is the fitting of gamma distribution, since different real life problems will be modelled in gamma distribution so the information as per situation must be fit in gamma distribution for this purpose various technique in various environment is already there e.g in R, Matlab, excel etc.

shifted gamma distribution

     There are as per application and need whenever the requirement of shifting the distribution required from two parameter gamma distribution the new generalized three parameter or any another generalized gamma distribution shift the shape location and scale , such gamma distribution is known as shifted gamma distribution

truncated gamma distribution

     If we restrict the range or domain of the gamma distribution for the shape scale and location parameters the restricted gamma distribution is known as truncated gamma distribution based on the conditions.

survival function of gamma distribution

                The survival function for the gamma distribution is defined the function s(x) as follows

S(x)=1-\frac{\Gamma_{x} (\gamma )}{\Gamma (\gamma )} \ \ x\geq 0 ; \gamma > 0 \ where \ \ \Gamma_{x}(a) =\int_{0}^{x} t^{a-1}e^{-t} dt

mle of gamma distribution | maximum likelihood gamma distribution | likelihood function of gamma distribution

we know that the maximum likelihood take the sample from the population as a representative and this sample consider as an estimator for the probability density function to maximize for the parameters of density function, before going to gamma distribution recall some basics as for the random variable X the probability density function with theta as parameter has likelihood function as

L(\theta ; x_{1},x_{2},…….x_{n}) =f_{\theta }(x_{1}, x_{2},……x_{n} ),

this we can express as

L(\theta ; x_{1},x_{2},…….x_{n}) =\prod_{i=1}^{n}f\theta (x_{i})

and method of maximizing this likelihood function can be

L(\theta ; x_{1},x_{2},…….x_{n}) =sup_{(\theta \in \theta )} L(\theta ; x_{1},x_{2},…….x_{n})

if such theta satisfy this equation, and as log is monotone function we can write in terms of log

logL(\theta ; x_{1},x_{2},…….x_{n}) =sup_{(\theta \in \theta )} log L(\theta ; x_{1},x_{2},…….x_{n})

and such a supremum exists if

{\frac{\partial logL(\hat{\theta; x_{1}…..x_{n} }) }{\partial \theta_{j} }}=0, \ \ j=1,2,…k, \ \ \theta =(\theta <em>{1}, …..\theta </em>{k})

now we apply the maximum likelihood for the gamma distribution function as

f(x | \alpha ,\beta )=\prod_{i=1}^{n}f(x_{i} | \alpha ,\beta )=\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} \right )^{n}\prod_{i=1}^{n}x_{i}^{\alpha -1} exp(-\beta x_{i}) \propto \beta ^{n\alpha } exp\left ( -\beta \sum_{i=1}^{n}x_{i} \right )

the log likelihood of the function will be

\imath (\beta | \alpha ,x) \propto n\alpha log\beta -\beta n \bar{x} \propto \alpha log\beta - \bar{x} \beta

so is

0=\frac{\partial l}{\partial \beta } =\frac{\alpha }{\beta } -\bar{x},

and hence

\hat{\beta }= \frac{\alpha }{\bar{x}}

This can be achieved also as

\textbf{L}(\alpha ,\beta | x)=\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} x_{1}^{\alpha -1} e^{-\beta x_{1}} \right )……..\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} x_{n}^{\alpha -1} e^{-\beta x_{n}} \right ) =\left ( \frac{\beta ^{\alpha }}{\Gamma (\alpha )} \right)^{n} (x_{1} (x_{2}……(x_{n})^{\alpha -1} e^{-\beta }(x_{1}+x_{2}+……x_{n})

by

In\textbf{L}(\alpha ,\beta | x)=n(\alpha In\beta -In\Gamma (\alpha ))+(\alpha -1)\sum_{i=1}^{n} Inx_{i} -\beta \sum_{i=1}^{n}x_{i}

and the parameter can be obtained by differentiating

\frac{\partial }{\partial \alpha }In\textbf{L}(\hat{\alpha }, \hat{\beta } |x)=n(In\hat{\beta }-\frac{\mathrm{d} }{\mathrm{d} \alpha } In\Gamma (\hat{\alpha }))+\sum_{i=1}^{n} x_{i}=0

\frac{\partial }{\partial \beta }In\textbf{L}(\hat{\alpha }, \hat{\beta } |x)=n \frac{\hat{\alpha }}{\hat{\beta }} -\sum_{i=1}^{n}x_{i}=0 \ \ or \ \ \bar{x}=\frac{\hat{\alpha }}{\hat{\beta }}

n(In \hat{\alpha } -In\hat{x} -\frac{\mathrm{d} }{\mathrm{d} \alpha } In\Gamma (\hat{\alpha }) )+\sum_{i=1}^{n} Inx_{i}=0

gamma distribution parameter estimation method of moments | method of moments estimator gamma distribution

   We can calculate the moments of the population and sample with the help of expectation of nth order respectively, the method of moment equates these moments of distribution and sample to estimate the parameters, suppose we have sample of gamma random variable with the probability density function as

f(x|\alpha ,\lambda )=\frac{\lambda ^{\alpha }}{\Gamma (\alpha )}x^{\alpha -1}e^{-\lambda x} , \ \ x\geq 0

we know the first tow moments for this probability density function is

\mu <em>{1}=\frac{\alpha }{\lambda } \ \ \ \mu </em>{2}=\frac{\alpha (\alpha +1) }{\lambda ^{2}}

so

{\lambda } =\frac{\alpha}{\mu _{1}}

we will get from the second moment if we substitute lambda

\frac{\mu <em>{2}}{\mu </em>{1}^{2}}=\frac{\alpha +1}{\alpha }

and from this value of alpha is

\alpha=\frac{\mu <em>{1}^{2}}{\mu </em>{2}-\mu _{1}^{2}}

and now lambda will be

\lambda =\frac{\mu <em>{1}^{2}}{\mu </em>{2}-\mu <em>{1}^{2}} \frac{1}{\mu </em>{1}} \ \ \ \ \ =\frac{\mu <em>{1}^{2}}{\mu </em>{2}-\mu _{1}^{2}}

and moment estimator using sample will be

\hat{\lambda }=\frac{\bar{X}}{\hat{\sigma }^{2}}

confidence interval for gamma distribution

   confidence interval for gamma distribution is the way to estimate the information and its uncertainty which tells the interval is expected to have the true value of the parameter at what percent, this confidence interval is obtained from the observations of random variables, since it is obtained from random it itself is random to get the confidence interval for the gamma distribution there are different techniques in different application that we have to follow.

gamma distribution conjugate prior for exponential distribution | gamma prior distribution | posterior distribution poisson gamma

     The posterior and prior distribution  are the terminologies of Bayesian probability theory and they are conjugate to each other, any two distributions are conjugate if the posterior of one distribution is another distribution, in terms of theta let us show that gamma distribution is conjugate prior to the exponential distribution

if the probability density function of gamma distribution in terms of theta is as

f_{\Theta }(\theta )=\frac{\beta ^{\alpha }\theta ^{\alpha -1}e^{-\beta \theta }}{\Gamma (\alpha )}

assume the distribution function for theta is exponential from given data

f_{X_{i}|\Theta }(x_{i}|\theta )=\theta e^{-\theta x_{i}}

so the joint distribution will be

f(X|\Theta )=\theta^{n} e^{-\theta \sum x_{i}}

and using the relation

\textbf{Posterior} \propto \textbf{Likelihood} \ \ X \ \ \textbf{Prior}

we have

f_{\Theta |X}(\theta |x) \propto \theta ^{n}e^{-\theta \sum x_{i}} x \theta ^{\alpha -1}e^{-\beta \theta }

=\theta ^{n +\alpha -1} e^{-\theta (\sum x_{i} + \beta )}

\therefore \theta| X \sim \textbf{Gamma}(n+\alpha , \sum x_{i} +\beta )

which is

f\Lambda | X (\lambda |x) \propto \lambda ^{\sum x_{i}+\alpha -1} e^{-(n+\beta )\lambda }

so gamma distribution is conjugate prior to exponential distribution as posterior is gamma distribution.

gamma distribution quantile function

   Qauntile function of gamma distribution will be the function that gives the points in gamma distribution which relate the rank order of the values in gamma distribution, this require cumulative distribution function and for different language different algorithm and functions for the quantile of gamma distribution.

generalized gamma distribution

    As gamma distribution itself is the generalization of exponential family of distribution adding more parameters to this distribution gives us generalized gamma distribution which is the further generalization of this distribution family, the physical requirements gives different generalization one of the frequent one is using the probability density function as

f(x)=\frac{(\frac{x-\mu }{\beta })^{\gamma -1} exp (-\frac{x-\mu }{\beta })}{\beta \Gamma (\gamma )} \ \ x\geq \mu ;\gamma ,\beta > 0

the cumulative distribution function for such generalized gamma distribution can be obtained by

F(x)=\frac{\Gamma _{x}(\gamma )}{\Gamma (\gamma )} \ \ x\geq 0, \gamma > 0

where the numerator represents the incomplete gamma function as

\Gamma <em>{x}(a)=\int</em>{0}^{\infty}t^{a-1}e^{-t}dt

using this incomplete gamma function the survival function for the generalized gamma distribution can be obtained as

S(x)=1-\frac{\Gamma _{x}(\gamma )}{\Gamma (\gamma )} \ \ x\geq 0, \gamma > 0

another version of this three parameter generalized gamma distribution having probability density function is

f(t)=\frac{\beta }{\Gamma (k)\theta } \left ( \frac{t}{\theta } \right )^{k\beta -1} e^{-\left ( \frac{t}{\theta } \right )^{\beta }}

where k, β, θ are the parameters greater than zero, these generalization has convergence issues to overcome the Weibull parameters replaces

\mu =In(\theta )+\frac{1}{\beta } . In\left ( \frac{1}{\lambda ^{2}} \right ) \ \ \ \sigma =\frac{1}{\beta \sqrt{k}} \ \ \ \lambda =\frac{1}{\sqrt{k}} \ \ \ Where \ \ -\infty< \mu  0 , 0< \lambda

using this parameterization the convergence of the density function obtained so the more generalization for the gamma distribution with convergence is the distribution with probability density function as

F(x) = \begin{cases}\frac{|\lambda |}{\sigma .t}.\frac{1}{\Gamma \left ( \frac{1}{\lambda ^{2}} \right )}.e\left [ \frac{\lambda .\frac{In(t)-\mu }{\sigma }+In\left ( \frac{1}{\lambda ^{2}} \right )-e^{\lambda.\frac{In.(t)-\mu }{\sigma }} }{\lambda ^{2}} \right ] &\text{if } \lambda \neq 0\\\frac{1}{t.\sigma \sqrt{2 \pi }}e^{-\frac{1}{2}\left ( \frac{In(t)-\mu }{\sigma } \right )^{2}} &\text{if } \lambda =0\end{cases}

Beta generalized gamma distribution

   The gamma distribution involving the parameter beta in the density function because of which sometimes gamma distribution is known as the beta generalized gamma distribution with the density function

g_{\beta ,\gamma ,c}(x)=\frac{c\lambda ^{c\beta }}{\Gamma (\beta )}x^{c\beta -1}exp\left { -(\lambda x)^{c} \right }, \ \ x> 0

with cumulative distribution function as

G_{\beta ,\gamma ,c}(x)=\frac{\gamma (\beta ,(\lambda x)^{c})}{\Gamma (\beta )},

which is already discussed in detail in the discussion of gamma distribution, the further beta generalized gamma distribution is defined with the cdf as

F(x)=I_{G}(x)(a,b)=\frac{1}{B(a,b)}\int_{0}^{G(x)}\omega ^{a-1}(1-\omega )^{b-1}d\omega ,

where B(a,b) is the beta function , and the probability density function for this can be obtained by differentiation and the density function will be

f(x)=\frac{g(x)}{B(a,b)}G(x)^{a-1}\left { 1-G(x) \right }^{b-1}

here the G(x) is the above defined cumulative distribution function of gamma distribution, if we put this value then the cumulative distribution function of beta generalized gamma distribution is

F(x)=I_{\gamma (\beta ,(\lambda x)^{c})/\Gamma (\beta )}(a,b)=\frac{1}{B(a,b)}\int_{0}^{{\gamma (\beta ,(\lambda x)^{c})/\Gamma (\beta )}}\omega ^{a-1} (1-\omega )^{b-1} d\omega

and the probability density function

f(x)=\frac{c\lambda ^{c\beta }x^{c\beta -1}exp\left { -(\lambda x)^{c} \right }\gamma (\beta ,(\lambda x)^{c})^{a-1}\left { \Gamma (\beta )-\gamma (\beta ,(\lambda x)^{c}) \right }^{b-1}}{B(a,b)\Gamma (\beta )^{a+b-1}}

the remaining properties can be extended for this beta generalized gamma distribution with usual definitions.

Conclusion:

There are different forms and generalization of gamma distribution and Gamma distribution exponential family as per the real life situations so possible such forms and generalizations were covered in addition with the estimation  methods of gamma distribution in population sampling of information, if you require further reading on Gamma distribution exponential family, please go through below link and books. For more topics on Mathematics please visit our page.

https://en.wikipedia.org/wiki/Gamma_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

About DR. MOHAMMED MAZHAR UL HAQUE

I am DR. Mohammed Mazhar Ul Haque , Assistant professor in Mathematics. Having 12 years of experience in teaching. Having vast knowledge in Pure Mathematics , precisely on Algebra. Having the immense ability of problem designing and solving. Capable of Motivating candidates to enhance their performance.
I love to contribute to Lambdageeks to make Mathematics Simple , Interesting & Self Explanatory for beginners as well as experts.
Let's connect through LinkedIn - https://www.linkedin.com/in/dr-mohammed-mazhar-ul-haque-58747899/