Geometric Random Variable | Its important Characteristic

Some additional discrete random variable and its parameters

    The discrete random variable with its probability mass function combines the distribution of the probability and depending on the nature of the discrete random variable the probability distribution may have different names like binomial distribution, Poisson distribution etc., as already we has seen the types of discrete random variable, binomial random variable and Poisson random variable with the statistical parameters for these random variables. Most of the random variables are characterized depending on the nature of probability mass function, now we will see some more type of discrete random variables and its statistical parameters.

Geometric Random variable and its distribution

      A geometric random variable is the random variable which is assigned for the independent trials performed till the occurrence of success after continuous failure i.e if we perform an experiment n times and getting initially all failures n-1 times and then at the last we get success.  The probability mass function for such a discrete random variable will be

P(X=n)=(1-p)^{n-1} \times p, \;\; for\; n=1,2,3,4……

In this random variable the necessary condition for the outcome of the independent trial is the initial all the result must be failure before success.

Thus in brief the random variable which follows above probability mass function is known as geometric random variable.

It is easily observed that the sum of such probabilities will be 1 as the case for the probability.

\sum_{n=1}^{\infty}P(X=n)=p\sum_{n=1}^{\infty}( 1-p)^{n-1}=p*\frac{1}{1-(1-p))}=1

Thus the geometric random variable with such probability mass function is geometric distribution.

Expectation of Geometric random variable

    As expectation is one of the important parameter for the random variable so the expectation for the geometric random variable will be 

E[X]=\frac{1}{p} \ \ ,

where p is the probability of success.


E[X]= \sum_{n=1}^{\infty}X *P(X=n)

let the probability of failure be q=1-p


\\E[X]= \sum_{i=1}^{\infty}i *P(X=n)

\E[X]= \sum_{i=1}^{\infty}i *q^{i-1}p

\=\sum_{i=1}^{\infty}(i-1+1) *q^{i-1}p

\=\sum_{i=1}^{\infty}(i-1) *q^{i-1}p+\sum_{i=1}^{\infty}q^{i-1}p

\=\sum_{j=1}^{\infty}(j) *q^{j}p+1

\=q\sum_{j=1}^{\infty}(j) *q^{j-1}p+1




thus we get

E[X]=\frac{1}{p} \ \ ,

Thus the expected value or mean of the given information we can follow by just inverse value of probability of success in geometric random variable.

Variance and standard deviation of the geometric random variable

In similar way we can obtain the other important statistical parameter variance and standard deviation for the geometric random variable and it would be




To obtain these values we use the relation


So let us calculate first


set\ \ q=1-p

E[X]= \sum_{n=1}^{\infty}X *P(X=n)

\\E[X]= \sum_{i=1}^{\infty}i *q^{i-1}p


\ \ E[X^{2}]= \sum_{i=1}^{\infty}i^{2} *q^{i-1}p

\\E[X^{2}]= \sum_{i=1}^{\infty}(i-1+1)^{2} *q^{i-1}p

\\E[X^{2}]= \sum_{i=1}^{\infty}(i-1)^{2} *q^{i-1}p+\sum_{i=1}^{\infty}2(i-1) *q^{i-1}p+\sum_{i=1}^{\infty}q^{i-1}p

\\E[X^{2}]= \sum_{j=0}^{\infty}(j)^{2} *q^{j}p+2\sum_{j=0}^{\infty}j *q^{j}p+1

\\E[X^{2}]=q \sum_{j=1}^{\infty}(j)^{2} *q^{j-1}p+2q\sum_{j=1}^{\infty}j *q^{j-1}p+1



since \ \ E[X]= \frac{1}{p}, 1-q=p

pE[X^{2}]=\frac{2q }{p}+1

\\ E[X^{2}]=\frac{2q+p }{p^{2}}=\frac{q+q+p }{p^{2}}=\frac{q+1}{p^{2}}

So we now write

Var(X)= \frac{q+1}{p^{2}}-\frac{1}{p^{2}}=\frac{q}{p^{2}}=\frac{1-p}{p^{2}}

thus we have

var(X)=\frac{1-p}{p^{2}} \ \ and


Negative Binomial Random Variable

    This random falls in another discrete random variable because of the nature of its probability mass function, in the negative binomial random variable and in its distribution from n trial of an independent experiment r successes must be obtained initially

    \[\[P\{X=n\}=\left(\begin{array}{c} n-1 \\ r-1 \end{array}\right) p^{r}(1-p)^{n-r} \quad n=r, r+1, \ldots \\ \left(\begin{array}{c} n-1 \\ r-1 \end{array}\right) p^{r-1}(1-p)^{n-r}\]\]

In other words a random variable with above probability mass function is negative binomial random variable with parameters (r,p), note that if we restrict r=1 the negative binomial distribution turns to geometric distribution, we can specifically check

    \[\sum_{n=r}^{\infty} P\{X=n\}=\sum_{n=r}^{\infty}\left(\begin{array}{c} n-1 \\ r-1 \end{array}\right) p^{r}(1-p)^{n-r}=1\]

Expectation, Variance and standard deviation of the negative binomial random variable

The expectation and variance for the negative binomial random variable will be

E[X]=\frac{r}{p} \quad \operatorname{Var}(X)=\frac{r(1-p)}{p^{2}}

with the help of probability mass function of negative binomial random variable and definition of expectation we can write

\\ E\left[X^{k}\right]=\sum_{n=r}^{\infty} n^{k}\left(\begin{array}{c} n-1 \\ r-1 \end{array}\right) p^{r}(1-p)^{n-r} \\ =\frac{r}{p} \sum_{n=r}^{\infty} n^{k-1}\left(\begin{array}{c} n \\ r \end{array}\right) p^{r+1}(1-p)^{n-r} \text { since } n\left(\begin{array}{c} n-1 \\ r-1 \end{array}\right)=r\left(\begin{array}{c} n \\ r \end{array}\right) \\ =\frac{r}{p} \sum_{m=r+1}^{\infty}(m-1)^{k-1}\left(\begin{array}{c} m-1 \\ r \end{array}\right) p^{r+1}(1-p)^{m-(r+1)} \begin{array}{c} \text { by setting } \\ m=n+1 \end{array} \\ =\frac{r}{p} E\left[(Y-1)^{k-1}\right]

here Y is nothing but the negative binomial random variable now put k=1 we will get

    \[\\ E[X]=\frac{r}{p} \ \ and \ \ k=2 \ \ \\ E\left[X^{2}\right] =\frac{r}{p} E[Y-1] \\ =\frac{r}{p}\left(\frac{r+1}{p}-1\right)\]

Thus for variance


    \[\\ \operatorname{Var}(X) =\frac{r}{p}\left(\frac{r+1}{p}-1\right)-\left(\frac{r}{p}\right)^{2} \\=\frac{r(1-p)}{p^{2}}\]

Exxample: If a die is throw to get 5 on the face of die till we get 4 times this value find the expectation and variance.Sine the random variable associated with this independent experiment is negative binomial random variable for r=4 and probability of success p=1/6 to get 5 in one throw

as we know for negative binomial random variable 

    \[E[X]=\frac{r}{p} \\ \operatorname{Var}(X)=\frac{r(1-p)}{p^{2}}\]


    \[\\ E[X] =24 \\ \operatorname{Var}(X) =\frac{4\left(\frac{5}{6}\right)}{\left(\frac{1}{6}\right)^{2}}=120\]

Hypergeometric random variable

       If we particularly choosing a sample of size n from a total N having m and N-m two types then the random variable for first was selected have the probability mass function as

\\ P\{X=i\}=\frac{\left(\begin{array}{c}m \\ i\end{array}\right)\left(\begin{array}{c}N-m \\ n-i\end{array}\right)}{\left(\begin{array}{l}N \\ n\end{array}\right)} \quad i=0,1, \ldots, n

for example suppose we have a sack from which a sample of size n books taken randomly without replacement containing N books of which m are mathematics and N-m are physics, If we assign the random variable to denote the number of mathematics books selected then the probability mass function for such selection will be as per above probability mass function.

  In other words the random variable with the above probability mass function is known to be the hypergeometric random variable.

Example: From a lot of some electronic components if 30% of the lots have four defective components and 70% have one defective, provided size of lot is 10 and to accept the lot three random components will be chosen and checked if all are non-defective then lot will be selected. Calculate that from the total lot what percent of lot get rejected.

here consider A is the event to accept the lot

\\ P\{X=i\}=\frac{\left(\begin{array}{c}m \\ i\end{array}\right)\left(\begin{array}{c}N-m \\ n-i\end{array}\right)}{\left(\begin{array}{l}N \\ n\end{array}\right)} \quad i=0,1, \ldots, n

N=10, m=4, n=3

P(A \mid$ lot has 4 defectives $)=\frac{\left(\begin{array}{l}4 \\ 0\end{array}\right)\left(\begin{array}{l}6 \\ 3\end{array}\right)}{\left(\begin{array}{c}10 \\ 3\end{array}\right)}

for N=10, m=1, n=3

P(A \mid$ lot has 1 defective $)=\frac{\left(\begin{array}{l}1 \\ 0\end{array}\right)\left(\begin{array}{l}9 \\ 3\end{array}\right)}{\left(\begin{array}{c}10 \\ 3\end{array}\right)}

\begin{aligned} P(A) &=P(A \mid \text { lot has } 4 \text { defectives }) \frac{3}{10}+P(A \mid \text { lot has } 1 \text { defective }) \frac{7}{10} \\ &=\frac{\left(\begin{array}{l}4 \\ 0\end{array}\right)\left(\begin{array}{l}6 \\ 3\end{array}\right)}{\left(\begin{array}{c}10 \\ 3\end{array}\right)}\left(\frac{3}{10}\right)+\frac{\left(\begin{array}{l}1 \\ 0\end{array}\right)\left(\begin{array}{l}9 \\ 3\end{array}\right)}{\left(\begin{array}{c}10 \\ 3\end{array}\right)}\left(\frac{7}{10}\right) \\=& \frac{54}{100} \end{aligned}

Thus the 46% lot will be rejected.

Expectation, Variance and standard deviation of the hypergeometric random variable

    The expectation, variance and standard deviation for the hypergeometric random variable with parameters n,m, and N would be

\\ E[X]=\frac{n m}{N} \quad \operatorname{Var}(X)=n p(1-p)\left(1-\frac{n-1}{N-1}\right)

or  for the large value of N

\\ \operatorname{Var}(X) \approx n p(1-p)

and standard deviation is the square root of the variance.

By considering the definition of probability mass function of hypergeormetric function and the expectation we can write it as

\begin{aligned} E\left[X^{k}\right] &=\sum_{i=0}^{n} i^{k} P\{X=i\} \\ &=\sum_{i=1}^{n} i^{k}\left(\begin{array}{c}m \\ i\end{array}\right)\left(\begin{array}{c}N-m \\ n-i\end{array}\right) /\left(\begin{array}{c}N \\ n\end{array}\right) \end{aligned}

here by using the relations and identities of the combinations we have

\begin{aligned} \\ i\left(\begin{array}{c}m \\ i\end{array}\right) &=m\left(\begin{array}{c}m-1 \\ i-1\end{array}\right) \text { and } n\left(\begin{array}{c}N \\ n\end{array}\right)=N\left(\begin{array}{c}N-1 \\ n-1\end{array}\right) \end{aligned}

so it would be

\\ E\left[X^{k}\right] &=\frac{n m}{N} \sum_{i=1}^{n} i^{k-1}\left(\begin{array}{c}m-1 \\ i-1\end{array}\right)\left(\begin{array}{c}N-m \\ n-i\end{array}\right) /\left(\begin{array}{c}N-1 \\ n-1\end{array}\right) \\ &=\frac{n m}{N} \sum_{j=0}^{n-1}(j+1)^{k-1}\left(\begin{array}{c}m-1 \\ j\end{array}\right)\left(\begin{array}{c}N-m \\ n-1-j\end{array}\right) /\left(\begin{array}{c}N-1 \\ n-1\end{array}\right) \\ &=\frac{n m}{N} E\left[(Y+1)^{k-1}\right] \end{aligned}

here Y plays the role of hypergeometric random variable with respective parameters now if we put k=1 we will get

E[X]=\frac{n m}{N}

and for k=2

    \[\begin{aligned} E\left[X^{2}\right] &=\frac{n m}{N} E[Y+1] \\ &=\frac{n m}{N}\left[\frac{(n-1)(m-1)}{N-1}+1\right] \end{aligned}\]

so variance would be

\operatorname{Var}(X)=\frac{n m}{N}\left[\frac{(n-1)(m-1)}{N-1}+1-\frac{n m}{N}\right]

for p=m/N and

    \[\\ \frac{m-1}{N-1}=\frac{N p-1}{N-1}=p-\frac{1-p}{N-1}\]

we get

    \[\\ \operatorname{Var}(X)=n p\left[(n-1) p-(n-1) \frac{1-p}{N-1}+1-n p\right] \\\]

for very large value of N it would obviously

    \[\\ \operatorname{Var}(X) \approx n p(1-p)\]

Zeta (Zipf) random variable

        A discrete random variable is said to be Zeta if its probability mass function is given by

    \[\\ \qquad P\{X=k\}=\frac{C}{k^{\alpha+1}} \quad k=1,2, \ldots\]

for the positive values of alpha.

In the similar way we can find the values of the expectation, variance and standard deviation.

     In the similar way by using just the definition of the probability mass function and the mathematical expectation we can summarize the number of properties for the each of discrete random variable for example expected values of sums of random variables as

For random variables

 $X_{1}, X_{2}, \ldots, X_{n}$

E\left[\sum_{i=1}^{n} X_{i}\right]=\sum_{i=1}^{n} E\left[X_{i}\right]


   In this article we mainly focused on some additional discrete random variable, its probability mass functions, distribution and the statistical parameters mean or expectation, standard deviation and variance,  The brief introduction and simple example we discussed to give just the idea the detail study remains to discuss In the next articles we will move on continuous random variables and concepts related to continuous random variable ,if you want further reading then go through suggested link below. For more topics on mathematics, please this link.

Schaum’s Outlines of Probability and Statistics


I am DR. Mohammed Mazhar Ul Haque , Assistant professor in Mathematics. Having 12 years of experience in teaching. Having vast knowledge in Pure Mathematics , precisely on Algebra. Having the immense ability of problem designing and solving. Capable of Motivating candidates to enhance their performance.
I love to contribute to Lambdageeks to make Mathematics Simple , Interesting & Self Explanatory for beginners as well as experts.
Let's connect through LinkedIn -