Inverse Gamma Distribution: 21 Important Facts

Inverse gamma distribution and moment generating function of gamma distribution

      In continuation with gamma distribution we will see the concept of inverse gamma distribution and moment generating function, measure of central tendencies mean, mode and median of gamma distribution by following some of the basic properties of gamma distribution.

gamma distribution properties

Some of the important properties of gamma distribution are enlisted as follows

The probability density function for the gamma distribution is

gif

or

gif

where the gamma function is

gif

2.The cumulative distribution function for the gamma distribution is

gif

where f(x) is the probability density function as given above in particular cdf is

gif

and

gif

respectively or

E[X]=α*β

and

gif
  • The moment generating function M(t) for the gamma distribution is
gif

or

gif
  • The curve for the pdf and cdf is
Inverse gamma distribution
  • The invers gamma distribution can be defined by taking reciprocal of the probability density function of gamma distribution as
gif
  • The sum of independent gamma distribution is again the gamma distribution with sum of the parameters.

inverse gamma distribution | normal inverse gamma distribution

                If in the gamma distribution in the probability density function

or

gif

we take the variable reciprocal or inverse then the probability density function will be

Thus the random variable with this probability density function is known to be the inverse gamma random variable or inverse gamma distribution or inverted gamma distribution.

y%29%5Cleft%20%7C%20%5Cfrac%7B%5Cmathrm%7Bd%7D%20%7D%7B%5Cmathrm%7Bd%7D%20y%7Dy%5E%7B 1%7D%20%5Cright%20%7C
%5Cbeta%20y%29%7Dy%5E%7B 2%7D
y%7D

The above probability density function in any parameter we can take either in the form of lambda or theta the probability density function which is the reciprocal of gamma distribution is the probability density function of inverse gamma distribution.

Cumulative distribution function or cdf of inverse gamma distribution

                The cumulative distribution function for the inverse gamma distribution is the distribution function

gif

in which the f(x) is the probability density function of the inverse gamma distribution as

Mean and variance of the inverse gamma distribution

  The mean and variance of the inverse gamma distribution by following the usual definition of expectation and variance will be

gif

and

gif

Mean and variance of the inverse gamma distribution proof

        To get the mean and variance of the inverse gamma distribution using the probability density function

and the definition of expectations, we first find the expectation for any power of x as

gif
gif.latex?%3D%5Cfrac%7B%5Cbeta%20%5E%7Bn%7D%5Ctau%20%28%5Calpha n%29%7D%7B%28%5Calpha%20 1%29..
gif.latex?%3D%5Cfrac%7B%5Cbeta%20%5E%7Bn%7D%7D%7B%28%5Calpha%20 1%29..

in the above integral we used the density function as

now for the value of α greater than one and n as one

gif

similarly the value for n=2 is for alpha greater than 2

gif

using these expectations will give us the value of variance as

gif

Invers gamma distribution plot | Inverse gamma distribution graph

                The inverse gamma distribution is the reciprocal of the gamma distribution so while observing the gamma distribution it is good to observe the nature of the curves of inverse gamma distribution having probability density function as

and the cumulative distribution function by following

gif
Inverse gamma distribution
Inverse gamma distribution graph

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 1 and varying the value of β.

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 2 and varying the value of β

Description: graphs for the probability density function and cumulative distribution function by fixing the value of α as 3 and varying the value of β.

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 1 and varying the value of α.

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 2 and varying the value of α

Description: graphs for the probability density function and cumulative distribution function by fixing the value of β as 3 and varying the value of α.

moment generating function of gamma distribution

Before understanding the concept of moment generating function for the gamma distribution let us recall some concept of moment generating function

Moments

    The moment of the random variable is defined with the help of expectation as

gif

this is known as r-th moment of the random variable X it is the moment about origin and commonly known as raw moment.

     If we take the r-th moment of the random variable about the mean μ as

gif

this moment about the mean is known as central moment and the expectation will be as per the nature of random variable as

gif
gif

in the central moment if we put values of r then we get some initial moments as

em%3E%7B1%7D%3D0%20%2C%20%7B%5Cmu%7D %7B2%7D%3D%5Csigma%20%5E%7B2%7D

If we take the binomial expansion in the central moments then we can easily get the relationship between the central and raw moments as

em%3E%7Br j%7D%7B%5Cmu%7D%5E%7Bj%7D%20+%20..

some of the initial relationships are as follows

Moment generating function

   The moments we can generate with the help of a function that function is known as moment generating function and is defined as

gif

this function generates the moments with the help of expansion of exponential function in either of the form

gif

using Taylors form as

em%3E%7Br%7D%5Cfrac%7Bt%5E%7Br%7D%7D%7Br%21%7D+.

differentiating this expanded function with respect to t gives the different moments as

em%3E%7BX%7D%28t%29%5Clvert %7Bt%3D0%20%7D

on in another way if we take the derivative directly as

gif

since for both discrete

gif

and continuous we have

gif

so for t=0 we will get

gif

likewise

gif

as

gif

and in general

gif

there is two important relations for the moment generating functions

b%29%20%5C%20M %7B%28X+Y%29%7D%28t%29%3DM %7BX%7D%28t%29%20M %7BY%7D%28t%29

moment generating function of a gamma distribution | mgf of gamma distribution | moment generating function for gamma distribution

Now for the gamma distribution the moment generating function M(t) for the pdf

is

gif

and for the pdf

the moment generating function is

gif

gamma distribution moment generating function proof | mgf of gamma distribution proof

    Now first take the form of probability density function as

and using the definition of moment generating function M(t) we have

gif
gif

we can find the mean and variance of the gamma distribution with the help of moment generating function as differentiating with respect to t two times this function we will get

gif

if we put t=0 then first value will be

gif

and

gif

Now putting the value of these expectation in

gif

alternately for the pdf of the form

gif

the moment generating function will be

%5Cbeta%20%29%7D%20x%5E%7B%5Calpha%20 1%7D%20dx%20%5C%20%3D%20%5Cleft%20%28%20%5Cfrac%7B1%7D%7B1 %5Cbeta%20t%7D%20%5Cright%20%29%5E%7B%5Calpha%20%7D%5Cint %7B0%7D%5E%7B%5Cinfty%7D%20%5Cfrac%7By%5E%7B%5Calpha%20 1%7D%20e%5E%7B y%7D%7D%7B%5Ctau%20%28%5Calpha%20%29%7D%20dy%20%5C%20%5C%20%2C%20%5C%20%5C%20t%26lt%3B%20%5Cfrac%7B1%7D%7B%5Cbeta%20%7D%20%5C%20%3D%20%281 %5Cbeta%20t%29%5E%7B %5Calpha%20%7D%20%5C%20%5C%20t%26lt%3B%20%5Cfrac%7B1%7D%7B%5Cbeta%20%7D

and differentiating and putting t=0 will give mean and variance as follows

gif

2nd moment of gamma distribution

   The second moment of gamma distribution by differentiating moment generating function two times and putting the value of t=0 in second derivative of that function we will get

gif

third moment of gamma distribution

                The third moment of gamma distribution we can find by differentiating the moment generating function three times and putting the value of t=0 in third derivative of the mgf we will get

gif

or directly by integrating as

gif

 sigma for gamma distribution

   sigma or standard deviation of gamma distribution we can find by taking the square root of variance of gamma distribution of type

gif

or

gif

for any defined value of alpha, beta and lambda.

characteristic function of gamma distribution | gamma distribution characteristic function

      If the variable t in the moment generating function is purely an imaginary number as t=iω then the function is known as the characteristic function of gamma distribution denoted and expressed as

gif

as for any random variable the characteristic function will be

gif

Thus for the gamma distribution the characteristic function by following the pdf of gamma distribution is

gif

following

%5Cbeta%20%29%5E%7B %5Calpha%20%7D%5Cint %7B0%7D%5E%7B%5Cinfty%7Dx%5E%7B%5Calpha%20 1%7De%5E%7B x%7D%20dx%3D%5Ctau%20%28%5Calpha%20%29%5Cbeta%20%5E%7B%5Calpha%20%7D%281 i%5Cbeta%20t%29%5E%7B %5Calpha%20%7D

There is another form of this characteristics function also if

2%7D

then

2%7D

sum of gamma distributions | sum of exponential distribution gamma

  To know the result of sum of gamma distribution we must first of all understand sum of independent random variable for the continuous random variable, for this let us have probability density functions for the continuous random variables X and Y then the cumulative distribution function for the sum of random variables will be

gif

differentiating this convolution of integral for the probability density functions of X and Y will give the probability density function for the sum of random variables as

em%3E%7B %5Cinfty%7D%5E%7B%5Cinfty%7DF %7BX%7D%28a y%29f %7BY%7D%28y%29%20dy%20%5C%20%3D%20%5Cint %7B %5Cinfty%7D%5E%7B%5Cinfty%7D%5Cfrac%7B%5Cmathrm%7Bd%7D%20%7D%7B%5Cmathrm%7Bd%7D%20a%7DF %7BX%7D%28a y%29f %7BY%7D%28y%29%20dy%20%5C%20%3D%20%5Cint %7B %5Cinfty%7D%5E%7B%5Cinfty%7Df %7BX%7D%28a y%29f %7BY%7D%28y%29%20dy

Now let us prove if X and Y are the gamma random variables with respective density functions then there sum will also be gamma distribution with sum of same parameters

considering the probability density function of the form

for the random variable X take alpha as  s and for random variable Y take alpha as t so using the probability density for the sum of random variables we have

em%3E%7B0%7D%5E%7Ba%7D%5Clambda%20e%5E%7B %5Clambda%20%28a y%29%7D%20%28%5Clambda%20%28a y%29%29%5E%7Bs 1%7D%5Clambda%20e%5E%7B %5Clambda%20y%7D%20%28%5Clambda%20y%29%5E%7Bt 1%7D%20dy

here C is independent of a , now the value will be

gif

which represent the probability density function of sum of X and Y and which is of the Gamma distribution, hence the sum of the gamma distribution also represents the gamma distribution by respective sum of parameters.

mode of gamma distribution

    To find the mode of gamma distribution let us consider the probability density function as

now differentiate this pdf with respect to x, we will get the differentiation as

gif

this will be zero for x=0 or x=(α -1)/λ

so these are only critical points at which our first derivative will be zero if alpha greater than or equal to zero then x=0 will not be mode because this makes pdf zero so mode will be (α -1)/λ

and for alpha strictly less than one the derivative decreases from infinity to zero as x increases from zero to infinity so this is not possible hence the mode of gamma distribution is

gif

median of gamma distribution

The median of the gamma distribution can be found with the help of inverse gamma distribution as

gif

or

gif

provided

gif

which gives

gif.latex?median%28n%29%3Dn+%5Cfrac%7B2%7D%7B3%7D+%5Cfrac%7B8%7D%7B405n%7D%20 %5Cfrac%7B64%7D%7B5103n%5E%7B2%7D%7D+..

gamma distribution shape

     Gamma distribution takes different shape depending on the shape parameter when shape parameter is one gamma distribution is equal to the exponential distribution but when we vary the shape parameter the skewness of the curve of gamma distribution decreases as the increase in the shape parameter, in another words the shape of the curve of gamma distribution changes as per the standard deviation .

skewness of gamma distribution

    skewness of any distribution can be observed by observing the probability density function of that distribution and skewness coefficient

em%3E%7B3%7D%7D%7B%5Csigma%20%5E%7B3%7D%7D

for the gamma distribution we have

gif.latex?E%28X%5E%7Bk%7D%29%3D%5Cfrac%7B%28%5Calpha%20+k 1%29%28%5Calpha%20+k 2%29..

so

gif

this shows the skewness depends on alpha only if alpha increases to infinity curve will be more symmetric and sharp and when alpha goes to zero the gamma distribution density curve positively skewed which can be observed in the density graphs.

generalized gamma distribution | shape and scale parameter in gamma distribution | three parameter gamma distribution | multivariate gamma distribution

gif

where γ, μ and β are the shape, location and scale parameters respectively, by assigning specific values to these parameters we can get the two parameter gamma distribution specifically if we put μ=0, β=1 then we will get standard gamma distribution as

gif

using this 3 parameter gamma distribution probability density function we can find the expectation and variance by following there definition respectively.

Conclusion:

The concept of reciprocal of gamma distribution that is inverse gamma distribution in comparison with gamma distribution and measure of central tendencies of gamma distribution with the help of moment generating function were the focus of this article, if you require further reading go through suggested books and links. For more post on mathematics, visit our mathematics page.

https://en.wikipedia.org/wiki/Gamma_distribution

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

Gamma Distribution: 7 Important Properties You Should Know

Gamma Distribution

One of the continuous random variable and continuous distribution is the Gamma distribution, As we know the continuous random variable deals with the continuous values or intervals so is the Gamma distribution with specific probability density function and probability mass function, in the successive discussion we discuss in detail the concept, properties and results with examples of gamma random variable and gamma distribution.

Gamma random variable or Gamma distribution | what is gamma distribution | define gamma distribution | gamma distribution density function | gamma distribution probability density function | gamma distribution proof

A continuous random variable with probability density function

gif

is known to be Gamma random variable or Gamma distribution where the α>0, λ>0 and the gamma function

gif

we have the very frequent property of gamma function by integration by parts as

gif
gif
gif

If we continue the process starting from n then

gif
gif
gif.latex?%3D%28n 1%29%20%28n 2%29....3.

and lastly the value of gamma of one will be

CodeCogsEqn

thus the value will be

gif

cdf of gamma distribution | cumulative gamma distribution | integration of gamma distribution

The cumulative distribution function(cdf) of gamma random variable or simply the distribution function of gamma random variable is same as that of continuous random variable provided the probability density function is different i.e

gif

here the probability density function is as defined above for the gamma distribution, the cumulative distribution function we can write also as

gif

in both of the above formats the value of pdf is as follows

gif

where the α >0, λ>0 are real numbers.

Gamma distribution formula | formula for gamma distribution | gamma distribution equation | gamma distribution derivation

To find the probability for the gamma random variable the probability density function we have to use for different given α >0 , λ >0 is as

gif


and using the above pdf the distribution for the gamma random variable we can obtain by

gif

Thus the gamma distribution formula require the pdf value and the limits for the gamma random variable as per the requirement.

Gamma distribution example


show that the total probability for the gamma distribution is one with the given probability density function i.e

gif

for λ >0, α>0.
Solution:
using the formula for the gamma distribution

gif
gif

since the probability density function for the gamma distribution is

gif


which is zero for all the value less than zero so the probability will be now

gif
gif

using the definition of gamma function

gif

and substitution we get

gif

thus

gif

Gamma distribution mean and variance | expectation and variance of gamma distribution | expected value and variance of gamma distribution | Mean of gamma distribution | expected value of gamma distribution | expectation of gamma distribution


In the following discussion we will find the mean and variance for the gamma distribution with the help of standard definitions of expectation and variance of continuous random variables,

The expected value or mean of the continuous random variable X with probability density function

gif

or Gamma random variable X will be

gif

mean of gamma distribution proof | expected value of gamma distribution proof

To obtain the expected value or mean of gamma distribution we will follow the gamma function definition and property,
first by the definition of expectation of continuous random variable and probability density function of gamma random variable we have

gif
gif
gif

by cancelling the common factor and using the definition of gamma function

gif

now as we have the property of gamma function

gif

the value of expectation will be

gif

thus the mean or expected value of gamma random variable or gamma distribution we get is

gif

variance of gamma distribution | variance of a gamma distribution

The variance for the gamma random variable with the given probability density function

gif

or variance of the gamma distribution will be

gif

variance of gamma distribution proof


As we know that the variance is the difference of the expected values as

gif

for the gamma distribution we already have the value of mean

gif

now first let us calculate the value of E[X2], so by definition of expectation for the continuous random variable we have
since the function f(x) is the probability distribution function of gamma distribution as

gif

so the integral will be from zero to infinity only

gif
gif

so by definition of the gamma function we can write

gif
gif

Thus using the property of the gamma function we got the value as

gif


Now putting the value of these expectation in

gif
gif
gif

thus, the value of variance of gamma distribution or gamma random variable is

gif

Gamma distribution parameters | two parameter gamma distribution | 2 variable gamma distribution


The Gamma distribution with the parameters λ>0, α>0 and the probability density function

gif

has statistical parameters mean and variance as

gif

and

gif

since λ is positive real number, to simplify and easy handling another way is to set λ=1/β so this gives the probability density function in the form

gif

in brief the distribution function or cumulative distribution function for this density we can express as

this gamma density function gives the mean and variance as

gif

and

gif


which is obvious by the substitution.
Both the way are commonly used either the gamma distribution with the parameter α and λ denoted by gamma (α, λ) or the gamma distribution with the parameters β and λ denoted by gamma (β, λ) with the respective statistical parameters mean and variance in each of the form.
Both are nothing but the same.

Gamma distribution plot | gamma distribution graph| gamma distribution histogram

The nature of the gamma distribution we can easily visualize with the help of graph for some of specific values of the parameters, here we draw the plots for the probability density function and cumulative density function for some values of parameters
let us take probability density function as

gif

then cumulative distribution function will be

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of alpha as 1 and varying the value of beta.

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of alpha as 2 and varying the value of beta

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of alpha as 3 and varying the value of beta

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of beta  as 1 and varying the value of alpha

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of beta  as 2 and varying the value of alpha

gamma distribution

Description: graphs for the probability density function and cumulative distribution function by fixing the value of beta as 3 and varying the value of alpha.

In general different curves as for alpha varying is

Gamma distribution
Gamma distribution graph

Gamma distribution table | standard gamma distribution table


The numerical value of gamma function

gif


known as incomplete gamma function numerical values as follows

Gamma distribution



The gamma distribution numerical value for sketching the plot for the probability density function and cumulative distribution function for some initial values are as follows

1xf(x),α=1,β=1f(x),α=2,β=2f(x),α=3,β=3P(x),α=1,β=1P(x),α=2,β=2P(x),α=3,β=3
0100000
0.10.9048374180.023780735611.791140927E-40.095162581960.0012091042746.020557215E-6
0.20.81873075310.04524187096.929681371E-40.18126924690.004678840164.697822176E-5
0.30.74081822070.064553098230.0015080623630.25918177930.010185827111.546530703E-4
0.40.6703200460.081873075310.002593106130.3296799540.017523096313.575866931E-4
0.50.60653065970.097350097880.0039188968750.39346934030.026499021166.812970042E-4
0.60.54881163610.11112273310.0054582050210.45118836390.036936313110.001148481245
0.70.49658530380.12332041570.0071856645830.50341469620.048671078880.001779207768
0.80.44932896410.13406400920.0090776691950.55067103590.061551935550.002591097152
0.90.40656965970.14346633410.011112273310.59343034030.075439180150.003599493183
10.36787944120.15163266490.013269098340.63212055880.090204010430.004817624203
1.10.33287108370.15866119790.015529243520.66712891630.10572779390.006256755309
1.20.30119421190.16464349080.017875201230.69880578810.12190138220.007926331867
1.30.2725317930.16966487750.02029077660.7274682070.13862446830.00983411477
1.40.24659696390.17380485630.022761011240.75340303610.15580498360.01198630787
1.50.22313016010.17713745730.025272110820.77686983990.17335853270.01438767797
1.60.2018965180.17973158570.027811376330.7981034820.19120786460.01704166775
1.70.18268352410.18165134610.030367138940.81731647590.20928237590.01995050206
1.80.16529888820.18295634690.032928698170.83470111180.22751764650.02311528775
1.90.14956861920.18370198610.035486263270.85043138080.24585500430.02653610761
20.13533528320.18393972060.038030897710.86466471680.26424111770.03021210849
2.10.12245642830.18371731830.040554466480.87754357170.28262761430.03414158413
2.20.11080315840.1830790960.043049586250.88919684160.30097072420.03832205271
2.30.10025884370.18206614240.045509578110.89974115630.31923094580.04275032971
2.40.090717953290.18071652720.047928422840.90928204670.33737273380.04742259607
2.50.082084998620.1790654980.050300718580.91791500140.35536420710.052334462
2.60.074273578210.17714566550.052621640730.92572642180.3731768760.05748102674
2.70.067205512740.17498717590.054886904070.93279448730.39078538750.0628569343
2.80.060810062630.17261787480.057092726880.93918993740.40816728650.06845642568
2.90.055023220060.17006345890.059235797090.94497677990.42530279420.07427338744
30.049787068370.16734762010.06131324020.95021293160.44217459960.08030139707
Image9
Gamma Distribution Graph
Image10
Image11

finding alpha and beta for gamma distribution | how to calculate alpha and beta for gamma distribution | gamma distribution parameter estimation


For a gamma distribution finding alpha and beta we will take mean and variance of the gamma distribution

gif

and

gif


now we will get value of beta as

gif


so

gif


and

gif

thus

gif

only taking some fractions from the gamma distribution we will get the value of alpha and beta.

gamma distribution problems and solutions | gamma distribution example problems | gamma distribution tutorial | gamma distribution question

1. Consider the time require to resolve the problem for a customer is gamma distributed in hours with the mean 1.5 and variance 0.75 what would be the probability that the problem resolving time exceed 2 hours, if time exceeds 2 hours what would be the probability that the problem will resolved in at least 5 hours.

solution: since the random variable is gamma distributed with mean 1.5 and variance 0.75 so we can find the values of alpha and beta and with the help of these values the probability will be

P(X>2)=13e-4=0.2381

and

P(X>5 | X>2)=(61/13)e-6=0.011631

2. If the negative feedback in week from the users is modelled in gamma distribution with parameters alpha 2 and beta as 4 after the 12 week negative feedback came after restructuring the quality, from this information can restructuring improves the performance ?

solution: As this is modelled in gamma distribution with α=2, β=4

we will find the mean and standard deviation as μ =E(x)=α * β=4 * 2=8

since the value X=12 is within the standard deviation from the mean so we can not say this is improvement or not by the restructuring the quality, to prove the improvement caused by the restructuring information given is insufficient.

3. Let X be the gamma distribution with parameters α=1/2, λ=1/2 , find the probability density function for the function Y=Square root of X

Solution: let us calculate the cumulative distribution function for Y as

2%7D

now differentiating this with respect to y gives the probability density function for Y as

2%7D

and the range for y will be from 0 to infinity


Conclusion:

The concept of gamma distribution in probability and statistic is the one of the important day to day applicable distribution of exponential family, all the basic to higher level concept were discussed so far related to gamma distribution, if you require further reading, please go through mentioned books. You can also visit out mathematics page for more Topic

https://en.wikipedia.org/wiki/Gamma_distribution
A first course in probability by Sheldon Ross
Schaum’s Outlines of Probability and Statistics
An introduction to probability and statistics by ROHATGI and SALEH

Probability Theory: 9 Facts You Should Know

Image1 1 300x179 1

A brief Description of Probability theory

In the previous articles, the probability we discussed was at very basic level, Probability is a means of expressing information that an occurrence of an event has occurred, In pure mathematics the concept of probability has been described in the form of probability theory which is widely used in the areas of real life as well as different branches of philosophy, science, gambling, finance, statistics and mathematics etc. for finding the likelihood of main events.

    Probability theory is the branch of mathematics which deals with the random experiment and its outcome, the core objects for dealing such analysis of the random experiment are events, random variable, stochastic processes, non-deterministic events etc.

Providing an example when we toss a coin or die this event although is random but when we repeat such trial number of times the result of such trial or event will result in a particular statistical arrangement which we can predict after studying via law of large numbers or the central limit theorems etc. so likewise we can use probability theory for the day to day activity of human beings e.g. large set of data can be analyze by quantitative analysis, for  explanation of those systems for which we have insufficient information we can use probability theory e.g. complex systems in statistical mechanics, for physical phenomena of atomic scales in quantum mechanics. 

    There are number of real life situations as well as applications where the probabilistic situation occurs the probability theory will be used provided the familiarity of the concept and the handling of the results and relations of probability theory.  In following we will get some differentiation of the situations with the help of some terms in probability theory.     

Discrete probability

Discrete probability theory is the study of random experiments in which the result can be counted numerically, so here the restriction is the events whatever occurred must be countable subset of given sample space. It includes the experiment of throwing coin or dice, random walk, picking cards from deck, balls in bags etc.

Continuous probability

Continuous probability theory is the study of random experiments in which the result is within the continuous intervals, so here restriction is the events whatever occurred must be in the form of continuous intervals as a subset of sample space.

Measure-theoretic probability

The Measure theoretic probability theory deals with the any of discrete and continuous random outcome, and differentiate in which situation what measure have to be used. The measure theoretic probability theory also deals with the probability distributions which is neither discrete nor continuous nor the mixture of both.

     So to study the probability we must know first of all what is the nature of random experiment either that is discrete, continuous or mixture of both or neither, depending on this we can set our strategies which way we have to follow.  we will discuss all the situation consecutively one by one.

EXPERIMENT

Any action that produces a result or an outcome is called as experiment. There are two types of experiment.

Deterministic Experiments  Non-deterministic Experiments (or Random Experiments)
Any experiment whose outcome we can predict in advance under some conditions.Any experiment whose outcome or result we cannot able to predict in advance.
For example flow of current in specific circuit based on the power provided we know by some physical laws.For example tossing of an unbiased coin we don’t know head will come or tail
We don’t need probability theory for such experiments outcome.We need probability theory for such experiments outcome.

Theory of Probability is basically depending on the model of a random experiment, that implies an experiment whose outcome is unpredictable with certainty, before the experiment is run. People normally thinks that the experiment can be recurrent forever under fundamentally the same circumstances.   

   This presumption is important because the theory of Probability is concerned with the long-term practices as the experiment is recreated. Naturally, a proper definition of a random experiment needs a careful definition of specifically what information about the experiment is being recorded, that is, a careful definition of what constitutes an outcome.

SAMPLE SPACE

As already discussed Sample space is nothing but the set having all possible outcomes of non-deterministic or random experiment. In mathematical analysis random variable which is outcome of such experiment is a real valued function denoted by X i.e X:A ⊆ S → ℝ  that we will discuss in detail later.  Here also we can categorize sample space as finite or infinite.  Infinite sample spaces can be discrete or continuous.

Finite Sample Spaces  Infinite Discrete Sample Spaces  
Tossing a coin or anything with two different result {H, T}Repeatedly tossing a coin until first head shows possible outcome may be {H,TH,TTH,TTTH,…………}
Throwing a die {1, 2, 3, 4, 5, 6}Throwing a die repeatedly till 6 come
Drawing a card from a deck of 52 cardsDrawing a card and replacing till queen come
Choosing a birthday from a year {1, 2, 3, 4, …, 365}.Arriving time of two consecutive trains

EVENT

Event as already we know is subset of the sample space of random experiment for which we are discussing the probability.  In other word we can say any element in the power set of sample space for finite sample space is Event and for infinite we have to exclude some subsets.

Independent eventsDependent Events
If there is no effect of the events to other eventsOccurrence of one event affect other events
For example tossing a coinDrawing a card without returning.
Probabilities of the events also not affectedProbabilities of the events affected
P(A ⋂ B) = P (A) X P(B)P(A ⋂ B) =P(A) X P(B/A)
P(B/A) is the conditional prob. of B given A

RANDOM VARIABLE

The understanding of random variable is very important for the study of probability theory. Random variable  is very helpful to generalized the concept of probability which gives mathematical property to probabilities questions  and the use of measure theoretic probability is based on random variable.  Random variable which is outcome of random experiment is a real valued function denoted by X i.e X:A ⊆ S → ℝ

Discrete Random VariableContinuous Random Variable
Countable outcome of random experimentOutcome of random experiment in range
For a coin toss, the possible events are heads or tails. so random variable takes the values :
X=1 if heads and X=0 if tails
a real number between zero and one
For throwing a die X=1,2,3,4,5,6For the time of travelling X=(3,4)

A random variable can be thought of as an unknown value that may change every time it is inspected. Thus, a random variable can be thought of as a function mapping the sample space of a random process to the real numbers.

Probability Distributions

Probability distribution is defined as the collection of random variable with its probability,

so obviously depending on the nature of random variable we can categorize as

Discrete Probability DistributionContinuous Probability Distribution
If random variable is discrete then probability distribution is known as discrete probability distributionIf random variable is continuous then probability distribution is known as continuous probability distribution
For example number of tails for tossing a coin two times  can be distributed as result will be TT,HH,TH,HT
X(no of tails): 0 1 2
P(x) : 1/4 1/2 1/3
A continuous probability distribution differs from a discrete probability distribution so for the random variable X ≤ a its probability P(X ≤ a) can be considered as the area under the curve (See the below image)
continuous probability distribution
continuous probability distribution

      In the similar way for dealing with probability of random variable depends on the nature of random variable, so the concepts we are using will depend on the nature of random variable.

Conclusion:

   In this article we mainly discuss the scenario of probability, how we can deal the probability and some concept comparatively. Before discussing the core subject this discussion is important so that the problems we deal stands where we know clearly. In the consecutive articles we relate probability to random variable and some familiar terms related to probability theory we will discuss, if you want further reading then go through:

Schaum’s Outlines of Probability and Statistics

https://en.wikipedia.org/wiki/Probability

For more topics on mathematics please check this page.

Normal Random Variable : 3 Important Facts

01

Normal Random variable and Normal distribution

      The random variable with uncountable set of values is known to be continuous random variable, and the probability density function with the help of integration as area under the curve gives the continuous distribution, Now we will focus one of the most used and frequent continuous random variable viz normal random variable which has another name as Gaussian random variable or Gaussian distribution.

Normal random variable

      Normal random variable is the continuous random variable with probability density function

01

having mean μ and variance σ2 as the statistical parameters and geometrically the probability density function has the bell shaped curve which is symmetric about the mean μ.

Normal Random variable
Normal Random variable

We know that probability density function has the total probability as one so

02

by putting y= (x-μ)/σ

03
04
05
06
07

this double integration can be solved by converting it  into polar form

08

which is the required value so it is verified for the integral I.

  • If X is normally distributed with parameter μ  and σ2 then Y=aX+b is also normally distributed with the parameters  aμ+b and a2μ2

Expectation and Variance of Normal Random variable

The Expected value of the normal random variable and the variance we will get with the help of

09

where X is normally distributed with the parameters mean μ and standard deviation σ.

10

since mean of Z is zero so we have the variance as

11

by using integration by parts

12 1

for the variable Z the graphical interpretation is as follows

Normal Random variable
Normal Random variable

and the area under the curve for this variable Z which is known as standard normal variable, it is calculated for the reference (given in the table), as the curve is symmetric so for negative value the area will be same as that of positive values

13
z0.000.010.020.030.040.050.060.070.080.09
0.00.500000.503990.507980.511970.515950.519940.523920.527900.531880.53586
0.10.539830.543800.547760.551720.555670.559620.563560.567490.571420.57535
0.20.579260.583170.587060.590950.594830.598710.602570.606420.610260.61409
0.30.617910.621720.625520.629300.633070.636830.640580.644310.648030.65173
0.40.655420.659100.662760.666400.670030.673640.677240.680820.684390.68793
0.50.691460.694970.698470.701940.705400.708840.712260.715660.719040.72240
0.60.725750.729070.732370.735650.738910.742150.745370.748570.751750.75490
0.70.758040.761150.764240.767300.770350.773370.776370.779350.782300.78524
0.80.788140.791030.793890.796730.799550.802340.805110.807850.810570.81327
0.90.815940.818590.821210.823810.826390.828940.831470.833980.836460.83891
1.00.841340.843750.846140.848490.850830.853140.855430.857690.859930.86214
1.10.864330.866500.868640.870760.872860.874930.876980.879000.881000.88298
1.20.884930.886860.888770.890650.892510.894350.896170.897960.899730.90147
1.30.903200.904900.906580.908240.909880.911490.913080.914660.916210.91774
1.40.919240.920730.922200.923640.925070.926470.927850.929220.930560.93189
1.50.933190.934480.935740.936990.938220.939430.940620.941790.942950.94408
1.60.945200.946300.947380.948450.949500.950530.951540.952540.953520.95449
1.70.955430.956370.957280.958180.959070.959940.960800.961640.962460.96327
1.80.964070.964850.965620.966380.967120.967840.968560.969260.969950.97062
1.90.971280.971930.972570.973200.973810.974410.975000.975580.976150.97670
2.00.977250.977780.978310.978820.979320.979820.980300.980770.981240.98169
2.10.982140.982570.983000.983410.983820.984220.984610.985000.985370.98574
2.20.986100.986450.986790.987130.987450.987780.988090.988400.988700.98899
2.30.989280.989560.989830.990100.990360.990610.990860.991110.991340.99158
2.40.991800.992020.992240.992450.992660.992860.993050.993240.993430.99361
2.50.993790.993960.994130.994300.994460.994610.994770.994920.995060.99520
2.60.995340.995470.995600.995730.995850.995980.996090.996210.996320.99643
2.70.996530.996640.996740.996830.996930.997020.997110.997200.997280.99736
2.80.997440.997520.997600.997670.997740.997810.997880.997950.998010.99807
2.90.998130.998190.998250.998310.998360.998410.998460.998510.998560.99861
3.00.998650.998690.998740.998780.998820.998860.998890.998930.998960.99900
3.10.999030.999060.999100.999130.999160.999180.999210.999240.999260.99929
3.20.999310.999340.999360.999380.999400.999420.999440.999460.999480.99950
3.30.999520.999530.999550.999570.999580.999600.999610.999620.999640.99965
3.40.999660.999680.999690.999700.999710.999720.999730.999740.999750.99976
3.50.999770.999780.999780.999790.999800.999810.999810.999820.999830.99983

since we have used the substitution

14

Here keep in mind that Z is standard normal variate where as continuous random variable X is normally distributed normal random variable with mean μ and standard deviation σ.

So to find the distribution function for the random variable we will use the conversion to the standard normal variate as

16

for any value of a.

Example: In the standard normal curve find the area between the points 0 and 1.2.

If we follow the table the value of 1.2 under the column 0 is 0.88493 and value of 0 is 0.5000 ,

Normal Random variable
Normal Random variable
17

Example: find the area for the standard normal curve within -0.46 to 2.21.

Normal Random variable
Normal Random variable

From the shaded region we can bifurcate this region from -0.46 to 0 and 0 to 2.21 because the normal curve is symmetric about y axis so the area from -0.46 to 0 is same as the are from 0 to 0.46 thus from the table

18

and

19

so we can write it as

Total Area =(area between z = -0.46 and z=0 ) + (area between z =0 and z=2.21)

= 0.1722 + 0.4864

= 0.6586

Example: If X is normal random variable with mean 3 and variance 9 then find the following probabilities

P2<X<5

P{X>0}

P|X-3|>6

Solution:  since we have

20
21.PNG
22
Normal Random variable
Normal Random variable

so bifurcating into the intervals -1/3 to 0 and 0 to 2/3 we will get the solution from the tabular values

23

or

24
25

=0.74537 -1 + 0.62930 =0.37467

and

26
Normal Random variable
Normal Random variable
27.PNG
Normal Random variable
Normal Random variable

Example: An observer in paternity case states that the length (in days) of human growth

is normally distributed with parameters mean  270 and variance 100. In this case the suspect  who is father of the child provided the proof that he was out of the country during a period that started 290 days before the birth of the child and ended 240 days earlier the birth. Find the probability that the mother could have had the very long or very short pregnancy indicated by the witness?

Let X denote the normally distributed random variable for gestation and consider the suspect is the father of the child. In that case the birth of the child happened within the specified time has the probability

29

Relation between Normal random variable and Binomial random variable

      In case of Binomial distribution the mean is np and the variance is npq so if we convert such binomial random variable with such  mean and standard deviation having n very large and p or q are very small going nearer to zero then standard normal variable Z with the help of these mean and variance is

30.PNG

here in terms of Bernouli trials X considers the number of successes in n trials. As n is increases and goes nearer to infinity this normal variate goes in the same way to become standard normal variate.

The relation of binomial and standard normal variate we can find with the help of following theorem.

DeMoivre Laplace limit theorem

If Sn denotes the number of successes that occur when n  independent trials, each resulting in a success with probability p , are performed, then, for any a < b ,

31.PNG
32

Example: With the help of normal approximation to the binomial random variable find the probability of occurrence of 20 times tail when a fair coin tossed 40 times.

Solution: Suppose the random variable X represents the occurrence of tail, since the binomial random variable is discrete random variable and normal random variable is continuous random variable so to convert the discrete into the continuous, we write it as

33 1

and if we solve the given example with the help of binomial distribution we will get it as

34

Example: To decide the efficiency of a definite nourishment in decreasing the extent of cholesterol in the blood circulation, 100 people are placed on the nourishment. The cholesterol count were observed for the define time after providing the nourishment. If from this sample 65 percent have low cholesterol count then nourishment will be approved. What is the probability that the nutritionist approves the new nourishment if, actually, it has no consequence on the cholesterol level?

solution:  Let the random variable express the cholesterol level if down by the nourishment so the probability for such random variable will be ½ for each person, if X denotes the low level number of people then the probability that result approved even there is no effect of nourishment to reduce the level of cholesterol is

35


36
37

Conclusion:

   In this article the concept of continuous random variable namely normal random variable and its distribution with probability density function were discussed and the statistical parameter mean, variance for the normal random variable is given. The conversion of normally distributed random variable to the new standard normal variate and area under the curve for such standard normal variate is given in tabulated form one of the relation with discrete random variable is also mentioned with example ,if you want further reading then go through:

Schaum’s Outlines of Probability and Statistics

https://en.wikipedia.org/wiki/Probability.

For more topics on mathematics please check this page.

Continuous Random Variable: 3 Important Facts

quicklatex.com a668c4a960333671f6f6ff4163822c2e l3

Continuous random variable, types and its distribution

     The random variable which takes the finite or countably infinite values is known as discrete random variable and its pair with probability forms the distribution for the discrete random variable. Now for the random variable who takes the values as uncountable, what would be the probability and remaining characteristics that we are going to discuss.  Thus in brief the continuous random variable is the random variable whose set of values are uncountable. The real life example for the continuous random variable is the life span of electrical or electronic components and arrival of specific public vehicle on the stops etc.

Continuous random variable and probability density function

                Random variable  will be continuous random variable if for a non-negative real valued function f on x and B ⊆ and

01.PNG

this function f is known as Probability density function  of the given random variable X.

The probability density function obviously satisfies the following probability axioms

02.PNG

Since from the axioms of the probability we know that the total probability is one so

03.PNG

For the continuous random variable the probability will be calculated in terms of such function f, suppose we want to find the probability for the continuous interval say [a, b] then it would be

04.PNG

As we know the integration represents the area under the curve so this probability shows such area for the probability like

Continuous random variable | Its Important distribution
Continuous random variable

by equating a=b the value will be

06.PNG

and in similar way the probability for the value less than or equal to specific value by following this will be

07.PNG

Example: The continuous working time of the electronic component is expressed in the form of continuous random variable and the probability density function is given as

08.PNG

find the probability that the component will work effectively between 50 to 150 hours and the probability of less than 100 hours.

since the random variable represents the continuous random variable so the probability density function given in the question gives the total  probability as

09.PNG

So we will get the value of λ

08.PNG 1

λ =1/100

for the probability of 50 hrs to 150hrs we have

10.PNG

in the similar way the probability less than 100 will be

11.PNG

Example: The computer based device has number of chipsets with lifespan given by the probability density function

12.PNG

then after 150 hours find the probability that we have to replace 2 chipset from total 5 chips.

let us consider Ei be the event to replace the i-th chipset. so the probability of such event will be

13.PNG

as working of all the chips independent so the probability for 2 to be replace will be

14.PNG

Cumulative distribution function

  Cumulative distribution function for the continuous random variable is defined with the help of probability distribution function as

15.PNG

in another form

16.PNG

we can obtain the probability density function with the help of distribution function as

16.PNG 1

Mathematical Expectation and Variance of continuous random variable

Expectation

The mathematical expectation or mean of the continuous random variable  with probability density function  can be define as

17.PNG
  • For any real valued function of continuous random variable X expectation will be
18.PNG

where g is the real valued function.

  1. For any non-negative continuous random variable Y the expectation will be
19.PNG
  • For any constants a and b

E[aX + b] = aE[X] + b

Variance

                The variance of the continuous random variable X with the parameter mean or expectation  can be define in the similar way as discrete random variable is

20.PNG
21

   The proof of all the above properties of expectation and variance we can easily obtain by just following the steps we have in discrete random variable and the definitions of expectation, variance and probability in terms of continuous random variable

Example: If the probability density function of continuous random variable X is given by

22 2

then find the expectation and variance of the continuous random variable X.

Solution:  For the given probability density function

23 1

the expected value by the definition will be

24 1

Now to find the variance we require E[X2]

25 1

Since

26 1

so

27

Uniform random variable

    If the continuous random variable X is having the probability density function given by

28 1

over the interval (0,1) then this distribution is known as uniform distribution and the random variable is known as uniform random variable.

  • For any constants a and b such that 0<a<b<1
29 1
Continuous random variable
Continuous random variable: Uniform random variable

Expectation and Variance of Uniform random variable

      For the uniformly continuous random variable X on the general interval (α , β) the expectation by the definition will be

30

and variance we will get if we find first E[X2]

31
32 1
33 2

so

34 1
35 1

Example: At a particular station the trains for the given destination arrive with frequency of 15 minutes form 7 A.M. For the passenger who is on the station at a time between 7 to 7.30 distributed uniformly what will be the probability that the passenger get train within 5 minutes and what will be probability for more than 10 minutes.

Solution: As the time from 7 to 7.30 is distributed uniformly for the passenger to be at railway station denote this by uniform random variable X. so the interval will be (0, 30)

Since to get the train within 5 minutes passenger must be at the station between 7.10 to 7.15 or 7.25 to 7.30 so the probability will be

36 1

=1/3

In similar manner to get the train after waiting more than 10 minutes passenger must be at the station from 7 to 7.05 or 7.15 to 7.20 so the probability will be

37 1

Example: Find the probability for the uniform random variable X distributed over the interval (0,10 )

for X<3, X>6 and 3<X<8.

Solution: since the random variable is given as uniformly distributed so the probabilities will be

38

Example: (Bertrands Paradox) For any random chord of a circle. what would be the probability that the length of that random chord will be greater than the side of the equilateral triangle inscribed in the same circle.

This problems does not have clearance about the random chord so this problem were reformulated in terms of diameter or angle and then answer as 1/3 were obtained.

Conclusion:

   In this article the concept of continuous random variable and its distribution with probability density function were discussed and the statistical parameter mean, variance for the continuous random variable is given. The uniform random variable and its distribution with example is given which is the type of continuous random variable in the successive article we will focus some important types of continuous random variable with suitable examples and properties. ,if you want further reading then go through:

Schaum’s Outlines of Probability and Statistics

https://en.wikipedia.org/wiki/Probability

If you want to read more topics on Mathematics then go through Mathematics Page.

Geometric Random Variable: 7 Important Characteristics

image 42

Some additional discrete random variable and its parameters

    The discrete random variable with its probability mass function combines the distribution of the probability and depending on the nature of the discrete random variable the probability distribution may have different names like binomial distribution, Poisson distribution etc., as already we has seen the types of discrete random variable, binomial random variable and Poisson random variable with the statistical parameters for these random variables. Most of the random variables are characterized depending on the nature of probability mass function, now we will see some more type of discrete random variables and its statistical parameters.

Geometric Random variable and its distribution

      A geometric random variable is the random variable which is assigned for the independent trials performed till the occurrence of success after continuous failure i.e if we perform an experiment n times and getting initially all failures n-1 times and then at the last we get success.  The probability mass function for such a discrete random variable will be

image 32

In this random variable the necessary condition for the outcome of the independent trial is the initial all the result must be failure before success.

Thus in brief the random variable which follows above probability mass function is known as geometric random variable.

It is easily observed that the sum of such probabilities will be 1 as the case for the probability.

image 33

Thus the geometric random variable with such probability mass function is geometric distribution.

Know more about Continuous random variable

Expectation of Geometric random variable

    As expectation is one of the important parameter for the random variable so the expectation for the geometric random variable will be 

E[X]=1/p

where p is the probability of success.

since

image 34

let the probability of failure be q=1-p

so

image 36
image 37
image 40
image 39
image 45

E[X]=qE[X]+1

(1-q)E[X]=1

pE[X]=1

thus we get

image 46

Thus the expected value or mean of the given information we can follow by just inverse value of probability of success in geometric random variable.

To get details about Normal Random Variable

Variance and standard deviation of the geometric random variable

In similar way we can obtain the other important statistical parameter variance and standard deviation for the geometric random variable and it would be

image 47

and

image 48

To obtain these values we use the relation

image 49

So let us calculate first

E[X2]

set q=1-p

image 50
image 51

so

image 52
image 53
image 54
image 55
image 56
image 57

thus we have

01.PNG 1

Negative Binomial Random Variable

    This random falls in another discrete random variable because of the nature of its probability mass function, in the negative binomial random variable and in its distribution from n trial of an independent experiment r successes must be obtained initially

2.PNG

In other words a random variable with above probability mass function is negative binomial random variable with parameters (r,p), note that if we restrict r=1 the negative binomial distribution turns to geometric distribution, we can specifically check

3.PNG

Expectation, Variance and standard deviation of the negative binomial random variable

The expectation and variance for the negative binomial random variable will be

4.PNG

with the help of probability mass function of negative binomial random variable and definition of expectation we can write

5.PNG

here Y is nothing but the negative binomial random variable now put k=1 we will get

6.PNG

Thus for variance

Exxample: If a die is throw to get 5 on the face of die till we get 4 times this value find the expectation and variance.Sine the random variable associated with this independent experiment is negative binomial random variable for r=4 and probability of success p=1/6 to get 5 in one throw

as we know for negative binomial random variable 

7.PNG

Hypergeometric random variable

       If we particularly choosing a sample of size n from a total N having m and N-m two types then the random variable for first was selected have the probability mass function as

10.PNG 1

for example suppose we have a sack from which a sample of size n books taken randomly without replacement containing N books of which m are mathematics and N-m are physics, If we assign the random variable to denote the number of mathematics books selected then the probability mass function for such selection will be as per above probability mass function.

  In other words the random variable with the above probability mass function is known to be the hypergeometric random variable.

Read more about Jointly Distributed Random Variables

Example: From a lot of some electronic components if 30% of the lots have four defective components and 70% have one defective, provided size of lot is 10 and to accept the lot three random components will be chosen and checked if all are non-defective then lot will be selected. Calculate that from the total lot what percent of lot get rejected.

here consider A is the event to accept the lot

11.PNG 1

N=10, m=4, n=3

13.PNG 1

for N=10, m=1, n=3

12.PNG 1

Thus the 46% lot will be rejected.

Expectation, Variance and standard deviation of the hypergeometric random variable

    The expectation, variance and standard deviation for the hypergeometric random variable with parameters n,m, and N would be

14.PNG 1

or  for the large value of N

15.PNG 1

and standard deviation is the square root of the variance.

By considering the definition of probability mass function of hypergeormetric function and the expectation we can write it as

16.PNG 2

here by using the relations and identities of the combinations we have

17.PNG 1

here Y plays the role of hypergeometric random variable with respective parameters now if we put k=1 we will get

E[X] = nm/N

and for k=2

image 62

so variance would be

image 61

for p=m/N and

image 60

we get

image 59

for very large value of N it would obviously

image 58

Zeta (Zipf) random variable

        A discrete random variable is said to be Zeta if its probability mass function is given by

image 42

for the positive values of alpha.

In the similar way we can find the values of the expectation, variance and standard deviation.

     In the similar way by using just the definition of the probability mass function and the mathematical expectation we can summarize the number of properties for the each of discrete random variable for example expected values of sums of random variables as

For random variables

$ X1,X2, X3…$

image 41

Conclusion:

   In this article we mainly focused on some additional discrete random variable, its probability mass functions, distribution and the statistical parameters mean or expectation, standard deviation and variance,  The brief introduction and simple example we discussed to give just the idea the detail study remains to discuss In the next articles we will move on continuous random variables and concepts related to continuous random variable ,if you want further reading then go through suggested link below. For more topics on mathematics, please this link.

Schaum’s Outlines of Probability and Statistics

https://en.wikipedia.org/wiki/Probability

Binomial Random Variable: 3 Interesting Facts To Know

1 3

Binomial & Poisson random variable and its properties

    The random variable which deals with the success and failure outcome of the random experiment for n repetitions were known to be Binomial random variable the definition of its probability mass function deals with the probability of success p and probability of failure q only, the definition with examples already we has seen, now with the understanding we see some of the properties of such discrete random variable,

Expectation and Variance of the binomial random variable

Expectation and Variance of binomial random variable with n repetition and p as probability of success are

E[X]= np

and Var(X) = np(1-p)

now consider for showing these two the expectation of random variable of power k by following  the definition of probability mass function for binomial random variable as,

1 3
Binomial Random Variable

where Y is another binomial random variable with n-1 trials and p as the probability of success, If we take the value of k=1 then we will get

E[X]= np

and if we substitute k=2 we will get

E[X2] =npE[Y + 1]

=np[(n-1)p + 1]

so we will get easily

Var(X)=E[X2] – (E[X])2

=np[(n-1)p + 1] -(np)2

=np(1-p)

Example: For an unbiased coin do the experiment of tossing 100 times and for the number of tails that appearing in this case find the mean, variance and standard deviation of such experiment.

The tail for one toss has the probability of success p=1/2=0.5

so the mean of such experiment is

E[X]= np

since the experiment is binomial as only success or failure we will get for n number of repetitions

so as μ=np

μ=100x(0.5)=50

Similarly the variance and the standard deviation will be

Var(X)= np(1-p)

σ2= np(1-p)

2 3

The value would be

σ2 =(100)(0.5)(0.5)=25

Example:     Find the mean and standard deviation for the probability of 0.1 defectiveness in bolt manufacturing company from the lot of 400 bolt.

here n=400, p=0.1, mean= np =400×0.1=40

since

σ2= np(1-p)

3

so standard deviation will be

4

Example: Find the probability of exactly, less than and at least 2 successes if the mean and standard deviation for the binomial random variable is 4 and 2 respectively.

Since mean = np= 4

and variance = np(1-p) = 2,

so 4(1-p)=2

(1-p)=1/2

p=1-(1/2)

putting this value in mean we get

np = 4

n(1/2)=4

n=8

probability of exactly 2 successes will be

5

probability of less than 2 successes will be

p(X < 2)

=P(0) +P(1) = 8C0 p0q8 + 8C1 p1q7

=(1/256)+8 x (1/2) x (1/2)7 = 9/256

Probability of at least 2 successes

p(X>2)= 1- p(X<2)

=1-P(0) – P(1)= 1-[P(0) + P(1)] =1- (9/256)=247/256

Poisson Random Variable

    The discrete random variable X that takes the values 0,1,2…….. is known to be Poisson Random variable provided for any λ>0 its probability mass function must be

6

or

7

as

8

When n is very large and the probability of success p is very small in such case Poisson random variable with its probability mass function became the approximation of binomial random variate with respective p.m.f. because the expectation in this case which is np will be moderate and that would be λ= np .

Example: Find the probability that there is at least one typing error on each page of the book which has Poisson distribution with mean 1/2 for a single page.

Let the discrete random variable X denote the errors on the page. so the Poisson random variable has the probability mass function as

8 1

λ = 1/2

9 1
10

Example: Find the probability that the sample of 10 items produced by a machine with 0.1 chances of defective production has at the most one defective item.

10 1

This we can solve both by binomial probability mass function as well as Poisson probability mass function, so we solve this by Poisson

Expectation and variance of the Poisson random variable

Expectation and Variance of Poisson random variable with n repetition and p as probability of success are

E[X]= np= λ

and          

Var(X) = np= λ

Before showing the result the we must keep in mind that the Poisson random variable is nothing but the approximation of Binomial random variable so np= λ now expectation by using the probability mass function will be

13
14
15
16

This means the mathematical expected value of Poisson random variable is equal to its parameter, similarly for calculating the variance and standard deviation of Poisson random variable we require expectation of square of X so,

17
AnyConv.com 18
AnyConv.com 19
AnyConv.com 21

The above summation is obvious as two of the sums are expectation and sum of the probabilities.

Thus the value of variance we will get is

Var(X) = E[X2] – (E[X])2

so in the case of Poisson random variable the mean and variance have the same value i.e np as a parameter.

The Poisson random variable is the approximation good for the finding of diverse processes e.g. finding the occurrence of number of earthquakes within some specific time duration, finding the number of electron during a fixed time from the heated cathode, finding the possible number of deaths during specified time, or number of wars within specific year e.t.c.

Example : Calculate the probability that the total number of passengers in two days is less than 2. If the number of arrival of passengers with mean 5 follows Poisson random variable. mean=np=5

AnyConv.com 22 1

If we consider the number of passengers in two days less than 2 it would be

First daySecond dayIn total
000
011
101

so the probability will be the combination of these two days as

AnyConv.com 23 1
AnyConv.com 24
AnyConv.com 25

=e-10[1+5+5]

=11e-10

=114.5410-5

=4.994*10-4

Example: Calculate the probability of 4 or more faulty condensers from a pack of 100 condensers provided the manufacturing defect for the condensers is 1%.

Here p=1% =0.01 and n= 100 * 0.01 =1

so we can use the Poisson random variables probability mass function P.M.F

mean= np = 100*0.01=1

AnyConv.com 26

so the probability for 4 or more faulty condensers will be

AnyConv.com 27

=1-[P(0)+P(1)+P(2)+P(3)]

AnyConv.com 28

Example: If 0.002 chances are there for a product to be defective from the manufacturing, for a pack containing 10 of such products what would be the probability that such a packet has no defective, one defective, and two defective products from the consignment of 50000 packets of same product.

Here for a single pack probability of defect i.e p=0.002, n=10

then the mean np=0.002*10= 0.020

AnyConv.com 29

we will find for each case as

AnyConv.com 30
Binomial Random Variable: Example

So from the table it is clear that the number of defective blades in packets zero, one and two will be 4900,980,10 respectively.

Conclusion:

   In this article we discussed some properties of one of Binomial random variable, Poisson random variable and Random Experiment.  Also one more discrete random variable i.e Poisson random variable, discussed with properties. The distribution for the probability mass function, expectation , variance and standard deviation example also taken for better understanding , In the next articles we try to cover some more discrete random variables if you want further reading then go through Mathematics Page.

Schaum’s Outlines of Probability and Statistics

https://en.wikipedia.org/wiki/Probability

Probability Mass Function: 5 Examples

Discrete Random Variable and Mathematical Expectation-II

As already we now familiar with the discrete random variable, it is the random variable which takes countable number of possible values in a sequence. The two important concept related to the discrete random variables are the probability of discrete random variable and distribution function we restrict the name to such probability and distribution function as,

Probability Mass function (p.m.f)

                The Probability Mass function is the probability of the discrete random variable, so for any discrete random variables  x1, x2, x3, x4,……, xk  the corresponding probabilities P(x1), P(x2), P(x3), P(x4)……, P(xk) are the corresponding probability mass functions.

Specifically, for X=a, P(a)=P(X=a) is its p.m.f

We here onwards use probability mass function for discrete random variables probability.  All the probability characteristics for the probability will obviously applicable to probability mass function like positivity and summation of all p.m.f will be one e.t.c.

Cumulative Distribution Function (c.d.f)/Distribution Function

  The distribution function defined as

F(x)=P(X<=x)

for discrete random variable with probability mass function is the cumulative distribution function (c.d.f.) of the random variable.

and mathematical expectation for such random variable we defined was

gif

we now see some of the results of mathematical expectations

  1. If x1, x2, x3, x4,….. are the discrete random variables with respective probabilities P(x1), P(x2), P(x3), P(x4) … the expectation for the real valued function g will be
gif

Example: for the following probability mass functions find the E(X3)

probability mass function

Here the g(X)=X3

So,

gif
gif

E (X3) = (-1)3 <em>0.2 + (0)3</em> 0.5 + (1)3 * 0.3

E (X3) = 0.1

In the similar way for any nth order we can write

CodeCogsEqn 13

Which is known as nth moment.

2. If a and b are constants then

E[aX + b]=aE[X] + b

This we can understand easily as

gif
gif

=aE[X] + b

Variance in terms of Expectation.

                For the mean denoted by μ the variance of the discrete random variable X denoted by var(X) or σ in terms of expectation will be

Var(X) =E[(X- μ)2]

and this we can further simplify as

Var(X) =E[(X- μ)2]

gif
gif
gif
gif
gif

= E [X2] – 2μ2 + μ2

= E [X2] – μ2

this means we can write the variance as the difference of the expectation of random variable square and square of expectation of random variable.

i.e. Var (X)= E[X2] – (E[X])2

Example:  when a die is thrown calculate the variance.

Solution:  here we know when die thrown the probabilities for each face will be

p(1)=p(2)=p(3)=p(4)=p(5)=p(6)=1/6

hence for calculating variance we will find expectation of random variable and its square as

E[X]=1.(1/6)+2.(1/6)+3.(1/6)+4.(1/6)+5.(1/6)+6.(1/6)=(7/2)

E[X2] =12.(1/6)+22.(1/6)+32.(1/6)+42.(1/6)+52.(1/6)+62.(1/6) =(1/6)(91)

and we just obtained the variance as

Var (X) =E[X2] – (E[X])2

so

Var (X)=(91/6) -(7/2)2 =35/12

One of the important identity for variance is

  1. For the arbitrary constants a and b we have

Var(aX + b) =a2 Var(X)

This we can show easily as

Var(aX + b) =E[(aX+ b -aμ-b)2 ]

=E[a2(X – μ)2]

=a2 E[(X-μ)2]

=a2 Var(X)

Bernoulli Random variable

      A Swiss mathematician James Bernoulli define the Bernoulli random variable as a random variable having either success or failure as only two outcomes for the random experiment.

i.e When the outcome is success X=1

When the outcome is failure X=0

So the probability mass function for the Bernoulli random variable is

p(0) = P{X=0}=1-p

p(1) =P{X=1}=p

where p is the probability of success and 1-p will be the probability of failure.

Here we can take 1-p=q also where q is the probability of failure.

As this type of random variable is obviously discrete so this is one of discrete random variable.

Example: Tossing a coin.

Binomial Random Variable

If for a random experiment which is having only outcome as success or failure we take n trials so each time we will get either success or failure then the random variable X representing outcome for such n trial random experiment is known as Binomial random variable.

                In other words if p is the probability mass function for the success in  the single Bernoulli trial and q=1-p is the probability for the failure then the probability for happening of event ‘x or i’ times in n trials will be

gif

or

gif.latex?p%28i%29%3D%5Cbinom%7Bn%7D%7Bi%7Dp%5E%7Bi%7D%281 p%29%5E%7Bn i%7D%20where%20i%20%3D0%2C1%2C2%2C%u2026

Example: If we toss two coins six times and getting head is success and remaining occurrences are failures then its probability will be

gif
gif

in the similar way we can calculate for any such experiment.

The Binomial random variable is having the name Binomial because it represent the expansion of

Test
If we put in place of n=1 then this would turn into the Bernoulli’s random variable.

Example: If five coins were tossed and the outcome is taken independently then what would be the probability for number of heads occurred.

Here if we take random variable X as the number of heads then it would turns to the binomial random variable with n=5 and probability of  success as ½

So by following the probability mass function for the binomial random variable we will get

gif
gif
gif
gif
gif

Example:

In a certain company the probability of defective is 0.01 from the production. The company manufacture and sells the product in a pack of 10 and to its customers offer money back guarantee that at most 1 of the 10 product is defective, so what proportion of sold products pack the company must replace.

Here If X is the random variable representing the defective products then it is of the binomial type with n=10 and p=0.01 then the probability that the pack will return is

CodeCogsEqn 14

Example: (chuck-a-luck/ wheel of fortune) In a specific game of fortune in hotel a player bet on any of the numbers from 1 to 6, three dice then rolled and if the number appears bet by the player once, twice or thrice the player that much units means if appear once then 1 unit if on two dice then 2 units and if on three dice then 3 units, check with the help of probability the game is fair for the player or not.

If we assume there will be no unfair means with the dice and con techniques then by assuming the outcome of the dice independently the probability of success for each dice is 1/6 and failure will be

 1-1/6 so this turns to be the example of binomial random variable with n=3

so first we will calculate the winning probabilities by assigning x as players win

gif
gif
gif
gif

Now to calculate the game is fair for the player or not we will calculate the expectation of the random variable

E[X] = -125+75+30+3/216

= -17/216

This means the probability of losing the game for the player when he plays 216 times is 17.

Conclusion:

   In this article we discussed some of the basic properties of a discrete random variable, probability mass function and variance.  In addition we has seen some types of a discrete random variable, Before we start the continuous random variable we try to cover all the types and properties of discrete random variable, if you want further reading then go through:

Schaum’s Outlines of Probability and Statistics

https://en.wikipedia.org/wiki/Probability

For more Topics on Mathematics, please follow this link

The Conditional Probability: 7 Interesting Facts To Know

CodeCogsEqn 1 2

Conditional probability

Conditional probability theory come out from the concept of taking huge risk. there are many issue now a days that stalk from the game of chance, such as throwing coins, throwing a dice, and playing cards. 

Conditional probability theory is applied in many different domains and the flexibility of Conditional probability provides tools for almost so many different needs. probability theory and samples related to the study of the probability of the happening of events.

Consider X and Y both are two events of a incidental experiment. Afterwards, the probability of the happenings of X under the circumstance that Y has already happened with P (Y) ≠ 0, is known as conditional probability and is denoted by P (X / Y).

Therefore, P (X / Y) = The probability of the happening of X, if provided that Y has already happened.

P(X ⋂ Y)/P( Y ) = n(X ⋂ Y)/n (Y )

Similarly, P (Y / X) = The probability of the occurrence of Y, as X has already happened.

P(X ⋂ Y)/P( X ) = n(X ⋂ Y)/n (Y )

In brief for some cases, P (X / Y) is used to specify the probability of the occurrence of X when Y occurs. Similarly, P (Y / X) is used to specify the probability of Y happening while X happens.

What is Multiplication theorem on Probability?

If X and Y both are self-supporting (independent) events of  a arbitrary experiment, then

P( X Y) = P( X ). P( X/Y ), if P ( X ) ≠ 0

P( X Y) = P( Y ). P( Y/X ), if P ( Y ) ≠ 0

What is Multiplication theorems for independent events? 

If X and Y both are self-supporting (independent) events connected to  a arbitrary experiment, then   P(X ∩ Y) =P(X).P(Y)

i.e., the probability of simultaneous happening of two independent events is equal to the multiplication of their probabilities. By using multiplication theorem, we have  P(X ∩ Y) =P(Y).P(Y/X)

 As X and Y are independent events, therefore P(Y/X)=P(Y)

Implies, P(X ∩ Y) =P(X).P(Y)

While events are mutually exclusive : 

If X and Y are mutually exclusive events, then ⇒ n(X ∩ Y)= 0 , P(X ∩ Y) = 0

P(X U Y)=P(X) +P(Y)

For any three events X, Y, Z which are mutually exclusive, 

P(X ∩ Y)= P(Y ∩ Z) =P(Z ∩ X) =P(X ∩ Y ∩ Z) =0

P (X ⋃ Y ⋃ Z) = P(X) + P(Y) + P(Z)

While events are independent : 

If X and Y are unconstrained (or independent) events, then

P(X ∩ Y) = P(X).P(Y)

P(X U Y) = P(X) + P(Y) – P(X). P(Y)

Let X and Y be two events connected with a arbitrary (or random) experiment, then

CodeCogsEqn 1 2
CodeCogsEqn 2 1

If Y⊂ X, then

CodeCogsEqn 4

(b) P(Y) ≤ P(X)

Similarly if X⊂ Y, then

CodeCogsEqn 6

(b) P(X) ≤ P(Y)

Probability of occurrence of neither X nor Y is 

CodeCogsEqn 8

Example: If from a pack of cards a single card is picked. What is the possible chance that it is either a spade or a king?

solution:

P (A) = P (a spade card) =13/52

P (B) = P (a king card) =4/52

P (either a spade or a king card) = P (A or B)

=P(A∪B)=P(A)+P(B)-P(A∩B)

=P(A)+P(B)-P(A)P(B)

=13/52+4/52-{(13/52)*(4/52)}

=4/13

Example: Someone is known to hit the target with 3 out of 4 chances, while another person is known to hit the target with 2 out of 3 chances. Find out if the probability that target are likely to be hit at all when both people are trying.

solution:

 probability of  target hit by first person = P (A) =3/4

probability of  target hit by second person = P (B) =2/3

The two events are not mutually exclusive, since both persons hit the same target   =P (A or B)

=P(A∪B)=P(A)+P(B)-P(A∩B)

=P(A)+P(B)-P(A)P(B)

=3/4+2/3-{(3/4)*(2/3)}

=11/12

Example: If  A  and B are two events such that P(A)=0.4 , P(A+B)=0.7 and P(AB)=0.2 then P(B) ?

solution: Since we have P(A+B)=P(A) +P(B) -P(AB)

=> 0.7=0.4+ P(B)-0.2

=> P(B) =0.5

Example: A card is selected at arbitrary from a pack of cards. What is the possibility of the card being a red color card or a queen.

Solution: Required probability  is

P(Red + Queen)-P(Red ⋂ Queen)

=P(Red) +P(Queen)-P(Red ⋂ Queen)

=26/52+4/52-2/52=28/52=7/13

Example: If the probability of X failing in the test is 0.3 and that the probability of Y is 0.2, then find the probability that X or Y failed in the test?

Solution: Here P(X)=0.3 , P(Y)=0.2

Now P(X ∪ Y)= P(X) +P(Y) -P(X ⋂ Y)

Since these are independent events, so

P(X ⋂ Y) =P(X) . P(Y)

Thus required probability is 0.3+0.2 -0.06=0.44

Example: The chances to fail in Physics are 20% and the possibility to fail in Mathematics are 10%. What are the possibilities to fail in at least one subject ?

Solution: Let P(A) =20/100=1/5, P(B) =10/100=1/10

Since events are independent and we have to find 

P(A ∪ B)=P(A) +P(B) -P(A). P(B)

=(1/5)+(1/10)-(1/5). (1/10)= (3/10)-(1/50)=14/50

So the chance of fail in one subject is (14/50)X 100=28%

Example: The probability of solving a question by three students are 1/2,1/4, and 1/6 respectively. What  will be the possible chance of answering the question?

Solution:

(i) This question can also be solved by one student

(ii) This question can be answered by two students concurrently.

(iii) This question can be answered by three students all together.

P(A)=1/2, P(B)=1/4, P(C)=1/6

P(A ∪ B ∪ C)= P(A) + P(B) +P(C)- [P(A).P(B)+P(B).P(C)+P(C).P(A)] + [P(A).P(B).P(C)]

=(1/2)+(1/4)+(1/6)-[(1/2).(1/4)+(1/4).(1/6)+(1/6).(1/2)] +[(1/2).(1/4).(1/6)] =33/48

Example: A random variable X has the probability distribution

X12345678
P(X)0.150.230.120.100.200.080.070.05
Conditional probability: Example

For the events E ={X is prime number} and F={X<4},  find the probability of P(E ∪ F) .

Solution:

E ={ X is a prime number}

P(E) = P(2) +P(3)+ P(5) +P(7) =0.62

F ={X < 4}, P(F) =P(1)+P(2)+P(3)=0.50

and P(E ⋂ F) = P(2)+ P(3) =0.35

P(E ∪ F) =P(E)+P(F) – P(E ⋂ F)

      = 0.62+0.50 – 0.35 = 0.77

Example: Three coins are tossed. If one of them appears tail, then what would be the possible chance that all the three coins appear tail ?

Solution: Consider E is the event where all the three coins appears tail and F is the event where a coin appears tail. 

F= {HHT, HTH, THH, HTT, THT, TTH, TTT}

and E = {TTT}

Required probability = P(E/F)=P(E ⋂F )/P(E)=1/7

Total probability and Baye’s rule

The law of total probability :

For the sample space S and n mutually exclusive and exhaustive events E1 E2 ….En related with a random experiment. If X is a specific event which happens with the events E1 or E2 or …or En, then 

Baye’s rule : 

Consider S be a sample space and E1, E2, …..En be n incongruous (or mutually exclusive) events such that

gif

and P(Ei) > 0 for i = 1,2,…,n

We can think of Ei’s as the factors leading to the outcome of the an experiment. The probabilities P(Ei), i = 1, 2, ….., n are called known as prior (or earlier) probabilities. If the assessment outcomes in an result of event X, where P(X) > 0. Then we have to perceive the possibility that the perceived the event X was due to cause Ei, that is, we look for the conditional probability P(Ei/X) . These probabilities are known as posterior probabilities, given by Baye’s rule as

CodeCogsEqn 11

Example: There are 3 Boxes which are known to contain 2 blue and 3 green marbles ; 4 blue and 1 green marbles and 3 blue and 7 green marbles respectively. A marble is drawn at random from one of the boxes and found to be a green ball. Then what is the probability that it was drawn from the Box containing the most green marbles.

Solution: Consider the following events :

A ->marble drawn is green;

E1 -> Box 1 is chosen;

E2 Box 2 is chosen

E3 Box 3 is chosen.

P(E1)=P(E2)=P(E3)=1/3 , p(A/E1)=3/5

Then

P(A/E2)=1/5, P(A/E3)=7/10

Required probability =P(E3/A)

P(E3)P(A/E3)/P(E1)P(A/E1)+P(E2)P(A/E2)+P(E3)P(A/E3)=7/15

Example: In an entrance test there are multiple choice questions. There are four probable correct answers to each question of which one is right. The possible chance that a pupil perceives the right answer to a particular question is 90%. If he gets the right answer to a particular question, then what is the probable chance that he was predicting.

Solution: We define the following events :

A1 : He knows the answer.

A2 : He might not know the answer.

E: He is aware of the right answer.

P(A1) =9/10, P(A2) =1-9/10=1/10, P(E/A1)=1,

P(E/A2) = 1/4

uLx44GwAKqC5FgaL3pOZbwf6PytzEThkEgj1wp1QOhW7NHbiboSvyGjKjfVSpcNTxeR nEuIiYOwQhKhUHvnIXZ7i58YjsAvAKyB7DJAQLePSkZLYRoLLbIIZd3JaC Ewhor dc So the expected probability

Conditional Probability
Conditional Probability

Example: Bucket A contains 4 Yellow and 3 Black Marbles and Bucket B contains 4 Black and 3 Yellow Marbles. One Bucket is taken at random and a Marble is drawn and noted it is Yellow. What is the probability that it comes Bucket B.

Solution: It is based on Baye’s theorem. 

Probability of picked Bucket A , P(A)=1/2

Probability of picked Bucket B , P(B)=1/2

Probability of Yellow Marble picked from Bucket A  =P(A). P(G/A)=(1/2)x (4/7)=2/7 

Probability of Yellow Marble picked from Bucket B = P(B).P(G/B)=(1/2)x(3/7)=3/14

Total probability of Yellow Marbles= (2/7)+(3/14)=1/2

Probability of fact that Yellow Marbles is drawn from Bucket B  

P(G/B)={P(B).P(G/B)}/{P(A).P(G/A)+P(B).P(G/B)}={(1/2)x(3/7)}/{[(1/2)x(4/7)]+[(1/2)+(3/7)]} =3/7

Conclusion:

 In this article we mainly discuss on the Conditional Probability and Bayes theorem with the examples of these the direct and dependent consequence of the trial we discuss so far now in the consecutive articles we relate probability to random variable and some familiar terms related to probability theory we will discuss, if you want further reading then go through:

Schaum’s Outlines of Probability and Statistics and Wikipedia page.

For further study, please refer our mathematics page.

Permutations And Combinations: 3 Important Facts To Remember

  After discussing the definitions and basic concepts we will enlist all the results and relations of permutation and combination, depending on all those we will get more familiar with the concept of permutation and combination by solving miscellaneous examples.

Points to remember (Permutation)

  1. The number of ways of ordering = nPr={n(n-1)(n-2)…..(n-r+1)((n-r)!)}/(n-r)!= n!/{(n-r)!}
  2. The number of arranging of n different objects taken all together at a time is = nPn =n!
  3. nP0 =n!/n!=1
  4. P=n. n-1Pr-1
  5. 0!=1
  6. 1/(-r)!=0 , (-r)!=∞ (r N)
  7. The number of ways of filling r places where every place can be filled by any one of n objects, The count of permutations = The number of ways of stuffing r places =(n)r   

Example: How many numbers between 999 and 10000 can be generated with the help of numbers 0, 2, 3,6,7,8 where the digits must not be duplicated?

Solution: The numbers in between 999 and 10000 all are of four digit numbers.

                   The four-digit numbers constructed by digits 0, 2, 3,6,7,8 are

Permutation
Permutation: Example

  But here those numbers are also involved which begin from 0. So we can take the numbers which are formed with three digits.

Taking initial digit 0, the number of ways to arrange pending 3 places from five digits 2, 3,6,7,8 are 5P3 =5!/(5-3)!=2!*3*4*5/2!= 60

So the required numbers = 360-60 = 300.

Example: How many books can be set out in a row so that the two books mentioned are not together?

Solution: Total number of orders of n different books =n!.                                                                                                                

           If two mentioned books always together then number of ways =(n-1)!x2

Example: How many ways are there divided by 10 balls between two boys, one getting two and the other eight.

Solution: A gets 2, B  gets 8;  10!/2!8!=45

                  A gets 8, B gets 2; 10!/(8!2!)=45

that means 45+45=90 ways the ball will be divided.

Example: Search the number of arranging of the alphabets of the word “CALCUTTA”.

Solution: Required number of ways =8!/(2!2!2!)=5040

Example: Twenty people have been invited to the party. How many different ways in which they and the host can sit at a round table, if the two people have to sit on either side of the keeper.

Solution: There will be total 20 + 1 = 21 persons in all.

The two specified persons and the host be considered as one unit so that these remain 21 –  3 + 1 = 19 persons to be arranged in 18 ! ways.

 But the two particular person on either side of the host can themselves be arranged in 2! ways.

  Hence there are 2 ! *18 ! ways.

Example : In how many ways a garland can be made from exactly 10 flowers.

Solution:  n flowers’ garland can be made in (n-1)! ways.

Using 10 flowers garland can be prepared in 9!/2 different ways.

Example: Find the specific four-digit number which should be formed by 0, 1, 2, 3, 4, 5, 6, 7 so that each and every number has the number 1.

Solution: After securing 1 at first position out of 4 places 3 places can be filled by7P3 =7!/(7-3)!=5*6*7=210 ways.

But some numbers whose fourth digit is zero, so such type of ways =6P2=6!/(6-2)!=20.

                   Total ways = 7P36P2 =210-20=180

Keep these Points in mind for Combination

  • The number of combinations of n objects, of which p are identical, taken r at a time is

n-pCr+n-pCr-1+n-pCr-2+……..+n-pC0 , if r<=p and  n-pCr+n-pCr-1+n-pCr-2+…..+n-pCr-p  , if r>p

  1. n choose 0 or n choose n is 1, nC0 = nCn =1, nC1 =n.
  2. nCr + nCr-1 = n+1Cr
  3. Cx = nCy <=> x=y or x+y=n
  4. n. n-1Cr-1 =(n-r+1) nCr-1
  5. nC0+nC2+nC4+….=nC1+nC3+nC5…..=2n-1
  6. 2n+1C0+2n+1C1+2n+1C2+……+2n+1Cn=22n
  7. nCn+n+1Cn+n+2Cn+n+3Cn+………..+2n-1Cn=2nCn+1
  8. Number of combinations of n dissimilar things taken all at a time. nCn=n!/{n!(n-n)!}=1/(0)!=1

In continuation we will solve some examples  

Example: If 15Cr=15Cr+5 , then what is the value of r?

Solution: Here we will use the above

 nCr=nCn-r on the left side of the equation

15Cr=15Cr+5 => 15C15-r =15Cr+5

=> 15-r=r+5 => 2r=10 => r=10/2=5

so the value of r is 5 implies the problem of 15 CHOOSE 5.

Example: If 2nC3 : nC2 =44:3   find the value of r, So that the value of nCr  will be 15.

 Solution: Here the given term is the ratio of 2n choose 3 and n choose 2 as

by the definition of combination

(2n)!/{(2n-3)!x3!} X {2!x(n-2)!}/n!=44/3

=> (2n)(2n-1)(2n-2)/{3n(n-1)}=44/3

=> 4(2n-1)=44 =>2n=12 => n=6

                   Now 6Cr=15 => 6Cr=6C2   or 6C4 => r=2, 4

so the problem is turned out to be 6 choose 2 or 6 choose 4

Example:  If  nCr-1= 36 nCr=84 and nCr+1=126 , then what would be the value of r ?

 Solution : Here nCr-1 / nCr =36/84 and nCr /nCr+1 =84/126 .

(n)!/{(n-r+1)!x(r-1)!} X {(r)!x(n-r)!}/(n)!=36/84

r/(n-r+1)=3/7 => 7r=3n-3r+3

=> 3n-10r=-3, and similarly from second ration we get

4n-10r=6

On solving, we get n=9, r=3

so the problem turned out to be 9 choose 3 , 9 choose 2 and 9 choose 4.

Example: Everyone in the room shakes hands with everyone. The total count of handshaking are 66. Find the number of person in the room.

nC2 =66 => n!/{2!(n-2)!}=66 => n(n-1)=132 => n=12

Solution: so the value of n is 12 implies the total number of people in the room is 12 and the problem is 12 choose 2.

Example: In a football tournament, 153 games were played. All teams played one game. find the number of groups involved in tournament.

Solution:

here nC2 =153 => n!/{2!(n-2)} = 153 => n(n-1)/2=153 => n=18

so the total number of teams participated in the tournament were 18 and the combination is 18 choose 2 .

Example During the Deepawali ceremony each club member sends greeting cards to others. If there are 20 members in the club, what would be the total number of ways greeting cards exchanged by the members.

Solution: Since two members can exchange cards each other in two ways so there is 20 choose 2 two times

2 x 20C2 =2 x (20!)/{2!(20-2)!}=2*190=380, there would be 380 ways to exchange greeting cards.

Example: Six plus ‘+’ and four minus ‘-’ symbols should be arranged in such a straight line so that no two ‘-’ symbols meet, find  the total number of ways .

 Solution: The ordering can be make as -+-+-+-+-+-+- the (-) signs can be put in 7 vacant (pointed) place.

Hence required number of ways = 7C4 =35 .

Example: If nC21 =nC6 , then  find nC15 =?

Solution: Given nC21 =nC6

21+6=n => n=27

Hence 27C15 =27!/{15!(27-15)!} =17383860

which is the 27 choose 15.

Conclusion

Some examples are taken depending on the relations and results, as number of examples we can take on each of the result but the important thing here I want to show was how we can use any result depending on the situation if you require further reading you can go through the content or if any personal help then you can free to contact us some of the related content you can find from:

For more topics on Mathematics, please check this link.

SCHAUM’S OUTLINE OF Theory and Problems of DISCRETE MATHEMATICS

https://en.wikipedia.org/wiki/Permutation

https://en.wikipedia.org/wiki/Combination