In the probability theory the Chebyshev’s Inequality & central limit theorem deal with the situations where we want to find the probability distribution of sum of large numbers of random variables in approximately normal condition, Before looking the limit theorems we see some of the inequalities, which provides the bounds for the probabilities if the mean and variance is known.

**Table of Content**

## Markov’s inequality

The Markov’s inequality for the random variable X which takes only positive value for a>0 is

[latex]P\{X \geq a\} \leq \frac{E[X]}{a}[/latex]

to prove this for a>0 consider

[latex]I=\left\{\begin{array}{ll}

1 & \text { if } X \geq a \\

0 & \text { otherwise }

\end{array}\right.[/latex]

Since

[latex]X \geq 0 \\

\\ I \leq \frac{X}{a}[/latex]

now taking expectation of this inequality we get

[latex]E[I] \leq \frac{E[X]}{a}[/latex]

the reason is

[latex]E[I]=P\{X \geq a\}[/latex]

which gives the Markov’s inequality for a>0 as

[latex]P\{X \geq a\} \leq \frac{E[X]}{a}[/latex]

## Chebyshev’s inequality

For the finite mean and variance of random variable X the Chebyshev’s inequality for k>0 is

[latex]P(|X-\mu| \geq k\} \leq \frac{\sigma^{2}}{k^{2}}[/latex]

where sigma and mu represents the variance and mean of random variable, to prove this we use the Markov’s inequality as the non negative random variable

[latex](X-\mu)^{2}[/latex]

for the value of a as constant square, hence

[latex]P\left\{(X-\mu)^{2} \geq k^{2}\right\} \leq \frac{E\left[(X-\mu)^{2}\right]}{k^{2}}[/latex]

this equation is equivalent to

[latex]P(|X-\mu| \geq k\} \leq \frac{E\left[(X-\mu)^{2}\right]}{k^{2}}=\frac{\sigma^{2}}{k^{2}}[/latex]

as clearly

[latex](X-\mu)^{2} \equiv k^{2} \text{if and only if} |X-\mu| \geq k[/latex]

## Examples of Markov’s and Chebyshev’s inequalities :

- If the production of specific item is taken as random variable for the week with mean 50 , find the probability of production exceeding 75 in a week and what would be the probability if the production of a week is between 40 and 60 provided the variance for that week is 25?

Solution: Consider the random variable X for the production of the item for a week then to find the probability of production exceeding 75 we will use Markov’s inequality as

[latex]P(X>75\} \leq \frac{E[X]}{75}=\frac{50}{75}=\frac{2}{3}[/latex]

Now the probability for the production in between 40 to 60 with variance 25 we will use Chebyshev’s inequality as

[latex]P\{|X-50| \geq 10\} \leq \frac{\sigma^{2}}{10^{2}}=\frac{1}{4}[/latex]

so

[latex]P\{|X-50|<10\} \geq 1-\frac{1}{4}=\frac{3}{4} [/latex]

this shows the probability for the week if the production is between 40 to 60 is 3/4.

2. Show that the chebyshev’s inequality which provides upper bound to the probability is not particularly nearer to the actual value of the probability.

Solution:

Consider the random variable X is uniformly distributed with mean 5 and variance 25/3 over the interval (0,1) then by the chebyshev’s inequality we can write

[latex]P(|X-5|>4\} \leq \frac{25}{3(16)} \approx 0.52[/latex]

but the actual probability will be

[latex]P(|X-5|>4\}=0.20[/latex]

which is far from the actual probability likewise if we take the random variable X as normally distributed with mean and variance then Chebyshev’s inequality will be

[latex]P\{|X-\mu|>2 \sigma\} \leq \frac{1}{4}[/latex]

but the actual probability is

[latex]P(|X-\mu|>2 \sigma\}=P\left\{\left|\frac{X-\mu}{\sigma}\right|>2\right\}=2[1-\Phi(2)] \approx 0.0456

[/latex]

## Weak Law of Large Numbers

The weak law for the sequence of random variables will be followed by the result that Chebyshev’s inequality can be used as the tool for proofs for example to prove

[latex]P\{X=E[X]\}=1[/latex]

if the variance is zero that is the only random variables having variances equal to 0 are those which are constant with probability 1 so by Chebyshev’s inequality for n greater than or equal to 1

[latex]P\left\{|X-\mu|>\frac{1}{n}\right\}=0[/latex]

as

[latex]n \rightarrow \infty[/latex]

by the continuity of the probability

[latex]\begin{aligned}

0=\lim _{n \rightarrow \infty} P\left\{|X-\mu|>\frac{1}{n}\right\} &=P\left\{\lim _{n \rightarrow \infty}\left\{|X-\mu|>\frac{1}{n}\right\}\right\} \\

&=P\{X \neq \mu\}

\end{aligned}[/latex]

which proves the result.

**Weak Law of Large Numbers**: For the sequence of identically distributed and independent random variables X_{1},X_{2},……. each of which having the finite mean E[X_{i}]=μ, then for any ε>0

[latex]P\left\{\left|\frac{X_{1}+\cdots+X_{n}}{n}-\mu\right| \geq \varepsilon\right\} \rightarrow 0 \quad \text { as } \quad n \rightarrow \infty[/latex]

to prove this we assume that variance is also finite for each random variable in the sequence so the expectation and variance

[latex]E\left[\frac{X_{1}+\cdots+X_{n}}{n}\right]=\mu \quad \text { and } \quad \operatorname{Var}\left(\frac{X_{1}+\cdots+X_{n}}{n}\right)=\frac{\sigma^{2}}{n}[/latex]

now from the Chebyshev’s inequality the upper bound of the probability as

[latex]P\left\{\left|\frac{X_{1}+\cdots+X_{n}}{n}-\mu\right| \geq \varepsilon\right\} \leq \frac{\sigma^{2}}{n \varepsilon^{2}}[/latex]

which for n tending to infinity will be

[latex]P\left\{\left|\frac{X_{1}+\cdots+X_{n}}{n}-\mu\right| \geq \varepsilon\right\} \rightarrow 0 \quad \text { as } \quad n \rightarrow \infty[/latex]

## Central Limit theorem

The central limit theorem is one of the important result in probability theory as it gives the distribution to the sum of large numbers which is approximately normal distribution in addition to the method for finding the approximate probabilities for sums of independent random variables central limit theorem also shows the empirical frequencies of so many natural populations exhibit bell-shaped means normal curves, Before giving the detail explanation of this theorem we use the result

“If the sequence of random variables Z_{1},Z_{2},…. have the distribution function and moment generating function as F_{Zn} and M_{zn} then

[latex]M_{Z_{n}}(t) \rightarrow M_{Z}(t) \text{ for all t, then } F_{Z_{n}}(t) \rightarrow F_{Z}(t) \text{ for all t at which } F_{Z}(t) \text{is continuous”}[/latex]

** Central Limit theorem**: For the sequence of identically distributed and independent random variables X_{1},X_{2},……. each of which having the mean μ and variance σ2 then the distribution of the sum

[latex]\frac{X_{1}+\cdots+X_{n}-n \mu}{\sigma \sqrt{n}}[/latex]

tends to the standard normal as n tends to infinity for a to be real values

[latex]P\left\{\frac{X_{1}+\cdots+X_{n}-n \mu}{\sigma \sqrt{n}} \leq a\right\} \rightarrow \frac{1}{\sqrt{2 \pi}} \int_{-\infty}^{a} e^{-x^{2} / 2} d x \quad \text { as } n \rightarrow \infty[/latex]

Proof: To prove the result consider the mean as zero and variance as one i.e μ=0 & *σ*^{2}=1 and the moment generating function for X_{i} exists and finite valued so the moment generating function for the random variable X_{i}/√n will be

[latex]E\left[\exp \left\{\frac{t X_{i}}{\sqrt{n}}\right\}\right]=M\left(\frac{t}{\sqrt{n}}\right)[/latex]

hene the moment generating function for the sum ΣX_{i}/√n will be

[latex]\left[M\left(\frac{L}{\sqrt{n}}\right)\right]^{n} [/latex]

Now let us take L(t)=logM(t)

so

[latex]\begin{aligned}

L(0) &=0 \\

L^{\prime}(0) &=\frac{M^{\prime}(0)}{M(0)} \\

&=\mu \\

&=0 \\

L^{\prime \prime}(0) &=\frac{M(0) M^{\prime \prime}(0)-\left[M^{\prime}(0)\right]^{2}}{[M(0)]^{2}} \\

&=E\left[X^{2}\right] \\

&=1

\end{aligned}[/latex]

to show the proof we first show

[latex][M(t / \sqrt{n})]^{n} \rightarrow e^{2^{2} / 2} \text{ as }n \rightarrow \infty[/latex]

by showing its equivalent form

[latex]n L(t / \sqrt{n}) \rightarrow t^{2} / 2[/latex]

since

[latex]\begin{aligned}

\lim _{n \rightarrow \infty} \frac{L(t / \sqrt{n})}{n^{-1}}&=\lim _{n \rightarrow \infty} \frac{-L^{\prime}(t / \sqrt{n}) n^{-3 / 2} t}{-2 n^{-2}} \quad \text{by L’Hôpital’s rule }\\

&=\lim _{n \rightarrow \infty}\left[\frac{L^{\prime}(t / \sqrt{n}) t}{2 n^{-1 / 2}}\right] \\

&=\lim _{n \rightarrow \infty}\left[\frac{-L^{\prime \prime}(t / \sqrt{n}) n^{-3 / 2} t^{2}}{-2 n^{-3 / 2}}\right] \quad \text{ again by L’Hôpital’s rule }\\

&=\lim _{n \rightarrow \infty}\left[L^{\prime \prime}\left(\frac{t}{\sqrt{n}}\right) \frac{t^{2}}{2}\right] \\

&=\frac{t^{2}}{2}\end{aligned}[/latex]

hence this shows the result for the mean zero and variance 1, and this same result follows for the general case also by taking

[latex]X_{i}^{*}=\left(X_{i}-\mu\right) / \sigma[/latex]

and for each a we have

[latex]P\left\{\frac{X_{1}+\cdots+X_{n}-n \mu}{\sigma \sqrt{n}} \leq a\right\} \rightarrow \Phi(a)[/latex]

## Example of the Central Limit theorem

To calculate the distance in light year of a star from the lab of an astronomer, he is using some measuring techniques but because of change in atmosphere each time the distance measured is not exact but with some error so to find the exact distance he plans to observe continuously in a sequence and the average of these distances as the estimated distance, If he consider the values of measurement identically distributed and independent random variable with mean d and variance 4 light year, find the number of measurement to do to obtain the 0.5 error in the estimated and actual value?

Solution: Let us consider the measurements as the independent random variables in sequence X_{1},X_{2},…….X_{n} so by the Central Limit theorem we can write

[latex]Z_{n}=\frac{\sum_{i=1}^{n} X_{i}-n d}{2 \sqrt{n}}[/latex]

which is the approximation into standard normal distribution so the probability will be

[latex]\left.\begin{array}{rl}

P\left\{-0.5 \leq \frac{\sum_{i=1}^{n} X_{i}}{n}-d \leq 0.5\right.

\end{array}\right\}=P\left\{-0.5 \frac{\sqrt{n}}{2} \leq Z_{n} \leq 0.5 \frac{\sqrt{n}}{2}\right\} \approx \Phi\left(\frac{\sqrt{n}}{4}\right)-\Phi\left(\frac{-\sqrt{n}}{4}\right)=2\Phi\left(\frac{\sqrt{n}}{4}\right)-1[/latex]

so to get the accuracy of the measurement at 95 percent the astronomer should measure n* distances where

[latex]2 \Phi\left(\frac{\sqrt{n^{*}}}{4}\right)-1=0.95 \quad \text { or } \quad \Phi\left(\frac{\sqrt{n^{*}}}{4}\right)=0.975[/latex]

so from the normal distribution table we can write it as

[latex]\frac{\sqrt{n^{*}}}{4}=1.96 \quad \text { or } \quad n^{*}=(7.84)^{2} \approx 61.47[/latex]

which says the measurement should be done for 62 number of times, this also can be observed with the help of Chebyshev’s inequality by taking

[latex]E\left[\sum_{i=1}^{n} \frac{X_{i}}{n}\right]=d \quad \operatorname{Var}\left(\sum_{i=1}^{n} \frac{X_{i}}{n}\right)=\frac{4}{n}[/latex]

so the inequality results in

[latex]P\left\{\left|\sum_{i=1}^{n} \frac{X_{i}}{n}-d\right|>0.5\right\} \leq \frac{4}{n(0.5)^{2}}=\frac{16}{n} [/latex]

hence for n=16/0.05=320 which gives certainity that there will be only o.5 percent error in the measurement of the distance of the star from the lab of observations.

2. The number of admitted students in engineering course is Poisson distributed with mean 100, it was decided that if the admitted students are 120 or more the teaching will be in two sections otherwise in one section only, what will be the probability that there will be two sections for the course?

Solution: By following the Poisson distribution the exact solution will be

[latex]e^{-100} \sum_{i=120}^{\infty} \frac{(100)^{i}}{i !} [/latex]

which is obviously not give the particular numerical value, If we consider the random variable X as the students admitted then by the central limit theorem

[latex]P\{X \geq 120\}=P\{X \cong 119.5\} [/latex]

which can be

[latex]\begin{array}{l}

=P\left\{\frac{X-100}{\sqrt{100}} \geq \frac{119.5-100}{\sqrt{100}}\right\} \\

\approx 1-\Phi(1.95) \\

\approx 0.0256

\end{array} [/latex]

which is the numerical value.

3. Calculate the probability that the sum on ten die when rolled is between 30 and 40 including 30 and 40?

Solution: Here considering the die as X_{i} for ten values of i. the mean and variance will be

[latex]E\left(X_{i}\right)=\frac{7}{2}, \quad \operatorname{Var}\left(X_{i}\right)=E\left[X_{i}^{2}\right]-\left(E\left[X_{i}\right]\right)^{2}=\frac{35}{12} [/latex]

thus following the central limit theorem we can write

[latex]\begin{aligned}

P[29.5 \leq X \leq 40.5\} &=P\left\{\frac{29.5-35}{\sqrt{\frac{350}{12}}} \leq \frac{X-35}{\sqrt{\frac{350}{12}}} \leq \frac{40.5-35}{\sqrt{\frac{350}{12}}}\right\} \\

& \approx 2 \Phi(1.0184)-1 \\

& \approx 0.692

\end{aligned} [/latex]

which is the required probability.

4. For the uniformly distributed independent random variables X_{i} on the interval (0,1) what will be the approximation of the probability

[latex]P\left\{\sum_{i=1}^{10} X_{i}>6\right\} [/latex]

Solution: From the Unifrom distribution we know that the mean and variance will be

[latex]E\left[X_{i}\right]=\frac{1}{2} \qquad \operatorname{Var}\left(X_{i}\right)=\frac{1}{12} [/latex]

Now using the central limit theorem we can

[latex]\begin{aligned}

P\left\{\sum_{1}^{10} X_{i}>6\right\} &=P\left\{\frac{\sum_{1}^{10} X_{i}-5}{\sqrt{10\left(\frac{1}{12}\right)}}>\frac{6-5}{\sqrt{10\left(\frac{1}{12}\right)}}\right\} \\

& \approx 1-\Phi(\sqrt{1.2}) \\

& \approx 0.1367

\end{aligned} [/latex]

thus the summation of the random variable will be 14 percent.

5. Find the probability for the evaluator of the exam to give grades will be 25 exams in starting 450 min if there are 50 exams whose grading time is independent with mean 20 min and standard deviation 4 min.

Solution: Consider the time require to grade the exam by the random variable X_{i} so the random variable X will be

[latex]X=\sum_{i=1}^{25} X_{i} [/latex]

since this task for 25 exam is withing 450 min so

[latex]P\{X \leq 450\} [/latex]

[latex]E[X]=\sum_{i=1}^{25} E\left[X_{i}\right]=25(20)=500[/latex]

[latex]\operatorname{Var}(X)=\sum_{i=1}^{25}\\

\operatorname{Var}\left(X_{i}\right)=25(16)=400[/latex]

here using the central limit theorem

[latex]\begin{aligned}

P[X \leq 450\} &=P\left(\frac{X-500}{\sqrt{400}} \leq \frac{450-500}{\sqrt{400}}\right) \\

& \approx P(Z \leq-2.5\} \\

&=P(Z \geq 2.5\} \\

&=1-\Phi(2.5)=0.006

\end{aligned}[/latex]

which is the required probability.

## Central Limit theorem for independent random variables

For the sequence which is not identically distributed but having independent random variables X_{1},X_{2},……. each of which having the mean μ and variance *σ*^{2} provided it satisfies

- each X
_{i}is uniformly bounded - sum of the variances is infinite, then

[latex]P\left\{\frac{\sum_{i=1}^{n}\left(X_{i}-\mu_{i}\right)}{\sqrt{\sum_{i=1}^{n} \sigma_{i}^{2}}} \simeq a\right\} \rightarrow \Phi(a) \quad \text { as } n \rightarrow \infty[/latex]

**Strong Law of Large Numbers**

Strong Law of Large numbers is very crucial concept of the probability theory which says that the average of sequence of commonly distributed random variable with probability one will converge to the mean of that same distribution

**Statement**: For the sequence of identically distributed and independent random variables X_{1},X_{2},……. each of which having the finite mean with probability one then

[latex]\frac{X_{1}+X_{2}+\cdots+X_{n}}{n} \rightarrow \mu \quad \text { as } \quad n \rightarrow \infty^{\dagger}[/latex]

Proof: To prove this consider the mean of each of random variable is zero, and the series

[latex]S_{n}=\sum_{i=1}^{n} X_{i}[/latex]

now for this consider power of this as

[latex]\begin{aligned}

E\left[S_{n}^{4}\right]=& E\left[\left(X_{1}+\cdots+X_{n}\right)\left(X_{1}+\cdots+X_{n}\right)\right.\\

&\left.\times\left(X_{1}+\cdots+X_{n}\right)\left(X_{1}+\cdots+X_{n}\right)\right]

\end{aligned}[/latex]

after taking the expansion of the right hand side terms we have the terms of the form

[latex]X_{i}^{4}, \quad X_{i}^{3} X_{j}, \quad X_{i}^{2} X_{j}^{2}, \quad X_{i}^{2} X_{j} X_{k}, \quad \quad X_{i} X_{j} X_{k} X_{l}[/latex]

since these are independents so the mean of these will be

[latex]\begin{aligned}

E\left[X_{i}^{3} X_{j}\right] &=E\left[X_{i}^{3}\right] E\left[X_{j}\right]=0 \\

E\left[X_{i}^{2} X_{j} X_{k}\right] &=E\left[X_{i}^{2}\right] E\left[X_{j}\right] E\left[X_{k}\right]=0 \\

E\left[X_{i} X_{j} X_{k} X_{l}\right] &=0 \\

\end{aligned}[/latex]

with the help of combination of the pair the expansion of the series now will be

[latex]\begin{aligned}

E\left[S_{n}^{4}\right] &=n E\left[X_{i}^{4}\right]+6\left(\begin{array}{l}

n \\

2

\end{array}\right) E\left[X_{i}^{2} X_{j}^{2}\right] \\

&=n K+3 n(n-1) E\left[X_{i}^{2}\right] E\left[X_{j}^{2}\right]

\end{aligned}[/latex]

since

[latex]0 \leq \operatorname{Var}\left(X_{i}^{2}\right)=E\left[X_{i}^{4}\right]-\left(E\left[X_{i}^{2}\right]\right)^{2}[/latex]

so

[latex]\left(E\left[X_{i}^{2}\right]\right)^{2} \leq E\left[X_{i}^{4}\right]=K[/latex]

we get

[latex]E\left[S_{n}^{4}\right] \leq n K+3 n(n-1) K[/latex]

this suggest the inequality

[latex]E\left[\frac{S_{n}^{4}}{n^{4}}\right] \leq \frac{K}{n^{3}}+\frac{3 K}{n^{2}}[/latex]

hence

[latex]E\left[\sum_{n=1}^{\infty} \frac{s_{n}^{4}}{n^{4}}\right]=\sum_{n=1}^{\infty} E\left[\frac{s_{n}^{4}}{n^{4}}\right]<\infty[/latex]

By the convergence of the series since the probability of each random variable is one so

[latex]\lim _{n \rightarrow \infty} \frac{S_{n}^{4}}{n^{4}}=0[/latex]

since

[latex]\frac{S_{n}}{n} \rightarrow 0 \quad \text { as } \quad n \rightarrow \infty[/latex]

if the mean of each random variable is not equal to zero then with deviation and probability one we can write it as

[latex]\lim _{n \rightarrow \infty} \sum_{i=1}^{n} \frac{\left(X_{i}-\mu\right)}{n}=0[/latex]

or

[latex]\lim _{n \rightarrow \infty} \sum_{i=1}^{n} \frac{X_{i}}{n}=\mu[/latex]

which is required result.

## One Sided Chebyshev Inequality

The one sided Chebysheve inequality for the random variable X with mean zero and finite variance if a>0 is

to prove this consider for b>0 let the random variable X as

[latex]X \geq a \text{ is equivalent to } X+b \geq a+b[/latex]

which gives

[latex]P[X \geq a]=P[X+b \geq a+b]\\

\leq P[(X+b)^{2} \geq (a+b)^{2}]

[/latex]

so using the Markov’s inequality

which gives the required inequality. for the mean and variance we can write it as

[latex]P(X-\mu \geq a\} \leq \frac{\sigma^{2}}{\sigma^{2}+a^{2}}

\\P(\mu-X \geq a\} \leq \frac{\sigma^{2}}{\sigma^{2}+a^{2}}[/latex]

This further can be written as

[latex]\begin{array}{l}

P(X \geq \mu+a\} \leq \frac{\sigma^{2}}{\sigma^{2}+a^{2}} \\

P\{X \leq \mu-a\} \leq \frac{\sigma^{2}}{\sigma^{2}+a^{2}}

\end{array}[/latex]

## Example:

Find the upper bound of the probability that the production of the company which is distributed randomly will at least 120, if the production of this certain company is having mean 100 and variance 400.

Solution:

Using the one sided chebyshev inequaility

[latex]P\{X \geq 120\}=P(X-100 \geq 20\} \leq \frac{400}{400+(20)^{2}}=\frac{1}{2}[/latex]

so this gives the probability of the production within a week atleast 120 is 1/2, now the bound for this probability will be obtained by using Markov’s inequality

[latex]P[X \geq 120\} \leq \frac{E(X)}{120}=\frac{5}{6}[/latex]

which shows the upper bound for the probability.

## Example:

Hundred pairs are taken from two hundred persons having hundred men and hundred women find the upper bound of the probability that at most thirty pair will consist a men and a women.

Solution:

Let the random variable X_{i} as

[latex]X_{i}=\left\{\begin{array}{ll}1 & \text { if man i is paired with a women} \\ 0 & \text { otherwise }\end{array}\right.[/latex]

so the pair can be expressed as

[latex]X=\sum_{i=1}^{100} X_{i}[/latex]

Since every man can equally likely to be pair with remaining people in which hundred are women so the mean

[latex]E\left[X_{i}\right]=P\left\{X_{i}=1\right\}=\frac{100}{199}[/latex]

in the same way if i and j are not equal then

[latex]\begin{aligned}

E\left[X_{i} X_{j}\right] &=P\left\{X_{i}=1, X_{j}=1\right\} \\

&=P\left\{X_{i}=1\right\} P\left[X_{j}=1 \mid X_{i}=1\right\}=\frac{100}{199} \frac{99}{197}

\end{aligned}[/latex]

as

[latex]P\left\{X_{j}=1 \mid X_{i}=1\right\}=99 / 197[/latex]

hence we have

[latex]begin{aligned}

E[X] &=\sum_{i=1}^{100} E\left[X_{i}\right] \\

&=(100) \frac{100}{199} \\

& \approx 50.25 \\

\operatorname{Var}(X) &=\sum_{i=1}^{100} \operatorname{Var}\left(X_{i}\right)+2 \sum_{i<j} \sum \operatorname{Cov}\left(X_{i}, X_{j}\right) \\

&=100 \frac{100}{199} \frac{99}{199}+2\left(\begin{array}{c}

100 \\

2

\end{array}\right)\left[\frac{100}{199} \frac{99}{197}-\left(\frac{100}{199}\right)^{2}\right] \\

& \approx 25.126

\end{aligned}[/latex]

using the chebyshev inequality

[latex]P\{X \leq 30\} \leq P\{|X-50.25| \geq 20.25\} \leq \frac{25.126}{(20.25)^{2}} \approx 0.061[/latex]

which tells that the possibility of pairing 30 men with women is less than six, thus we can improve the bound by using one sided chebyshev inequality

[latex]\begin{aligned}

P[X \leq 30\} &=P[X \leq 50.25-20.25\rangle \\

& \leq \frac{25.126}{25.126+(20.25)^{2}} \\

& \approx, 0.058

\end{aligned}[/latex]

## Chernoff Bound

If the moment generating function already known then

[latex]\begin{aligned}

P[X \geq a\}&=P\left(e^{\ell X} \geq e^{\downarrow a}\right)\\

&\leq E\left[e^{t X}\right] e^{-t a}

\end{aligned}[/latex]

as

[latex]M(t)=E\left[e^{L X}\right][/latex]

in the same way we can write for t<0 as

[latex]\begin{aligned}

P\{X \leq a\} &=P\left\{e^{I X} \geq e^{[\alpha}\right\} \\

& \leq E\left[e^{t X}\right] e^{-t a}

\end{aligned}[/latex]

Thus the Chernoff bound can be define as

[latex]\begin{array}{ll}

P\{X \geq a\} \leq e^{-f \tau} M(t) & \text { for all } t>0 \\

P\{X \leq a\} \leq e^{-\pi \tau} M(t) & \text { for all } t<0

\end{array}[/latex]

this inequality stands for all the values of t either positive or negative.

## Chernoff bounds for the standard normal random variable

The Chernoff bounds for the standard normal random variable whose moment generating function

[latex]M(t)=e^{e^{2} / 2}[/latex]

is

[latex]P\{Z \geq a\rangle \leq e^{-t a} e^{t^{2} / 2} \quad \text { for all } \quad t>0[/latex]

so minimizing this inequality and right hand side power terms gives for a>0

[latex]P\{Z \geq a\} \simeq e^{-\lambda^{2} / 2}[/latex]

and for a<0 it is

[latex]P\{Z \leq a\} \leqq e^{-\alpha^{2} / 2}[/latex]

## Chernoff bounds for the Poisson random variable

The Chernoff bounds for the Poisson random variable whose moment generating function

[latex]M(t)=e^{\lambda\left(e^{\prime}-1\right)}[/latex]

is

[latex]P\{X \geq i\} \leq e^{\lambda\left(e^{t}-1\right)} e^{-i t} \quad t>0[/latex]

so minimizing this inequality and right hand side power terms gives for a>0

[latex]P\{X \geq i\} \leq e^{\lambda \omega / \lambda-1)}\left(\frac{\lambda}{i}\right)[/latex]

and it would be

[latex]P\{X \geq i\} \leq \frac{e^{-2}(e \lambda)^{i}}{l^{i}}[/latex]

## Example on Chernoff Bounds

In a game if a player is equally likely to either win or lose the game independent of any past score, find the chernoff bound for the probability

Solution: Let X_{i} denote the winning of the player then the probability will be

[latex]P\left\{X_{i}=1\right\}=P\left\{X_{i}=-1\right\}=\frac{1}{2}[/latex]

for the sequence of n plays let

[latex]S_{n}=\sum_{i=1}^{n} X_{i}[/latex]

so the moment generating function will be

[latex]E\left[e^{\ell X}\right]=\frac{e^{t}+e^{-t}}{2}[/latex]

here using the expansions of exponential terms

[latex]\begin{aligned}

e^{I}+e^{-l} &=1+t+\frac{t^{2}}{2 !}+\frac{t^{3}}{3 !}+\cdots+\left(1-t+\frac{t^{2}}{2 !}-\frac{t^{3}}{3 !}+\cdots\right) \\

&=2\left\{1+\frac{t^{2}}{2 !}+\frac{t^{4}}{4 !}+\cdots\right\} \\

&=2 \sum_{n=0}^{\infty} \frac{t^{2 n}}{(2 n) !} \\

& \simeq 2 \sum_{n=0}^{\infty} \frac{\left(t^{2} / 2\right)^{n}}{n !} \quad \operatorname{since}(2 n) ! \geq n ! 2^{n} \\

&=2 e^{t^{2} / 2}

\end{aligned}[/latex]

so we have

[latex]E\left[e^{t X}\right] \geq e^{t^{2} / 2}[/latex]

now applying the property of moment generating function

[latex]\begin{aligned}

E\left[e^{\mathcal{S}_{n}}\right] &=\left(E\left[e^{L X}\right]\right)^{n} \\

& \leq e^{n^{2} / 2}

\end{aligned}[/latex]

This gives the inequality

[latex]P\left\{S_{n} \geq a\right\} \leq e^{-\alpha^{2} / 2 n} \quad a>0[/latex]

hence

[latex]P\left\{S_{10} \geq 6\right\} \leq e^{-36 / 20} \approx 0.1653[/latex]

Conclusion:

The inequalities and limit theorem for the large numbers were discussed and the justifiable examples for the bounds of the probabilities were also taken to get the glimpse of the idea, Also the the help of normal, poisson random variable and moment generating function is taken to demonstrate the concept easily, if you require further reading go through below books or for more Article on Probability, please follow our Mathematics pages.

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH