Jointly Distributed Random Variables | Its Important Properties & 5 Examples

Content

Jointly distributed random variables

     The jointly distributed random variables are the random variable more than one with probability jointly distributed for these random variables, in other words in experiments where the different outcome with their common probability is known as jointly distributed random variable or joint distribution, such type of situation occurs frequently while dealing the problems of the chances.

Joint distribution function | Joint Cumulative probability distribution function | joint probability mass function | joint probability density function

    For the random variables X and Y the distribution function or joint cumulative distribution function is

F(a,b)= P\left \{ X\leq a, Y\leq b \right \}   \    \ , \   \   -\infty< a , b< \infty

where the nature of the joint probability depends on the nature of random variables X and Y either discrete or continuous, and the individual distribution functions for X and Y can be obtained using this joint cumulative distribution function as

F_{X}(a)=P \left \{ { X\leq a } \right \} \\ = P \left \{ X\leq a, Y< \infty \right \} \\ =P\left ( \lim_{b \to \infty} X\leq a, Y< b \right ) \\ =\lim_{b \to \infty} P \left \{ X\leq a, Y\leq b \right \} \\ = \lim_{b \to \infty} F(a,b) \\ \equiv F(a, \infty)

similarly for Y as

F_{Y} (b)=P\left \{ Y\leq b \right \} \\ =\lim_{a \to \infty} F(a,b) \\ \equiv F(\infty, b)

these individual distribution functions of X and Y are known as Marginal distribution functions when joint distribution is under consideration. These distributions are very helpful for getting the probabilities like

P\left { X> a, Y> b \right } = 1-P(\left { X> a, Y> b \right }^{c})   \\ =1-P(\left { X> a \right }^{c}\cup \left { Y> b \right }^{c})   \\ =1- P(\left { X\leq a \right }\cup \left { Y\leq b \right })   \\ =1-\left [ P\left { X\leq a \right } +P\left { Y\leq b \right }-P\left { X\leq a , Y\leq b\right }\right ]    \\ =1- F_{X}(a)-F_{Y}(b)+F(a,b)

and

P\left {a_{1}\leq X\leq a_{2} , b_{1}\leq Y\leq b_{2} \right }   \\ =F(a_{2},b_{2})+F(a_{1},b_{1})-F(a_{1},b_{2})-F(a_{2},b_{1})

in addition the joint probability mass function for the random variables X and Y is defined as

p(x,y)=P\left { X=x, Y=y \right }

the individual probability mass or density functions for X and Y can be obtained with the help of such joint probability mass or density function like in terms of discrete random variables as

p_{X}(x)=P\left { X=x \right }    \\ =\sum_{y:p(x,y)> 0}^{}p(x,y)   \\ p_{Y}(y)=\sum_{y:p(x,y)> 0}^{}p(x,y)

and in terms of continuous random variable the joint probability density function will be

P\left { (X,Y)\in C \right }=\int_{(x,y)\in C}^{}\int f(x,y)dxdy

where C is any two dimensional plane, and the joint distribution function for continuous random variable will be

F(a,b)=P\left { X\in (-\infty,a], Y\in (-\infty,b] \right }    \\ =\int_{-\infty}^{b}\int_{-\infty}^{a} f(x,y)dxdy

the probability density function from this distribution function can be obtained by differentiating

f(a,b)=\frac{\partial^2 }{\partial a \partial b} F(a,b)

and the marginal probability from the joint probability density function

P\left { X\in A \right }=P\left { X\in A,Y\in (-\infty,\infty) \right } \ =\int_{A}^{}\int_{-\infty}^{\infty}f(x,y)dydx \ =\int_{A}^{}f_{X}(x)dx

as

f_{X}(x)=\int_{-\infty}^{\infty}f(x,y)dy

and

f_{Y}(y)=\int_{-\infty}^{\infty}f(x,y)dx

with respect to the random variables X and Y respectively

Examples on Joint distribution

  1. The joint probabilities for the random variables X and Y representing the number of mathematics and statistics books from a set of books which contains 3 mathematics, 4 statistics and 5 physics books if 3 books taken randomly

p(0,0)=\binom{5}{3}/\binom{12}{3}=\frac{10}{220}   \\ p(0,1)=\binom{4}{1} \binom{5}{2}/\binom{12}{3}=\frac{40}{220}    \\ p(0,2)=\binom{4}{2} \binom{5}{1}/\binom{12}{3}=\frac{30}{220}    \\ p(0,3)=\binom{4}{3}/\binom{12}{3}=\frac{4}{220}    \\ p(1,0)=\binom{3}{1} \binom{5}{2}/\binom{12}{3}=\frac{30}{220}     \\ p(1,1)=\binom{3}{1} \binom{4}{1} \binom{5}{1}/\binom{12}{3}=\frac{60}{220}     \\ p(1,2)=\binom{3}{1} \binom{4}{2}/\binom{12}{3}=\frac{18}{220}     \\ p(2,0)=\binom{3}{2} \binom{5}{1}/\binom{12}{3}=\frac{15}{220}    \\ p(2,1)=\binom{3}{2} \binom{4}{1}/\binom{12}{3}=\frac{12}{220}     \\ p(3,0)=\binom{3}{3}/\binom{12}{3}=\frac{1}{220}    \

  • Find the joint probability mass function for the sample of families having 15% no child, 20% 1 child, 35% 2 child and 30% 3 child if the family we choose randomly from this sample for child to be Boy or Girl?

The joint probability we will find by using the definition as

Jointly distributed random variables
Jointly distributed random variables : Example

and this we can illustrate in the tabular form as follows

Jointly distributed random variables
Jointly distributed random variables : Example of joint distribution
  • Calculate the probabilities

 (a)  P\left { X> 1, Y> 1 \right } , \    \  (b) P\left { X< Y \right }, and \     \  (c) P\left { X< a \right }

if for the random variables X and Y the joint probability density function is given by

f(x,y) = \begin{cases} 2e^{-x}y^{-2y} \ \ 0< x< \infty , \  \ 0< y< \infty  \\  0 &\text{otherwise} \end{cases}

with the help of definition of joint probability for continuous random variable

=\int_{-\infty}^{b}\int_{-\infty}^{a}f(x,y)dxdy

and the given joint density function the first probability for the given range will be

P\left { X> 1,Y< 1 \right }=\int_{0}^{1}\int_{1}^{\infty}2e^{-x} e^{-2y} dxdy

=\int_{0}^{1}2e^{-2y} \left ( -e^{-x}\lvert_{1}^{\infty} \right )dy

=e^{-1}\int_{0}^{1}2e^{-2y}dy

=e^{-1}(1-e^{-2})

in the similar way the probability

P\left { X< Y \right }=\int_{(x,y):}^{}\int_{x< y}^{}2e^{-2x}e^{-2y}dxdy

=\int_{0}^{\infty}\int_{0}^{y}2e^{-2x}e^{-2y}dxdy

=\int_{0}^{\infty}2e^{-2y}(1-e^{-y})dy

=\int_{0}^{\infty}2e^{-2y}dy - \int_{0}^{\infty}2e^{-3y}dy =1-\frac{2}{3}=\frac{1}{3}

and finally

P\left \{ X< a \right \}=\int_{0}^{a}\int_{0}^{\infty}2e^{-2y}e^{-x}dydx

=\int_{0}^{a}e^{-x}dx

=1-e^{-a}

  • Find the joint density function for the quotient X/Y of random variables X and Y if their joint probability density function is

f(x,y) = \begin{cases} e^{-(x+y)} \ \ 0< x< \infty , \ \ 0< y< \infty \\  \ 0 &\text{otherwise} \end{cases}

To find the probability density function for the function X/Y we first find the joint distribution function then we will differentiate the obtained result,

so by the definition of joint distribution function and given probability density function we have

F_{X}/_{Y}(a)=P\left { \frac{X}{Y}\leq a \right }

=\int_{\frac{X}{Y}\leq a}^{}\int e^{-(x+y)}dxdy

=\int_{0}^{\infty}\int_{0}^{ay}e^{-(x+y)}dxdy

= \left { \int_{0}^{\infty}-e^{-y}dxdy +\frac{e^{-(a+1)y}}{a+1} \right }\lvert_{0}^{\infty}

=1-\frac{1}{a+1}

thus by differentiating this distribution function with respect to a we will get the density function as

f_{\frac{X}{Y}}(a)=\frac{1}{(a+1)^{2}}

where a is within zero to infinity.

Independent random variables and joint distribution

     In the joint distribution the probability for two random variable X and Y is said to be independent if

P\left { X \in A, Y \in B\right } =P \left { X \in A \right } P\left { Y \in B \right }

where A and B are the real sets. As already in terms of events we know that the independent random variables are the random variables whose events are independent.

Thus for any values of a and b

P\left { X\leq a, Y\leq b \right } =P\left {X\leq a \right }P\left {Y\leq b \right }

and the joint distribution or cumulative distribution function for the independent random variables X and Y will be

F(a,b)=F_{X}(a)F_{Y}(b) \ \ for \ \ all \ \ a,b

if we consider the discrete random variables X and Y then

p(x,y)=p_{X}(x)p_{Y}(y) \ \ for \ \ all \ \ x,y

since

P\left { X\in A, Y\in B \right } =\sum_{y\in B}^{}\sum_{x \in A}^{}p(x,y)

=\sum_{y\in B}^{}\sum_{x \in A}^{}p_{X}(x)p_{Y}(y)

=\sum_{y\in B}p_{Y}(y) \sum_{x\in A}p_{X}(x)

= P\left { Y \in B \right } P\left { X \in A \right }

similarly for the continuous random variable also

f(x,y)=f_{X}(x)f_{Y}(y) \ \ for \ \ all \ \ x,y

Example of independent joint distribution

  1. If for a specific day in a hospital the patients entered are poisson distributed with parameter λ and probability of male patient as p and probability of female patient as (1-p) then show that the number of male patients and female patients entered in the hospital are independent poisson random variables with parameters λp and λ(1-p) ?

consider the number of male and female patients by random variable X and Y then

P\left { X=i, Y=j \right }= P\left { X=i, Y=j|X +Y=i+j \right }P\left { X+Y=i+j \right }+P\left { X=i,Y=j|X +Y\neq i+j \right }P\left { X+Y\neq i+j \right }

P\left { X=i, Y=j \right }= P\left { X=i, Y=j|X +Y=i+j \right }P\left { X+Y=i+j \right }

as X+Y are the total number of patients entered in the hospital which is poisson distributed so

P\left { X+Y=i+j \right }=e^{-\lambda }\frac{\lambda ^{i+j}}{(i+j)!}

as the probability of male patient is p and female patient is (1-p) so exactly from total fix number are male or female shows binomial probability as

P\left { X=i, Y=j|X + Y=i+j \right }=\binom{i+j}{i}p^{i}(1-p)^{j}

using these two values we will get the above joint probability as

P\left { X=i, Y=j \right }=\binom{i+j}{i}p^{i}(1-p)^{j}e^{-\lambda} \frac{\lambda ^{i+j}}{(i+j)!}

=e^{-\lambda} \frac{\lambda p^i}{i! j!}\left [ \lambda (1-p) \right ]^{j}

=e^{-\lambda p} \frac{(\lambda p)^i}{i!} e^{-\lambda (1-p)} \frac{\left [ \lambda (1-p) \right ]^{j}}{j!}

thus probability of male and female patients will be

P\left { X=i \right } =e^{-\lambda p} \frac{(\lambda p)^i}{i!} \sum_{j} e^{-\lambda (1-p)} \frac{\left [ \lambda (1-p) \right ]^{j}}{j!} = e^{-\lambda p} \frac{(\lambda p)^i}{i!}

and

P\left { Y=j \right } =e^{-\lambda (1-p)} \frac{\left [ \lambda (1-p) \right ]^{j}}{j!}

which shows both of them are poisson random variables with the parameters λp and λ(1-p).

2. find the probability that a person has to wait for more than ten minutes at the meeting for a client as if each client and that person arrives between  12 to 1 pm following uniform distribution.

consider the random variables X and Y to denote the time for that person and client between 12 to 1 so the probability jointly for X and Y will be

 2P  \left { X+10 < Y  \right } =2 \int_{X+10 < Y} \int f(x,y)dxdy

=2 \int_{X+10 < Y} \int f_{X}(x) f_{Y}(y)dxdy

=2 \int_{10}^{60} \int_{0}^{y-10} \left (\frac{1}{60}\right )^{2} dxdy

=\frac{2}{(60)^{2}}\int_{10}^{60} (y-10)dy

=\frac{25}{36}

calculate

P\left { X\geq YZ \right }

where X,Y and Z are uniform random variable over the interval (0,1).

here the probability will be

P\left { X\geq YZ \right } = \int \int_{x\geq yz}\int f_{X,Y,Z} (x,y,z) dxdydz

for the uniform distribution the density function

f_{X,Y,Z} (x,y,z) =f_{X} (x) f_{Y}(y) f_{Z}(z) =1 , \ \ 0\leq x\leq 1 , \ \ 0\leq y\leq 1 , \ \ 0\leq z\leq 1

for the given range so

=\int_{0}^{1}\int_{0}^{1}\int_{yz}^{1} dxdydz

=\int_{0}^{1}\int_{0}^{1} (1-yz) dydz

=\int_{0}^{1}\left ( 1-\frac{z}{2} \right ) dydz

=\frac{3}{4}

SUMS OF INDEPENDENT RANDOM VARIABLES BY JOINT DISTRIBUTION

  The sum of independent variables X and Y with the probability density functions as continuous random variables, the cumulative distribution function will be

F_{X+Y} (a)= P\left \{ X+Y\leq a \left. \right \} \right.

= \int_{x+y\leq a}\int f_{X} (x)f_{Y}(y)dxdy

= \int_{-\infty}^{\infty}\int_{-\infty}^{a-y} f_{X}(x)f_{Y}(y)dxdy

= \int_{-\infty}^{\infty}\int_{-\infty}^{a-y} f_{X}(x) dx f_{Y}(y)dy

= \int_{-\infty}^{\infty} F_{X} (a-y) f_{Y}(y)dy

by differentiating this cumulative distribution function for the probability density function of these independent sums are

f_{X+Y} (a)=\frac{\mathrm{d} }{\mathrm{d} a}\int_{-\infty}^{\infty} F_{X} (a-y)f_{Y} (y)dy

f_{X+Y} (a)=\int_{-\infty}^{\infty} \frac{\mathrm{d} }{\mathrm{d} a} F_{X} (a-y)f_{Y} (y)dy

=\int_{-\infty}^{\infty} f_{X} (a-y)f_{Y} (y)dy

by following these two results we will see some continuous random variables and their sum as independent variables

sum of independent uniform random variables

   for the random variables X and Y uniformly distributed over the interval (0,1) the probability density function for both of these independent variable is

f_{X}(a)=f_{Y}(a) = \begin{cases} 1 & \ 0< a< 1 \\ \ \ 0 & \text{ otherwise } \end{cases}

so for the sum X+Y we have

f_{X+Y}(a) = \int_{0}^{1}f_{X}(a-y)dy

for any value a lies between zero and one

f_{X+Y}(a)= \int_{0}^{a}dy =a

if we restrict a in between one and two it will be

f_{X+Y}(a)= \int_{a-1}^{a}dy =2-a

this gives the triangular shape density function

f_{X+Y}(a) = \begin{cases} \ a & 0\leq a \leq 1 \\ \ 2-a & \ 1< a< 2 \\ \ 0 & \text{ otherwise } \end{cases}

if we generalize for the n independent uniform random variables 1 to n then their distribution function

F_{n}(x)=P\left ( X_{1} + ......+ X_{n} \leq x \right )

by mathematical induction will be

F_{n}(x)=\frac{x^{n}}{n!} , 0\leq x\leq 1

sum of independent Gamma random variables

    If we have two independent gamma random variables with their usual density function

f(y)= \frac{\lambda e^{-\lambda y}(\lambda y)^{t-1}}{\Gamma (t)} \ \ , 0< y< \infty

then following the density for the sum of independent gamma random variables

f_{X+Y}(a)=\frac{1}{\Gamma (s)\Gamma (t)}\int_{0}^{a}\lambda e^{-\lambda (a-y)}\left [ \lambda (a-y) \right ]^{s-1}\lambda e^{-\lambda y} (\lambda y)^{t-1}dy

=K e^{-\lambda a} \int_{0}^{a}\left [ (a-y) \right ]^{s-1}(y)^{t-1}dy

=K e^{-\lambda a} a^{s+t-1} \int_{0}^{1} (1-x)^{s-1}x^{t-1} dx \ \ by \ \ letting \ \ x=\frac{y}{a}

=C e^{-\lambda a} a^{s+t-1}

f_{X+Y}(a)=\frac{\lambda e^{-\lambda a} (\lambda a)^{s+t-1}}{\Gamma (s+t)}

this shows the density function for the sum of gamma random variables which are independent

sum of independent exponential random variables

    In the similar way as gamma random variable the sum of independent exponential random variables we can obtain density function and distribution function by just specifically assigning values of gamma random variables.

Sum of independent normal random variable | sum of independent Normal distribution

                If we have n number of independent normal random variables Xi , i=1,2,3,4….n with respective means μi and variances σ2i then their sum is also normal random variable with the mean as Σμi  and variances Σσ2i

    We first show the normally distributed independent sum for two normal random variable X with the parameters 0 and σ2 and Y with the parameters 0 and 1, let us find the probability density function for the sum X+Y with

c=\frac{1}{2\sigma ^{2}} +\frac{1}{2} =\frac{1+\sigma ^{2}}{2\sigma ^{2}}

in the joint distribution density function

f_{X+Y}(a)=\int_{-\infty}^{\infty}f_{X}(a-y)f_{Y}(y)dy

with the help of definition of density function of normal distribution

  f_{X}(a-y)f_{Y}(y)=\frac{1}{\sqrt{2\pi }\sigma } exp\left { -\frac{(a-y)^{2}}{2\sigma ^{2}} \right }\frac{1}{\sqrt{2\pi }}exp\left { -\frac{y^{2}}{2} \right }

=\frac{1}{2\pi \sigma } exp \left { -\frac{a^{2}}{2\sigma ^{2}} \right } exp \left { -c\left ( y^{2} -2y\frac{a}{1+\sigma ^{2}}\right ) \right }

thus the density function will be

f_{X+Y}(a)=\frac{1}{2\pi \sigma }exp \left { -\frac{a^{2}}{2\sigma ^{2}} \right} exp \left { \frac{a^{2}}{2\sigma ^{2}(1+\sigma ^{2})} \right } X \int_{-\infty}^{\infty} exp \left { -c\left ( y-\frac{a}{1+\sigma ^{2}} \right )^{2} \right } dy

=\frac{1}{2\pi \sigma } exp \left { - \frac{a^{2}}{2(1+\sigma ^{2})} \right } \int_{-\infty}^{\infty} exp \left { -cx^{2} \right } dx

=C exp \left { -\frac{a^{2}}{2(1+\sigma ^{2})} \right }

which is nothing but the density function of a normal distribution with mean 0 and variance (1+σ2) following the same argument we can say

X_{1} + X_{2}=\sigma <em>{2}\left ( \frac{X</em>{1}-\mu <em>{1}}{\sigma </em>{2}}+\frac{X_{2}-\mu <em>{2}}{\sigma </em>{2}} \right ) +\mu <em>{1} +\mu </em>{2}

with usual mean and variances. If we take the expansion and observe the sum is normally distributed with the mean as the sum of the respective means and variance as the sum of the respective variances,

thus in the same way the nth sum will be the normally distributed random variable with the mean as Σμi  and variances Σσ2i

Sums of independent Poisson random variables

If we have two independent Poisson random variables X and Y with parameters λ1 and λ2 then their sum X+Y is also Poisson random variable or Poisson distributed

since X and Y are Poisson distributed and we can write their sum as the union of disjoint events so

P \left { X+Y =n \right } =\sum_{k=0}^{n}P\left { X=k, Y=n-k \right }

=\sum_{k=0}^{n}P\left { X=k \right },P\left { Y=n-k \right }

=\sum_{k=0}^{n}e^{-\lambda <em>{1}} \frac{\lambda </em>{1}^{k}}{k!}e^{-\lambda <em>{2}}\frac{\lambda </em>{2}^{n-k}}{(n-k)!}

by using the of probability of independent random variables

=e^{-(\lambda <em>{1}+\lambda </em>{2})} \sum_{k=0}^{n} \frac{\lambda <em>{1}^{k}\lambda </em>{2}^{n-k}}{k!(n-k)!}

=\frac{e^{-(\lambda <em>{1}+\lambda </em>{2})}}{n!}\sum_{k=0}^{n} \frac{n!}{k!(n-k)!} \lambda <em>{1}^{k}\lambda </em>{2}^{n-k}

=\frac{e^{-(\lambda <em>{1}+\lambda </em>{2})}}{n!} (\lambda <em>{1}+\lambda </em>{2})^{n}

so we get the sum X+Y is also Poisson distributed with the mean λ12

Sums of independent binomial random variables

                If we have two independent binomial random variables X and Y with parameters (n,p) and (m, p) then their sum X+Y is also binomial random variable or Binomial distributed with parameter (n+m, p)

let use the probability of the sum with definition of binomial as

P\left { X+Y= k \right } =\sum_{i=0}^{n}P\left { X=i, Y=k-i \right }

=\sum_{i=0}^{n}P\left { X=i \right } P\left { Y=k-i \right }

=\sum_{i=0}^{n}\binom{n}{i}p^{i}q^{n-i}\binom{m}{k-i}p^{k-i}q^{m-k+i}

where \ \ q=1-p \ \ and \ \ where \ \ \binom{r}{j}=0 \ \ when \ \ j< 0

\binom{m+n}{k}=\sum_{i=0}^{n}\binom{n}{i}\binom{m}{k-i}

which gives

P\left { X+Y=k \right }=p^{k}q^{n+m-k}\sum_{i=0}^{n}\binom{n}{i}\binom{m}{k-i}

so the sum X+Y is also binomially distributed with parameter (n+m, p).

Conclusion:

The concept of jointly distributed random variables which gives the distribution comparatively for more than one variable in the situation is discussed in addition the basic concept of independent random variable with the help of joint distribution and sum of independent variables with some example of distribution is given with their parameters, if you require further reading go through mentioned books. For more post on mathematics, please click here.

https://en.wikipedia.org

A first course in probability by Sheldon Ross

Schaum’s Outlines of Probability and Statistics

An introduction to probability and statistics by ROHATGI and SALEH

About DR. MOHAMMED MAZHAR UL HAQUE

I am DR. Mohammed Mazhar Ul Haque , Assistant professor in Mathematics. Having 12 years of experience in teaching. Having vast knowledge in Pure Mathematics , precisely on Algebra. Having the immense ability of problem designing and solving. Capable of Motivating candidates to enhance their performance.
I love to contribute to Lambdageeks to make Mathematics Simple , Interesting & Self Explanatory for beginners as well as experts.
Let's connect through LinkedIn - https://www.linkedin.com/in/dr-mohammed-mazhar-ul-haque-58747899/