(14) joint distribution
TRANSCRIPT
-
7/30/2019 (14) Joint Distribution
1/13
1APPLIEDSTATISTICSANDCOMPUTINGLAB
Joint Distribution
In this section we give an introduction to the joint distribution of random variables.
We give examples of discrete and continuous joint distributions. From the joint
distribution ofx1, , xp we obtain the individual distribution (which we call themarginal distribution) of each ofx1,x2, , xp. We define the concept of conditionaldistribution. We briefly study the concept of independence of random variables and
its relation to uncorrelatedness. First, let us consider a few examples.
Example 1: A college has 2 specialists in long distance running, 4 specialists in
Tennis and 6 top level cricketers among its students. The college plans to send 3
sportsmen from the above for participating in the University sports and games. The
three sportsmen are selected randomly from among the above 12. Letx1 andx2 denote
respectively the number of long distance specialists and the number of tennis
specialists chosen. The joint probability mass function ofx1 andx2 is defined as
P{x1 = i,x2 =j} fori =0,1,2andj = 0,1,2,3. Obtain the joint probability mass functionofx1,x2.
Solution: p(0, 0) = P{x1 = 0,x2 = 0} =6 12
3 3
=220
20
p(0, 1) = P{x1 = 0,x2 = 1} =220
60
3
12
2
6
1
4=
Similarly, p(0, 2) =220
36
3
12
1
6
2
4=
p(0, 3) =220
4
3
12
3
4=
p(1, 0) =220
30
3
12
2
6
1
2=
p(1, 1) =220
48
3
12
1
6
1
4
1
2=
p(1, 2) =22012
3
12
2
4
1
2=
p(1, 3) = 0 since the number of persons chosen is 3.
p(2, 0) =220
6
3
12
1
6
2
2=
p(2, 1) =220
4
3
12
1
4
2
2=
p(2, 2) = 0
p(2, 3) = 0
-
7/30/2019 (14) Joint Distribution
2/13
2APPLIEDSTATISTICSANDCOMPUTINGLAB
The values taken by x1 andx2 and the corresponding probabilities constitute the jointdistribution ofx1 andx2 and can be expressed in a tabular form as follows:
Table 1
Joint distribution ofx1 andx2
Value
taken byx2 0 1 2 3 Row
sumx1 Joint probability
0
Jointprobability
220
20
220
60
220
36
220
4
220
120
1220
30
220
48
220
12 0
220
90
2220
6
220
4 0 0
220
10
Column sum220
56
220
112
220
48
220
4 1
For the joint distribution table, it is easy to write down the distributions ofx1 andx2
which we call the marginal distributions ofx1 andx2 respectively.
P{x1 = 0} = P{x1 = 0,x2 = 0} + P{x1 = 0,x2 = 1}+ P{x1 = 0,x2 = 2}+ P{x1 = 0,x2 = 3}
=220
20+220
60+220
60+220
4.
Notice that P{x1 = 0} is the row-sum corresponding to x1 = 0 in the above table.
Accordingly this is recorded as row-sum corresponding to x1 = 0. Similarly thesecond and third row-sums are the probabilities ofx1 = 1 and x1 = 2 respectively.
Thus the marginal distribution ofx1 is
Table 2
Marginal distribution ofx1
Value 0 1 2
Probability 220
120
220
90
220
10
Similarly, the marginal distribution ofx2 is obtained using the column sums in table 1.
Thus the marginal distribution ofx2 is
Table 3Marginal distribution ofx2
Value 0 1 2 3
Probability220
56
220
112
220
48
220
4
-
7/30/2019 (14) Joint Distribution
3/13
3APPLIEDSTATISTICSANDCOMPUTINGLAB
Suppose we are given additional information that no long distance running specialistis selected, or in other words, we know thatx1 = 0. Then what are the probabilities for
x2 = 0, 1, 2, 3 given this additional information? Notice that we are looking for the
conditional probabilities: P{x2 =j | x1 = 0} forj = 0, 1, 2, 3. We can compute them
using the row corresponding tox1 = 0.
P{x2 = 0| x1 = 0} =)0(P
)0,0(P
1
21
=
==
x
xx
=120
20
220
120
220
20= =
6
1
P{x2 = 1| x1 = 0} =120
60
220
120
220
60= =
2
1
P{x2 = 2| x1 = 0} =120
36
220
120
220
36= =
10
3
P{x2 = 3| x1 = 0} =120
4
220
120
220
4= =
30
1
The distribution ofx2 givenx1 = 0 is called the conditional distribution ofx2 givenx1
= 0 and can be expressed neatly in the following table.
Table 4
Conditional distribution ofx2 | x1 = 0
Value 0 1 2 3
Probability6
1
2
1
10
3
30
1
E1. In example 1, let x3 = number of cricketers chosen. Write down the joint
distribution ofx2 and x3. Obtain the marginal distributions ofx2 and x3.Obtain the conditional distribution ofx3 givenx2 = 1.
E2. In example 1, letpijkdenote P{x1 = i,x2 =j,x3 = k}. Obtainpijk fori = 0,1,2;j
= 0,1,2,3 and k= 0,1,2,,6. The values ofx1,x2 andx3 and the corresponding
pijkconstitute the joint distribution ofx1,x2 andx3.
The random variables x1, x2 and x3 in example 1 and E1 are discrete. x=
3
2
1
x
x
x
is
called a discrete random vector and the distribution ofx(the joint distribution ofx1,x2
andx3) in such a case is called discrete multivariate distribution.
On the other hand we say that x1,, xp are jointly continuous(x= ( )1t
px xK is
continuous) if there exists a function f(u1,,up) defined for all u1,, up having the
property that for every set A of p-tuples,
P((x1,, xp) A) = f(u1,,up)du1,,dup where the integration is over all
(u1,,up) A.
-
7/30/2019 (14) Joint Distribution
4/13
4APPLIEDSTATISTICSANDCOMPUTINGLAB
The function f(u1,,up) is called the probability density function ofx (or jointprobability density function ofx1,,xp). If A1,, Ap are sets of real numbers such
that A = {(u1,,up), ui Ai, i = 1,,p} we can write
P{(u1,,up) A} = P{ xi Ai, i = 1,,p } =
pApp
A
duduuuf 11 )(
1
The distribution function ofx = ( )1t
px xK is defined as F(a1,a2,,ap)={x1
a1,,xp ap}
=1
1 1
paa
p pf ( u u )du du
L K K
It follows, upon differentiation, that f(a1,,ap) = ),,(F 11
p
p
p
aa
aa
provided the
partial derivatives are defined.
Another interpretation of the density function ofxcan be given using the following:
P{ai
-
7/30/2019 (14) Joint Distribution
5/13
5APPLIEDSTATISTICSANDCOMPUTINGLAB
Thenprpr
pp
ruxduduuuf
duduuufduduf
11
11
121|),,(
),()|(
2
221
++
==
u
x
xuu
( )pppprrrr
pppp
duuxuduuxuduuxuduuxu++
++
++++,,P
,,P
1111
1111
=pppprrrrrrrr duuxuduuxuduuxuduuxu ++++ ++++ ,,|,,P 11111111
Thus for small values of du1,,dup, )|( 21| 221 uuuxx =f represents the conditional
probability that xi lies between ui and u1+dui, i = 1,,rgiven thatxj lies between uj
and uj+duj,j = r+1,,p.
Let us now consider a few examples.
Example 2: Considerx= (x1,x2)twith density
fx(u1,u2) =( )
-
7/30/2019 (14) Joint Distribution
6/13
6APPLIEDSTATISTICSANDCOMPUTINGLAB
= ( ) 1
0
2212 duuu
= 2 u1 2
1=
2
3- u1.
Thus1
1f ( u ) =
x
1 1
3for 0 1
2
0 else where.
- ,u u
<
-
7/30/2019 (14) Joint Distribution
7/13
7APPLIEDSTATISTICSANDCOMPUTINGLAB
1 2f ( u ,u ) =
x
1 22
1 22 0 0
0 else where.
-u - ue e u , u < < < <
(a)Obtain the marginal density ofx2
(b)
Obtain the conditional density ofx1 givenx2 = u2.
Solution: (a) Clearly2
2f ( u ) =
x0 whenever -
-
7/30/2019 (14) Joint Distribution
8/13
8APPLIEDSTATISTICSANDCOMPUTINGLAB
We give below a relationship between uncorrelatedness and independence.
Result 1. Letx1 andx2 be independent. Then Cov(x1,x2) = 0.
Proof: Let )(f 11 ux and )(f 22 ux be the densities ofx1 andx2. Then the joint density
ofx1 andx2 (i.e., the density ofx=
2
1
x
x
) isfx(u) = )(f).(f 2121
uuxx
where u =
2
1
u
u
.
Let u1 = (u1,,ur)tand u2 = (ur+1,,up)
t.
Cov(u1, u2) =E(u1, u2t) -E(u1).E(u2
t)
= pt
dudu)(f)(fuu 12121
21
uuxx
-1 2
1 1 1 2 2 1
t
r r pu f ( )du du u f ( )du du+ x xu uK K K K = 0
since the first integral in the previous expression splits into the product of the two
later integrals.
However the converse is not true as shown through the following exercise.
E3: Letx have the following distribution
Value -3 -1 1 3
Probability
(a)Show that the distributionx2 is
Value 1 9
Probability
(b)Write down the joint distribution ofx andx2
(c)Show thatx andx2
are uncorrelated.
(d)Show thatx andx2 are not independent.
E4: Consider a random vectorx=
2
1
x
x
with the density
fx(u1,u2) =( )1 1 2 1 22 0 1 0 1
0 otherwise
cu u u u , u < <