stat 111 chapter6 jointly distributed random variables 1

23
STAT 111 Chapter6 Jointly Distributed Random Variables 1

Upload: neal-mcdowell

Post on 23-Dec-2015

223 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: STAT 111 Chapter6 Jointly Distributed Random Variables 1

STAT 111

Chapter6

Jointly Distributed Random Variables

1

Page 2: STAT 111 Chapter6 Jointly Distributed Random Variables 1

In many experiments it is necessary to consider the properties of two or more random variables simultaneously. In the following, we shall be concerned with the bivariate case, that is, with situations where we are interested at the same time in a pair of random variables. Later, we shall extend this discussion to the multivariate case, covering any finite number of random variables,

If X and Y are two discrete random variables, the probability distribution for their simultaneous occurrence can be represented by a function with values f (x, y) for any pair of values (x, y) within the range of the random variables X and Y. It is common to refer to this function as the joint probability distribution

of X and Y, and it is defined as

f (x, y) = P(X = x, Y = y)

2

Page 3: STAT 111 Chapter6 Jointly Distributed Random Variables 1

DefinitionThe function f (x, y) is a joint probability distribution (or joint probability mass function) of the discrete random variables X and Y if and only if its values satisfy the conditions;

1 .for all (x, y)

x

2. ∑ ∑ f (x ,y ) = 1, where the double summation extends over all x y

Based on this definition , the joint cumulative distribution function is the joint probably that X ≤ x and Y ≤ y ,given by

F (x , y ) = p (X ≤ x ,Y≤ y)= ∑ ∑ ƒ ( xi , yi) xi ≤ x yi ≤ y

for - ∞ < x<∞ , - ∞< y <∞

0),( yxf

3

Page 4: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example Let X denote the number of heads and Y the number of heads minus the number of tails when coins are tossed. Find the joint probability distribution of X and Y.

Possible values (0,-3) ,(1,-1), (2,1), (3,3) P(X=0, Y= -3) = 1 ,

8 P(X=1 , Y=-1)= 3 ..… ,

8 These probabilities can most easily be expressed in tabular from as in this table

Y/X0123Sum -3 1/80001/8-103/8003/8

1003/803/830001/81/8

Sum1/83/83/81/814

Page 5: STAT 111 Chapter6 Jointly Distributed Random Variables 1

ExampleSuppose that 3 balls are randomly selected from an urn containing 3 red, 4 white and 5 blue balls. Let X denotes the number of red balls chosen. Let Y denotes the

number of white balls chosen. Find the joint probability function of X and Y ,P(X + Y ≤ 2), P(X = 1, Y = 2), p(X = 0, 0 ≤ Y≤ 2) , and P(X > Y).

X = # R , Y= #W

3R 5B 4 W 3

X/Y0123

010/22040/22030/2204/220

130/22060/22018/2200

215/22012/22000

3 1/220000

1 .X + Y ≤ 2 for the following values of X and Y

){0 .0) ,(1 ,0) ,(2 ,0) ,(0 ,1),(1 ,1) ,(0,2}(therefore,

P(X+Y≤2)= 10 + 40 + 30 + 30 + 60 + 15 220 220 220 220 220 220

2 .P (X= 1,Y=2) =f (1,2) = 18/220

3 .P(X = 0, 0 ≤ Y≤ 2) = ƒ (0, 0) + ƒ(0,1) + ƒ(0, 2 ) = 10 + 40 + 30 220 220 220

5

123

53 /)0,0()0,0( CCYXPf

Page 6: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example

Let the joint probability mass function of X and Y is given in the following table Y X012

01/161/81/811/161/81/1621/81/41/16

Find1 .P(X= 1, Y≤ 0) =ƒ(1,0) =1/8

2 .P(X = 2,Y ≤ 0) =ƒ(2, 0) = 1/8

3 .P(X = 2,X + Y = 4) =ƒ(2,2) = 1/16

4 .P(l ≤ X<3, Y≥ 1) =ƒ(1, l)+ƒ(l,2) + ƒ(2, l) + ƒ(2,2) = 1 + 1 + 1 + 1 8 4 16 16

5.P(X ≤ 2) =1 - P(X > 2) = 1 - 0 = 1

6 .F(l, 1) ) =P(X≤1,Y≤1) = f (1,0)+ f (1, l)+f (0,0)+f (0,1) = 1 + 1 + 1 + 1 8 8 16 16

6

Page 7: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example

Determine the value of c so that the following functions represent joint probability distributions of the random variables X and Y;

1 .ƒ(x,y) = c x y, for x = 1,2,3. y=1,2,3.

∑ƒ(x+ y) = c [1+2+3+2+4+6+3+6+9] = 1 c = 1/36

2 .ƒ(x , y) = c│ x-y │, for x= -2 ,0 ,2 y =1,2,3.

∑ƒ(x , y) = c[3+4+5+1+2+3+1+0+1]=1 c = 1/20

7

Page 8: STAT 111 Chapter6 Jointly Distributed Random Variables 1

All the preceding definitions concerning two random variables can be generalized to the multivariate case, where there are n random variables. The values of the joint probability distribution of the discrete random variables X1,X2 , …, Xn, defined over the same sample space S, are given by

f(x1,x2, ...,xn) = P(X1=X 1,X2, ..., Xn = xn)

for all n-tuple (x1, x2, ..., xn) within the range of the random variables. Also the values of their joint cumulative distribution function are given by

F ( x1 , x2 ,… , xn ) = P ( X1 ≤ x1 , X2 ≤ x2, …Xn ≤ xn)

8

Page 9: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example

Considering n flips of a balanced coin, let X 1 be the number of heads (0 or 1) obtained on the first flip, X2 the number of heads obtained on

the second flip, ..., and Xn the number of heads obtained on the nth flip.

find the joint probability distribution of these n random variables .

f(x1 , x2 , …,xn)= P(X1 = x1 , X2 = x2 ,..,Xn=xn) = 1 x 1 x 1 x…x 1 = 1 2 2 2 2 2n

Where each X: can take on the value 0 or 1

9

Page 10: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Marginal DistributionsIf X and Y are discrete random variables and f(x, y) is the value of their joint probability distribution at (x, y), the function given by

for each x within the range of X, is called the marginal distribution of X Similarly , the function given by

f y(y) =∑ ƒ(x,y) x

for each y within the range of y , is called the marginal distribution of y note that , because the probabilities are obtained from the margins,

we call them marginal distribution

f x (x) =∑ ƒ(x,y) y

10

Page 11: STAT 111 Chapter6 Jointly Distributed Random Variables 1

ExampleAssume that the random variable X and Y have the following joint prob. Mass function. Find the marginal distribution of X and Y.

X \ Y 0123Sum010/22040/22030/2204/22084/220130/22060/2201 8/22001 08/220215/22012/2200027/22031/2200001/220

sum56/220112/2204 8/ 2204/2201

Marginal of xX01

2

3Sum

ƒ(x)84/220108/22027/2201/2201

Marginal of yY0123

ƒ(y)56/220112/22048/2204/22011

Page 12: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example Assume that the random variables X and Y have the joint probability mass function given as

f(x ,y)= λx+y e -2λ x = 0 , 1 , 2.., x!y! y=0,1,2……,

Find the marginal distribution of X

∑∞

ƒ(x)= ∑ λx+y e-2λ = = x! y!

= λx e -λ

x!

[Using e λ = λt ] t = 0 t !

λxe-2λ

x!

λY

Y! Y=0

λ x

X!

e-2λ e λ

∑ ∞

12

Page 13: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example Let the joint distribution of X and Y be given as

f(x , y) = x + y x = 0,l,2,3 y = 0,1,2,

30

y x0123001/301/151/1011/301/151/102/1521/151/102/151/6

1/101/53/102/5Y012

f(y)1/51/37/15

Find the marginal distribution function of X and Y. marginal of X

Similarly, f(y) = (3+2x)/15

Or , since the joint is the marginal of x

the marginal of y

13

x0123

f(x)1/101/53/102/5

10

1]33[

30

1)]2()1()0[(

30

1

30),()(

2

0

2

0

xxxxx

yxyxfxf

yy

Page 14: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Conditional Distribution:Recall that for any two events A and B, the conditional probability of A given B is defined as;

P( A\ B)=

Provided P(B) ≠ 0. Suppose now that A and B are the events X = x and Y = y. So that we can write

P(X = x\ Y = y)= (1)

P (X=x ,Y=y)P (Y = y)

Provided P(Y = y) = fY(y) ± 0, where P(X = x, Y = y) = f(x, y) is the value of the joint probability distribution of X and Y at (x, y) and fy (y) is the value of the marginal distribution of Y at y. Denoting the probability in equation 1 by f(x\y) to indicate that x is a variable and y is fixed, then we have the following definition.

P (A ∩ B)P (B)

14

Page 15: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Definition :If f(x, y) is the value of the joint probability' distribution of thediscrete random variable X and Y at (x, y) and fy (y) is the value of themarginal distribution of Y at y, the function given by

ƒ(x\y) = P(X = x\ Y = y) =

ƒ (x , y)ƒY (y)

for each x within the range of X, is called the conditional distribution of X given Y = y. Similarly, if fx(x) is the value of the marginal distribution of X at x

f(y/x) = P ( Y = y / X = x ) = ƒ (x , y)ƒx (y)

For each y within the range of Y , is called the conditional distribution of Y given X = x

15

Page 16: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example 1The joint probability mass function of X and Y is given by Marginal of Y

f(1,1) = 1 f(1,2) = 1 f(2,1)= 1 f(2,2)= 1 8 4 8 2

1.Compute the conditional mass function of X given Y = i, i =1,2 2.Compute P(XY ≤ 3) = f(1, 1) + f(1, 2) + f(2, 1) = 1/2

3 .P(X/Y> l) = f(2, 1)= 1/8Y x12Sum

11/81/82/821/41/26/8

sum3/85/81

Marginal of y

The conditional mass f n of X/Y =1

The conditional of X/Y=2

y12

f(y)2/86/8 x12f(x\y=1)1/21/2

y12Sumf(x\y=2)1/32/31 16

Page 17: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Independent Random Variables:

In chapter 3 we stated that two events are independent if and only if

that is, their joint probability is equal to the product of their marginal probabilities. In the following the notion of independence of random variables is presented.

P(A ∩ B) = P(A) P(B)

17

Page 18: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Definition The random variables X and Y are said to be independent if and only if

ƒ(x, y) = ƒx(x) fY(y) for all possible values of X and Y .

In terms of the Joint cumulative distribution function, X and Y are independent if and only if

F(x, y) = Fx(x) FY(y)For all possible values of X and Y

OrP(X є A, Y є B) = P(X Є A) P(Y ЄB) for every A, B.

18

Page 19: STAT 111 Chapter6 Jointly Distributed Random Variables 1

19

Thus, loosely speaking, X and Y are independent if knowing the value of one does not change the distribution of the other. Random variables that are not independent are said to be dependent .

checking for Independence of discrete random variables requires a very thorough investigation since it is possible to have the product of the Marginal distributions equal to the joint probability distribution for some but not all combinations of ( x,y) . If one can find any point ( x , y ) for which

ƒ(x , y) ≠ ƒx (x) . ƒy(y) ; then the discrete variable X and Y are not independent

Page 20: STAT 111 Chapter6 Jointly Distributed Random Variables 1

If X is independent of Y, then the conditional mass function is the same as the unconditional ones. This follows because

if X is independent of Y, then

ƒ(x\y) = P(X = x\Y = y)

= P(X = x,Y = y) P(Y = y)

= P(X = x)P(Y = y) P( Y = y)

= P(X = x)

20

Page 21: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Definition

Let X1, X2, ..., Xn, be n discrete random variable havingdensities ƒ1, f2, ..., fn respectively. These random variables are said to be independent if their joint density function f is given by

ƒ(x1,x2,..„xn)=ƒX1(xl)ƒX2(x2)...ƒXn(xn)

for all (x1, x2, ..., xn) within their range.

21

Page 22: STAT 111 Chapter6 Jointly Distributed Random Variables 1

Example

Show that the random variables of Example 1 are not independent,

f (1,1) = 1 f(1,2)= 1 8 4

f (2,1)= 1 f (2,2)= 1 8 2

f x (1) = 3 , fx (2)= 5 , fy (1)= 2 , fy (2)=3 8 8 8 4

f(1,1)= 1 ≠ 6 = fx (1) . fy (1) X and Y are not independent 8 64

22

Page 23: STAT 111 Chapter6 Jointly Distributed Random Variables 1

ExampleThe random variables X and Y are specified by

P(X=1)= 2 P(X=0)= 1 3 3

P(Y= 1)= 1 P( Y= -1) = 3 4 4

Construct the joint distribution of X and Y assuming that X and Y are independent

Y X01

-11/4½

11/122/1223