5.1. jointly distributed random variables
DESCRIPTION
Chapter 5 Joint Probability Distributions and Random Samples. 5.1. Jointly Distributed Random Variables. Joint Probability Mass Function. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/1.jpg)
5.1. Jointly Distributed Random Variables
Chapter 5Joint Probability Distributions
and Random Samples
![Page 2: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/2.jpg)
Joint Probability Mass FunctionLet X and Y be two discrete rv’s defined on the sample space of an experiment. The joint probability mass function p(x, y) is defined for each pair of numbers (x, y) by
( , ) ( and )p x y P X x Y y
Let A be the set consisting of pairs of (x, y) values, then
,
, ( , )x y A
P X Y A p x y
![Page 3: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/3.jpg)
Marginal Probability Mass Functions
The marginal probability mass functions of X and Y, denoted pX(x) and pY(y) are given by
( ) ( , ) ( ) ( , )X Yy x
p x p x y p y p x y
![Page 4: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/4.jpg)
Joint Probability Density FunctionLet X and Y be continuous rv’s. Then f (x, y) is a joint probability density function for X and Y if for any two-dimensional set A
, ( , )A
P X Y A f x y dxdy If A is the two-dimensional rectangle
, ( , )b d
a c
P X Y A f x y dydx
( , ) : , ,x y a x b c y d
![Page 5: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/5.jpg)
,P X Y A
= Volume under density surface above A
( , )f x y
A = shaded rectangle
![Page 6: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/6.jpg)
Marginal Probability Density Functions
The marginal probability density functions of X and Y, denoted fX(x) and fY(y), are given by
( ) ( , ) for
( ) ( , ) for
X
Y
f x f x y dy x
f y f x y dx y
![Page 7: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/7.jpg)
Independent Random VariablesTwo random variables X and Y are said to be independent if for every pair of x and y values
( , ) ( ) ( )X Yp x y p x p y
when X and Y are discrete or( , ) ( ) ( )X Yf x y f x f y
when X and Y are continuous. If the conditions are not satisfied for all (x, y) then X and Y are dependent.
![Page 8: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/8.jpg)
Multivariate Random Variables
If X1, X2,…, Xn are all discrete random variables, the joint pmf of the variables is the function
1 1 1( ,..., ) ( ,..., )n n np x x P X x X x If the variables are continuous, the joint pdf is the function f such that for any n intervals
[a1,b1], …,[an,bn],
1 1 1( ,..., )n n nP a X b a X b 1
1
1 1... ( ,..., ) ...n
n
bb
n na a
f x x dx dx
![Page 9: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/9.jpg)
Independence – More Than Two Random Variables
The random variables X1, X2,…,Xn are independent if for every subset
of the variables, the joint pmf or pdf of the subset is equal to the product of the marginal pmf’s or pdf’s.
1 2, ,...,
ni i iX X X
![Page 10: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/10.jpg)
Conditional Distributions:
Definition: Let (X, Y) be a discrete r.v. with jpmf, p(xi, yj ).The conditional pmf’s are defined
by
P(X = xiY=yj)= PX|Y( xiyj )= p(xi,yj) / pY(yj)
pY(yj) (0)
![Page 11: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/11.jpg)
Conditional Probability FunctionsConditional Probability Functions
Let X and Y be two continuous rv’s with joint pdf f (x, y) and marginal X pdf fX(x). Then for any X value x for which fX(x) > 0, the conditional probability density function of Y given that X = x is
|( , )( | )
( )Y XX
f x yf y x yf x
If X and Y are discrete, replacing pdf’s by pmf’s gives the conditional probability mass function of Y when X = x.
![Page 12: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/12.jpg)
Conditional Expectation and Variance
• The conditional expectation of a r.v. X given that Y=y is defined in the discrete case by
• and in the continuous case by
ix
kY|Xik )y|x(p x)yY|X(E
-
Y|X dx )y|x(f x)yY|X(E
![Page 13: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/13.jpg)
Example • Let (X, Y) ~ f(x, y)= (1/2) e-x, x 0, |y|<x
• Furthermore, the conditional variance of X given Y = y is defined by
• V(XY= y) = E(X2 Y = y) - [E(XY = y) ]2.
0dy)x2/1(y)xX|Y(Ex
x
![Page 14: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/14.jpg)
Let X and Y be jointly distributed rv’s with pmf p(x, y) or pdf f (x, y) according to whether the variables are discrete or continuous. Then the expected value of a function h(X, Y), denoted E[h(X, Y)] or
is ( , ) ( , )
( , ) ( , )
x yh x y p x y
h x y f x y dxdy
discrete
continuous
( , )h X Y
5.2 Expected Values, Covariance &Correlation
![Page 15: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/15.jpg)
CovarianceThe covariance between two rv’s X and Y is
( )( ) ( , )
( )( ) ( , )
X Yx y
X Y
x y p x y
x y f x y dxdy
discrete
continuous
Cov , X YX Y E X Y
![Page 16: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/16.jpg)
Short-cut Formula for Covariance
Cov , X YX Y E XY
![Page 17: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/17.jpg)
Correlation
The correlation coefficient of X and Y, denoted by Corr(X, Y),
defined by, , or just , isX Y
,
Cov ,X Y
X Y
X Y
![Page 18: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/18.jpg)
Correlation Proposition
1. If a and c are either both positive or both negative, Corr(aX + b, cY + d) = Corr(X, Y)
2. For any two rv’s X and Y, 1 Corr( , ) 1.X Y
![Page 19: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/19.jpg)
Correlation Proposition
1. If X and Y are independent, then but does not imply independence.
2. for some numbers a and b with
0, 0
1 or 1 iff Y aX b 0.a
![Page 20: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/20.jpg)
5.3
Statistics and their Distributions
![Page 21: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/21.jpg)
Statistic
A statistic is a function of random variables.
e.g
Where (X1, X2, …, Xn), are r.v.’s. An observations of this vector of r.v.’s from a specific random sample is denoted by
( x1, x2, …, xn), in this case is a specific value of
n
1iix
n1
x
X
n
1iiX
n1
X
![Page 22: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/22.jpg)
Random Samples
The rv’s X1,…,Xn are said to form a (simple
random sample of size n if
1. The Xi’s are independent rv’s.
2. Every Xi has the same probability distribution.
![Page 23: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/23.jpg)
Simulation ExperimentsThe following characteristics must be specified:
1. The statistic of interest.
2. The population distribution.
3. The sample size n.
4. The number of replications k.
![Page 24: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/24.jpg)
5.4
The Distribution of the
Sample Mean
![Page 25: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/25.jpg)
Using the Sample Mean
Let X1,…, Xn be a random sample from a distribution with mean value and standard deviation Then
.
22
1.
2.
X
X
E X
V X n
In addition, with To = X1 +…+ Xn, 2, , and .
oo o TE T n V T n n
![Page 26: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/26.jpg)
Normal Population Distribution
Let X1,…, Xn be a random sample from a normal distribution with mean value and standard deviation Then for any n, is normally distributed, as is To.
. X
![Page 27: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/27.jpg)
The Central Limit Theorem
Let X1,…, Xn be a random sample from a distribution with mean value and variance Then if n sufficiently large, has approximately a normal distribution with
X 2.
22 and ,X X n and To also has
approximately a normal distribution with2, .
o oT Tn n n, the better the approximation.
The larger the value of
![Page 28: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/28.jpg)
The Central Limit Theorem
Population distribution
small to moderate nX
large nX
![Page 29: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/29.jpg)
Rule of Thumb
If n > 30, the Central Limit Theorem can be used.
![Page 30: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/30.jpg)
Approximate Lognormal Distribution
Let X1,…, Xn be a random sample from a distribution for which only positive values are possible [P(Xi > 0) = 1]. Then if n is sufficiently large, the product Y = X1X2…Xn has approximately a lognormal distribution.
![Page 31: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/31.jpg)
5.5
The Distribution of a
Linear Combination
![Page 32: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/32.jpg)
Linear Combination
Given a collection of n random variables X1,…, Xn and n numerical constants a1,…,an, the rv
is called a linear combination of the Xi’s.
1 11
...n
n n i ii
Y a X a X a X
![Page 33: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/33.jpg)
Expected Value of a Linear Combination
Let X1,…, Xn have mean values and variances of respectively
1 2, ,..., n 2 2 21 2, ,..., ,n
Whether or not the Xi’s are independent,
1 1 1 1... ...n n n nE a X a X a E X a E X
1 1 ... n na a
![Page 34: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/34.jpg)
Variance of a Linear Combination
2 21 1 1 1... ...n n n nV a X a X a V X a V X
If X1,…, Xn are independent,
2 2 2 21 1 ... n na a
and
1 1
2 2 2 2... 1 1 ...
n na X a X n na a
![Page 35: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/35.jpg)
Variance of a Linear Combination
1 11 1
... Cov ,n n
n n i j i ji j
V a X a X a a X X
For any X1,…, Xn,
![Page 36: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/36.jpg)
Difference Between Two Random Variables
1 2 1 2E X X E X E X
and, if X1 and X2 are independent,
1 2 1 2V X X V X V X
![Page 37: 5.1. Jointly Distributed Random Variables](https://reader035.vdocument.in/reader035/viewer/2022062408/56815b97550346895dc99b76/html5/thumbnails/37.jpg)
Difference Between Normal Random Variables
If X1, X2,…Xn are independent, normally distributed rv’s, then any linear combination of the Xi’s also has a normal distribution. The difference X1 – X2 between two independent, normally distributed variables is itself normally distributed.