jacobian transformation

7
8/20/2019 Jacobian Transformation http://slidepdf.com/reader/full/jacobian-transformation 1/7 Math 425 Intro to Probability Lecture 30 Kenneth Harris [email protected] Department of Mathematics University of Michigan April 3, 2009 Kenneth Harris (Math 425)  Math 425 Intro to Probability Lecture 30  April 3, 2009 1 / 32 Introduction One Function of Random Variables Functions of a Random Variable: Density  Let  g () =  y  be a one-to-one function whose derivative is nonzero on some region A  of the real line. Suppose g  maps  A  onto B , so that there is an inverse map  =  h ()  from B  back to  A.  Let  X  be a continuous random variable with known density  f (). Let  Y  = G (). Then the density of  Y  is () =  f  () · dt () . Note: Compare to Ross, Theorem 5.7.1, page 243. Kenneth Harris (Math 425)  Math 425 Intro to Probability Lecture 30  April 3, 2009 3 / 32 Two Functions of Two Random Variables Problem  Let the continuous random variables (, )  have joint density ,(, )  and let A = {(, ) :  f ,(, )  >  0}. (, )  determines a point with  xy -coordinates in the region A .  Consider the continuous random variables  ( , )  given by  =  g 1 (, )   = g 2 (, ). (, )  determines a point with  uv -coordinates in some region B. Problem. If the transformation from xy -coordinates to uv -coordinates given by  =  g 1 (, )   = g 2 (, ). is nice on A , then we can produce the joint density  f ,(, )  for the random variable ( , ). Kenneth Harris (Math 425)  Math 425 Intro to Probability Lecture 30  April 3, 2009 5 / 32 Two Functions of Two Random Variables Definition Nice Transformations Definition A transformation from xy -coordinates to uv -coordinates (xy  ⇒ uv ) given by  =  g 1 (, )   = g 2 (, ). is nice on A, if 1 The partial derivatives  ∂ ∂ ,  ∂ ∂ ,  ∂ ∂ , and  ∂ ∂  exist and are continuous on A. 2 The Jacobian of the transformation is nonzero on A: (, ) = ∂ ∂ ∂ ∂ ∂ ∂ ∂ ∂  =  ∂ ∂ ∂ ∂  −  ∂ ∂ ∂ ∂   = 0 whenever  ( , ) ∈A. Kenneth Harris (Math 425)  Math 425 Intro to Probability Lecture 30  April 3, 2009 6 / 32

Upload: karthik-mohan-k

Post on 07-Aug-2018

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 1/7

Math 425

Intro to Probability

Lecture 30

Kenneth Harris

[email protected]

Department of MathematicsUniversity of Michigan

April 3, 2009

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 1 / 32

Introduction One Function of Random Variables

Functions of a Random Variable: Density

  Let g (x ) =  y  be a one-to-one function whose derivative is nonzero

on some region A  of the real line.

Suppose g  maps A  onto B , so that there is an inverse map

x  =  h (y ) from B  back to  A.

  Let X  be a continuous random variable with known density  f X (x ).

Let Y   = G (X ). Then the density of  Y   is

f Y (y ) =  f X 

h (y )

· d 

dt h (y )

.

Note: Compare to Ross, Theorem 5.7.1, page 243.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 3 / 32

Two Functions of Two Random Variables

Problem

  Let the continuous random variables (X , Y ) have joint density

f X ,Y (x , y ) and let A = {(x , y ) :   f X ,Y (x , y ) >  0}.

(X , Y ) determines a point with  xy -coordinates in the region

A.

 Consider the continuous random variables  (U , V ) given by

U  =  g 1(X , Y )   V   = g 2(X , Y ).

(U , V ) determines a point with  uv -coordinates in some region B.

Problem. If the transformation from xy -coordinates to uv -coordinates

given by

u  =  g 1(x , y )   v   = g 2(x , y ).

is nice onA

, then we can produce the joint density  f U ,V 

(u , v ) for the

random variable (U , V ).

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 5 / 32

Two Functions of Two Random Variables

Definition Nice Transformations

Definition

A transformation from xy -coordinates to uv -coordinates (xy  ⇒ uv )

given by

u  =  g 1(x , y )   v   = g 2(x , y ).

is nice on A, if

1 The partial derivatives   ∂ u ∂ x ,

  ∂ u ∂ y ,

  ∂ v ∂ x , and   ∂ v 

∂ y   exist and are continuous

on A.2 The Jacobian of the transformation is nonzero on A:

J (x , y ) =

∂ u ∂ x 

∂ u ∂ y 

∂ v ∂ x 

∂ v ∂ y 

=  ∂ u 

∂ x 

∂ v 

∂ y  −  ∂ u 

∂ y 

∂ v 

∂ x  = 0

whenever  (x , y ) ∈ A.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 6 / 32

Page 2: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 2/7

Two Functions of Two Random Variables

Change of coordinates

  A nice transformation on A (xy  ⇒ uv ) amounts to simply a change

of coordinates of the plane from  xy -coordinates to uv -coordinates.

We can recover the original xy -coordinates from the new

uv -coordinates.

 

Suppose (xy  ⇒ uv ) is  nice transformation on Au  =  g 1(x , y )   v  =  g 2(x , y )

to uv -coordinates on a region B.

There is a reverse transformation (uv  ⇒ xy ) from uv -coordinates to

xy -coordinates

x  =  h 1(u , v )   y  =  h 2(u , v ).

which maps B onto A and which are also nice   on B.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 7 / 32

Two Functions of Two Random Variables

Jacobians

  The Jacobian of the original transformation (xy  ⇒ uv ) is thedeterminant

J (x , y ) =

∂ u ∂ x 

∂ u ∂ y 

∂ v ∂ x 

∂ v ∂ y 

= ∂ u 

∂ x 

∂ v 

∂ y  −  ∂ u 

∂ y 

∂ v 

∂ x 

  The Jacobian of the inverse transformation (uv 

 ⇒xy ) is the

determinantJ (u , v ) =

∂ x ∂ u 

∂ x ∂ v 

∂ y ∂ u 

∂ y ∂ v 

=  ∂ x 

∂ u 

∂ y 

∂ v  −  ∂ x 

∂ v 

∂ y 

∂ u 

  Since (xy  ⇒ uv ) is nice on A,

J (x , y ) = 0 whenever (x , y ) ∈ A and

J (u , v ) = 0 whenever (u , v ) ∈ B.

Furthermore, the two Jacobian determinants are inverses

J (x , y ) =  J (u , v )−1

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 8 / 32

Two Functions of Two Random Variables

Main Theorem

Theorem

Let  (X , Y ) be continuous random variables with joint density f X ,Y (x , y ), and (U , V ) be random variables given by 

U  =  g 1(X , Y )   V  =  g 2(X , Y ).

Suppose the  (xy  ⇒ uv ) transformation 

u  =  g 1(x , y )   v  =  g 2(x , y ).

is nice on A = {(x , y ) :   f X ,Y (x , y ) = 0}.Let the inverse  (uv  ⇒ xy ) from B to A be 

x  =  h 1(u , v )   y  =  h 2(u , v ).

  The joint density of  (U , V ) is given for  (u , v ) ∈ B by either equation f U ,V (u , v ) =   f X ,Y 

h 1(u , v ), h 2(u , v )

·J (u , v )

f U ,V (u , v ) =   f X ,Y 

h 1(u , v ), h 2(u , v )

· J (x , y )−1

whichever is more convenient to compute.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 9 / 32

Two Functions of Two Random Variables

Picture of Theorem

f U ,V (u , v ) · du · dv  ≈   P{(U , V ) ∈ ∆B }

P {(X , Y ) ∈ ∆A} ≈   f X ,Y (x , y ) · J (u , v )

· du · dv 

x,yu,v

du

dv

UV XY

B

A

PU,VB   PX,YA

Area of  B dudv Area of   A Ju,vdudv

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 10 / 32

Page 3: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 3/7

Two Functions of Two Random Variables

Sketch of Proof of Theorem

  Let B  ⊆ B and suppose (uv  ⇒ xy ) maps B  to  A ⊆ A.

P {(U , V ) ∈ B }   =   P {(X , Y ) ∈ A}=

 (x ,y )∈A

f X ,Y (x , y ) dx dy 

=  (u ,v )∈B 

f X ,Y h 1(u , v ), h 2(u , v ) · J (u , v ) dudv 

using the Change of Variables Theorem of analysis.

  Intuitively, we can break  B  into small regions  ∆B  which (uv  ⇒ xy )transforms to small regions ∆A of A where for any (u , v ) ∈ ∆B :

f U ,V (u , v ) · Area

∆B  ≈ f X ,Y (h 1(u , v ), h 2(u , v )) · Area

∆A

where Area

∆A

≈ Area

∆B  · J (u , v )

.  Differentiate the integrals to get the transformation rule:

f U ,V (u , v ) =

f X ,Y 

h 1(u , v ), h 2(u , v )

· J (u , v )

  if (u , v ) ∈ B0 otherwise.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 11 / 32

Example

Functions of a Random Variable: Density

Example. Let X   and Y  be continuous random variables with joint

density f X ,Y (x , y ) and where X  = 0. Consider

U  =  XY V   = X .

  The transformation (xy  ⇒ uv ) is given by

u  =  xy v   = x .

The inverse transformation (uv  ⇒ xy ) is given by

x  =  v y  = u 

v .

The Jacobian for transformation for  (uv  ⇒ xy ) is

J (u , v ) =

0   1

v 1   − u 

v 2

= −1

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 13 / 32

Example

Functions of a Random Variable: Density

  So, the joint density is

f U ,V (u , v ) =   f X ,Y 

h 1(u , v ), h 2(u , v ) · J (u , v )

=   f X ,Y 

v ,

 u 

·   1

|v |

  We can compute the marginal  f U (u ) =  f XY (u ) by

f XY (u ) =

   ∞

−∞

f X ,Y 

v ,

 u 

·   1

|v | dv 

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 14 / 32

Examples: Polar Coordinates

Rectangle to Polar coordinates

  It is often convenient to change from rectangular coordinates  xy   topolar coordinates r θ. The transformation (xy  ⇒ r θ) is

r  =  x 2 + y 2 θ =  arctan y x 

.

where r   > 0 and   −π < θ ≤ π.

  The inverse transformation (r θ ⇒ xy ) from polar back torectangular is

x  =  r  cos θ   y  =  r  sin θ.

  The transformation is (xy 

 ⇒r θ) nice in the punctured plane

R2 − {(0, 0)}.  Verified in three slides.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 16 / 32

Page 4: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 4/7

Examples: Polar Coordinates

Converting Rectangle to Polar

  Rectangle xy -coordinates to polar  r θ-coordinates:

r  = 

x 2 + y 2,   r   > 0   θ =  arctan y 

x ,   −π < θ ≤ π,

Polar r θ-coordinates to rectangle  xy -coordinates

x  =  r  cos θ   y  =  r  sin θ   −∞ < x , y  < ∞.

x,yr,Θ 

yrsinΘ 

xrcosΘ 

r   x2  y

2

Θarctan y

 x

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 17 / 32

Examples: Polar Coordinates

Converting Rectangle to Polar

  Plot of tan  y x 

  on [−π, π]. The four quadrants of the plane are

I  :  x , y  >  0   II  :  x  <  0, y  >  0   III  :  x , y  <  0   IV  :  x  >  0, y  <  0

II III I IV

Π Π

2

Π

2  Π

6

4

2

2

4

6

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 18 / 32

Examples: Polar Coordinates

Problem: Rectangle to Polar

Problem. Let (X , Y ) be randomly chosen in some region  R  of the

xy -plane with joint density  f X ,Y (x , y ).

  Consider the random variables giving the polar coordinates

R  = 

X 2 + Y 2 Θ =  arctan Y 

where R  >  0 and −π < Θ ≤ π.

  The Jacobian is easiest to compute on the  r θ-plane:

J (r , θ) =

cos θ   sin θ

−r  sin θ   r  cos θ

= r  cos2 θ + r  sin2 θ =  r 

  The joint distribution of  R , Θ is

f R ,Θ(r , θ) =  r  · f X ,Y (r  cos θ, r  sin θ)   r   > 0,−π < θ ≤ π.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 19 / 32

Examples: Polar Coordinates

Example

Example. Let (X , Y ) be uniformly distributed in R  =   unit circle. So,

f X ,Y (x , y ) =  1

π  when x 2 + y 2 ≤ 1.

  So,

f R ,Θ(r , θ) =  r  · f X ,Y (r  cos θ, r  sin θ) =  r 

π  0 <  r  ≤ 1,−π < θ ≤ π

  The marginals are

f R (r ) =

   π−π

π d θ =  2r    0 <  r  ≤ 1,

f Θ(θ) =    1

0

π

 dr  =  1

2π   −π < θ

≤π.

Thus, Θ  is uniformly distributed on  (−π, π].

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 20 / 32

Page 5: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 5/7

Examples: Polar Coordinates

Example

Example. Let (X , Y ) be independent and normally distributed in the

plane with (µ =  0, σ2). So,

f X ,Y (x , y ) =  1

2πσ2e −(x 2+y 2)/2σ2

  Since f R ,Θ(r , θ) =  r 

 ·f X ,Y (r  cos θ, r  sin θ),

f R ,Θ(r , θ) =  r 

2πσ2e −(r 2 cos2 θ+r 2 sin2 θ)/2σ2

=  r 

2πσ2e −r 2/2σ2

0 <  r ,−π < θ ≤ π.

  The marginals are

f R (r ) =

   π−π

2πσ2e −r 2/2σ2

d θ =  r 

σ2e −r 2/2σ2

0 <  r ,

f Θ(θ) =

  ∞0

2πσ2e −r 2/2σ2

dr  =  1

2π  − π < θ ≤ π.

Thus, Θ  is  uniformly distributed on  (−π, π] and  R  is the Rayleigh

distribution (the distance of  (X , Y ) from the origin).Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 21 / 32

Examples: Polar Coordinates

Example

Example. Let R  be  exponentially distributed with mean 2 and  Θ  beuniformly distributed in  (−π, π], both independent. The joint distributionis

f R ,Θ(r , θ) =  1

2π ·  1

2e −r /2 0 <  r ,−π < θ ≤ π

  Let X   and Y  be random variables determined by

X   =√ 

R cos Θ   Y  =√ 

R sin Θ

Solve for r , θ in the transformation  x  =√ 

2r  cos θ  and  y  =√ 

2r  sin θ:

r  =  x 2 + y 2 θ =  arctan y 

x .

The Jacobian determinant is easiest to compute using  r θ-coordinates:

J (r , θ) =

cos θ2√ 

r sin θ2√ 

−√ r  sin θ√ 

r  cos θ

=  cos2 θ

2  +

 sin2θ

2  =

  1

2.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 22 / 32

Examples: Polar Coordinates

Example – continued

f R ,Θ(r , θ) =  1

2π ·  1

2e −r /2 0 <  r ,−π < θ ≤ π

 

So,f X ,Y (x , y ) =   f R ,θ(x 2 + y 2, arctan

 y 

x ) · 2

=  1

2πe −(x 2+y 2)/2

  X  and Y  are independent and normally distributed randomvariables with (µ =  0, σ2 = 1). The marginals are obtained byintegrating f X ,Y (x , y ):

f X (x ) =  1√ 

2πe −x 2/2

f Y (y ) =   1√ 2π

e −y 2/2

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 23 / 32

Examples: Polar Coordinates

Example–continued

Let U  and  V  be uniformly distributed on (0, 1).

  Consider the random variable Θ:

Θ =  2πV  − π

So, Θ  is uniformly distributed on (−π, π).

  Consider the random variable R :

R  =  2 ln  1

U   solving, u  =  e −r /2.

By Proposition 5.7.1 (Ross, page 243),

f R (r ) =  f U (e −r /2) · u  =  1

2e −r /2

So, R  is exponentially distributed with mean 2.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 24 / 32

Page 6: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 6/7

Examples: Polar Coordinates

Example – continued

  Let X   and Y  be random variables determined by

X   =√ 

R cos Θ   Y  =√ 

R sin Θ

Then X  and Y  are independent standard normal random variables!!

  We can simulate a standard normal random variable X  by using two

independent uniform random variables U  and V   on (0, 1):

X   =

 2 ln

  1

U  cos

2πV  − π

.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 25 / 32

Examples: Polar Coordinates

Converting Rectangle to Polar

  Simulating a standard normal random variable with a pair of independent

uniform random variables on (0, 1).

3   2   1 0 1 2 30.0

0.1

0.2

0.3

0.4

0.5 1000 data points

3   2   1 0 1 2 30.0

0.1

0.2

0.3

0.4

0.5 10,000 data points

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 26 / 32

Example of Joint Distribution

Example

Example. Let X   and Y  be independent and uniformly distributed on(0, 1]. Find the joint probability density function for the randomvariables

U  =

  X 

Y    V  =  XY .

  Individually, the distribution of X   and Y   are

f X (x ) =

1 if 0 ≤ x  ≤ 1

0 otherwisef Y (y ) =

1 if 0 ≤ y  ≤ 1

0 otherwise

So, the joint distribution  f X ,Y (x , y ) is

f X ,Y (x , y ) =  f X (x ) · f Y (y ) = 1 if 0 ≤ x , y  ≤ 1

0 otherwise.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 28 / 32

Example of Joint Distribution

Example – continued

  The transformation into  uv -coordinates

u  =  x 

y   v  =  xy ,

is one-to-one and has an inverse

x  =√ 

uv y  =

 v 

u .

The Jacobian determinant is easiest when computed in  xy coordinates:

J (x , y ) =

1y    − x 

y 2

y x 

= 2x 

y   = 2u 

  So, u , v   > 0 and

f U ,V (u , v ) =  f X ,Y √ uv ,

 v 

u  · 1

2u 

=   1

2u   if 0  <

√ uv , v 

u  ≤1

0 otherwise.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 29 / 32

Page 7: Jacobian Transformation

8/20/2019 Jacobian Transformation

http://slidepdf.com/reader/full/jacobian-transformation 7/7

Example of Joint Distribution

Example – continued

  It remains to compute the bounds on u  and  v .

0 <√ 

uv , v 

u  ≤ 1   =⇒   0 <  v  ≤   1

u ,   0 <  v  ≤ u .

Only one of these ranges need be retained, depending upon whetheru  ∈ (0, 1] or  u  ∈ [1,∞):

f U ,V (u , v ) =

12u 

  if 0  <  u  <  1,   0 <  v  ≤ u ,

or, if u  ≥ 1,   0 <  v  ≤   1u 

,

0 otherwise.

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 30 / 32

Example of Joint Distribution

Example – continued

f U ,V (u , v ) =

12u 

  if 0  <  u  <  1,   0 <  v  ≤ u ,

or, if u  ≥ 1,   0 <  v  ≤   1u 

,

0 otherwise.

 

We compute the marginals.

f U (u ) =

 u 

01

2u  dv    if 0  <  u  <  1   1

01

2u  dv    if u  ≥ 1

0 otherwise

=

12

 dv    if 0  <  u  <  11

2u 2 dv    if u  ≥ 1

0 otherwise

f v (v ) =

   1v 

v 1

2u  du    if 0  <  v  ≤ 1

0 otherwise=

ln  1

v    if 0  <  v  ≤ 1

0 otherwise

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 31 / 32

Example of Joint Distribution

Example – continued

  Plot of area determined by

0 <  u  <  1   =⇒   0 <  v  ≤ u    and   u  ≥ 1   =⇒   0 <  v  ≤   1

u .

v

1

v

1 2 3 4u

0.2

0.4

0.6

0.8

1.0

v

Kenneth Harris (Math 425)   Math 425 Intro to Probability Lecture 30   April 3, 2009 32 / 32