scientific computing singular value decomposition svd

Post on 13-Jan-2016

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Scientific Computing

Singular Value DecompositionSVD

SVD - Overview

SVD is a way to decompose singular (or nearly singular) matrices, i.e. matrices that do not have inverses. This includes square matrices whose determinant is zero (or nearly zero) and all rectangular matrices.

SVD - Basics

The SVD of a m-by-n matrix A is given by the formula :

TUDVAWhere :U is a m-by-m matrix of the orthonormal eigenvectors of AAT (U is orthogonal)VT is the transpose of a n-by-n matrix containing the orthonormal eigenvectors of ATA (V is orthogonal)D is a n-by-n Diagonal matrix of the singular values which are the square roots of the eigenvalues of ATA

SVD

• In Matlab: [U,D,V]=svd(A,0)

T1

00

00

00

VUA

nw

w

T1

00

00

00

VUA

nw

w

The Algorithm

Derivation of the SVD can be broken down into two major steps [2] :1. Reduce the initial matrix to bidiagonal form using

Householder transformations (reflections)2. Diagonalize the resulting matrix using orthogonal

transformations (rotations)

Initial Matrix Bidiagonal Form Diagonal Form

Householder Transformations

Recall: A Householder matrix is a reflection defined as :H = I – 2wwT

Where w is a unit vector with |w|2 = 1.We have the following properties :

H = HT

H-1 = HT

H2 = I (Identity Matrix)

If H is multiplied by another matrix, (on right/left) it results in a new matrix with zero’ed out elements in a selected row / column based on the values chosen for w.

Applying HouseholderTo derive the bidiagonal matrix, we apply successive

Householder matrices on the left (columns) and right (rows):

21 M

21

MM

1

xxxx

xxxx

xxxx

xxxx

xx

HK

xxxx

xxxx

xxxx

xxxx

xxxxx

xxxxx

xxxxx

xxxxx

xxxxx

xxxxx

H

m

x

x

xxx

xxxx

xx

HK

xxx

xxx

xxx

xxxx

xx

m

M

2

M

......

3

BMn

x

xx

xx

xx

xx

K

x

xx

xxx

xx

xx

n

Application con’t

Householder CalculationColumns: Recall that we zero out the column below the (k,k) entry

as follows (note that there are m rows): Let (column vector-size m)

Note:

Thus, where Ik is a kxk identity matrix.

),,,,0,,0()sgn(

),,,,0,,0(

,,1,,

,,1,

kkm

kkk

kkk

kkkk

kkm

kkkk

kkkk

aaaa

aaav

tkkk

k

kk wwIHand

v

vw 2

2

**00

**00

0000

**0

*

*

0

tkkww

M

IwwIH tkkkk 0

0)2(

Householder CalculationRows: To zero out the row past the (k,k+1) entry: Let (row vector- size n)

where wk is a row vector

Note:

Thus, where Ik is a (k-1)x(k-1) identity matrix.

),,,,0,,0()sgn(

),,,0,,0(

,2,1,1,

,1,

knk

kkk

kkk

kkkk

knkk

kkkk

aaaa

aav

ktkk

k

kk wwIHand

v

vw 2

2

**00

**00

0000

**0

*

*

0

k

tkww

Kk = (Ik − 2wktwk ) =

I 0

0 M

⎣ ⎢

⎦ ⎥

ExampleTo derive H1 for the given matrix A :

We have :

Thus,

So,

4

2

4

1A

4

2

10

0

0

1

6

4

2

4

6

4

2

4

1 11 v

ttt

w

30

2,

30

1,

30

5

120

4,

120

2,

120

10

4210

4,2,10

2221

15

11

15

2

3

215

2

15

14

3

13

2

3

1

3

2

15

2

15

1

3

115

1

30

1

6

13

1

6

1

6

5

230

2,

30

1,

30

5

30

230

130

5

221 IIwwIH T

3044

1212

2034

Example con’tThen,

For K1:

733.0267.08.00

133.0867.16.00

667.3667.056

15/1115/45/40

15/215/285/30

3/113/256

3044

1212

2034

15

11

15

2

3

215

2

15

14

3

13

2

3

1

3

2

1AH

667.3667.0236.1100010236.6667.3667.050

236.6667.3667.0505

2

1

v

3098.00563.0949.002

22

v

vw

8081.00349.0588.00

0349.09937.0107.00

588.0107.08017.00

0001

096.00175.0294.00

0175.00032.00535.00

294.00535.0901.00

0000

2

3098.00563.0949.00

3098.0

0563.0

949.0

0

22 111

I

IwwIK t

Example con’tThen,

We can start to see the bidiagonal form.

H1AK1 =

−6 −5 −0.667 −3.667

0 −0.6 1.867 −0.133

0 0.8 −0.267 0.733

⎢ ⎢ ⎢

⎥ ⎥ ⎥

1 0 0 0

0 −0.8017 −0.107 −0.588

0 −0.107 0.9937 −0.0349

0 −0.588 −0.0349 0.8081

⎢ ⎢ ⎢ ⎢

⎥ ⎥ ⎥ ⎥

=

−6 6.236 0 0

0 0.3598 1.9236 0.1799

0 −1.0441 −0.3761 0.1315

⎢ ⎢ ⎢

⎥ ⎥ ⎥

Example con’tIf we carry out this one more time we get:

B = HAK

B =

−6 6.236 0 0

0 −1.1044 0.9846 0

0 0 −1.678 0.3257

⎢ ⎢ ⎢

⎥ ⎥ ⎥

H =

−0.667 −0.522 −0.532

−0.333 −0.430 0.839

−0.667 0.737 0.113

⎢ ⎢ ⎢

⎥ ⎥ ⎥

K =

1 0 0 0

0 −0.802 0.0674 −0.594

0 −0.107 −0.994 0.0315

0 −0.588 0.0888 0.804

⎢ ⎢ ⎢ ⎢

⎥ ⎥ ⎥ ⎥

The QR Algorithm

As seen, the initial matrix is placed into bidiagonal form which results in the following decomposition :

A = HBK with H = H1...Hn and K = Km…K1

The next step takes B and converts it to the final diagonal form using successive rotation transformations (as reflections would disrupt upper triangular form).

Givens Rotations

A Givens rotation is used to rotate a plane about two coordinates axes and can be used to zero elements similar to the householder reflection.

It is represented by a matrix of the form :

Note: The multiplication GA effects only the rows i and j in A.Likewise the multiplication AGt only effects the columns i and j.

1000

00

00

0001

),,(jjij

jiii

cs

scjiG

c ij = cos(θ )

sij = sin(θ )

Givens rotation

The zeroing of an element is performed by computing the c and s in the following system.

0

22 bab

a

cs

sc

Where b is the element being zeroed and a is next to b in the preceding column / row.

This results in :22 ba

ac

s =−b

a2 +b2

Givens Example

In our previous example, we had used Householder transformations to get a bidiagonal matrix:

We can use rotation matrices to zero out the off-diagonal terms

Matlab: [U,D,V]=svd(A,0)

B =

−6 6.236 0 0

0 −1.1044 0.9846 0

0 0 −1.678 0.3257

⎢ ⎢ ⎢

⎥ ⎥ ⎥

SVD ApplicationsCalculation of inverse of A:

So, for mxn A define (pseudo) inverse to be: V D-1 Ut

A−1A = A−1UDV T →I = A−1UDV T[1] : Given

V = A−1UDV TV →V = A−1UD

A =UDV T

[2] : Multiply by A-1

[3] : Multiply by V[4]* : Multiply by D-1

VD−1 = A−1UDD−1 →VD−1 = A−1U

VD−1UT = A−1UUT →VD−1UT = A−1[5] : Multiply by UT

[6] : Rearranging

A−1 =VD−1UT

SVD Applications con’t

• SVD can tell How close a square matrix A is to be singular.• The ratio of the largest singular value to the smallest singular value

can tell us how close a matrix is to be singular:

• A is singular if c is infinite.• A is ill-conditioned if c is too large (machine dependent).

Condition number

SVD Applications con’tData Fitting Problem

SVD Applications con’tImage processing[U,W,V]=svd(A)NewImg=U(:,1)*W(1,1)*V(:,1)’

SVD Applications con’t

• SVD is used as a method for noise reduction. • Let a matrix A represent the noisy signal:

– compute the SVD, – and then discard small singular values of A.

• It can be shown that the small singular values mainly represent the noise, and thus the rank-k matrix Ak represents a filtered signal with less noise.

Digital Signal Processing (DSP)

top related