matrix multiplication chapter iibest.eng.buffalo.edu/research/lecture series 2013/matrix...

31
Matrix Multiplication Chapter II- Matrix Analysis By Gokturk Poyrazoglu The State University of New York at Buffalo – BEST Group – Winter Lecture Series

Upload: others

Post on 08-Jul-2020

10 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Matrix Multiplication

Chapter II- Matrix Analysis

By Gokturk Poyrazoglu

The State University of New York at Buffalo – BEST Group – Winter Lecture Series

Page 2: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Outline

1. Basic Linear Algebra

2. Vector Norms

3. Matrix Norms

4. The Singular Value Decomposition

5. SVD Properties

Page 3: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Basic Ideas from Linear Algebra

Independence :

An indexed family of vectors is a linearly independent family if none

of them can be written as a linear combination of finitely many other

vectors in the family.

A subset S of a vector space A is called linearly dependent if there

exist a finite number of distinct vectors a1, a2,..., an in S and scalars α1,

α2,..., αn, not all zero, such that

Subspace (span) :

Given a collection of vectors a1, a2, … , an ; the set of all linear

combinations of these vectors is a subspace (span).

Page 4: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Basic Ideas from Linear Algebra

Maximal Linearly Independent Subset:

The subset {ai1, ai2, …, ain} is a maximal subset of {a1, a2, …, an} if

it is linearly independent and is not properly contained in any

linearly independent subset of {a1, a2, …, an} .

Basis :

If the subset {ai1, ai2, …, ain} is maximal and the span of the

maximal subset is equal to span of the {a1, a2, …, an}, then the

maximal subset {ai1, ai2, …, ain} is a BASIS for spanning the

space.

Dimension :

All bases for a subspace S have the same number of elements.

This number is the dimension of S.

Page 5: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Range – Null Space - Rank

Range:

Null Space :

Rank :

Page 6: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Matrix Inverse

Consider matrix A and X as a n-by-n square matrix

X is inverse of A if

If such X exists, then A is said to be nonsingular.

Otherwise, A is singular matrix.

Properties:

Page 7: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Vector Orthogonality

Orthogonal :

A set of vectors {x1, x2, …, xn} is orthogonal if

Orthonormal:

A set of vectors {x1, x2, …, xn} is orthonormal if

where is the Kronecker Delta

Page 8: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Matrix Orthogonality

Orthogonal Matrix :

A square matrix Q is said to be orthogonal if

Properties:

Columns of Q are orthonormal vectors.

If matrix V1 is a rectangular matrix with all orthonormal

vectors, we can find such a matrix V2 to create an

orthogonal matrix V.

Determinant of Q is either +1 or -1.

Page 9: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Determinant

where A1j is a matrix obtained by deleting the 1st row and jth column of A.

Properties:

1.

2.

3.

4.

5.

For Square Matrices

Page 10: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Eigenvalues

The set of matrix-A’s eigenvalues is

If the eigenvalues of A are real; index them from largest to

smallest

Properties :

1. Matrix A and B are similar matrices if

Similar matrices have exactly the same eigenvalues.

Page 11: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Eigenvectors

There is a nonzero vector x such that the following holds:

Such a vector x is said to be an eigenvector for A

associated with an eigenvalue.

Note:

If n-by-n square matrix has n independent eigenvectors then A

is diagonalizable.

Page 12: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Vector Norms

Vector norm is a function that satisfies the followings:

Notation :

P-norms :

Important Norms (1 , 2 , inf ):

Page 13: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Vector Norm Properties

Holder inequality :

Spec. Case when p=q=2 is called Cauchy-Schwarz Inequality

2-norm is preserved under orthogonal transformation.

If Q is orthogonal matrix;

Page 14: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Absolute Error – Relative Error

Absolute Error in x:

Relative Error in x:

Special case for Relative Error: ∞-norm

Largest component of has approximately p correct significant digits.

Page 15: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Matrix Norms

Frobenius norm :

p-norm :

Meaning : is p-norm of the largest vector obtained by

applying A to a unit p-norm vector.

Page 16: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Matrix Norm Properties

1.

2.

3.

Sequence Convergence:

Consider {Ak} as a sequence, then {Ak} converges if

Page 17: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Orthogonal Invariance

Multiplication of matrix-A with an orthogonal matrix (Q,Z)

doesn’t change the inf-norm or 2-norm of matrix-A.

Example :

Page 18: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

The Singular Value Decomposition

1. U is an m×m unitary matrix (orthogonal matrix if “M" is real),

2. The matrix Σ is an m×n diagonal matrix with nonnegative real numbers on the diagonal,

3. Matrix V* denotes the transpose of the n×n unitary matrix V.

4. The diagonal entries of Σ are known as the singular values of M.

5. In this case, the diagonal matrix Σ is uniquely determined by M (though the matrices U and V are not).

Page 19: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

SVD Properties

*Detailed proof is given in slide 28

Page 20: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

SVD Properties

The corollary states that:

1. The columns of V (right-singular vectors) are eigenvectors of ATA.

2. The columns of U (left-singular vectors) are eigenvectors of AAT.

3. The non-zero elements of Σ (non-zero singular values) are the

square roots of the non-zero eigenvalues of AAT or ATA.

Page 21: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

SVD Properties

Page 22: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

SVD Properties

Page 23: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

SVD Properties

It states that the closest rank-k matrix to A is Ak. (rank

minimization)

It states that the smallest singular value of A is the 2-norm

distance of A to the set of all rank-deficient matrices.

Page 24: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Complex SVD

Unitary Matrix:

A complex square matrix is a unitary matrix if

Complex SVD:

Consider a complex square matrix A, then there exists unitary

matrices U and V such that the following holds.

Page 25: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Sensitivity of Square Systems

We want to solve Ax=b;

How perturbations in A and b affect the solution x?

SVD Analysis

Page 26: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Extra Proof Slides

Page 27: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Proof of SVD Theorem (Slide 18)

Page 28: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Proof of SVD Theorem (Slide 19)

Page 29: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Proof of

The Eckhart-Young Theorem (Slide 23)

Page 30: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Nearness to Singularity

Question is:

If det(A)=0 is equivalent to singularity, is det(A)≅0

equivalent to near singularity ?

Answer is NO.

Page 31: Matrix Multiplication Chapter IIbest.eng.buffalo.edu/Research/Lecture Series 2013/Matrix Analysis.pdfBasic Ideas from Linear Algebra Independence : An indexed family of vectors is

Matrix 2-norm

Consider a unit 2-norm vector z