quantum one: lecture 19 1. 2 representation independent properties of linear operators 3
TRANSCRIPT
Quantum One: Lecture 19
1
2
Representation Independent Properties of Linear Operators
3
In the last lecture, we derived the canonical commutation relations obeyed by the Cartesian components of the position and wavevector (or momentum) operators.
We then began a study of unitary operators and showed that any two sets of orthonormal basis vectors are connected by a unitary operator and by its adjoint.
We saw how to transform between two discrete representations, using the matrices that represent the unitary operators connecting them, and extended this idea to continuous representation, noting that the Fourier transform relation between position and momentum actually represents a unitary transformation between those two representations.
In this lecture we learn about a number of properties that the matrices that represent a given linear operator share, i.e., representation independent properties.
We begin by introducing what is called the Trace of an Operator 4
In the last lecture, we derived the canonical commutation relations obeyed by the Cartesian components of the position and wavevector (or momentum) operators.
We then began a study of unitary operators and showed that any two sets of basis vectors are connected by a unitary operator and by its adjoints.
We saw how to transform between two discrete representations, using the matrices that represent the unitary operators connecting them, and extended this idea to continuous representation, noting that the Fourier transform relation between position and momentum actually represents a unitary transformation between those two representations.
In this lecture we learn about a number of properties that the matrices that represent a given linear operator share, i.e., representation independent properties.
We begin by introducing what is called the Trace of an Operator 5
In the last lecture, we derived the canonical commutation relations obeyed by the Cartesian components of the position and wavevector (or momentum) operators.
We then began a study of unitary operators and showed that any two sets of basis vectors are connected by a unitary operator and by its adjoints.
We saw how to transform between two discrete representations, using the matrices that represent the unitary operators connecting them, and extended this idea to continuous representation, noting that the Fourier transform relation between position and momentum actually represents a unitary transformation between those two representations.
In this lecture we learn about a number of properties that the matrices that represent a given linear operator share, i.e., representation independent properties.
We begin by introducing what is called the Trace of an Operator 6
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
7
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
8
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
9
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
10
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
11
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
12
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
13
The Trace of an Operator A, denoted by Tr(A), is the sum of the diagonal elements of any matrix [A] representing the operator (which we also define as Tr([A]), i.e., for any discrete ONB of states {|i },⟩
It is easy to show that the trace of a product of matrices (or operators) is invariant under a cyclic permutation of the elements in the product.
14
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
15
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
16
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
17
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
18
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
19
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
20
The Trace of an Operator Note that if [A] represents a linear operator A in some representation, and [U] is a unitary matrix with adjoint [U⁺], then
is the matrix representing A in some other representation, and
which shows that the trace of any matrix [A],[A ],[A ],… that represents A is the ′ ′′same. Thus, Tr(A) is a representation independent property.
In a representation in which [A] is diagonal, the diagonal elements are just the eigenvalues of A, and so
21
The Determinant of an Operator A, denoted by det(A), or more simply as |A|, is the determinant any matrix that represents that operator, i.e.,
if in some discrete representation of states {|i }⟩ , the matrix [A] represents A, then
Basic familiarity with general properties of the determinant of a matrix will be assumed. 22
The Determinant of an Operator A, denoted by det(A), or more simply as |A|, is the determinant any matrix that represents that operator, i.e.,
if in some discrete representation of states {|i }⟩ , the matrix [A] represents A, then
Basic familiarity with general properties of the determinant of a matrix will be assumed. 23
The Determinant of an Operator A, denoted by det(A), or more simply as |A|, is the determinant any matrix that represents that operator, i.e.,
if in some discrete representation of states {|i }⟩ , the matrix [A] represents A, then
Basic familiarity with general properties of the determinant of a matrix will be assumed. 24
The Determinant of an Operator
For example, any determinant can be expanded in minors, until one gets down to 2×2 matrices.
The determinant of a 2×2 matrix is
The determinant of a diagonal matrix is just the product of the diagonal elements.
25
The Determinant of an Operator
For example, any determinant can be expanded in minors, until one gets down to 2×2 matrices.
The determinant of a 2×2 matrix is
The determinant of a diagonal matrix is just the product of the diagonal elements.
26
The Determinant of an Operator
For example, any determinant can be expanded in minors, until one gets down to 2×2 matrices.
The determinant of a 2×2 matrix is
The determinant of a diagonal matrix is just the product of the diagonal elements.
27
The Determinant of an Operator
Thus, e.g., the identity operator, which in any discrete representation is represented by the identity matrix, has a determinant of unity,
In addition, it turns out that the determinant of a product of matrices is equal to the product of their determinants, i.e., if [ABC] = [A][B][C], then
det[ABC] = det([A][B][C]) = det[A) det[B] det [C]
28
The Determinant of an Operator
Thus, e.g., the identity operator, which in any discrete representation is represented by the identity matrix, has a determinant of unity,
In addition, it turns out that the determinant of a product of matrices is equal to the product of their determinants, i.e., if [ABC] = [A][B][C], then
det[ABC] = det([A][B][C]) = det[A) det[B] det [C]
29
The Determinant of an Operator
Thus, e.g., the identity operator, which in any discrete representation is represented by the identity matrix, has a determinant of unity,
In addition, it turns out that the determinant of a product of matrices is equal to the product of their determinants, i.e., if [ABC] = [A][B][C], then
det[ABC] = det([A][B][C]) = det[A] det[B] det [C]
30
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
31
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
32
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
33
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
34
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
35
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
36
The Determinant of an Operator
It follows that if [A] and [A ] = [U][A][U⁺]′ represent a linear operator A in two different representations connected by the unitary operator U, then
in which we have used the result in both directions to recombine the product of the determinant into the determinant of the product, i.e.,
det([U][U⁺]) = det(1) = 1.
37
The Determinant of an Operator
Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so
Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish.
This condition extends to any operator represented by such a matrix, i.e.
If det(A) = 0, then A is non-invertible or singular.
If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹=A⁻¹A=1.
38
The Determinant of an Operator
Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so
Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish.
This condition extends to any operator represented by such a matrix, i.e.
If det(A) = 0, then A is non-invertible or singular.
If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹=A⁻¹A=1.
39
The Determinant of an Operator
Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so
Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish.
This condition extends to any operator represented by such a matrix, i.e.
If det(A) = 0, then A is non-invertible or singular.
If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹=A⁻¹A=1.
40
The Determinant of an Operator
Note that in a representation in which A is diagonal, the diagonal elements are just the eigenvalues of A, and so
Finally, you may recall that a necessary and sufficient condition for the inverse of a matrix to exist is that its determinant not vanish.
This condition extends to any operator represented by such a matrix, i.e.
If det(A) = 0, then A is non-invertible or singular.
If det(A) ≠ 0, then there exists an inverse operator A⁻¹ such that AA⁻¹ = A⁻¹A = 1.
41
Eigenvalues, Eigenvectors, and Eigenspaces
Recall that a nonzero vector |χ⟩ is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation
A|χ =a|χ⟩ ⟩.
The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}.
The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded.
Comment: A number of basic features follow from the eigenvalue equation.
1. If |χ ⟩ is an eigenvector of A then so is any multiple λ|χ⟩ of |χ⟩. This follows from the fact that A is a linear operator so that
A(λ|χ ) = λA|χ = λa|χ = a(λ|χ )⟩ ⟩ ⟩ ⟩ .42
Eigenvalues, Eigenvectors, and Eigenspaces
Recall that a nonzero vector |χ⟩ is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation
A|χ =a|χ⟩ ⟩.
The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}.
The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded.
Comment: A number of basic features follow from the eigenvalue equation.
1. If |χ ⟩ is an eigenvector of A then so is any multiple λ|χ⟩ of |χ⟩. This follows from the fact that A is a linear operator so that
A(λ|χ ) = λA|χ = λa|χ = a(λ|χ )⟩ ⟩ ⟩ ⟩ .43
Eigenvalues, Eigenvectors, and Eigenspaces
Recall that a nonzero vector |χ⟩ is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation
A|χ =a|χ⟩ ⟩.
The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}.
The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded.
Comment: A number of basic features follow from the eigenvalue equation.
1. If |χ ⟩ is an eigenvector of A then so is any multiple λ|χ⟩ of |χ⟩. This follows from the fact that A is a linear operator so that
A(λ|χ ) = λA|χ = λa|χ = a(λ|χ )⟩ ⟩ ⟩ ⟩ .44
Eigenvalues, Eigenvectors, and Eigenspaces
Recall that a nonzero vector |χ⟩ is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation
A|χ =a|χ⟩ ⟩.
The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}.
The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded.
Comment: A number of basic features follow from the eigenvalue equation.
1. If |χ ⟩ is an eigenvector of A then so is any multiple λ|χ⟩ of |χ⟩. This follows from the fact that A is a linear operator so that
A(λ|χ ) = λA|χ = λa|χ = a(λ|χ )⟩ ⟩ ⟩ ⟩ .45
Eigenvalues, Eigenvectors, and Eigenspaces
Recall that a nonzero vector |χ⟩ is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation
A|χ =a|χ⟩ ⟩.
The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}.
The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded.
Comment: A number of basic features follow from the eigenvalue equation.
1. If |χ ⟩ is an eigenvector of A then so is any multiple λ|χ⟩ of |χ⟩. This follows from the fact that A is a linear operator so that
A(λ|χ ) = λA|χ = λa|χ = a(λ|χ )⟩ ⟩ ⟩ ⟩ .46
Eigenvalues, Eigenvectors, and Eigenspaces
Recall that a nonzero vector |χ⟩ is said to be an eigenvector of the operator A with eigenvalue a (where generally, a ∈ C) if it satisfies the eigenvalue equation
A|χ =a|χ⟩ ⟩.
The set of eigenvalues {a} for which solutions to this equation exist is referred to as the spectrum of the operator A, and we write spectrum(A)={a}.
The spectrum of an arbitrary operator can be real, complex, continuous, discrete, mixed, bounded, or unbounded.
Comment: A number of basic features follow from the eigenvalue equation.
1. If |χ ⟩ is an eigenvector of A then so is any multiple λ|χ⟩ of |χ⟩. This follows from the fact that A is a linear operator so that
A(λ|χ ) = λA|χ = λa|χ = a(λ|χ )⟩ ⟩ ⟩ ⟩ .47
Eigenvalues, Eigenvectors, and Eigenspaces
Thus, only the direction in Hilbert space of a given eigenvector is unique.This means that we are always free to construct eigenvectors that are appropriately normalized.
2. By taking the adjoint of the eigenvalue equation A|χ =a|χ , ⟩ ⟩
we see that if |χ is an eigenket with eigenvalue⟩ a then
which implies that χ| is an eigenbra of A⁺ with eigenvalue . ⟨
48
Eigenvalues, Eigenvectors, and Eigenspaces
Thus, only the direction in Hilbert space of a given eigenvector is unique.This means that we are always free to construct eigenvectors that are appropriately normalized.
2. By taking the adjoint of the eigenvalue equation A|χ =a|χ , ⟩ ⟩
we see that if |χ is an eigenket with eigenvalue⟩ a then
which implies that χ| is an eigenbra of A⁺ with eigenvalue . ⟨
49
Eigenvalues, Eigenvectors, and Eigenspaces
Thus, only the direction in Hilbert space of a given eigenvector is unique.This means that we are always free to construct eigenvectors that are appropriately normalized.
2. By taking the adjoint of the eigenvalue equation
we see that if |χ is an eigenket with eigenvalue⟩ a then
which implies that χ| is an eigenbra of A⁺ with eigenvalue . ⟨
50
Eigenvalues, Eigenvectors, and Eigenspaces
Thus, only the direction in Hilbert space of a given eigenvector is unique.This means that we are always free to construct eigenvectors that are appropriately normalized.
2. By taking the adjoint of the eigenvalue equation
we see that if |χ is an eigenket with eigenvalue⟩ a then
which implies that χ| is an eigenbra of A⁺ with eigenvalue . ⟨
51
Eigenvalues, Eigenvectors, and Eigenspaces
Thus, only the direction in Hilbert space of a given eigenvector is unique.This means that we are always free to construct eigenvectors that are appropriately normalized.
2. By taking the adjoint of the eigenvalue equation
we see that if |χ is an eigenket with eigenvalue⟩ a then
which implies that χ| is an eigenbra of A⁺ with eigenvalue . ⟨
52
Eigenvalues, Eigenvectors, and Eigenspaces
An eigenvalue a of an operator A is degenerate if there exists more than one linearly independent eigenvector corresponding to that eigenvalue.
The degeneracy of an eigenvalue a is equal to the maximum number of linearly independent eigenvectors associated with it.
We also say that an eigenvalue with degeneracy is -fold degenerate.
An eigenvalue with only one linearly independent eigenvector is said to be nondegenerate.
53
Eigenvalues, Eigenvectors, and Eigenspaces
An eigenvalue a of an operator A is degenerate if there exists more than one linearly independent eigenvector corresponding to that eigenvalue.
The degeneracy of an eigenvalue a is equal to the maximum number of linearly independent eigenvectors associated with it.
We also say that an eigenvalue with degeneracy is -fold degenerate.
An eigenvalue with only one linearly independent eigenvector is said to be nondegenerate.
54
Eigenvalues, Eigenvectors, and Eigenspaces
An eigenvalue a of an operator A is degenerate if there exists more than one linearly independent eigenvector corresponding to that eigenvalue.
The degeneracy of an eigenvalue a is equal to the maximum number of linearly independent eigenvectors associated with it.
We also say that an eigenvalue with degeneracy is -fold degenerate.
An eigenvalue with only one linearly independent eigenvector is said to be nondegenerate.
55
Eigenvalues, Eigenvectors, and Eigenspaces
An eigenvalue a of an operator A is degenerate if there exists more than one linearly independent eigenvector corresponding to that eigenvalue.
The degeneracy of an eigenvalue a is equal to the maximum number of linearly independent eigenvectors associated with it.
We also say that an eigenvalue with degeneracy is -fold degenerate.
An eigenvalue with only one linearly independent eigenvector is said to be nondegenerate.
56
Eigenvalues, Eigenvectors, and Eigenspaces
It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if
then the action of A on any linear combination
of these vectors is
57
Eigenvalues, Eigenvectors, and Eigenspaces
It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if
then the action of A on any linear combination
of these vectors is
58
Eigenvalues, Eigenvectors, and Eigenspaces
It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if
then the action of A on any linear combination
of these vectors is
59
Eigenvalues, Eigenvectors, and Eigenspaces
It is easy to show that any linear superposition of the linearly independent eigenstates of A with the same eigenvalue is also an eigenstate of A with that eigenvalue, i.e., if
then the action of A on any linear combination
of these vectors is
60
Eigenvalues, Eigenvectors, and Eigenspaces
As we have seen, any set of linearly independent vectors forms a basis for a subspace of S, namely, the subspace
formed from all possible linear combinations of those vectors.
Thus the subspace formed from any set of linearly independent eigenvectors of A with the same -fold degenerate eigenvalue a, forms a basis for an - dimensional subspace , each vector of which is an eigenvector of A with that eigenvalue.
We refer to the subspace as the eigenspace of A associated with eigenvalue a.
61
Eigenvalues, Eigenvectors, and Eigenspaces
As we have seen, any set of linearly independent vectors forms a basis for a subspace of S, namely, the subspace
formed from all possible linear combinations of those vectors.
Thus the subspace formed from any set of linearly independent eigenvectors of A with the same -fold degenerate eigenvalue a, forms a basis for an - dimensional subspace , each vector of which is an eigenvector of A with that eigenvalue.
We refer to the subspace as the eigenspace of A associated with eigenvalue a.
62
Eigenvalues, Eigenvectors, and Eigenspaces
As we have seen, any set of linearly independent vectors forms a basis for a subspace of S, namely, the subspace
formed from all possible linear combinations of those vectors.
Thus the subspace formed from any set of linearly independent eigenvectors of A with the same -fold degenerate eigenvalue a, forms a basis for an - dimensional subspace , each vector of which is an eigenvector of A with that eigenvalue.
We refer to the subspace as the eigenspace of A associated with eigenvalue a.
63
Eigenvalues, Eigenvectors, and Eigenspaces
Within the eigenspace we may form linear combinations of the linear independent eigenvectors using the Gram-Schmidt orthogonalization procedure to construct an orthonormal basis
of eigenvectors for this eigenspace. Here τ is a discrete index that labels the different orthonormal basis states that span this subspace.
Thus each linear operator A of the state space can be associated with 1) a set of eigenvalues (each of which has a certain degeneracy ), 2) a set of eigenvectors , and 3) and a set of eigenspaces associated with each the different eigenvalues of that operator. 64
Eigenvalues, Eigenvectors, and Eigenspaces
Thus each linear operator A of the state space can be associated with
1. a set of eigenvalues (each of which has a certain degeneracy ),
2. a set of eigenvectors ,
3. and a set of eigenspaces associated with each the different eigenvalues of that operator.
65
Eigenvalues and Eigenvectors of Hermitian Operators
The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include:
Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ⟩ is one of its eigenvectors, then
A|χ =a|χ , ⟩ ⟩
and so
But for a Hermitian operator the adjoint of this last equation is
Comparing the last two relations we deduce that . 66
Eigenvalues and Eigenvectors of Hermitian Operators
The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include:
Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ⟩ is one of its eigenvectors, then
A|χ =a|χ , ⟩ ⟩
and so
But for a Hermitian operator the adjoint of this last equation is
Comparing the last two relations we deduce that . 67
Eigenvalues and Eigenvectors of Hermitian Operators
The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include:
Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ⟩ is one of its eigenvectors, then
A|χ =a|χ , ⟩ ⟩
and so
But for a Hermitian operator the adjoint of this last equation is
Comparing the last two relations we deduce that . 68
Eigenvalues and Eigenvectors of Hermitian Operators
The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include:
Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ⟩ is one of its eigenvectors, then
A|χ =a|χ , ⟩ ⟩
and so
But for a Hermitian operator the adjoint of this last equation is
Comparing the last two relations we deduce that . 69
Eigenvalues and Eigenvectors of Hermitian Operators
The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include:
Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ⟩ is one of its eigenvectors, then
A|χ =a|χ , ⟩ ⟩
and so
But for a Hermitian operator the adjoint of this last equation is
Comparing the last two relations we deduce that . 70
Eigenvalues and Eigenvectors of Hermitian Operators
The second postulate associates observables with linear Hermitian operators A. The reason for this largely stems from the special properties associated with such operators. These include:
Reality of the Eigenvalues - If A is a Hermitian operator, so that A = A⁺, and |χ⟩ is one of its eigenvectors, then
A|χ =a|χ , ⟩ ⟩
and so
But for a Hermitian operator the adjoint of this last equation is
Comparing the last two relations we deduce that . 71
Eigenvalues and Eigenvectors of Hermitian Operators
Thus, we see that the eigenvalues of Hermitian operators are necessarily real.
Formerly we showed that expectation values of Hermitian operators are real.
The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal.
The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators.
Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form
72
Eigenvalues and Eigenvectors of Hermitian Operators
Thus, we see that the eigenvalues of Hermitian operators are necessarily real.
Formerly we showed that expectation values of Hermitian operators are real.
The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal.
The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators.
Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form
73
Eigenvalues and Eigenvectors of Hermitian Operators
Thus, we see that the eigenvalues of Hermitian operators are necessarily real.
Formerly we showed that expectation values of Hermitian operators are real.
The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal.
The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators.
Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form
74
Eigenvalues and Eigenvectors of Hermitian Operators
Thus, we see that the eigenvalues of Hermitian operators are necessarily real.
Formerly we showed that expectation values of Hermitian operators are real.
The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal.
The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators.
Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form
75
Eigenvalues and Eigenvectors of Hermitian Operators
Thus, we see that the eigenvalues of Hermitian operators are necessarily real.
Formerly we showed that expectation values of Hermitian operators are real.
The two statements are obviously closely related, and reduce to the same thing in any representation in which the Hermitian operator is diagonal.
The requirement that measurable quantities be real-valued motivates the identification of observables with Hermitian operators.
Note that, because of the reality of the eigenvalues, the adjoint of the eigenvalue equation for a Hermitian operator has the simple form
76
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨ ,
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ =a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that77
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨ ,
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ =a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that78
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨ ,
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ =a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that79
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨ ,
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ =a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that80
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨ ,
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ =a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that81
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that82
Orthogonality of Eigenvectors for Hermitian Operators
Let |χ⟩ and |χ′⟩, be eigenvectors of a Hermitian operator A corresponding to eigenvalues a and a′, respectively. Thus, we can write
A|χ = a|χ⟩ ⟩ and A|χ = a |χ′ ′ ′⟩ ⟩.
Taking the inner product of the first of these with |χ ′⟩ we find thatχ |A|χ = a χ |χ . ⟨ ′ ⟩ ⟨ ′ ⟩
But the adjoint of the eigenvalue equation for a′ is
⟨χ |A = χ |a′ ′ ′⟨
where we have used the reality of the eigenvalues deduced above. Taking the inner product of this equation on the right with |χ⟩, we find that
χ |A|χ =a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩.
Equating these two expressions for the matrix element χ |A|χ ⟨ ′ ⟩ we find that83
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
So a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
84
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
so a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
85
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
so a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
86
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
so a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
87
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
so a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
88
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
so a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
89
Orthogonality of Eigenvectors for Hermitian Operators
⟨χ |A|χ = a χ |χ = a χ |χ′ ′ ′ ′⟩ ⟨ ⟩ ⟨ ⟩
so a χ |χ = a χ |χ⟨ ′ ⟩ ′ ⟨ ′ ⟩,
or (a - a ) χ |χ = 0′ ′⟨ ⟩ .
There are two ways in which this product can vanish.
Either a = a , ′
in which case we haven't found out anything,
or a ≠ a , ′
in which case χ |χ = 0 ⟨ ′ ⟩
showing that the eigenstates of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal.
90
Orthogonality of Eigenspaces for Hermitian Operators
The fact that eigenvectors of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal, implies something about the eigenspaces of such operators.
Indeed, we have seen that each vector in the eigenspace of A is an eigenvector of A corresponding to eigenvalue a.
Thus if A is a Hermitian operator, and a ≠ a′, every vector in must be orthogonal to every vector in .
In such a situation, we say that the eigenspaces and of A corresponding to different eigenvalues are themselves orthogonal.
Thus Hermitian operators have orthogonal eigenspaces.
91
Orthogonality of Eigenspaces for Hermitian Operators
The fact that eigenvectors of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal, implies something about the eigenspaces of such operators.
Indeed, we have seen that each vector in the eigenspace of A is an eigenvector of A corresponding to eigenvalue a.
Thus if A is a Hermitian operator, and a ≠ a′, every vector in must be orthogonal to every vector in .
In such a situation, we say that the eigenspaces and of A corresponding to different eigenvalues are themselves orthogonal.
Thus Hermitian operators have orthogonal eigenspaces.
92
Orthogonality of Eigenspaces for Hermitian Operators
The fact that eigenvectors of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal, implies something about the eigenspaces of such operators.
Indeed, we have seen that each vector in the eigenspace of A is an eigenvector of A corresponding to eigenvalue a.
Thus if A is a Hermitian operator, and a ≠ a′, every vector in must be orthogonal to every vector in .
In such a situation, we say that the eigenspaces and of A corresponding to different eigenvalues are themselves orthogonal.
Thus Hermitian operators have orthogonal eigenspaces.
93
Orthogonality of Eigenspaces for Hermitian Operators
The fact that eigenvectors of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal, implies something about the eigenspaces of such operators.
Indeed, we have seen that each vector in the eigenspace of A is an eigenvector of A corresponding to eigenvalue a.
Thus if A is a Hermitian operator, and a ≠ a′, every vector in must be orthogonal to every vector in .
In such a situation, we say that the eigenspaces and of A corresponding to different eigenvalues are themselves orthogonal.
Thus Hermitian operators have orthogonal eigenspaces.
94
Orthogonality of Eigenspaces for Hermitian Operators
The fact that eigenvectors of a Hermitian operator corresponding to different eigenvalues are necessarily orthogonal, implies something about the eigenspaces of such operators.
Indeed, we have seen that each vector in the eigenspace of A is an eigenvector of A corresponding to eigenvalue a.
Thus if A is a Hermitian operator, and a ≠ a′, every vector in must be orthogonal to every vector in .
In such a situation, we say that the eigenspaces and of A corresponding to different eigenvalues are themselves orthogonal.
Thus Hermitian operators have orthogonal eigenspaces.
95
In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it.
These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum.
We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue.
We then proved two important properties about Hermitian operators: (1) they have real eigenvalues, and (2) eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal.
96
In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it.
These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum.
We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue.
We then proved two important properties about Hermitian operators: (1) they have real eigenvalues, and (2) eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal.
97
In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it.
These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum.
We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue.
We then proved two important properties about Hermitian operators: (1) they have real eigenvalues, and (2) eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal.
98
In this lecture, we defined a number of representation independent properties of linear operators, including the trace and the determinant of a linear operator, which are equal to the corresponding values for any matrix which represents it.
These two properties were seen to be closely related to the spectral properties of the operator, in that they can each be expressed in terms of its collection of eigenvalues i.e., in terms of its spectrum.
We thus began a discussion of so-called eigen-properties of a linear operator, which include its spectrum of eigenvalues (and their degeneracy), its eigenvectors, and its eigenspaces, which are the subspaces of S containing all the eigenvectors of a linear operator A having the same eigenvalue.
We then proved two important facts about Hermitian operators: (1) that they have real eigenvalues, and (2) that eigenvectors and eigenspaces associated with different eigenvalues are necessarily orthogonal.
99
In the next lecture, we address the question of how one actually goes about solving the eigenvalue equation for a given linear operator A in order to find its eigenvalues (and their degeneracies) and eigenvectors.
100
101