Section PEE Properties of Eigenvalues and Eigenvectors
Subsection BPE Basic Properties of Eigenvalues
Theorem EDELI. Eigenvectors with Distinct Eigenvalues are Linearly Independent.
Suppose that A is a square matrix and S={x1,x2,x3,…,xp} is a set of eigenvectors with eigenvalues λ1,λ2,λ3,…,λp such that λi≠λj whenever i≠j. Then S is a linearly independent set.
Proof.
We will establish by induction (Proof Technique I) that that any set of \(k\) eigenvectors of \(A\) with distinct eigenvalues \(\scalarlist{\lambda}{k}\) is a linearly independent set. Suppose \(A\) has size \(n\text{.}\)
Base Case.
When \(k=1\text{,}\) \(\set{\vect{x}_1}\) is a set with a single nonzero vector and thus is linearly independent.
Induction Step.
Begin with a relation of linear dependence on the set \(\set{\vectorlist{x}{k}}\)
Then
This equation is a relation of linear dependence on the set \(\set{\vectorlist{x}{k-1}}\text{,}\) which is a linearly independent set by the induction hypothesis. So the scalars must all be zero by Definition LI. That is, \(a_i\left(\lambda_i-\lambda_k\right)=0\) for \(1\leq i\leq k-1\text{.}\) However, we have the hypothesis that the eigenvalues are distinct, so \(\lambda_i-\lambda_k\neq 0\) for \(1\leq i\leq k-1\text{.}\) So Theorem ZPZF implies \(a_i=0\) for \(1\leq i\leq k-1\text{.}\)
This reduces the original relation of linear dependence on \(\set{\vectorlist{x}{k}}\) to the simpler equation \(a_k\vect{x}_k=\zerovector\text{.}\) By Theorem SMEZV we conclude that \(a_k=0\) or \(\vect{x}_k=\zerovector\text{.}\) Eigenvectors are never the zero vector (Definition EEM), so \(a_k=0\text{.}\) Now all of the scalars \(a_i\text{,}\) \(1\leq i\leq k\) are zero, and so the only relation of linear dependence on the set \(\set{\vectorlist{x}{k}}\) is trivial. So by Definition LI, the set \(\set{\vectorlist{x}{k}}\) is linearly independent.
Theorem MNEM. Maximum Number of Eigenvalues of a Matrix.
Suppose that A is a square matrix of size n. Then A cannot have more than n distinct eigenvalues.
Proof.
We argue by contradiction (Proof Technique CD). Assume that \(A\) has \(n+1\) or more distinct eigenvalues. Then there is a set of \(n+1\) or more eigenvectors of \(A\text{,}\) with distinct eigenvalues. This is a set of \(n+1\) or more vectors from \(\complex{n}\) that will be linearly independent by Theorem EDELI. But this contradicts Theorem MVSLD, so our assumption is false, and there are no more than \(n\) distinct eigenvalues.
Theorem SMZE. Singular Matrices have Zero Eigenvalues.
Suppose A is a square matrix. Then A is singular if and only if 0 is an eigenvalue of A.
Proof.
We have the following equivalences:
Theorem NME7. Nonsingular Matrix Equivalences, Round 7.
Suppose that A is a square matrix of size n. The following are equivalent.
- A is nonsingular.
- A row-reduces to the identity matrix.
- The null space of A contains only the zero vector, N(A)={0}.
- The linear system LS(A,b) has a unique solution for every possible choice of b.
- The columns of A are a linearly independent set.
- A is invertible.
- The column space of A is Cn, C(A)=Cn.
- The columns of A are a basis for Cn.
- The rank of A is n, r(A)=n.
- The nullity of A is zero, n(A)=0.
- λ=0 is not an eigenvalue of A.
Proof.
The equivalence of the first and last statements is Theorem SMZE, reformulated by negating each statement in the equivalence. So we are able to improve on Theorem NME6 with this addition.
Sage NME8. Nonsingular Matrix Equivalences, Round 8.
Zero eigenvalues are another marker of singular matrices. We illustrate with two matrices, the first nonsingular, the second not.
Theorem ESMM. Eigenvalues of a Scalar Multiple of a Matrix.
Suppose A is a square matrix and λ is an eigenvalue of A. Then αλ is an eigenvalue of αA.
Proof.
Let \(\vect{x}\neq\zerovector\) be one eigenvector of \(A\) for \(\lambda\text{.}\) Then
So \(\vect{x}\neq\zerovector\) is an eigenvector of \(\alpha A\) for the eigenvalue \(\alpha\lambda\text{.}\)
Theorem EOMP. Eigenvalues Of Matrix Powers.
Suppose A is a square matrix, λ is an eigenvalue of A, and s≥0 is an integer. Then λs is an eigenvalue of As.
Proof.
Let \(\vect{x}\neq\zerovector\) be one eigenvector of \(A\) for \(\lambda\text{.}\) Then we proceed by induction on \(s\) (Proof Technique I). First, for \(s=0\text{,}\) employing Theorem MMIM and Property OC to establish the base case,
So \(\lambda^s\) is an eigenvalue of \(A^s\) in this special case. For the induction step, we assume the theorem is true for \(s\text{,}\) and find
So \(\vect{x}\neq\zerovector\) is an eigenvector of \(A^{s+1}\) for \(\lambda^{s+1}\text{,}\) and induction tells us the theorem is true for all \(s\geq 0\text{.}\)
Theorem EPM. Eigenvalues of the Polynomial of a Matrix.
Suppose A is a square matrix and λ is an eigenvalue of A. Let q(x) be a polynomial in the variable x. Then q(λ) is an eigenvalue of the matrix q(A).
Proof.
Let \(\vect{x}\neq\zerovector\) be one eigenvector of \(A\) for \(\lambda\text{,}\) and write \(q(x)=a_0+a_1x+a_2x^2+\cdots+a_mx^m\text{.}\) Then
So \(\vect{x}\neq 0\) is an eigenvector of \(q(A)\) for the eigenvalue \(q(\lambda)\text{.}\)
Theorem EIM. Eigenvalues of the Inverse of a Matrix.
Suppose A is a square nonsingular matrix and λ is an eigenvalue of A. Then λ−1 is an eigenvalue of the matrix A−1.
Proof.
Notice that since \(A\) is assumed nonsingular, \(\inverse{A}\) exists by Theorem NI, but more importantly, \(\lambda^{-1}=1/\lambda\) does not involve division by zero since Theorem SMZE prohibits this possibility.
Let \(\vect{x}\neq\zerovector\) be one eigenvector of \(A\) for \(\lambda\text{.}\) Suppose \(A\) has size \(n\text{.}\) Then
So \(\vect{x}\neq\zerovector\) is an eigenvector of \(\inverse{A}\) for the eigenvalue \(\frac{1}{\lambda}\text{.}\)
Theorem ERMCP. Eigenvalues of Real Matrices come in Conjugate Pairs.
Suppose A is a square matrix with real entries and x is an eigenvector of A for the eigenvalue λ. Then ¯x is an eigenvector of A for the eigenvalue ¯λ.
Proof.
We have
So \(\conjugate{\vect{x}}\) is an eigenvector of \(A\) for the eigenvalue \(\conjugate{\lambda}\text{.}\)
Subsection EHM Eigenvalues of Hermitian Matrices
Recall that a matrix is Hermitian (or self-adjoint) if A=A∗ (Definition HM). In the case where A is a matrix whose entries are all real numbers, being Hermitian is identical to being symmetric (Definition SYM). Keep this in mind as you read the next two theorems. Their hypotheses could be changed to “suppose A is a real symmetric matrix.”Theorem HMRE. Hermitian Matrices have Real Eigenvalues.
Suppose that A is a Hermitian matrix and λ is an eigenvalue of A. Then λ∈R.
Proof.
Let \(\vect{x}\neq\zerovector\) be one eigenvector of \(A\) for the eigenvalue \(\lambda\text{.}\) Then
Since \(\vect{x}\neq\zerovector\text{,}\) by Theorem PIP we know \(\innerproduct{\vect{x}}{\vect{x}}\neq 0\text{.}\) Then by Theorem ZPZF, \(\lambda - \conjugate{\lambda}=0\text{,}\) and so \(\lambda = \conjugate{\lambda}\text{.}\) If a complex number is equal to its conjugate, then it has a complex part equal to zero, and therefore is a real number.
Theorem HMOE. Hermitian Matrices have Orthogonal Eigenvectors.
Suppose that A is a Hermitian matrix and x and y are two eigenvectors of A for different eigenvalues. Then x and y are orthogonal vectors.
Proof.
Let \(\vect{x}\) be an eigenvector of \(A\) for \(\lambda\) and let \(\vect{y}\) be an eigenvector of \(A\) for a different eigenvalue \(\rho\text{.}\) So we have \(\lambda-\rho\neq 0\text{.}\) Then
Because \(\lambda\) and \(\rho\) are presumed to be different, \(\lambda-\rho\neq 0\text{,}\) and Theorem ZPZF implies that \(\innerproduct{\vect{x}}{\vect{y}}=0\text{.}\) In other words, \(\vect{x}\) and \(\vect{y}\) are orthogonal vectors according to Definition OV.
Reading Questions PEE Reading Questions
1.
How can you identify a nonsingular matrix just by looking at its eigenvalues?
2.
How many different eigenvalues may a square matrix of size n have?
3.
What is amazing about the eigenvalues of a Hermitian matrix and why is it amazing?
Exercises PEE Exercises
M10.
Grab several square matrices whose eigenvalues you know (previous examples or exercises are fine), form the transpose At, and compute its eigenvalues. After a few examples, formulate a conjecture.
You should very quickly observe that \(A\) and \(\transpose{A}\) have identical eigenvalues. However, their eigenvectors are unpredictably different. So our techniques in this section will not lead to a viable proof. Similarity, a topic in this chapter (Section SD), and specifically Theorem PSMS, are an avenue for a proof but require some advanced topics that we cannot address. Finally, in our last chapter, the determinant (Chapter D) provides a proof (Theorem ETM).
M20.
This exercise will show we can use a polynomial to convert one matrix into another, with predictable changes in its eigenvalues. In [cross-reference to target(s) "example-ESMS4" missing or not unique]
the 4×4 symmetric matrix
is shown to have the three eigenvalues λ=3,1,−1. Suppose we wanted a 4×4 matrix that has the three eigenvalues λ=4,0,−2. We can employ Theorem EPM by finding a polynomial that converts 3 to 4, 1 to 0, and −1 to −2. Such a polynomial is called an interpolating polynomial, and in this example we can use
(a)
Verify that the polynomial r(x) converts the eigenvalues as advertised.
We will not discuss how to concoct the interpolating polynomial, \(r(x)\text{,}\) but a text on numerical analysis should provide the details. For now, it should be routine to verify that \(r(3)=4\text{,}\) \(r(1)=0\) and \(r(-1)=-2\text{.}\)
(b)
In the style of Example PM, compute the matrix r(C).
Now compute
(c)
Compute the eigenvalues of r(C) directly and verify that they are as expected.
notice that the multiplicities are the same, and the eigenspaces of \(C\) and \(r(C)\) are identical.
T20.
Suppose that A is a square matrix. Prove that a single vector may not be an eigenvector of A for two different eigenvalues.
Suppose that the vector \(\vect{x}\neq\zerovector\) is an eigenvector of \(A\) for the two eigenvalues \(\lambda\) and \(\rho\text{,}\) where \(\lambda\neq\rho\text{.}\) Then \(\lambda-\rho\neq 0\text{,}\) and we also have
By Theorem SMEZV, either \(\lambda-\rho=0\) or \(\vect{x}=\zerovector\text{,}\) which are both contradictions.
T22.
Suppose that U is a unitary matrix with eigenvalue λ. Prove that λ has modulus 1, i.e. |λ|=1. This says that all of the eigenvalues of a unitary matrix lie on the unit circle of the complex plane.