Section PEE Properties of Eigenvalues and Eigenvectors

The previous section introduced eigenvalues and eigenvectors, and concentrated on their existence and determination. This section will be more about theorems, and the various properties eigenvalues and eigenvectors enjoy. Like a good $4\times 100\text{ meter}$ relay, we will lead-off with one of our better theorems and save the very best for the anchor leg.

Subsection BPE Basic Properties of Eigenvalues

Theorem EDELI Eigenvectors with Distinct Eigenvalues are Linearly Independent

Suppose that $A$ is an $n\times n$ square matrix and $S=\set{\vectorlist{x}{p}}$ is a set of eigenvectors with eigenvalues $\scalarlist{\lambda}{p}$ such that $\lambda_i\neq\lambda_j$ whenever $i\neq j$. Then $S$ is a linearly independent set.

There is a simple connection between the eigenvalues of a matrix and whether or not the matrix is nonsingular.

Theorem SMZE Singular Matrices have Zero Eigenvalues

Suppose $A$ is a square matrix. Then $A$ is singular if and only if $\lambda=0$ is an eigenvalue of $A$.

With an equivalence about singular matrices we can update our list of equivalences about nonsingular matrices.

Theorem NME8 Nonsingular Matrix Equivalences, Round 8

Suppose that $A$ is a square matrix of size $n$. The following are equivalent.

  1. $A$ is nonsingular.
  2. $A$ row-reduces to the identity matrix.
  3. The null space of $A$ contains only the zero vector, $\nsp{A}=\set{\zerovector}$.
  4. The linear system $\linearsystem{A}{\vect{b}}$ has a unique solution for every possible choice of $\vect{b}$.
  5. The columns of $A$ are a linearly independent set.
  6. $A$ is invertible.
  7. The column space of $A$ is $\complex{n}$, $\csp{A}=\complex{n}$.
  8. The columns of $A$ are a basis for $\complex{n}$.
  9. The rank of $A$ is $n$, $\rank{A}=n$.
  10. The nullity of $A$ is zero, $\nullity{A}=0$.
  11. The determinant of $A$ is nonzero, $\detname{A}\neq 0$.
  12. $\lambda=0$ is not an eigenvalue of $A$.

Sage NME8 Nonsingular Matrix Equivalences, Round 8

Certain changes to a matrix change its eigenvalues in a predictable way.

Theorem ESMM Eigenvalues of a Scalar Multiple of a Matrix

Suppose $A$ is a square matrix and $\lambda$ is an eigenvalue of $A$. Then $\alpha\lambda$ is an eigenvalue of $\alpha A$.

Unfortunately, there are not parallel theorems about the sum or product of arbitrary matrices. But we can prove a similar result for powers of a matrix.

Theorem EOMP Eigenvalues Of Matrix Powers

Suppose $A$ is a square matrix, $\lambda$ is an eigenvalue of $A$, and $s\geq 0$ is an integer. Then $\lambda^s$ is an eigenvalue of $A^s$.

While we cannot prove that the sum of two arbitrary matrices behaves in any reasonable way with regard to eigenvalues, we can work with the sum of dissimilar powers of the same matrix. We have already seen two connections between eigenvalues and polynomials, in the proof of Theorem EMHE and the characteristic polynomial (Definition CP). Our next theorem strengthens this connection.

Theorem EPM Eigenvalues of the Polynomial of a Matrix

Suppose $A$ is a square matrix and $\lambda$ is an eigenvalue of $A$. Let $q(x)$ be a polynomial in the variable $x$. Then $q(\lambda)$ is an eigenvalue of the matrix $q(A)$.

Example BDE Building desired eigenvalues

Inverses and transposes also behave predictably with regard to their eigenvalues.

Theorem EIM Eigenvalues of the Inverse of a Matrix

Suppose $A$ is a square nonsingular matrix and $\lambda$ is an eigenvalue of $A$. Then $\lambda^{-1}$ is an eigenvalue of the matrix $\inverse{A}$.

The proofs of the theorems above have a similar style to them. They all begin by grabbing an eigenvalue-eigenvector pair and adjusting it in some way to reach the desired conclusion. You should add this to your toolkit as a general approach to proving theorems about eigenvalues.

So far we have been able to reserve the characteristic polynomial for strictly computational purposes. However, sometimes a theorem about eigenvalues can be proved easily by employing the characteristic polynomial (rather than using an eigenvalue-eigenvector pair). The next theorem is an example of this.

Theorem ETM Eigenvalues of the Transpose of a Matrix

Suppose $A$ is a square matrix and $\lambda$ is an eigenvalue of $A$. Then $\lambda$ is an eigenvalue of the matrix $\transpose{A}$.

If a matrix has only real entries, then the computation of the characteristic polynomial (Definition CP) will result in a polynomial with coefficients that are real numbers. Complex numbers could result as roots of this polynomial, but they are roots of quadratic factors with real coefficients, and as such, come in conjugate pairs. The next theorem proves this, and a bit more, without mentioning the characteristic polynomial.

Theorem ERMCP Eigenvalues of Real Matrices come in Conjugate Pairs

Suppose $A$ is a square matrix with real entries and $\vect{x}$ is an eigenvector of $A$ for the eigenvalue $\lambda$. Then $\conjugate{\vect{x}}$ is an eigenvector of $A$ for the eigenvalue $\conjugate{\lambda}$.

This phenomenon is amply illustrated in Example CEMS6, where the four complex eigenvalues come in two pairs, and the two basis vectors of the eigenspaces are complex conjugates of each other. Theorem ERMCP can be a time-saver for computing eigenvalues and eigenvectors of real matrices with complex eigenvalues, since the conjugate eigenvalue and eigenspace can be inferred from the theorem rather than computed.

Subsection ME Multiplicities of Eigenvalues

A polynomial of degree $n$ will have exactly $n$ roots. From this fact about polynomial equations we can say more about the algebraic multiplicities of eigenvalues.

Theorem DCP Degree of the Characteristic Polynomial

Suppose that $A$ is a square matrix of size $n$. Then the characteristic polynomial of $A$, $\charpoly{A}{x}$, has degree $n$.

Theorem NEM Number of Eigenvalues of a Matrix

Suppose that $\scalarlist{\lambda}{k}$ are the distinct eigenvalues of a square matrix $A$ of size $n$. Then \begin{equation*} \sum_{i=1}^{k}\algmult{A}{\lambda_i}=n \end{equation*}

Theorem ME Multiplicities of an Eigenvalue

Suppose that $A$ is a square matrix of size $n$ and $\lambda$ is an eigenvalue. Then \begin{equation*} 1\leq\geomult{A}{\lambda}\leq\algmult{A}{\lambda}\leq n \end{equation*}

Theorem MNEM Maximum Number of Eigenvalues of a Matrix

Suppose that $A$ is a square matrix of size $n$. Then $A$ cannot have more than $n$ distinct eigenvalues.

Subsection EHM Eigenvalues of Hermitian Matrices

Recall that a matrix is Hermitian (or self-adjoint) if $A=\adjoint{A}$ (Definition HM). In the case where $A$ is a matrix whose entries are all real numbers, being Hermitian is identical to being symmetric (Definition SYM). Keep this in mind as you read the next two theorems. Their hypotheses could be changed to “suppose $A$ is a real symmetric matrix.”

Theorem HMRE Hermitian Matrices have Real Eigenvalues

Suppose that $A$ is a Hermitian matrix and $\lambda$ is an eigenvalue of $A$. Then $\lambda\in{\mathbb R}$.

Notice the appealing symmetry to the justifications given for the steps of this proof. In the center is the ability to pitch a Hermitian matrix from one side of the inner product to the other.

Look back and compare Example ESMS4 and Example CEMS6. In Example CEMS6 the matrix has only real entries, yet the characteristic polynomial has roots that are complex numbers, and so the matrix has complex eigenvalues. However, in Example ESMS4, the matrix has only real entries, but is also symmetric, and hence Hermitian. So by Theorem HMRE, we were guaranteed eigenvalues that are real numbers.

In many physical problems, a matrix of interest will be real and symmetric, or Hermitian. Then if the eigenvalues are to represent physical quantities of interest, Theorem HMRE guarantees that these values will not be complex numbers.

The eigenvectors of a Hermitian matrix also enjoy a pleasing property that we will exploit later.

Theorem HMOE Hermitian Matrices have Orthogonal Eigenvectors

Suppose that $A$ is a Hermitian matrix and $\vect{x}$ and $\vect{y}$ are two eigenvectors of $A$ for different eigenvalues. Then $\vect{x}$ and $\vect{y}$ are orthogonal vectors.

Notice again how the key step in this proof is the fundamental property of a Hermitian matrix (Theorem HMIP) — the ability to swap $A$ across the two arguments of the inner product. We will build on these results and continue to see some more interesting properties in Section OD.