Section EE Eigenvalues and Eigenvectors

In this section, we will define the eigenvalues and eigenvectors of a matrix, and see how to compute them. More theoretical properties will be taken up in the next section.

Subsection EEM Eigenvalues and Eigenvectors of a Matrix

We start with the principal definition for this chapter.

Definition EEM Eigenvalues and Eigenvectors of a Matrix

Suppose that $A$ is a square matrix of size $n$, $\vect{x}\neq\zerovector$ is a vector in $\complex{n}$, and $\lambda$ is a scalar in $\complex{\null}$. Then we say $\vect{x}$ is an eigenvector of $A$ with eigenvalue $\lambda$ if \begin{equation*} A\vect{x}=\lambda\vect{x} \end{equation*}

Before going any further, perhaps we should convince you that such things ever happen at all. Understand the next example, but do not concern yourself with where the pieces come from. We will have methods soon enough to be able to discover these eigenvectors ourselves.

Example SEE Some eigenvalues and eigenvectors
Sage EE Eigenvalues and Eigenvectors

Example SEE hints at a number of intriguing properties, and there are many more. We will explore the general properties of eigenvalues and eigenvectors in Section PEE, but in this section we will concern ourselves with the question of actually computing eigenvalues and eigenvectors. First we need a bit of background material on polynomials and matrices.

Subsection PM Polynomials and Matrices

A polynomial is a combination of powers, multiplication by scalar coefficients, and addition (with subtraction just being the inverse of addition). We never have occasion to divide when computing the value of a polynomial. So it is with matrices. We can add and subtract matrices, we can multiply matrices by scalars, and we can form powers of square matrices by repeated applications of matrix multiplication. We do not normally divide matrices (though sometimes we can multiply by an inverse). If a matrix is square, all the operations constituting a polynomial will preserve the size of the matrix. So it is natural to consider evaluating a polynomial with a matrix, effectively replacing the variable of the polynomial by a matrix. We will demonstrate with an example.

Example PM Polynomial of a matrix

Subsection EEE Existence of Eigenvalues and Eigenvectors

Before we embark on computing eigenvalues and eigenvectors, we will prove that every matrix has at least one eigenvalue (and an eigenvector to go with it). Later, in Theorem MNEM, we will determine the maximum number of eigenvalues a matrix may have.

The determinant (Definition DM) will be a powerful tool in Subsection EE.CEE when it comes time to compute eigenvalues. However, it is possible, with some more advanced machinery, to compute eigenvalues without ever making use of the determinant. Sheldon Axler does just that in his book, Linear Algebra Done Right. Here and now, we give Axler's “determinant-free” proof that every matrix has an eigenvalue. The result is not too startling, but the proof is most enjoyable.

Theorem EMHE Every Matrix Has an Eigenvalue

Suppose $A$ is a square matrix. Then $A$ has at least one eigenvalue.

The proof of Theorem EMHE is constructive (it contains an unambiguous procedure that leads to an eigenvalue), but it is not meant to be practical. We will illustrate the theorem with an example, the purpose being to provide a companion for studying the proof and not to suggest this is the best procedure for computing an eigenvalue.

Example CAEHW Computing an eigenvalue the hard way

Subsection CEE Computing Eigenvalues and Eigenvectors

Fortunately, we need not rely on the procedure of Theorem EMHE each time we need an eigenvalue. It is the determinant, and specifically Theorem SMZD, that provides the main tool for computing eigenvalues. Here is an informal sequence of equivalences that is the key to determining the eigenvalues and eigenvectors of a matrix, \begin{equation*} A\vect{x}=\lambda\vect{x}\iff A\vect{x}-\lambda I_n\vect{x}=\zerovector\iff \left(A-\lambda I_n\right)\vect{x}=\zerovector \end{equation*}

So, for an eigenvalue $\lambda$ and associated eigenvector $\vect{x}\neq\zerovector$, the vector $\vect{x}$ will be a nonzero element of the null space of $A-\lambda I_n$, while the matrix $A-\lambda I_n$ will be singular and therefore have zero determinant. These ideas are made precise in Theorem EMRCP and Theorem EMNS, but for now this brief discussion should suffice as motivation for the following definition and example.

Definition CP Characteristic Polynomial

Suppose that $A$ is a square matrix of size $n$. Then the characteristic polynomial of $A$ is the polynomial $\charpoly{A}{x}$ defined by \begin{equation*} \charpoly{A}{x}=\detname{A-xI_n} \end{equation*}

Example CPMS3 Characteristic polynomial of a matrix, size 3

The characteristic polynomial is our main computational tool for finding eigenvalues, and will sometimes be used to aid us in determining the properties of eigenvalues.

Theorem EMRCP Eigenvalues of a Matrix are Roots of Characteristic Polynomials

Suppose $A$ is a square matrix. Then $\lambda$ is an eigenvalue of $A$ if and only if $\charpoly{A}{\lambda}=0$.

Example EMS3 Eigenvalues of a matrix, size 3

Let us now turn our attention to the computation of eigenvectors.

Definition EM Eigenspace of a Matrix

Suppose that $A$ is a square matrix and $\lambda$ is an eigenvalue of $A$. Then the eigenspace of $A$ for $\lambda$, $\eigenspace{A}{\lambda}$, is the set of all the eigenvectors of $A$ for $\lambda$, together with the inclusion of the zero vector.

Example SEE hinted that the set of eigenvectors for a single eigenvalue might have some closure properties, and with the addition of the one eigenvector that is never an eigenvector, $\zerovector$, we indeed get a whole subspace.

Theorem EMS Eigenspace for a Matrix is a Subspace

Suppose $A$ is a square matrix of size $n$ and $\lambda$ is an eigenvalue of $A$. Then the eigenspace $\eigenspace{A}{\lambda}$ is a subspace of the vector space $\complex{n}$.

Theorem EMS tells us that an eigenspace is a subspace (and hence a vector space in its own right). Our next theorem tells us how to quickly construct this subspace.

Theorem EMNS Eigenspace of a Matrix is a Null Space

Suppose $A$ is a square matrix of size $n$ and $\lambda$ is an eigenvalue of $A$. Then \begin{equation*} \eigenspace{A}{\lambda}=\nsp{A-\lambda I_n} \end{equation*}

You might notice the close parallels (and differences) between the proofs of Theorem EMRCP and Theorem EMNS. Since Theorem EMNS describes the set of all the eigenvectors of $A$ as a null space we can use techniques such as Theorem BNS to provide concise descriptions of eigenspaces. Theorem EMNS also provides a trivial proof for Theorem EMS.

Example ESMS3 Eigenspaces of a matrix, size 3

Subsection ECEE Examples of Computing Eigenvalues and Eigenvectors

There are no theorems in this section, just a selection of examples meant to illustrate the range of possibilities for the eigenvalues and eigenvectors of a matrix. These examples can all be done by hand, though the computation of the characteristic polynomial would be very time-consuming and error-prone. It can also be difficult to factor an arbitrary polynomial, though if we were to suggest that most of our eigenvalues are going to be integers, then it can be easier to hunt for roots. These examples are meant to look similar to a concatenation of Example CPMS3, Example EMS3 and Example ESMS3. First, we will sneak in a pair of definitions so we can illustrate them throughout this sequence of examples.

Definition AME Algebraic Multiplicity of an Eigenvalue

Suppose that $A$ is a square matrix and $\lambda$ is an eigenvalue of $A$. Then the algebraic multiplicity of $\lambda$, $\algmult{A}{\lambda}$, is the highest power of $(x-\lambda)$ that divides the characteristic polynomial, $\charpoly{A}{x}$.

Since an eigenvalue $\lambda$ is a root of the characteristic polynomial, there is always a factor of $(x-\lambda)$, and the algebraic multiplicity is just the power of this factor in a factorization of $\charpoly{A}{x}$. So in particular, $\algmult{A}{\lambda}\geq 1$. Compare the definition of algebraic multiplicity with the next definition.

Definition GME Geometric Multiplicity of an Eigenvalue

Suppose that $A$ is a square matrix and $\lambda$ is an eigenvalue of $A$. Then the geometric multiplicity of $\lambda$, $\geomult{A}{\lambda}$, is the dimension of the eigenspace $\eigenspace{A}{\lambda}$.

Every eigenvalue must have at least one eigenvector, so the associated eigenspace cannot be trivial, and so $\geomult{A}{\lambda}\geq 1$.

Example EMMS4 Eigenvalue multiplicities, matrix of size 4
Example ESMS4 Eigenvalues, symmetric matrix of size 4
Example HMEM5 High multiplicity eigenvalues, matrix of size 5
Example CEMS6 Complex eigenvalues, matrix of size 6
Example DEMS5 Distinct eigenvalues, matrix of size 5
Sage CEVAL Computing Eigenvalues
Sage CEVEC Computing Eigenvectors