Skip to main content
\(\newcommand{\orderof}[1]{\sim #1} \newcommand{\Z}{\mathbb{Z}} \newcommand{\reals}{\mathbb{R}} \newcommand{\real}[1]{\mathbb{R}^{#1}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\complex}[1]{\mathbb{C}^{#1}} \newcommand{\conjugate}[1]{\overline{#1}} \newcommand{\modulus}[1]{\left\lvert#1\right\rvert} \newcommand{\zerovector}{\vect{0}} \newcommand{\zeromatrix}{\mathcal{O}} \newcommand{\innerproduct}[2]{\left\langle#1,\,#2\right\rangle} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} \newcommand{\dimension}[1]{\dim\left(#1\right)} \newcommand{\nullity}[1]{n\left(#1\right)} \newcommand{\rank}[1]{r\left(#1\right)} \newcommand{\ds}{\oplus} \newcommand{\detname}[1]{\det\left(#1\right)} \newcommand{\detbars}[1]{\left\lvert#1\right\rvert} \newcommand{\trace}[1]{t\left(#1\right)} \newcommand{\sr}[1]{#1^{1/2}} \newcommand{\spn}[1]{\left\langle#1\right\rangle} \newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)} \newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)} \newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)} \newcommand{\per}[1]{#1^\perp} \newcommand{\augmented}[2]{\left\lbrack\left.#1\,\right\rvert\,#2\right\rbrack} \newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)} \newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}} \newcommand{\rowopswap}[2]{R_{#1}\leftrightarrow R_{#2}} \newcommand{\rowopmult}[2]{#1R_{#2}} \newcommand{\rowopadd}[3]{#1R_{#2}+R_{#3}} \newcommand{\leading}[1]{\boxed{#1}} \newcommand{\rref}{\xrightarrow{\text{RREF}}} \newcommand{\elemswap}[2]{E_{#1,#2}} \newcommand{\elemmult}[2]{E_{#2}\left(#1\right)} \newcommand{\elemadd}[3]{E_{#2,#3}\left(#1\right)} \newcommand{\scalarlist}[2]{{#1}_{1},\,{#1}_{2},\,{#1}_{3},\,\ldots,\,{#1}_{#2}} \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\vectorcomponents}[2]{\colvector{#1_{1}\\#1_{2}\\#1_{3}\\\vdots\\#1_{#2}}} \newcommand{\vectorlist}[2]{\vect{#1}_{1},\,\vect{#1}_{2},\,\vect{#1}_{3},\,\ldots,\,\vect{#1}_{#2}} \newcommand{\vectorentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\matrixentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\lincombo}[3]{#1_{1}\vect{#2}_{1}+#1_{2}\vect{#2}_{2}+#1_{3}\vect{#2}_{3}+\cdots +#1_{#3}\vect{#2}_{#3}} \newcommand{\matrixcolumns}[2]{\left\lbrack\vect{#1}_{1}|\vect{#1}_{2}|\vect{#1}_{3}|\ldots|\vect{#1}_{#2}\right\rbrack} \newcommand{\transpose}[1]{#1^{t}} \newcommand{\inverse}[1]{#1^{-1}} \newcommand{\submatrix}[3]{#1\left(#2|#3\right)} \newcommand{\adj}[1]{\transpose{\left(\conjugate{#1}\right)}} \newcommand{\adjoint}[1]{#1^\ast} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\setparts}[2]{\left\lbrace#1\,\middle|\,#2\right\rbrace} \newcommand{\card}[1]{\left\lvert#1\right\rvert} \newcommand{\setcomplement}[1]{\overline{#1}} \newcommand{\charpoly}[2]{p_{#1}\left(#2\right)} \newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)} \newcommand{\eigensystem}[3]{\lambda&=#2&\eigenspace{#1}{#2}&=\spn{\set{#3}}} \newcommand{\geneigenspace}[2]{\mathcal{G}_{#1}\left(#2\right)} \newcommand{\algmult}[2]{\alpha_{#1}\left(#2\right)} \newcommand{\geomult}[2]{\gamma_{#1}\left(#2\right)} \newcommand{\indx}[2]{\iota_{#1}\left(#2\right)} \newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3} \newcommand{\lteval}[2]{#1\left(#2\right)} \newcommand{\ltinverse}[1]{#1^{-1}} \newcommand{\restrict}[2]{{#1}|_{#2}} \newcommand{\preimage}[2]{#1^{-1}\left(#2\right)} \newcommand{\rng}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\krn}[1]{\mathcal{K}\!\left(#1\right)} \newcommand{\compose}[2]{{#1}\circ{#2}} \newcommand{\vslt}[2]{\mathcal{LT}\left(#1,\,#2\right)} \newcommand{\isomorphic}{\cong} \newcommand{\similar}[2]{\inverse{#2}#1#2} \newcommand{\vectrepname}[1]{\rho_{#1}} \newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}} \newcommand{\vectrepinvname}[1]{\ltinverse{\vectrepname{#1}}} \newcommand{\vectrepinv}[2]{\lteval{\ltinverse{\vectrepname{#1}}}{#2}} \newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}} \newcommand{\matrixrepcolumns}[4]{\left\lbrack \left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{1}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{2}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{3}}}\right|\ldots\left|\vectrep{#2}{\lteval{#1}{\vect{#3}_{#4}}}\right.\right\rbrack} \newcommand{\cbm}[2]{C_{#1,#2}} \newcommand{\jordan}[2]{J_{#1}\left(#2\right)} \newcommand{\hadamard}[2]{#1\circ #2} \newcommand{\hadamardidentity}[1]{J_{#1}} \newcommand{\hadamardinverse}[1]{\widehat{#1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

SectionNMNonsingular Matrices

In this section we specialize further and consider matrices with equal numbers of rows and columns, which when considered as coefficient matrices lead to systems with equal numbers of equations and variables. We will see in the second half of the course (Chapter D, Chapter E, Chapter LT, Chapter R) that these matrices are especially important.

SubsectionNMNonsingular Matrices

Our theorems will now establish connections between systems of equations (homogeneous or otherwise), augmented matrices representing those systems, coefficient matrices, constant vectors, the reduced row-echelon form of matrices (augmented and coefficient) and solution sets. Be very careful in your reading, writing and speaking about systems of equations, matrices and sets of vectors. A system of equations is not a matrix, a matrix is not a solution set, and a solution set is not a system of equations. Now would be a great time to review the discussion about speaking and writing mathematics in Proof Technique L.

DefinitionSQMSquare Matrix

A matrix with \(m\) rows and \(n\) columns is square if \(m=n\text{.}\) In this case, we say the matrix has size \(n\text{.}\) To emphasize the situation when a matrix is not square, we will call it rectangular.

We can now present one of the central definitions of linear algebra.

DefinitionNMNonsingular Matrix

Suppose \(A\) is a square matrix. Suppose further that the solution set to the homogeneous linear system of equations \(\linearsystem{A}{\zerovector}\) is \(\set{\zerovector}\text{,}\) in other words, the system has only the trivial solution. Then we say that \(A\) is a nonsingular matrix. Otherwise we say \(A\) is a singular matrix.

We can investigate whether any square matrix is nonsingular or not, no matter if the matrix is derived somehow from a system of equations or if it is simply a matrix. The definition says that to perform this investigation we must construct a very specific system of equations (homogeneous, with the matrix as the coefficient matrix) and look at its solution set. We will have theorems in this section that connect nonsingular matrices with systems of equations, creating more opportunities for confusion. Convince yourself now of two observations, (1) we can decide nonsingularity for any square matrix, and (2) the determination of nonsingularity involves the solution set for a certain homogeneous system of equations.

Notice that it makes no sense to call a system of equations nonsingular (the term does not apply to a system of equations), nor does it make any sense to call a \(5\times 7\) matrix singular (the matrix is not square).

Notice that we will not discuss Example HISAD as being a singular or nonsingular coefficient matrix since the matrix is not square.

The next theorem combines with our main computational technique (row reducing a matrix) to make it easy to recognize a nonsingular matrix. But first a definition.

DefinitionIMIdentity Matrix

The \(m\times m\) identity matrix, \(I_m\text{,}\) is defined by \begin{align*} \matrixentry{I_m}{ij}&= \begin{cases} 1 & \text{if }i=j\\ 0 & \text{if }i\neq j \end{cases} \end{align*} for \(1\leq i,\,j\leq m\text{.}\)

Notice that an identity matrix is square, and in reduced row-echelon form. Also, every column is a pivot column, and every possible pivot column appears once.

Proof

Notice that since this theorem is an equivalence it will always allow us to determine if a matrix is either nonsingular or singular. Here are two examples of this, continuing our study of Archetype A and Archetype B.

SubsectionNSNMNull Space of a Nonsingular Matrix

Nonsingular matrices and their null spaces are intimately related, as the next two examples illustrate.

These two examples illustrate the next theorem, which is another equivalence.

Proof

The next theorem pulls a lot of big ideas together. Theorem NMUS tells us that we can learn much about solutions to a system of linear equations with a square coefficient matrix by just examining a similar homogeneous system.

Proof

This theorem helps to explain part of our interest in nonsingular matrices. If a matrix is nonsingular, then no matter what vector of constants we pair it with, using the matrix as the coefficient matrix will always yield a linear system of equations with a solution, and the solution is unique. To determine if a matrix has this property (nonsingularity) it is enough to just solve one linear system, the homogeneous system with the matrix as coefficient matrix and the zero vector as the vector of constants (or any other vector of constants, see Exercise MM.T10).

Formulating the negation of the second part of this theorem is a good exercise. A singular matrix has the property that for some value of the vector \(\vect{b}\text{,}\) the system \(\linearsystem{A}{\vect{b}}\) does not have a unique solution (which means that it has no solution or infinitely many solutions). We will be able to say more about this case later (see the discussion following Theorem PSPHS).

Square matrices that are nonsingular have a long list of interesting properties, which we will start to catalog in the following, recurring, theorem. Of course, singular matrices will then have all of the opposite properties. The following theorem is a list of equivalences.

We want to understand just what is involved with understanding and proving a theorem that says several conditions are equivalent. So have a look at Proof Technique ME before studying the first in this series of theorems.

Proof

Finally, you may have wondered why we refer to a matrix as nonsingular when it creates systems of equations with single solutions (Theorem NMUS)! I have wondered the same thing. We will have an opportunity to address this just before Theorem NPNT, and again when we get to Theorem SMZD. Can you wait that long?

SubsectionReading Questions

1

In your own words state the definition of a nonsingular matrix.

2

What is the easiest way to recognize if a square matrix is nonsingular or not?

3

Suppose we have a system of equations and its coefficient matrix is nonsingular. What can you say about the solution set for this system?

SubsectionExercises

In Exercises C30–C33 determine if the matrix is nonsingular or singular. Give reasons for your answer.

C30

\begin{equation*} \begin{bmatrix} -3 & 1 & 2 & 8\\ 2 & 0 & 3 & 4\\ 1 & 2 & 7 & -4\\ 5 & -1 & 2 & 0 \end{bmatrix} \end{equation*}

Solution
C31

\begin{equation*} \begin{bmatrix} 2 & 3 & 1 & 4\\ 1 & 1 & 1 & 0\\ -1 & 2 & 3 & 5\\ 1 & 2 & 1 & 3 \end{bmatrix} \end{equation*}

Solution
C32

\begin{equation*} \begin{bmatrix} 9 & 3 & 2 & 4\\ 5 & -6 & 1 & 3\\ 4 & 1 & 3 & -5 \end{bmatrix} \end{equation*}

Solution
C33

\begin{equation*} \begin{bmatrix} -1 & 2 & 0 & 3 \\ 1 & -3 & -2 & 4 \\ -2 & 0 & 4 & 3 \\ -3 & 1 & -2 & 3 \end{bmatrix} \end{equation*}

Solution
C40

Each of the archetypes below is a system of equations with a square coefficient matrix, or is itself a square matrix. Determine if these matrices are nonsingular, or singular. Comment on the null space of each matrix.

Archetype A, Archetype B, Archetype F, Archetype K, Archetype L

C50

Find the null space of the matrix \(E\) below. \begin{align*} E&= \begin{bmatrix} 2 & 1 & -1 & -9 \\ 2 & 2 & -6 & -6 \\ 1 & 2 & -8 & 0 \\ -1 & 2 & -12 & 12 \end{bmatrix} \end{align*}

Solution
M30

Let \(A\) be the coefficient matrix of the system of equations below. Is \(A\) nonsingular or singular? Explain what you could infer about the solution set for the system based only on what you have learned about \(A\) being singular or nonsingular. \begin{align*} -x_1+5x_2&=-8\\ -2x_1+5x_2+5x_3+2x_4&=9\\ -3x_1-x_2+3x_3+x_4&=3\\ 7x_1+6x_2+5x_3+x_4&=30 \end{align*}

Solution

For Exercises M51–M52 say as much as possible about each system's solution set. Be sure to make it clear which theorems you are using to reach your conclusions.

M51

6 equations in 6 variables, singular coefficient matrix.

Solution
M52

A system with a nonsingular coefficient matrix, not homogeneous.

Solution
T10

Suppose that \(A\) is a square matrix, and \(B\) is a matrix in reduced row-echelon form that is row-equivalent to \(A\text{.}\) Prove that if \(A\) is singular, then the last row of \(B\) is a zero row.

Solution
T12

Using (Definition RREF) and (Definition IM) carefully, give a proof of the following equivalence: \(A\) is a square matrix in reduced row-echelon form where every column is a pivot column if and only if \(A\) is the identity matrix.

T30

Suppose that \(A\) is a nonsingular matrix and \(A\) is row-equivalent to the matrix \(B\text{.}\) Prove that \(B\) is nonsingular.

Solution
T31

Suppose that \(A\) is a square matrix of size \(n\times n\) and that we know there is a single vector \(\vect{b}\in\complex{n}\) such that the system \(\linearsystem{A}{\vect{b}}\) has a unique solution. Prove that \(A\) is a nonsingular matrix. (Notice that this is very similar to Theorem NMUS, but is not exactly the same.)

Solution
T90

Provide an alternative for the second half of the proof of Theorem NMUS, without appealing to properties of the reduced row-echelon form of the coefficient matrix. In other words, prove that if \(A\) is nonsingular, then \(\linearsystem{A}{\vect{b}}\) has a unique solution for every choice of the constant vector \(\vect{b}\text{.}\) Construct this proof without using Theorem REMEF or Theorem RREFU.

Solution