From A First Course in Linear Algebra

Version 1.08

© 2004.

Licensed under the GNU Free Documentation License.

http://linear.ups.edu/

In this section we specialize and consider matrices with equal numbers of rows
and columns, which when considered as coefficient matrices lead to systems with
equal numbers of equations and variables. We will see in the second half of the
course (Chapter D, Chapter E Chapter LT, Chapter R) that these matrices are
especially important.

Our theorems will now establish connections between systems of equations (homogeneous or otherwise), augmented matrices representing those systems, coefficient matrices, constant vectors, the reduced row-echelon form of matrices (augmented and coefficient) and solution sets. Be very careful in your reading, writing and speaking about systems of equations, matrices and sets of vectors. A system of equations is not a matrix, a matrix is not a solution set, and a solution set is not a system of equations. Now would be a great time to review the discussion about speaking and writing mathematics in Technique L.

Definition SQM

Square Matrix

A matrix with $m$ rows
and $n$ columns is square
if $m=n$. In this case, we say
the matrix has size $n$.
To emphasize the situation when a matrix is not square, we will call it rectangular.
$\u25b3$

We can now present one of the central definitions of linear algebra.

Definition NM

Nonsingular Matrix

Suppose $A$ is
a square matrix. Suppose further that the solution set to the homogeneous linear system
of equations $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$
is $\left\{0\right\}$,
i.e. the system has only the trivial solution. Then we say that
$A$ is a nonsingular
matrix. Otherwise we say $A$
is a singular matrix. $\u25b3$

We can investigate whether any square matrix is nonsingular or not, no matter if the matrix is derived somehow from a system of equations or if it is simply a matrix. The definition says that to perform this investigation we must construct a very specific system of equations (homogeneous, with the matrix as the coefficient matrix) and look at its solution set. We will have theorems in this section that connect nonsingular matrices with systems of equations, creating more opportunities for confusion. Convince yourself now of two observations, (1) we can decide nonsingularity for any square matrix, and (2) the determination of nonsingularity involves the solution set for a certain homogenous system of equations.

Notice that it makes no sense to call a system of equations nonsingular (the term does not apply to a system of equations), nor does it make any sense to call a $5\times 7$ matrix singular (the matrix is not square).

Example S

A singular matrix, Archetype A

Example HISAA shows that the coefficient matrix derived from Archetype A, specifically
the $3\times 3$
matrix,

$$A=\left[\begin{array}{ccc}\hfill 1\hfill & \hfill -1\hfill & \hfill 2\hfill \\ \hfill 2\hfill & \hfill 1\hfill & \hfill 1\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 0\hfill \end{array}\right]$$ |

is a singular matrix since there are nontrivial solutions to the homogeneous system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$. $\u22a0$

Example NM

A nonsingular matrix, Archetype B

Example HUSAB shows that the coefficient matrix derived from Archetype B, specifically
the $3\times 3$
matrix,

$$B=\left[\begin{array}{ccc}\hfill -7\hfill & \hfill -6\hfill & \hfill -12\hfill \\ \hfill 5\hfill & \hfill 5\hfill & \hfill 7\hfill \\ \hfill 1\hfill & \hfill 0\hfill & \hfill 4\hfill \end{array}\right]$$ |

is a nonsingular matrix since the homogeneous system, $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(B,\phantom{\rule{0em}{0ex}}0\right)$, has only the trivial solution. $\u22a0$

Notice that we will not discuss Example HISAD as being a singular or nonsingular coefficient matrix since the matrix is not square.

The next theorem combines with our main computational technique (row-reducing a matrix) to make it easy to recognize a nonsingular matrix. But first a definition.

Definition IM

Identity Matrix

The $m\times m$ identity
matrix, ${I}_{m}$,
is defined by

$${\left[{I}_{m}\right]}_{ij}=\left\{\begin{array}{cc}1\phantom{\rule{1em}{0ex}}\hfill & i=j\hfill \\ 0\phantom{\rule{1em}{0ex}}\hfill & i\ne j\hfill \end{array}\right.$$ |

(This definition contains Notation IM.) $\u25b3$

Example IM

An identity matrix

The $4\times 4$
identity matrix is

$${I}_{4}=\left[\begin{array}{cccc}\hfill 1\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 1\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 1\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill 1\hfill \end{array}\right].$$ |

Notice that an identity matrix is square, and in reduced row-echelon form. So in particular, if we were to arrive at the identity matrix while bringing a matrix to reduced row-echelon form, then it would have all of the diagonal entries circled as leading 1’s.

Theorem NMRRI

Nonsingular Matrices Row Reduce to the Identity matrix

Suppose that $A$ is a
square matrix and $B$
is a row-equivalent matrix in reduced row-echelon form. Then
$A$ is nonsingular if
and only if $B$ is the
identity matrix. $\square $

Proof ($\Leftarrow $) Suppose $B$ is the identity matrix. When the augmented matrix $\left[\left.A\phantom{\rule{0em}{0ex}}\right|\phantom{\rule{0em}{0ex}}0\right]$ is row-reduced, the result is $\left[\left.B\phantom{\rule{0em}{0ex}}\right|\phantom{\rule{0em}{0ex}}0\right]=\left[\left.{I}_{n}\phantom{\rule{0em}{0ex}}\right|\phantom{\rule{0em}{0ex}}0\right]$. The number of nonzero rows is equal to the number of variables in the linear system of equations $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$, so $n=r$ and Theorem FVCS gives $n-r=0$ free variables. Thus, the homogeneous system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ has just one solution, which must be the trivial solution. This is exactly the definition of a nonsingular matrix.

($\Rightarrow $) If $A$ is nonsingular, then the homogeneous system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ has a unique solution, and has no free variables in the description of the solution set. The homogeneous system is consistent (Theorem HSC) so Theorem FVCS applies and tells us there are $n-r$ free variables. Thus, $n-r=0$, and so $n=r$. So $B$ has $n$ pivot columns among its total of $n$ columns. This is enough to force $B$ to be the $n\times n$ identity matrix ${I}_{n}$. $\u25a0$

Notice that since this theorem is an equivalence it will always allow us to determine if a matrix is either nonsingular or singular. Here are two examples of this, continuing our study of Archetype A and Archetype B.

Example SRR

Singular matrix, row-reduced

The coefficient matrix for Archetype A is

$$A=\left[\begin{array}{ccc}\hfill 1\hfill & \hfill -1\hfill & \hfill 2\hfill \\ \hfill 2\hfill & \hfill 1\hfill & \hfill 1\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 0\hfill \end{array}\right]$$ |

which when row-reduced becomes the row-equivalent matrix

$$B=\left[\begin{array}{ccc}\hfill \text{1}\hfill & \hfill 0\hfill & \hfill 1\hfill \\ \hfill 0\hfill & \hfill \text{1}\hfill & \hfill -1\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill \end{array}\right].$$ |

Since this matrix is not the $3\times 3$ identity matrix, Theorem NMRRI tells us that $A$ is a singular matrix. $\u22a0$

Example NSR

Nonsingular matrix, row-reduced

The coefficient matrix for Archetype B is

$$A=\left[\begin{array}{ccc}\hfill -7\hfill & \hfill -6\hfill & \hfill -12\hfill \\ \hfill 5\hfill & \hfill 5\hfill & \hfill 7\hfill \\ \hfill 1\hfill & \hfill 0\hfill & \hfill 4\hfill \end{array}\right]$$ |

which when row-reduced becomes the row-equivalent matrix

$$B=\left[\begin{array}{ccc}\hfill \text{1}\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill \text{1}\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill \text{1}\hfill \end{array}\right].$$ |

Since this matrix is the $3\times 3$ identity matrix, Theorem NMRRI tells us that $A$ is a nonsingular matrix. $\u22a0$

Nonsingular matrices and their null spaces are intimately related, as the next two examples illustrate.

Example NSS

Null space of a singular matrix

Given the coefficient matrix from Archetype A,

$$A=\left[\begin{array}{ccc}\hfill 1\hfill & \hfill -1\hfill & \hfill 2\hfill \\ \hfill 2\hfill & \hfill 1\hfill & \hfill 1\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 0\hfill \end{array}\right]$$ |

the null space is the set of solutions to the homogeneous system of equations $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ has a solution set and null space constructed in Example HISAA as

$$\mathcal{N}\phantom{\rule{0em}{0ex}}\left(A\right)=\left\{\left.\left[\begin{array}{c}\hfill -{x}_{3}\hfill \\ \hfill {x}_{3}\hfill \\ \hfill {x}_{3}\hfill \end{array}\right]\phantom{\rule{0em}{0ex}}\right|\phantom{\rule{0em}{0ex}}{x}_{3}\in {\u2102}^{}\right\}$$ |

Example NSNM

Null space of a nonsingular matrix

Given the coefficient matrix from Archetype B,

$$A=\left[\begin{array}{ccc}\hfill -7\hfill & \hfill -6\hfill & \hfill -12\hfill \\ \hfill 5\hfill & \hfill 5\hfill & \hfill 7\hfill \\ \hfill 1\hfill & \hfill 0\hfill & \hfill 4\hfill \end{array}\right]$$ |

the homogeneous system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ has a solution set constructed in Example HUSAB that contains only the trivial solution, so the null space has only a single element,

$$\mathcal{N}\phantom{\rule{0em}{0ex}}\left(A\right)=\left\{\left[\begin{array}{c}\hfill 0\hfill \\ \hfill 0\hfill \\ \hfill 0\hfill \end{array}\right]\right\}$$ |

These two examples illustrate the next theorem, which is another equivalence.

Theorem NMTNS

Nonsingular Matrices have Trivial Null Spaces

Suppose that $A$ is a
square matrix. Then $A$
is nonsingular if and only if the null space of
$A$,
$\mathcal{N}\phantom{\rule{0em}{0ex}}\left(A\right)$, contains only the
zero vector, i.e. $\mathcal{N}\phantom{\rule{0em}{0ex}}\left(A\right)=\left\{0\right\}$.
$\square $

Proof The null space of a square matrix, $A$, is equal to the set of solutions to the homogeneous system, $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$. A matrix is nonsingular if and only if the set of solutions to the homogeneous system, $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$, has only a trivial solution. These two observations may be chained together to construct the two proofs necessary for each half of this theorem. $\u25a0$

The next theorem pulls a lot of ideas together. Two proof techniques are applicable to the proof. So first, head out and read two more proof techniques: Technique CD and Technique U. Theorem NMUS tells us that we can learn a lot about solutions to a system of linear equations with a square coefficient matrix by examining a similar homogeneous system.

Theorem NMUS

Nonsingular Matrices and Unique Solutions

Suppose that $A$ is
a square matrix. $A$
is a nonsingular matrix if and only if the system
$\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$
has a unique solution for every choice of the constant vector
$b$.
$\square $

Proof ($\Leftarrow $) The hypothesis for this half of the proof is that the system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$ has a unique solution for every choice of the constant vector $b$. We will make a very specific choice for $b$: $b=0$. Then we know that the system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ has a unique solution. But this is precisely the definition of what it means for $A$ to be nonsingular (Definition NM). That almost seems too easy! Notice that we have not used the full power of our hypothesis, but there is nothing that says we must use a hypothesis to its fullest.

If the first half of the proof seemed easy, perhaps we’ll have to work a bit harder to get the implication in the opposite direction. We provide two different proofs for the second half. The first is suggested by Asa Scherer and relies on the uniqueness of the reduced row-echelon form of a matrix (Theorem RREFU), a result that we could have proven earlier, but we have decided to delay until later. The second proof is lengthier and more involved, but does not rely on the uniqueness of the reduced row-echelon form of a matrix, a result we have not proven yet. It is also a good example of the types of proofs we will encounter throughout the course.

($\Rightarrow $, Round 1) We assume that $A$ is nonsingular, so we know there is a sequence of row operations that will convert $A$ into the identity matrix ${I}_{n}$ (Theorem NMRRI). Form the augmented matrix ${A}^{\prime}=\left[\left.A\phantom{\rule{0em}{0ex}}\right|\phantom{\rule{0em}{0ex}}b\right]$ and apply this same sequence of row operations to ${A}^{\prime}$. The result will be the matrix ${B}^{\prime}=\left[\left.{I}_{n}\phantom{\rule{0em}{0ex}}\right|\phantom{\rule{0em}{0ex}}c\right]$, which is in reduced row-echelon form. It should be clear that $c$ is a solution to $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$. Furthermore, since ${B}^{\prime}$ is unique (Theorem RREFU), the vector $c$ must be unique, and therefore is a unique solution of $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$.

($\Rightarrow $, Round 2) We will assume $A$ is nonsingular, and try to solve the system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$ without making any assumptions about $b$. To do this we will begin by constructing a new homogeneous linear system of equations that looks very much like the original. Suppose $A$ has size $n$ (why must it be square?) and write the original system as,

$$\begin{array}{llll}\hfill {a}_{11}{x}_{1}+{a}_{12}{x}_{2}+{a}_{13}{x}_{3}+\cdots +{a}_{1n}{x}_{n}& ={b}_{1}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill {a}_{21}{x}_{1}+{a}_{22}{x}_{2}+{a}_{23}{x}_{3}+\cdots +{a}_{2n}{x}_{n}& ={b}_{2}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill {a}_{31}{x}_{1}+{a}_{32}{x}_{2}+{a}_{33}{x}_{3}+\cdots +{a}_{3n}{x}_{n}& ={b}_{3}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill \vdots & \phantom{\rule{2em}{0ex}}& \hfill \text{(}\ast \text{)}\\ \hfill {a}_{n1}{x}_{1}+{a}_{n2}{x}_{2}+{a}_{n3}{x}_{3}+\cdots +{a}_{nn}{x}_{n}& ={b}_{n}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$$form the new, homogeneous system in $n$ equations with $n+1$ variables, by adding a new variable $y$, whose coefficients are the negatives of the constant terms,

$$\begin{array}{llll}\hfill {a}_{11}{x}_{1}+{a}_{12}{x}_{2}+{a}_{13}{x}_{3}+\cdots +{a}_{1n}{x}_{n}-{b}_{1}y& =0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill {a}_{21}{x}_{1}+{a}_{22}{x}_{2}+{a}_{23}{x}_{3}+\cdots +{a}_{2n}{x}_{n}-{b}_{2}y& =0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill {a}_{31}{x}_{1}+{a}_{32}{x}_{2}+{a}_{33}{x}_{3}+\cdots +{a}_{3n}{x}_{n}-{b}_{3}y& =0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill \vdots & \phantom{\rule{2em}{0ex}}& \hfill \text{(}\ast \ast \text{)}\\ \hfill {a}_{n1}{x}_{1}+{a}_{n2}{x}_{2}+{a}_{n3}{x}_{3}+\cdots +{a}_{nn}{x}_{n}-{b}_{n}y& =0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$$Since this is a homogeneous system with more variables than equations ($m=n+1>n$), Theorem HMVEI says that the system has infinitely many solutions. We will choose one of these solutions, any one of these solutions, so long as it is not the trivial solution. Write this solution as

$$\begin{array}{lllllllllllllllllllllll}\hfill {x}_{1}={c}_{1}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{2}={c}_{2}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{3}={c}_{3}& \phantom{\rule{2em}{0ex}}& \hfill \dots & \phantom{\rule{2em}{0ex}}& \hfill {x}_{n}={c}_{n}& \phantom{\rule{2em}{0ex}}& \hfill y={c}_{n+1}& \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill \end{array}$$We know that at least one value of the ${c}_{i}$ is nonzero, but we will now show that in particular ${c}_{n+1}\ne 0$. We do this using a proof by contradiction (Technique CD). So suppose the ${c}_{i}$ form a solution as described, and in addition that ${c}_{n+1}=0$. Then we can write the $i$-th equation of system $\left(\ast \ast \right)$ as,

$$\begin{array}{llll}\hfill {a}_{i1}{c}_{1}+{a}_{i2}{c}_{2}+{a}_{i3}{c}_{3}+\cdots +{a}_{in}{c}_{n}-{b}_{i}\left(0\right)& =0\phantom{\rule{2em}{0ex}}& \hfill & \\ \multicolumn{4}{c}{\text{whichbecomes}}\\ \phantom{\rule{2em}{0ex}}\\ \hfill {a}_{i1}{c}_{1}+{a}_{i2}{c}_{2}+{a}_{i3}{c}_{3}+\cdots +{a}_{in}{c}_{n}& =0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & \phantom{\rule{2em}{0ex}}& \hfill \end{array}$$Since this is true for each $i$, we have that ${x}_{1}={c}_{1},\phantom{\rule{0em}{0ex}}{x}_{2}={c}_{2},\phantom{\rule{0em}{0ex}}{x}_{3}={c}_{3},\dots ,\phantom{\rule{0em}{0ex}}{x}_{n}={c}_{n}$ is a solution to the homogeneous system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ formed with a nonsingular coefficient matrix. This means that the only possible solution is the trivial solution, so ${c}_{1}=0,\phantom{\rule{0em}{0ex}}{c}_{2}=0,\phantom{\rule{0em}{0ex}}{c}_{3}=0,\phantom{\rule{0em}{0ex}}\dots ,\phantom{\rule{0em}{0ex}}{c}_{n}=0$. So, assuming simply that ${c}_{n+1}=0$, we conclude that all of the ${c}_{i}$ are zero. But this contradicts our choice of the ${c}_{i}$ as not being the trivial solution to the system $\left(\ast \ast \right)$. So ${c}_{n+1}\ne 0$.

We now propose and verify a solution to the original system $\left(\ast \right)$. Set

$$\begin{array}{lllllllllllllllllll}\hfill {x}_{1}=\frac{{c}_{1}}{{c}_{n+1}}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{2}=\frac{{c}_{2}}{{c}_{n+1}}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{3}=\frac{{c}_{3}}{{c}_{n+1}}& \phantom{\rule{2em}{0ex}}& \hfill \dots & \phantom{\rule{2em}{0ex}}& \hfill {x}_{n}=\frac{{c}_{n}}{{c}_{n+1}}& \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill \end{array}$$Notice how it was necessary that we know that ${c}_{n+1}\ne 0$ for this step to succeed. Now, evaluate the $i$-th equation of system $\left(\ast \right)$ with this proposed solution, and recognize in the third line that ${c}_{1}$ through ${c}_{n+1}$ appear as if they were substituted into the left-hand side of the $i$-th equation of system $\left(\ast \ast \right)$,

$$\begin{array}{llll}\hfill & {a}_{i1}\frac{{c}_{1}}{{c}_{n+1}}+{a}_{i2}\frac{{c}_{2}}{{c}_{n+1}}+{a}_{i3}\frac{{c}_{3}}{{c}_{n+1}}+\cdots +{a}_{in}\frac{{c}_{n}}{{c}_{n+1}}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{{c}_{n+1}}\left({a}_{i1}{c}_{1}+{a}_{i2}{c}_{2}+{a}_{i3}{c}_{3}+\cdots +{a}_{in}{c}_{n}\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{{c}_{n+1}}\left({a}_{i1}{c}_{1}+{a}_{i2}{c}_{2}+{a}_{i3}{c}_{3}+\cdots +{a}_{in}{c}_{n}-{b}_{i}{c}_{n+1}\right)+{b}_{i}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\frac{1}{{c}_{n+1}}\left(0\right)+{b}_{i}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={b}_{i}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$$Since this equation is true for every $i$, we have found a solution to system $\left(\ast \right)$. To finish, we still need to establish that this solution is unique.

With one solution in hand, we will entertain the possibility of a second solution. So assume system $\left(\ast \right)$ has two solutions,

$$\begin{array}{lllllllllllllllllll}\hfill {x}_{1}={d}_{1}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{2}={d}_{2}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{3}={d}_{3}& \phantom{\rule{2em}{0ex}}& \hfill \dots & \phantom{\rule{2em}{0ex}}& \hfill {x}_{n}={d}_{n}& \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill \\ \hfill {x}_{1}={e}_{1}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{2}={e}_{2}& \phantom{\rule{2em}{0ex}}& \hfill {x}_{3}={e}_{3}& \phantom{\rule{2em}{0ex}}& \hfill \dots & \phantom{\rule{2em}{0ex}}& \hfill {x}_{n}={e}_{n}& \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}& \hfill \end{array}$$Then,

$$\begin{array}{llll}\hfill & \left({a}_{i1}\left({d}_{1}-{e}_{1}\right)+{a}_{i2}\left({d}_{2}-{e}_{2}\right)+{a}_{i3}\left({d}_{3}-{e}_{3}\right)+\cdots +{a}_{in}\left({d}_{n}-{e}_{n}\right)\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =\left({a}_{i1}{d}_{1}+{a}_{i2}{d}_{2}+{a}_{i3}{d}_{3}+\cdots +{a}_{in}{d}_{n}\right)-\left({a}_{i1}{e}_{1}+{a}_{i2}{e}_{2}+{a}_{i3}{e}_{3}+\cdots +{a}_{in}{e}_{n}\right)\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & ={b}_{i}-{b}_{i}\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill & =0\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$$This is the $i$-th equation of the homogeneous system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}0\right)$ evaluated with ${x}_{j}={d}_{j}-{e}_{j}$, $1\le j\le n$. Since $A$ is nonsingular, we must conclude that this solution is the trivial solution, and so $0={d}_{j}-{e}_{j}$, $1\le j\le n$. That is, ${d}_{j}={e}_{j}$ for all $j$ and the two solutions are identical, meaning any solution to $\left(\ast \right)$ is unique. $\u25a0$

This important theorem deserves several comments. First, notice that the proposed solution (${x}_{i}=\frac{{c}_{i}}{{c}_{n+1}}$) appeared in the Round 2 proof with no motivation whatsoever. This is just fine in a proof. A proof should convince you that a theorem is true. It is your job to read the proof and be convinced of every assertion. Questions like “Where did that come from?” or “How would I think of that?” have no bearing on the validity of the proof.

Second, this theorem helps to explain part of our interest in nonsingular matrices. If a matrix is nonsingular, then no matter what vector of constants we pair it with, using the matrix as the coefficient matrix will always yield a linear system of equations with a solution, and the solution is unique. To determine if a matrix has this property (non-singularity) it is enough to just solve one linear system, the homogeneous system with the matrix as coefficient matrix and the zero vector as the vector of constants (or any other vector of constants, see Exercise MM.T10).

Finally, formulating the negation of the second part of this theorem is a good exercise. A singular matrix has the property that for some value of the vector $b$, the system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$ does not have a unique solution (which means that it has no solution or infinitely many solutions). We will be able to say more about this case later (see the discussion following Theorem PSPHS). Square matrices that are nonsingular have a long list of interesting properties, which we will start to catalog in the following, recurring, theorem. Of course, singular matrices will then have all of the opposite properties. The following theorem is a list of equivalences. We want to understand just what is involved with understanding and proving a theorem that says several condtions are equivalent. So have a look at Technique ME before studying the first in this series of theorems.

Theorem NME1

Nonsingular Matrix Equivalences, Round 1

Suppose that $A$
is a square matrix. The following are equivalent.

- $A$ is nonsingular.
- $A$ row-reduces to the identity matrix.
- The null space of $A$ contains only the zero vector, $\mathcal{N}\phantom{\rule{0em}{0ex}}\left(A\right)=\left\{0\right\}$.
- The linear system $\mathcal{\mathcal{L}}\mathcal{S}\phantom{\rule{0em}{0ex}}\left(A,\phantom{\rule{0em}{0ex}}b\right)$ has a unique solution for every possible choice of $b$.

Proof That $A$ is nonsingular is equivalent to each of the subsequent statements by, in turn, Theorem NMRRI, Theorem NMTNS and Theorem NMUS. So the statement of this theorem is just a convenient way to organize all these results. $\u25a0$

- What is the definition of a nonsingular matrix?
- What is the easiest way to recognize a nonsingular matrix?
- Suppose we have a system of equations and its coefficient matrix is nonsingular. What can you say about the solution set for this system?

In Exercises C30–C33 determine if the matrix is nonsingular or singular. Give
reasons for your answer.

C30

$$\left[\begin{array}{cccc}\hfill -3\hfill & \hfill 1\hfill & \hfill 2\hfill & \hfill 8\hfill \\ \hfill 2\hfill & \hfill 0\hfill & \hfill 3\hfill & \hfill 4\hfill \\ \hfill 1\hfill & \hfill 2\hfill & \hfill 7\hfill & \hfill -4\hfill \\ \hfill 5\hfill & \hfill -1\hfill & \hfill 2\hfill & \hfill 0\hfill \end{array}\right]$$ |

Contributed by Robert Beezer Solution [201]

$$\left[\begin{array}{cccc}\hfill 2\hfill & \hfill 3\hfill & \hfill 1\hfill & \hfill 4\hfill \\ \hfill 1\hfill & \hfill 1\hfill & \hfill 1\hfill & \hfill 0\hfill \\ \hfill -1\hfill & \hfill 2\hfill & \hfill 3\hfill & \hfill 5\hfill \\ \hfill 1\hfill & \hfill 2\hfill & \hfill 1\hfill & \hfill 3\hfill \end{array}\right]$$ |

Contributed by Robert Beezer Solution [201]

$$\left[\begin{array}{cccc}\hfill 9\hfill & \hfill 3\hfill & \hfill 2\hfill & \hfill 4\hfill \\ \hfill 5\hfill & \hfill -6\hfill & \hfill 1\hfill & \hfill 3\hfill \\ \hfill 4\hfill & \hfill 1\hfill & \hfill 3\hfill & \hfill -5\hfill \end{array}\right]$$ |

Contributed by Robert Beezer Solution [202]

$$\left[\begin{array}{cccc}\hfill -1\hfill & \hfill 2\hfill & \hfill 0\hfill & \hfill 3\hfill \\ \hfill 1\hfill & \hfill -3\hfill & \hfill -2\hfill & \hfill 4\hfill \\ \hfill -2\hfill & \hfill 0\hfill & \hfill 4\hfill & \hfill 3\hfill \\ \hfill -3\hfill & \hfill 1\hfill & \hfill -2\hfill & \hfill 3\hfill \end{array}\right]$$ |

Contributed by Robert Beezer Solution [202]

C40 Each of the archetypes below is a system of equations with a square
coefficient matrix, or is itself a square matrix. Determine if these matrices are
nonsingular, or singular. Comment on the null space of each matrix.

Archetype A

Archetype B

Archetype F

Archetype K

Archetype L

Contributed by Robert Beezer

M30 Let $A$ be the coefficient matrix of the system of equations below. Is $A$ nonsingular or singular? Explain what you could infer about the solution set for the system based only on what you have learned about $A$ being singular or nonsingular.

$$\begin{array}{llll}\hfill -{x}_{1}+5{x}_{2}& =-8\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill -2{x}_{1}+5{x}_{2}+5{x}_{3}+2{x}_{4}& =9\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill -3{x}_{1}-{x}_{2}+3{x}_{3}+{x}_{4}& =3\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\\ \hfill 7{x}_{1}+6{x}_{2}+5{x}_{3}+{x}_{4}& =30\phantom{\rule{2em}{0ex}}& \hfill & \phantom{\rule{2em}{0ex}}\end{array}$$

Contributed by Robert Beezer Solution [202]

For Exercises M51–M52 say as much as possible about each system’s
solution set. Be sure to make it clear which theorems you are using to reach your
conclusions.

M51 6 equations in 6 variables, singular coefficient matrix.

Contributed by Robert Beezer Solution [203]

M52 A system with a nonsingular coefficient matrix, not homogeneous.

Contributed by Robert Beezer Solution [203]

T10 Suppose that $A$ is
a singular matrix, and $B$
is a matrix in reduced row-echelon form that is row-equivalent to
$A$. Prove that
the last row of $B$
is a zero row.

Contributed by Robert Beezer Solution [203]

C30 Contributed by Robert Beezer Statement [197]

The matrix row-reduces to

$$\left[\begin{array}{cccc}\hfill \text{1}\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill \text{1}\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill \text{1}\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill \text{1}\hfill \end{array}\right]$$ |

which is the $4\times 4$ identity matrix. By Theorem NMRRI the original matrix must be nonsingular.

C31 Contributed by Robert Beezer Statement [197]

Row-reducing the matrix yields,

$$\left[\begin{array}{cccc}\hfill \text{1}\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill -2\hfill \\ \hfill 0\hfill & \hfill \text{1}\hfill & \hfill 0\hfill & \hfill 3\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill \text{1}\hfill & \hfill -1\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill \end{array}\right]$$ |

Since this is not the $4\times 4$ identity matrix, Theorem NMRRI tells us the matrix is singular.

C32 Contributed by Robert Beezer Statement [198]

The matrix is not square, so neither term is applicable. See Definition NM, which
is stated for just square matrices.

C33 Contributed by Robert Beezer Statement [198]

Theorem NMRRI tells us we can answer this question by simply row-reducing the
matrix. Doing this we obtain,

$$\left[\begin{array}{cccc}\hfill \text{1}\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill \text{1}\hfill & \hfill 0\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill \text{1}\hfill & \hfill 0\hfill \\ \hfill 0\hfill & \hfill 0\hfill & \hfill 0\hfill & \hfill \text{1}\hfill \end{array}\right]$$ |

Since the reduced row-echelon form of the matrix is the $4\times 4$ identity matrix ${I}_{4}$, we know that $B$ is nonsingular.

M30 Contributed by Robert Beezer Statement [199]

We row-reduce the coefficient matrix of the system of equations,

Since the row-reduced version of the coefficient mattrix is the $4\times 4$ identity matrix, ${I}_{4}$ (Definition IM byTheorem NMRRI, we know the coefficient matrix is nonsingular. According to Theorem NMUS we know that the system is guaranteed to have a unique solution, based only on the extra information that the coefficient matrix is nonsingular.

M51 Contributed by Robert Beezer Statement [199]

Theorem NMRRI tells us that the coefficient matrix will not row-reduce
to the identity matrix. So if were to row-reduce the augmented matrix
of this system of equations, we would not get a unique solution. So by
Theorem PSSLS there remaining possibilities are no solutions, or infinitely
many.

M52 Contributed by Robert Beezer Statement [200]

Any system with a nonsingular coefficient matrix will have a unique solution by
Theorem NMUS. If the system is not homogeneous, the solution cannot be the
zero vector (Exercise HSE.T10).

T10 Contributed by Robert Beezer Statement [200]

Let $n$ denote the size
of the square matrix $A$.
By Theorem NMRRI the hypothesis that
$A$ is singular implies
that $B$ is not the
identity matrix ${I}_{n}$.
If $B$ has
$n$ pivot columns, then
it would have to be ${I}_{n}$,
so $B$ must have
fewer than $n$
pivot columns. But the number of nonzero rows in
$B$
($r$)
is equal to the number of pivot columns as well. So the
$n$ rows of
$B$ have fewer
than $n$ nonzero
rows, and $B$
must contain at least one zero row. By Definition RREF, this row must be at the
bottom of $B$.