##### DefinitionSQMSquare Matrix

A matrix with \(m\) rows and \(n\) columns is *square* if \(m=n\text{.}\) In this case, we say the matrix has *size* \(n\text{.}\) To emphasize the situation when a matrix is not square, we will call it *rectangular*.

In this section we specialize further and consider matrices with equal numbers of rows and columns, which when considered as coefficient matrices lead to systems with equal numbers of equations and variables. We will see in the second half of the course (Chapter D, Chapter E, Chapter LT, Chapter R) that these matrices are especially important.

Our theorems will now establish connections between systems of equations (homogeneous or otherwise), augmented matrices representing those systems, coefficient matrices, constant vectors, the reduced row-echelon form of matrices (augmented and coefficient) and solution sets. Be very careful in your reading, writing and speaking about systems of equations, matrices and sets of vectors. A system of equations is not a matrix, a matrix is not a solution set, and a solution set is not a system of equations. Now would be a great time to review the discussion about speaking and writing mathematics in Proof Technique L.

A matrix with \(m\) rows and \(n\) columns is *square* if \(m=n\text{.}\) In this case, we say the matrix has *size* \(n\text{.}\) To emphasize the situation when a matrix is not square, we will call it *rectangular*.

We can now present one of the central definitions of linear algebra.

Suppose \(A\) is a square matrix. Suppose further that the solution set to the homogeneous linear system of equations \(\linearsystem{A}{\zerovector}\) is \(\set{\zerovector}\text{,}\) in other words, the system has *only* the trivial solution. Then we say that \(A\) is a *nonsingular* matrix. Otherwise we say \(A\) is a *singular* matrix.

We can investigate whether any square matrix is nonsingular or not, no matter if the matrix is derived somehow from a system of equations or if it is simply a matrix. The definition says that to perform this investigation we must construct a very specific system of equations (homogeneous, with the matrix as the coefficient matrix) and look at its solution set. We will have theorems in this section that connect nonsingular matrices with systems of equations, creating more opportunities for confusion. Convince yourself now of two observations, (1) we can decide nonsingularity for any square matrix, and (2) the determination of nonsingularity involves the solution set for a certain homogeneous system of equations.

Notice that it makes no sense to call a system of equations nonsingular (the term does not apply to a system of equations), nor does it make any sense to call a \(5\times 7\) matrix singular (the matrix is not square).

Notice that we will not discuss Example HISAD as being a singular or nonsingular coefficient matrix since the matrix is not square.

The next theorem combines with our main computational technique (row reducing a matrix) to make it easy to recognize a nonsingular matrix. But first a definition.

The \(m\times m\) *identity matrix*, \(I_m\text{,}\) is defined by
\begin{align*}
\matrixentry{I_m}{ij}&=
\begin{cases}
1 & \text{if }i=j\\
0 & \text{if }i\neq j
\end{cases}
\end{align*}
for \(1\leq i,\,j\leq m\text{.}\)

Notice that an identity matrix is square, and in reduced row-echelon form. Also, every column is a pivot column, and every possible pivot column appears once.

Suppose that \(A\) is a square matrix and \(B\) is a row-equivalent matrix in reduced row-echelon form. Then \(A\) is nonsingular if and only if \(B\) is the identity matrix.

Notice that since this theorem is an equivalence it will always allow us to determine if a matrix is either nonsingular or singular. Here are two examples of this, continuing our study of Archetype A and Archetype B.

Nonsingular matrices and their null spaces are intimately related, as the next two examples illustrate.

These two examples illustrate the next theorem, which is another equivalence.

Suppose that \(A\) is a square matrix. Then \(A\) is nonsingular if and only if the null space of \(A\) is the set containing only the zero vector, i.e. \(\nsp{A}=\set{\zerovector}\text{.}\)

The next theorem pulls a lot of big ideas together. Theorem NMUS tells us that we can learn much about solutions to a system of linear equations with a square coefficient matrix by just examining a similar homogeneous system.

Suppose that \(A\) is a square matrix. \(A\) is a nonsingular matrix if and only if the system \(\linearsystem{A}{\vect{b}}\) has a unique solution for every choice of the constant vector \(\vect{b}\text{.}\)

This theorem helps to explain part of our interest in nonsingular matrices. If a matrix is nonsingular, then no matter what vector of constants we pair it with, using the matrix as the coefficient matrix will *always* yield a linear system of equations with a solution, and the solution is unique. To determine if a matrix has this property (nonsingularity) it is enough to just solve one linear system, the homogeneous system with the matrix as coefficient matrix and the zero vector as the vector of constants (or any other vector of constants, see Exercise MM.T10).

Formulating the negation of the second part of this theorem is a good exercise. A singular matrix has the property that for *some* value of the vector \(\vect{b}\text{,}\) the system \(\linearsystem{A}{\vect{b}}\) does not have a unique solution (which means that it has no solution or infinitely many solutions). We will be able to say more about this case later (see the discussion following Theorem PSPHS).

Square matrices that are nonsingular have a long list of interesting properties, which we will start to catalog in the following, recurring, theorem. Of course, singular matrices will then have all of the opposite properties. The following theorem is a list of equivalences.

We want to understand just what is involved with understanding and proving a theorem that says several conditions are equivalent. So have a look at Proof Technique ME before studying the first in this series of theorems.

Suppose that \(A\) is a square matrix. The following are equivalent.

- \(A\) is nonsingular.
- \(A\) is row-equivalent to the identity matrix.
- The null space of \(A\) contains only the zero vector, \(\nsp{A}=\set{\zerovector}\text{.}\)
- The linear system \(\linearsystem{A}{\vect{b}}\) has a unique solution for every possible choice of \(\vect{b}\text{.}\)

Finally, you may have wondered why we refer to a matrix as *nonsingular* when it creates systems of equations with *single* solutions (Theorem NMUS)! I have wondered the same thing. We will have an opportunity to address this just before Theorem NPNT, and again when we get to Theorem SMZD. Can you wait that long?

In your own words state the definition of a nonsingular matrix.

What is the *easiest* way to recognize if a square matrix is nonsingular or not?

Suppose we have a system of equations and its coefficient matrix is nonsingular. What can you say about the solution set for this system?

In Exercises C30–C33 determine if the matrix is nonsingular or singular. Give reasons for your answer.

\begin{equation*} \begin{bmatrix} -3 & 1 & 2 & 8\\ 2 & 0 & 3 & 4\\ 1 & 2 & 7 & -4\\ 5 & -1 & 2 & 0 \end{bmatrix} \end{equation*}

Solution\begin{equation*} \begin{bmatrix} 2 & 3 & 1 & 4\\ 1 & 1 & 1 & 0\\ -1 & 2 & 3 & 5\\ 1 & 2 & 1 & 3 \end{bmatrix} \end{equation*}

Solution\begin{equation*} \begin{bmatrix} 9 & 3 & 2 & 4\\ 5 & -6 & 1 & 3\\ 4 & 1 & 3 & -5 \end{bmatrix} \end{equation*}

Solution\begin{equation*} \begin{bmatrix} -1 & 2 & 0 & 3 \\ 1 & -3 & -2 & 4 \\ -2 & 0 & 4 & 3 \\ -3 & 1 & -2 & 3 \end{bmatrix} \end{equation*}

SolutionEach of the archetypes below is a system of equations with a square coefficient matrix, or is itself a square matrix. Determine if these matrices are nonsingular, or singular. Comment on the null space of each matrix.

Archetype A, Archetype B, Archetype F, Archetype K, Archetype L

Find the null space of the matrix \(E\) below. \begin{align*} E&= \begin{bmatrix} 2 & 1 & -1 & -9 \\ 2 & 2 & -6 & -6 \\ 1 & 2 & -8 & 0 \\ -1 & 2 & -12 & 12 \end{bmatrix} \end{align*}

SolutionLet \(A\) be the coefficient matrix of the system of equations below. Is \(A\) nonsingular or singular? Explain what you could infer about the solution set for the system based only on what you have learned about \(A\) being singular or nonsingular. \begin{align*} -x_1+5x_2&=-8\\ -2x_1+5x_2+5x_3+2x_4&=9\\ -3x_1-x_2+3x_3+x_4&=3\\ 7x_1+6x_2+5x_3+x_4&=30 \end{align*}

SolutionFor Exercises M51–M52 say as much as possible about each system's solution set. Be sure to make it clear which theorems you are using to reach your conclusions.

Suppose that \(A\) is a square matrix, and \(B\) is a matrix in reduced row-echelon form that is row-equivalent to \(A\text{.}\) Prove that if \(A\) is singular, then the last row of \(B\) is a zero row.

SolutionUsing (Definition RREF) and (Definition IM) carefully, give a proof of the following equivalence: \(A\) is a square matrix in reduced row-echelon form where every column is a pivot column if and only if \(A\) is the identity matrix.

Suppose that \(A\) is a nonsingular matrix and \(A\) is row-equivalent to the matrix \(B\text{.}\) Prove that \(B\) is nonsingular.

SolutionSuppose that \(A\) is a square matrix of size \(n\times n\) and that we know there is a *single* vector \(\vect{b}\in\complex{n}\) such that the system \(\linearsystem{A}{\vect{b}}\) has a unique solution. Prove that \(A\) is a nonsingular matrix. (Notice that this is very similar to Theorem NMUS, but is not exactly the same.)

Provide an alternative for the second half of the proof of Theorem NMUS, without appealing to properties of the reduced row-echelon form of the coefficient matrix. In other words, prove that if \(A\) is nonsingular, then \(\linearsystem{A}{\vect{b}}\) has a unique solution for every choice of the constant vector \(\vect{b}\text{.}\) Construct this proof without using Theorem REMEF or Theorem RREFU.

Solution