Skip to main content
\(\newcommand{\orderof}[1]{\sim #1} \newcommand{\Z}{\mathbb{Z}} \newcommand{\reals}{\mathbb{R}} \newcommand{\real}[1]{\mathbb{R}^{#1}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\complex}[1]{\mathbb{C}^{#1}} \newcommand{\conjugate}[1]{\overline{#1}} \newcommand{\modulus}[1]{\left\lvert#1\right\rvert} \newcommand{\zerovector}{\vect{0}} \newcommand{\zeromatrix}{\mathcal{O}} \newcommand{\innerproduct}[2]{\left\langle#1,\,#2\right\rangle} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} \newcommand{\dimension}[1]{\dim\left(#1\right)} \newcommand{\nullity}[1]{n\left(#1\right)} \newcommand{\rank}[1]{r\left(#1\right)} \newcommand{\ds}{\oplus} \newcommand{\detname}[1]{\det\left(#1\right)} \newcommand{\detbars}[1]{\left\lvert#1\right\rvert} \newcommand{\trace}[1]{t\left(#1\right)} \newcommand{\sr}[1]{#1^{1/2}} \newcommand{\spn}[1]{\left\langle#1\right\rangle} \newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)} \newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)} \newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)} \newcommand{\per}[1]{#1^\perp} \newcommand{\augmented}[2]{\left\lbrack\left.#1\,\right\rvert\,#2\right\rbrack} \newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)} \newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}} \newcommand{\rowopswap}[2]{R_{#1}\leftrightarrow R_{#2}} \newcommand{\rowopmult}[2]{#1R_{#2}} \newcommand{\rowopadd}[3]{#1R_{#2}+R_{#3}} \newcommand{\leading}[1]{\boxed{#1}} \newcommand{\rref}{\xrightarrow{\text{RREF}}} \newcommand{\elemswap}[2]{E_{#1,#2}} \newcommand{\elemmult}[2]{E_{#2}\left(#1\right)} \newcommand{\elemadd}[3]{E_{#2,#3}\left(#1\right)} \newcommand{\scalarlist}[2]{{#1}_{1},\,{#1}_{2},\,{#1}_{3},\,\ldots,\,{#1}_{#2}} \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\vectorcomponents}[2]{\colvector{#1_{1}\\#1_{2}\\#1_{3}\\\vdots\\#1_{#2}}} \newcommand{\vectorlist}[2]{\vect{#1}_{1},\,\vect{#1}_{2},\,\vect{#1}_{3},\,\ldots,\,\vect{#1}_{#2}} \newcommand{\vectorentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\matrixentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\lincombo}[3]{#1_{1}\vect{#2}_{1}+#1_{2}\vect{#2}_{2}+#1_{3}\vect{#2}_{3}+\cdots +#1_{#3}\vect{#2}_{#3}} \newcommand{\matrixcolumns}[2]{\left\lbrack\vect{#1}_{1}|\vect{#1}_{2}|\vect{#1}_{3}|\ldots|\vect{#1}_{#2}\right\rbrack} \newcommand{\transpose}[1]{#1^{t}} \newcommand{\inverse}[1]{#1^{-1}} \newcommand{\submatrix}[3]{#1\left(#2|#3\right)} \newcommand{\adj}[1]{\transpose{\left(\conjugate{#1}\right)}} \newcommand{\adjoint}[1]{#1^\ast} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\setparts}[2]{\left\lbrace#1\,\middle|\,#2\right\rbrace} \newcommand{\card}[1]{\left\lvert#1\right\rvert} \newcommand{\setcomplement}[1]{\overline{#1}} \newcommand{\charpoly}[2]{p_{#1}\left(#2\right)} \newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)} \newcommand{\eigensystem}[3]{\lambda&=#2&\eigenspace{#1}{#2}&=\spn{\set{#3}}} \newcommand{\geneigenspace}[2]{\mathcal{G}_{#1}\left(#2\right)} \newcommand{\algmult}[2]{\alpha_{#1}\left(#2\right)} \newcommand{\geomult}[2]{\gamma_{#1}\left(#2\right)} \newcommand{\indx}[2]{\iota_{#1}\left(#2\right)} \newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3} \newcommand{\lteval}[2]{#1\left(#2\right)} \newcommand{\ltinverse}[1]{#1^{-1}} \newcommand{\restrict}[2]{{#1}|_{#2}} \newcommand{\preimage}[2]{#1^{-1}\left(#2\right)} \newcommand{\rng}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\krn}[1]{\mathcal{K}\!\left(#1\right)} \newcommand{\compose}[2]{{#1}\circ{#2}} \newcommand{\vslt}[2]{\mathcal{LT}\left(#1,\,#2\right)} \newcommand{\isomorphic}{\cong} \newcommand{\similar}[2]{\inverse{#2}#1#2} \newcommand{\vectrepname}[1]{\rho_{#1}} \newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}} \newcommand{\vectrepinvname}[1]{\ltinverse{\vectrepname{#1}}} \newcommand{\vectrepinv}[2]{\lteval{\ltinverse{\vectrepname{#1}}}{#2}} \newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}} \newcommand{\matrixrepcolumns}[4]{\left\lbrack \left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{1}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{2}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{3}}}\right|\ldots\left|\vectrep{#2}{\lteval{#1}{\vect{#3}_{#4}}}\right.\right\rbrack} \newcommand{\cbm}[2]{C_{#1,#2}} \newcommand{\jordan}[2]{J_{#1}\left(#2\right)} \newcommand{\hadamard}[2]{#1\circ #2} \newcommand{\hadamardidentity}[1]{J_{#1}} \newcommand{\hadamardinverse}[1]{\widehat{#1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

AppendixAArchetypes

WordNet (an open-source lexical database) gives the following definition of “archetype”: something that serves as a model or a basis for making copies.

Our archetypes are typical examples of systems of equations, matrices and linear transformations. They have been designed to demonstrate the range of possibilities, allowing you to compare and contrast them. Several are of a size and complexity that is usually not presented in a textbook, but should do a better job of being “typical.”

We have made frequent reference to many of these throughout the text, such as the frequent comparisons between Archetype A and Archetype B. Some we have left for you to investigate, such as Archetype J, which parallels Archetype I.

How should you use the archetypes? First, consult the description of each one as it is mentioned in the text. See how other facts about the example might illuminate whatever property or construction is being described in the example. Second, each property has a short description that usually includes references to the relevant theorems. Perform the computations and understand the connections to the listed theorems. Third, each property has a small checkbox in front of it. Use the archetypes like a workbook and chart your progress by “checking-off” those properties that you understand.

The following chart summarizes some (but not all) of the properties described for each archetype. Notice that while there are several types of objects, there are fundamental connections between them. That some lines of the table do double-duty is meant to convey some of these connections. Consult this table when you wish to quickly find an example of a certain phenomenon.

ArchetypeAArchetype A

⬜  Summary   Linear system of three equations, three unknowns. Singular coefficient matrix with dimension 1 null space. Integer eigenvalues and a degenerate eigenspace for coefficient matrix.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} x_1 -x_2 +2x_3 & =1\\ 2x_1+ x_2 + x_3 & =8\\ x_1 + x_2 & =5 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1 = 2,\quad x_2 = 3,\quad x_3 = 1\\ x_1 = 3,\quad x_2 = 2,\quad x_3 = 0 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 1 & -1 & 2 & 1\\ 2 & 1 & 1 & 8\\ 1 & 1 & 0 & 5 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 1 & 3\\ 0 & \leading{1} & -1 & 2\\ 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3}=\colvector{3\\2\\0} + x_3\colvector{-1\\1\\1} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} x_1 -x_2 +2x_3 & = 0\\ 2x_1+ x_2 + x_3 & = 0\\ x_1 + x_2\quad\quad & = 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0\\ x_1 = -1,\quad x_2 = 1,\quad x_3 = 1\\ x_1 = -5,\quad x_2 = 5,\quad x_3 = 5 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 1 & 0 \\ 0 & \leading{1} & -1 & 0\\ 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 1 & -1 & 2\\ 2 & 1 & 1\\ 1 & 1 & 0 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 1\\ 0 & \leading{1} & -1\\ 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? Singular. Notice that the row-reduced version of the matrix is not the identity matrix and apply Theorem NMRRI. At the same time, examine the sizes of the sets \(D\) and \(F\) from the analysis of the reduced row-echelon version of the matrix.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\colvector{-1\\1\\1}} \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{1\\2\\1},\,\colvector{-1\\1\\1}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&-2&3\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{-3\\0\\1},\,\colvector{2\\1\\0}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\-\frac{1}{3}},\,\colvector{0\\1\\{\frac{2}{3}}}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\1},\,\colvector{0\\1\\-1}} \end{equation*}
⬜  Inverse Matrix?  The matrix is singular, and by Theorem NI does not have an inverse (Definition MI).
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=2&\text{nullity}&=1&\text{columns}&=3 \end{align*}
⬜  Determinant  Value of the determinant of the matrix. The matrix is singular so the determinant is \(0\) (Theorem SMZD). Notice that zero is an eigenvalue of the matrix (Theorem SMZE).
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check. \begin{align*} \eigensystem{A}{0}{\colvector{-1\\1\\1}}\\ \eigensystem{A}{2}{\colvector{1\\5\\3}}\\ %& forces align environment \end{align*}
⬜  Eigenvalue Multiplicities  Geometric and algebraic multiplicities (Definition GME, Definition AME). \begin{align*} \geomult{A}{0}&=1&\algmult{A}{0}&=2\\ \geomult{A}{2}&=1&\algmult{A}{2}&=1 \end{align*}
⬜  Diagonalizable  Diagonalizable (Definition DZM)?

No, \(\geomult{A}{0}\neq\algmult{B}{0}\text{,}\) Theorem DMFE.

ArchetypeBArchetype B

⬜  Summary   System with three equations, three unknowns. Nonsingular coefficient matrix. Distinct integer eigenvalues for coefficient matrix.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} -7x_1 -6 x_2 - 12x_3 &=-33\\ 5x_1 + 5x_2 + 7x_3 &=24\\ x_1 +4x_3 &=5 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1 = -3,\quad x_2 = 5,\quad x_3 = 2 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} -7&-6&- 12&-33\\ 5&5&7&24\\ 1&0&4&5 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1}&0&0&-3\\ 0&\leading{1}&0&5\\ 0&0&\leading{1}&2 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{4} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3}=\colvector{-3\\5\\2} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} -7x_1 -6 x_2 - 12x_3 &=0\\ 5x_1 + 5x_2 + 7x_3 &=0\\ x_1 +4x_3 &=0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0\\ 0 & \leading{1} & 0 & 0\\ 0 & 0 & \leading{1} & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{4} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} -7&-6&-12\\ 5&5&7\\ 1&0&4 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0\\ 0 & \leading{1} & 0\\ 0 & 0 & \leading{1} \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{\ } \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? Nonsingular. Notice that the row-reduced version of the matrix is the identity matrix and apply Theorem NMRRI. At the same time, examine the sizes of the sets \(D\) and \(F\) from the analysis of the reduced row-echelon version of the matrix.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\ } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{-7\\5\\1},\,\colvector{-6\\5\\0},\,\colvector{-12\\7\\4}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1}} \end{equation*}
⬜  Inverse Matrix?  The matrix is nonsingular, and by Theorem NI has an inverse (Definition MI). The inverse can be computed with the procedure in Theorem CINM. \begin{equation*} \begin{bmatrix} -10 & -12 & -9\\ \frac{13}{2} & 8 & \frac{11}{2}\\ \frac{5}{2} & 3 & \frac{5}{2} \end{bmatrix} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=3&\text{nullity}&=0&\text{columns}&=3 \end{align*}
⬜  Determinant  Value of the determinant of the matrix. The matrix is nonsingular so the determinant is nonzero (Theorem SMZD). Notice that zero is not an eigenvalue of the matrix (Theorem SMZE). \begin{equation*} \text{determinant}=-2 \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check. \begin{align*} \eigensystem{B}{-1}{\colvector{-5\\3\\1}}\\ \eigensystem{B}{1}{\colvector{-3\\2\\1}}\\ \eigensystem{B}{2}{\colvector{-2\\1\\1}}\\ %& forces align environment \end{align*}
⬜  Eigenvalue Multiplicities  Geometric and algebraic multiplicities (Definition GME, Definition AME). \begin{align*} \geomult{B}{-1}&=1&\algmult{B}{-1}&=1\\ \geomult{B}{1}&=1&\algmult{B}{1}&=1\\ \geomult{B}{2}&=1&\algmult{B}{2}&=1 \end{align*}
⬜  Diagonalizable  Diagonalizable (Definition DZM)?

Yes, distinct eigenvalues, Theorem DED.

⬜  Diagonalization  The diagonalization (Theorem DC). \begin{equation*} \begin{bmatrix}-1&-1&-1\\2&3&1\\-1&-2&1 \end{bmatrix}\begin{bmatrix} -7&-6&-12\\ 5&5&7\\ 1&0&4 \end{bmatrix} \begin{bmatrix}-5&-3&-2\\3&2&1\\1&1&1 \end{bmatrix}=\begin{bmatrix}-1&0&0\\0&1&0\\0&0&2 \end{bmatrix} \end{equation*}

ArchetypeCArchetype C

⬜  Summary   System with three equations, four variables. Consistent. Null space of coefficient matrix has dimension 1.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} 2x_1 - 3x_2 + x_3 - 6x_4 &= -7 \\ 4x_1 +x_2 +2x_3 + 9x_4 &= -7 \\ 3x_1 +x_2 +x_3 + 8x_4 &= -8 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1 = -7,\quad x_2 = -2,\quad x_3 = 7,\quad x_4 = 1\\ x_1 = -1,\quad x_2 = 7,\quad x_3 = 4,\quad x_4 = -2 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 2 & -3 & 1 & -6 & -7\\ 4 & 1 & 2 & 9 & -7\\ 3 & 1 & 1 & 8 & -8 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 2 & -5\\ 0 & \leading{1} & 0 & 3 & 1 \\ 0 & 0 & \leading{1} & -1 & 6 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{4,\,5} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3\\x_4}=\colvector{-5\\1\\6\\0} + x_4\colvector{-2\\-3\\1\\1} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} 2x_1 - 3x_2 + x_3 - 6x_4 &= 0 \\ 4x_1 +x_2 +2x_3 + 9x_4 &= 0 \\ 3x_1 +x_2 +x_3 + 8x_4 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0,\quad x_4=0\\ x_1 = -2,\quad x_2 = -3,\quad x_3 = 1,\quad x_4=1\\ x_1 = -4,\quad x_2 = -6,\quad x_3 = 2,\quad x_4=2 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 2 & 0\\ 0 & \leading{1} & 0 & 3 & 0 \\ 0 & 0 & \leading{1} & -1 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{4,\,5} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 2 & -3 & 1 & -6 \\ 4 & 1 & 2 & 9 \\ 3 & 1 & 1 & 8 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 2 \\ 0 & \leading{1} & 0 & 3 \\ 0 & 0 & \leading{1} & -1 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{4} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\colvector{-2\\-3\\1\\1}} \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{2\\4\\3},\,\colvector{-3\\1\\1},\,\colvector{1\\2\\1}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\0\\2},\,\colvector{0\\1\\0\\3 },\,\colvector{0\\0\\1\\ -1}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=3&\text{nullity}&=1&\text{columns}&=4 \end{align*}

ArchetypeDArchetype D

⬜  Summary   System with three equations, four variables. Consistent. Null space of coefficient matrix has dimension 2. Coefficient matrix identical to that of Archetype E, vector of constants is different.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} 2x_1 + x_2 + 7x_3 - 7x_4 &= 8 \\ -3x_1 + 4x_2 -5x_3 - 6x_4 &= -12 \\ x_1 +x_2 + 4x_3 - 5x_4 &= 4 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1 = 0,\quad x_2 = 1,\quad x_3 = 2,\quad x_4 = 1\\ x_1 = 4,\quad x_2 = 0,\quad x_3 = 0,\quad x_4 = 0\\ x_1 = 7,\quad x_2 = 8,\quad x_3 = 1,\quad x_4 = 3 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 2 & 1 & 7 & -7 & 8\\ -3 & 4 & -5 & -6 & -12\\ 1 & 1 & 4 & -5 & 4 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 3 & -2 & 4 \\ 0 & \leading{1} & 1 & -3 & 0\\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4,\,5} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3\\x_4}=\colvector{4\\0\\0\\0} + x_3\colvector{-3\\-1\\1\\0}+ x_4\colvector{2\\3\\0\\1} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} 2x_1 + x_2 + 7x_3 - 7x_4 &= 0 \\ -3x_1 + 4x_2 -5x_3 - 6x_4 &= 0 \\ x_1 +x_2 + 4x_3 - 5x_4 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0,\quad x_4=0\\ x_1 = -3,\quad x_2 = -1,\quad x_3 = 1,\quad x_4=0\\ x_1 = 2,\quad x_2 = 3,\quad x_3 = 0,\quad x_4=1\\ x_1 = -1,\quad x_2 = 2,\quad x_3 = 1,\quad x_4=1 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 3 & -2 & 0 \\ 0 & \leading{1} & 1 & -3 & 0\\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4,\,5} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 2 & 1 & 7 & -7\\ -3 & 4 & -5 & -6\\ 1 & 1 & 4 & -5 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 3 & -2\\ 0 & \leading{1} & 1 & -3\\ 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\colvector{-3\\-1\\1\\0},\,\colvector{2\\3\\0\\1}} \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{2\\-3\\1},\,\colvector{1\\4\\1}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&\frac{1}{7}&-\frac{11}{7}\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{\frac{11}{7}\\0\\1},\,\colvector{-\frac{1}{7}\\1\\0}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\\frac{7}{11}},\,\colvector{0\\1\\\frac{1}{11}}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\3\\-2},\,\colvector{0\\1\\1\\-3}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=2&\text{nullity}&=2&\text{columns}&=4 \end{align*}

ArchetypeEArchetype E

⬜  Summary   System with three equations, four variables. Inconsistent. Null space of coefficient matrix has dimension 2. Coefficient matrix identical to that of Archetype D, constant vector is different.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} 2x_1 + x_2 + 7x_3 - 7x_4 &= 2 \\ -3x_1 + 4x_2 -5x_3 - 6x_4 &= 3 \\ x_1 +x_2 + 4x_3 - 5x_4 &= 2 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE):
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 2 & 1 & 7 & -7 & 2\\ -3 & 4 & -5 & -6 & 3\\ 1 & 1 & 4 & -5 & 2 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 3 & -2 & 0\\ 0 & \leading{1} & 1 & -3 & 0\\ 0 & 0 & 0 & 0 & \leading{1} \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=3&D&=\set{1,\,2,\,5}&F&=\set{3,\,4} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples.
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} 2x_1 + x_2 + 7x_3 - 7x_4 &= 0 \\ -3x_1 + 4x_2 -5x_3 - 6x_4 &= 0 \\ x_1 +x_2 + 4x_3 - 5x_4 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0,\quad x_4=0\\ x_1 = 4,\quad x_2 = 13,\quad x_3 = 2,\quad x_4=5 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 3 & -2 & 0\\ 0 & \leading{1} & 1 & -3 & 0\\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4,\,5} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 2 & 1 & 7 & -7 \\ -3 & 4 & -5 & -6 \\ 1 & 1 & 4 & -5 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 3 & -2 \\ 0 & \leading{1} & 1 & -3 \\ 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3,\,4} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\colvector{-3\\-1\\1\\0},\,\colvector{2\\3\\0\\1}} \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{2\\-3\\1},\,\colvector{1\\4\\1}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&\frac{1}{7}&-\frac{11}{7}\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{\frac{11}{7}\\0\\1},\,\colvector{-\frac{1}{7}\\1\\0}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\\frac{7}{11}},\,\colvector{0\\1\\\frac{1}{11}}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\3\\-2},\,\colvector{0\\1\\1\\-3}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=2&\text{nullity}&=2&\text{columns}&=4 \end{align*}

ArchetypeFArchetype F

⬜  Summary   System with four equations, four variables. Nonsingular coefficient matrix. Integer eigenvalues, one has “high” multiplicity.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} 33x_1 - 16x_2 + 10x_3 - 2x_4 &= -27 \\ 99x_1 - 47x_2 + 27x_3 - 7x_4 &= -77 \\ 78x_1 - 36x_2 + 17x_3 - 6x_4 &= -52 \\ -9x_1 + 2x_2 + 3x_3 +4x_4 &= 5 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1 = 1,\quad x_2 = 2,\quad x_3 = -2,\quad x_4 = 4 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 33 & -16 & 10 & -2 & -27\\ 99 & -47 & 27 & -7 & -77\\ 78 & -36 & 17 & -6 & -52\\ -9 & 2 & 3 & 4 & 5 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 & 1\\ 0 & \leading{1} & 0 & 0 & 2\\ 0 & 0 & \leading{1} & 0 & -2 \\ 0 & 0 & 0 & \leading{1} & 4 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=4&D&=\set{1,\,2,\,3,\,4}&F&=\set{5} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3\\x_4}=\colvector{1\\2\\-2\\4} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} 33x_1 - 16x_2 + 10x_3 - 2x_4 &= 0 \\ 99x_1 - 47x_2 + 27x_3 - 7x_4 &= 0 \\ 78x_1 - 36x_2 + 17x_3 - 6x_4 &= 0 \\ -9x_1 + 2x_2 + 3x_3 +4x_4 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0,\quad x_4=0 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 & 0\\ 0 & \leading{1} & 0 & 0 & 0\\ 0 & 0 & \leading{1} & 0 & 0 \\ 0 & 0 & 0 & \leading{1} & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=4&D&=\set{1,\,2,\,3,\,4}&F&=\set{5} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 33 & -16 & 10 & -2 \\ 99 & -47 & 27 & -7 \\ 78 & -36 & 17 & -6 \\ -9 & 2 & 3 & 4 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 \\ 0 & \leading{1} & 0 & 0 \\ 0 & 0 & \leading{1} & 0 \\ 0 & 0 & 0 & \leading{1} \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=4&D&=\set{1,\,2,\,3,\,4}&F&=\set{\ } \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? Nonsingular. Notice that the row-reduced version of the matrix is the identity matrix and apply Theorem NMRRI. At the same time, examine the sizes of the sets \(D\) and \(F\) from the analysis of the reduced row-echelon version of the matrix.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\ } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{33\\99\\78\\-9},\,\colvector{-16\\-47\\-36\\2},\,\colvector{10\\27\\17\\3},\,\colvector{-2\\-7\\-6\\4}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{1\\0\\0\\0},\,\colvector{0\\1\\0\\0},\,\colvector{0\\0\\1\\0},\,\colvector{0\\0\\0\\1}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\0\\0},\,\colvector{0\\1\\0\\0},\,\colvector{0\\0\\1\\0},\,\colvector{0\\0\\0\\1}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\0\\0},\,\colvector{0\\1\\0\\0},\,\colvector{0\\0\\1\\0},\,\colvector{0\\0\\0\\1}} \end{equation*}
⬜  Inverse Matrix?  The matrix is nonsingular, and by Theorem NI has an inverse (Definition MI). The inverse can be computed with the procedure in Theorem CINM. \begin{equation*} \begin{bmatrix} -\left( \frac{86}{3} \right) & \frac{38}{3} & -\left( \frac{11}{3} \right) & \frac{7} {3} \\ -\left( \frac{129}{2} \right) & \frac{86}{3} & -\left( \frac{17}{2} \right) & \frac{31}{6} \\ -13 & 6 & -2 & 1 \\ -\left( \frac{45}{2} \right) & \frac{29}{3} & -\left( \frac{5}{2} \right) & \frac{13}{6} \end{bmatrix} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=4&\text{nullity}&=0&\text{columns}&=4 \end{align*}
⬜  Determinant  Value of the determinant of the matrix. The matrix is nonsingular so the determinant is nonzero (Theorem SMZD). Notice that zero is not an eigenvalue of the matrix (Theorem SMZE). \begin{equation*} \text{determinant}=-18 \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check. \begin{align*} \eigensystem{F}{-1}{\colvector{1\\2\\0\\1}}\\ \eigensystem{F}{2}{\colvector{2\\5\\2\\1}}\\ \eigensystem{F}{3}{\colvector{1\\1\\0\\7},\,\colvector{17\\45\\21\\0}}\\ %& forces align environment \end{align*}
⬜  Eigenvalue Multiplicities  Geometric and algebraic multiplicities (Definition GME, Definition AME). \begin{align*} \geomult{F}{-1}&=1&\algmult{F}{-1}&=1\\ \geomult{F}{2}&=1&\algmult{F}{2}&=1\\ \geomult{F}{3}&=2&\algmult{F}{3}&=2 \end{align*}
⬜  Diagonalizable  Diagonalizable (Definition DZM)?

Yes, full eigenspaces, Theorem DMFE.

⬜  Diagonalization  The diagonalization (Theorem DC). \begin{equation*} \begin{bmatrix}12&-5&1&-1\\-39&18&-7&3\\ \frac{27}{7}&-\frac{13}{7}&\frac{6}{7}&-\frac{1}{7}\\ \frac{26}{7}&-\frac{12}{7}&\frac{5}{7}&-\frac{2}{7} \end{bmatrix}\begin{bmatrix} 33 & -16 & 10 & -2 \\ 99 & -47 & 27 & -7 \\ 78 & -36 & 17 & -6 \\ -9 & 2 & 3 & 4 \end{bmatrix} \begin{bmatrix}1&2&1&17\\2&5&1&45\\0&2&0&21\\1&1&7&0 \end{bmatrix}=\begin{bmatrix}-1&0&0&0\\0&2&0&0\\0&0&3&0\\0&0&0&3 \end{bmatrix} \end{equation*}

ArchetypeGArchetype G

⬜  Summary   System with five equations, two variables. Consistent. Null space of coefficient matrix has dimension 0. Coefficient matrix identical to that of Archetype H, constant vector is different.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} 2x_1 + 3x_2 &= 6 \\ -x_1 + 4x_2 &= -14 \\ 3x_1 +10x_2 &= -2 \\ 3x_1 - x_2 &= 20 \\ 6x_1 + 9x_2 &= 18 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1 = 6,\quad x_2 = -2 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 2 & 3 & 6 \\ -1 & 4 & -14 \\ 3 & 10 & -2 \\ 3 & -1 & 20 \\ 6 & 9 & 18 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 6 \\ 0 & \leading{1} & -2 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2}=\colvector{6\\-2} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} 2x_1 + 3x_2 &= 0\\ -x_1 + 4x_2 &= 0 \\ 3x_1 +10x_2 &= 0 \\ 3x_1 - x_2 &= 0 \\ 6x_1 + 9x_2 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 \\ 0 & \leading{1} & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 2 & 3\\ -1 & 4 \\ 3 & 10\\ 3 & -1\\ 6 & 9 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 \\ 0 & \leading{1} \\ 0 & 0 \\ 0 & 0 \\ 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{\ } \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\ } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{2\\-1\\3\\3\\6},\,\colvector{3\\4\\10\\-1\\9}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&0&0&0&-\frac{1}{3}\\0&1&0&1-\frac{1}{3}\\0&0&1&1&-1\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{\frac{1}{3}\\\frac{1}{3}\\1\\0\\1},\,\colvector{0\\-1\\-1\\1\\0}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\2\\1\\3},\,\colvector{0\\1\\1\\-1\\0}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0},\,\colvector{0\\1}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=2&\text{nullity}&=0&\text{columns}&=2 \end{align*}

ArchetypeHArchetype H

⬜  Summary   System with five equations, two variables. Inconsistent, overdetermined. Null space of coefficient matrix has dimension 0. Coefficient matrix identical to that of Archetype G, constant vector is different.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} 2x_1 + 3x_2 &= 5 \\ -x_1 + 4x_2 &= 6 \\ 3x_1 +10x_2 &= 2 \\ 3x_1 - x_2 &= -1 \\ 6x_1 + 9x_2 &= 3 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE):
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 2 & 3 & 5 \\ -1 & 4 & 6 \\ 3 & 10 & 2 \\ 3 & -1 & -1 \\ 6 & 9 & 3 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 \\ 0 & \leading{1} & 0 \\ 0 & 0 & \leading{1} \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{\ } \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples.
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} 2x_1 + 3x_2 &= 0 \\ -x_1 + 4x_2 &= 0 \\ 3x_1 +10x_2 &= 0 \\ 3x_1 - x_2 &= 0 \\ 6x_1 + 9x_2 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1 = 0,\quad x_2 = 0 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 \\ 0 & \leading{1} & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{3} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 2 & 3 \\ -1 & 4 \\ 3 & 10 \\ 3 & -1 \\ 6 & 9 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 \\ 0 & \leading{1} \\ 0 & 0 \\ 0 & 0 \\ 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=2&D&=\set{1,\,2}&F&=\set{\ } \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\ } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{2\\-1\\3\\3\\6},\,\colvector{3\\4\\10\\-1\\9}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&0&0&0&-\frac{1}{3}\\0&1&0&1-\frac{1}{3}\\0&0&1&1&-1\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{\frac{1}{3}\\\frac{1}{3}\\1\\0\\1},\,\colvector{0\\-1\\-1\\1\\0}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\2\\1\\3},\,\colvector{0\\1\\1\\-1\\0}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0},\,\colvector{0\\1}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=2&\text{nullity}&=0&\text{columns}&=2 \end{align*}

ArchetypeIArchetype I

⬜  Summary   System with four equations, seven variables. Consistent. Null space of coefficient matrix has dimension 4.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} x_1 +4x_2 - x_4 + 7x_6 - 9x_7 &= 3\\ 2x_1 + 8x_2 - x_3 + 3x_4 + 9x_5 - 13x_6 + 7x_7 &= 9\\ 2x_3 -3x_4 -4x_5 +12x_6 -8x_7 &= 1\\ -x_1 - 4x_2 + 2x_3 +4x_4 + 8x_5 - 31x_6 + 37x_7 &= 4 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1=-25, x_2=4, x_3=22, x_4=29, x_5=1, x_6=2, x_7=-3\\ x_1=-7, x_2=5, x_3=7, x_4=15, x_5=-4, x_6=2, x_7=1\\ x_1=4, x_2=0, x_3=2, x_4=1, x_5=0, x_6=0, x_7=0 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 1 & 4 & 0 & -1 & 0 & 7 & -9 & 3 \\ 2 & 8 & -1 & 3 & 9 & -13 & 7 & 9 \\ 0 & 0 & 2 & -3 & -4 & 12 & -8 & 1 \\ -1 & -4 & 2 & 4 & 8 & -31 & 37 & 4 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 4 & 0 & 0 & 2 & 1 & -3 & 4 \\ 0 & 0 & \leading{1} & 0 & 1 & -3 & 5 & 2 \\ 0 & 0 & 0 & \leading{1} & 2 & -6 & 6 & 1 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=3&D&=\set{1,\,3,\,4}&F&=\set{2,\,5,\,6,\,7,\,8} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3\\x_4\\x_5\\x_6\\x_7}= \colvector{4\\0\\2\\1\\0\\0\\0} + x_2\colvector{-4\\1\\0\\0\\0\\0\\0}+ x_5\colvector{-2\\0\\-1\\-2\\1\\0\\0}+ x_6\colvector{-1\\0\\3\\6\\0\\1\\0}+ x_7\colvector{3\\0\\-5\\-6\\0\\0\\1} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} x_1 +4x_2 - x_4 + 7x_6 - 9x_7 &= 0\\ 2x_1 + 8x_2 - x_3 + 3x_4 + 9x_5 - 13x_6 + 7x_7 &= 0\\ 2x_3 -3x_4 -4x_5 +12x_6 -8x_7 &= 0\\ -x_1 - 4x_2 + 2x_3 +4x_4 + 8x_5 - 31x_6 + 37x_7 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1=0, x_2=0, x_3=0, x_4=0, x_5=0, x_6=0, x_7=0\\ x_1=3, x_2=0, x_3=-5, x_4=-6, x_5=0, x_6=0, x_7=1\\ x_1=-1, x_2=0, x_3=3, x_4=6, x_5=0, x_6=1, x_7=0\\ x_1=-2, x_2=0, x_3=-1, x_4=-2, x_5=1, x_6=0, x_7=0\\ x_1=-4, x_2=1, x_3=0, x_4=0, x_5=0, x_6=0, x_7=0\\ x_1=-4, x_2=1, x_3=-3, x_4=-2, x_5=1, x_6=1, x_7=1 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 4 & 0 & 0 & 2 & 1 & -3 & 0\\ 0 & 0 & \leading{1} & 0 & 1 & -3 & 5 & 0 \\ 0 & 0 & 0 & \leading{1} & 2 & -6 & 6 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=3&D&=\set{1,\,3,\,4}&F&=\set{2,\,5,\,6,\,7,\,8} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 1 & 4 & 0 & -1 & 0 & 7 & -9 \\ 2 & 8 & -1 & 3 & 9 & -13 & 7\\ 0 & 0 & 2 & -3 & -4 & 12 & -8\\ -1 & -4 & 2 & 4 & 8 & -31 & 37 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 4 & 0 & 0 & 2 & 1 & -3\\ 0 & 0 & \leading{1} & 0 & 1 & -3 & 5\\ 0 & 0 & 0 & \leading{1} & 2 & -6 & 6\\ 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=3&D&=\set{1,\,3,\,4}&F&=\set{2,\,5,\,6,\,7} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{ \colvector{-4\\1\\0\\0\\0\\0\\0},\, \colvector{-2\\0\\-1\\-2\\1\\0\\0},\, \colvector{-1\\0\\3\\6\\0\\1\\0},\, \colvector{3\\0\\-5\\-6\\0\\0\\1} } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{1\\2\\0\\-1},\,\colvector{0\\-1\\2\\2},\,\colvector{-1\\3\\-3\\4}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&-\frac{12}{31}&-\frac{13}{31}&\frac{7}{31}\end{bmatrix} \end{equation*} \begin{equation*} \set{\colvector{-\frac{7}{31}\\0\\0\\1},\,\colvector{\frac{13}{31}\\0\\1\\0},\,\colvector{\frac{12}{31}\\1\\0\\0}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\0\\-\frac{31}{7}},\,\colvector{0\\1\\0\\\frac{12}{7}},\,\colvector{0\\0\\1\\\frac{13}{7}}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\4\\0\\0\\2\\1\\-3},\,\colvector{0\\0\\1\\0\\1\\-3\\5},\,\colvector{0\\0\\0\\1\\2\\-6\\6}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=3&\text{nullity}&=4&\text{columns}&=7 \end{align*}

ArchetypeJArchetype J

⬜  Summary   System with six equations, nine variables. Consistent. Null space of coefficient matrix has dimension 5.
⬜  Definition  A system of linear equations (Definition SLE). \begin{align*} x_1 +2x_ 2-2x_3 +9x_4 +3x_5 -5x_6-2x_7 +x_8 +27x_9 &= -5 \\ 2x_1 +4x_2 +3x_3 +4x_4 -x_5 +4x_6 +10x_7 +2x_8 -23x_9 &=18 \\ x_1 +2x_2 +x_3 + 3x_4 +x_5 +x_6 +5x_7 +2x_8 -7x_9 &=6 \\ 2x_1 +4x_2 +3x_3 + 4x_4 -7x_5 +2x_6 +4x_7 -11x_ 9&=20 \\ x_1 +2x_2 + 5x_4 +2x_5 -4x_6 +3x_7 +8x_8 +13x_9 &= -4 \\ -3x_1 -6x_2 -x_3 -13x_4 +2x_5 -5x_6 -4x_7 +13x_ 8+10x_ 9&=-29 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE): \begin{gather*} x_1=6, x_2= 0, x_3= -1, x_4= 0, x_5= -1, x_6= 2, x_7= 0, x_8= 0, x_9= 0\\ x_1=4, x_2=1, x_3=-1, x_4=0, x_5=-1, x_6=2, x_7=0, x_8=0, x_9= 0\\ x_1=-17, x_2=7, x_3=3, x_4=2, x_5=-1, x_6=14, x_7=-1, x_8=3, x_9=2\\ x_1=-11, x_2=-6, x_3=1, x_4=5, x_5=-4, x_6=7, x_7=3, x_8=1, x_9=1 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM): \begin{equation*} \begin{bmatrix} 1 & 2 & -2 & 9 & 3 & -5 & -2 & 1 & 27 & -5 \\ 2 & 4 & 3 & 4 & -1 & 4 & 10 & 2 & -23 & 18 \\ 1 & 2 & 1 & 3 & 1 & 1 & 5 & 2 & -7 & 6 \\ 2 & 4 & 3 & 4 & -7 & 2 & 4 & 0 & -11 & 20 \\ 1 & 2 & 0 & 5 & 2 & -4 & 3 & 8 & 13 & -4 \\ -3 & -6 & -1 & -13 & 2 & -5 & -4 & 13 & 10 & -29 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF) \begin{equation*} \begin{bmatrix} \leading{1} & 2 & 0 & 5 & 0 & 0 & 1 & -2 & 3 & 6 \\ 0 & 0 & \leading{1} & -2 & 0 & 0 & 3 & 5 & -6 & -1 \\ 0 & 0 & 0 & 0 & \leading{1} & 0 & 1 & 1 & -1 & -1 \\ 0 & 0 & 0 & 0 & 0 & \leading{1} & 0 & -2 & -3 & 2 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF). \begin{align*} r&=4&D&=\set{1,\,3,\,5,\,6}&F&=\set{2,\,4,\,7,\,8,\,9,\,10} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples. \begin{equation*} \colvector{x_1\\x_2\\x_3\\x_4\\x_5\\x_6\\x_7\\x_8\\x_9}= \colvector{6\\0\\-1\\0\\-1\\2\\0\\0\\0} + x_2\colvector{-2\\1\\0\\0\\0\\0\\0\\0\\0}+ x_4\colvector{-5\\0\\2\\1\\0\\0\\0\\0\\0}+ x_7\colvector{-1\\0\\-3\\0\\-1\\0\\1\\0\\0}+ x_8\colvector{2\\0\\-5\\0\\-1\\2\\0\\1\\0}+ x_9\colvector{-3\\0\\6\\0\\1\\3\\0\\0\\1} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system. \begin{align*} x_1 +2x_ 2-2x_3 +9x_4 +3x_5 -5x_6-2x_7 +x_8 +27x_9 &= 0 \\ 2x_1 +4x_2 +3x_3 +4x_4 -x_5 +4x_6 +10x_7 +2x_8 -23x_9 &=0 \\ x_1 +2x_2 +x_3 + 3x_4 +x_5 +x_6 +5x_7 +2x_8 -7x_9 &=0 \\ 2x_1 +4x_2 +3x_3 + 4x_4 -7x_5 +2x_6 +4x_7 -11x_ 9&=0 \\ x_1 +2x_2 + + 5x_4 +2x_5 -4x_6 +3x_7 +8x_8 +13x_9 &= 0\\ -3x_1 -6x_2 -x_3 -13x_4 +2x_5 -5x_6 -4x_7 +13x_ 8+10x_ 9&=0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions. \begin{gather*} x_1=0, x_2=0, x_3=0, x_4=0, x_5=0, x_6=0, x_7=0, x_8=0, x_9=0\\ x_1=-2, x_2=1, x_3=0, x_4=0, x_5=0, x_6=0, x_7=0, x_8=0, x_9=0\\ x_1=-23, x_2=7, x_3=4, x_4=2, x_5=0, x_6=12, x_7=-1, x_8=3, x_9=2\\ x_1=-17, x_2=-6, x_3=2, x_4=5, x_5=-3, x_6=5, x_7=3, x_8=1, x_9=1 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros. \begin{equation*} \begin{bmatrix} \leading{1} & 2 & 0 & 5 & 0 & 0 & 1 & -2 & 3 & 0 \\ 0 & 0 & \leading{1} & -2 & 0 & 0 & 3 & 5 & -6 & 0 \\ 0 & 0 & 0 & 0 & \leading{1} & 0 & 1 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 & 0 & \leading{1} & 0 & -2 & -3 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS). \begin{align*} r&=4&D&=\set{1,\,3,\,5,\,6}&F&=\set{2,\,4,\,7,\,8,\,9,\,10} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix. \begin{equation*} \begin{bmatrix} 1 & 2 & -2 & 9 & 3 & -5 & -2 & 1 & 27 \\ 2 & 4 & 3 & 4 & -1 & 4 & 10 & 2 & -23 \\ 1 & 2 & 1 & 3 & 1 & 1 & 5 & 2 & -7 \\ 2 & 4 & 3 & 4 & -7 & 2 & 4 & 0 & -11 \\ 1 & 2 & 0 & 5 & 2 & -4 & 3 & 8 & 13 \\ -3 & -6 & -1 & -13 & 2 & -5 & -4 & 13 & 10 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 2 & 0 & 5 & 0 & 0 & 1 & -2 & 3 \\ 0 & 0 & \leading{1} & -2 & 0 & 0 & 3 & 5 & -6 \\ 0 & 0 & 0 & 0 & \leading{1} & 0 & 1 & 1 & -1 \\ 0 & 0 & 0 & 0 & 0 & \leading{1} & 0 & -2 & -3 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=4&D&=\set{1,\,3,\,5,\,6}&F&=\set{2,\,4,\,7,\,8,\,9} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? The question is moot, since the matrix is not square. Notice that there are many other properties that only make sense for square matrices, such as the inverse matrix, the determinant, eigenvalues and diagonalization.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{ \colvector{-2\\1\\0\\0\\0\\0\\0\\0\\0},\, \colvector{-5\\0\\2\\1\\0\\0\\0\\0\\0},\, \colvector{-1\\0\\-3\\0\\-1\\0\\1\\0\\0},\, \colvector{2\\0\\-5\\0\\-1\\2\\0\\1\\0},\, \colvector{-3\\0\\6\\0\\1\\3\\0\\0\\1} } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{1\\2\\1\\2\\1\\-3},\,\colvector{-2\\3\\1\\3\\0\\-1},\,\colvector{3\\-1\\1\\-7\\2\\2},\,\colvector{-5\\4\\1\\2\\-4\\-5}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&0&\frac{186}{131}&\frac{51}{131}&-\frac{188}{131}&\frac{77}{131}\\0&1&-\frac{272}{131}&-\frac{45}{131}&\frac{58}{131}&-\frac{14}{131}\end{bmatrix} \end{equation*} \begin{equation*} \set{ \colvector{-\frac{77}{131}\\\frac{14}{131}\\0\\0\\0\\1},\, \colvector{\frac{188}{131}\\-\frac{58}{131}\\0\\0\\1\\0},\, \colvector{-\frac{51}{131}\\\frac{45}{131}\\0\\1\\0\\0},\, \colvector{-\frac{186}{131}\\\frac{272}{131}\\1\\0\\0\\0} } \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\0\\0\\-1\\-\frac{29}{7}},\,\colvector{0\\1\\0\\0\\-\frac{11}{2}\\-\frac{94}{7}},\,\colvector{0\\0\\1\\0\\10\\22},\,\colvector{0\\0\\0\\1\\\frac{3}{2}\\3}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\2\\0\\5\\0\\0\\1\\-2\\3},\,\colvector{0\\0\\1\\-2\\0\\0\\3\\5\\-6},\,\colvector{0\\0\\0\\0\\1\\0\\1\\1\\-1},\,\colvector{0\\0\\0\\0\\0\\1\\0\\-2\\-3}} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=4&\text{nullity}&=5&\text{columns}&=9 \end{align*}

ArchetypeKArchetype K

⬜  Summary   Square matrix of size 5. Nonsingular. 3 distinct eigenvalues, 2 of multiplicity 2.
⬜  Definition  A square matrix (Definition SQM). Notice how the following analysis parallels the analysis of the coefficient matrix of the linear systems in previous archetypes, yet there is no system discussed explicitly for this archetype. \begin{equation*} \begin{bmatrix} 10 & 18 & 24 & 24 & -12 \\ 12 & -2 & -6 & 0 & -18 \\ -30 & -21 & -23 & -30 & 39 \\ 27 & 30 & 36 & 37 & -30 \\ 18 & 24 & 30 & 30 & -20 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 & 0 \\ 0 & \leading{1} & 0 & 0 & 0 \\ 0 & 0 & \leading{1} & 0 & 0 \\ 0 & 0 & 0 & \leading{1} & 0 \\ 0 & 0 & 0 & 0 & \leading{1} \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=5&D&=\set{1,\,2,\,3,\,4,\,5}&F&=\set{\ } \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? Nonsingular. Notice that the row-reduced version of the matrix is the identity matrix and apply Theorem NMRRI. At the same time, examine the sizes of the sets \(D\) and \(F\) from the analysis of the reduced row-echelon version of the matrix.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\ } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{10\\12\\-30\\27\\18},\,\colvector{18\\-2\\-21\\30\\24},\,\colvector{24\\-6\\-23\\36\\30},\,\colvector{24\\0\\-30\\37\\30},\,\colvector{-12\\-18\\39\\-30\\-20}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}\end{bmatrix} \end{equation*} \begin{equation*} \set{ \colvector{1\\0\\0\\0\\0},\, \colvector{0\\1\\0\\0\\0},\, \colvector{0\\0\\1\\0\\0},\, \colvector{0\\0\\0\\1\\0},\, \colvector{0\\0\\0\\0\\1} } \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{1\\0\\0\\0\\0},\,\colvector{0\\1\\0\\0\\0},\,\colvector{0\\0\\1\\0\\0},\,\colvector{0\\0\\0\\1\\0},\,\colvector{0\\0\\0\\0\\1}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\0\\0\\0},\,\colvector{0\\1\\0\\0\\0},\,\colvector{0\\0\\1\\0\\0},\,\colvector{0\\0\\0\\1\\0},\,\colvector{0\\0\\0\\0\\1}} \end{equation*}
⬜  Inverse Matrix?  The matrix is nonsingular, and by Theorem NI has an inverse (Definition MI). The inverse can be computed with the procedure in Theorem CINM. \begin{equation*} \begin{bmatrix} 1 & -\left( \frac{9} {4} \right) & -\left( \frac{3}{2} \right) & 3 & -6 \\ \frac{21} {2} & \frac{43}{4} & \frac{21}{2} & 9 & -9 \\ -15 & -\left( \frac{21}{2} \right) & -11 & -15 & \frac{39} {2} \\ 9 & \frac{15} {4} & \frac{9} {2} & 10 & -15 \\ \frac{9} {2} & \frac{3}{4} & \frac{3}{2} & 6 & -\left( \frac{19}{2} \right) \end{bmatrix} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=5&\text{nullity}&=0&\text{columns}&=5 \end{align*}
⬜  Determinant  Value of the determinant of the matrix. The matrix is nonsingular so the determinant is nonzero (Theorem SMZD). Notice that zero is not an eigenvalue of the matrix (Theorem SMZE). \begin{equation*} \text{determinant}=16 \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check. \begin{align*} \eigensystem{K}{-2}{\colvector{2\\-2\\1\\0\\1},\,\colvector{-1\\2\\-2\\1\\0}}\\ \eigensystem{K}{1}{\colvector{4\\-10\\7\\0\\2},\,\colvector{-4\\18\\-17\\5\\0}}\\ \eigensystem{K}{4}{\colvector{1\\-1\\0\\1\\1}}\\ %& forces align environment \end{align*}
⬜  Eigenvalue Multiplicities  Geometric and algebraic multiplicities (Definition GME, Definition AME). \begin{align*} \geomult{K}{-2}&=2&\algmult{K}{-2}&=2\\ \geomult{K}{1}&=2&\algmult{K}{1}&=2\\ \geomult{K}{4}&=1&\algmult{K}{4}&=1 \end{align*}
⬜  Diagonalizable  Diagonalizable (Definition DZM)?

Yes, full eigenspaces, Theorem DMFE.

⬜  Diagonalization  The diagonalization (Theorem DC). \begin{equation*} \begin{bmatrix}-4&-3&-4&-6&7\\-7&-5&-6&-8&10\\ 1&-1&-1&1&-3\\1&0&0&1&-2\\2&5&6&4&0 \end{bmatrix}\begin{bmatrix} 10 & 18 & 24 & 24 & -12 \\ 12 & -2 & -6 & 0 & -18 \\ -30 & -21 & -23 & -30 & 39 \\ 27 & 30 & 36 & 37 & -30 \\ 18 & 24 & 30 & 30 & -20 \end{bmatrix} \begin{bmatrix}2&-1&4&-4&1\\-2&2&-10&18&-1\\ 1&-2&7&-17&0\\0&1&0&5&1\\1&0&2&0&1 \end{bmatrix}=\begin{bmatrix}-2&0&0&0&0\\0&-2&0&0&0\\ 0&0&1&0&0\\0&0&0&1&0\\0&0&0&0&4 \end{bmatrix} \end{equation*}

ArchetypeLArchetype L

⬜  Summary   Square matrix of size 5. Singular, nullity 2. 2 distinct eigenvalues, each of “high” multiplicity.
⬜  Definition  A square matrix (Definition SQM). Notice how the following analysis parallels the analysis of the coefficient matrix of the linear systems in previous archetypes, yet there is no system discussed explicitly for this archetype. \begin{equation*} \begin{bmatrix} -2 & -1 & -2 & -4 & 4 \\ -6 & -5 & -4 & -4 & 6 \\ 10 & 7 & 7 & 10 & -13 \\ -7 & -5 & -6 & -9 & 10 \\ -4 & -3 & -4 & -6 & 6 \\ \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF). \begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 1 & -2 \\ 0 & \leading{1} & 0 & -2 & 2 \\ 0 & 0 & \leading{1} & 2 & -1 \\ 0 & 0 & 0 & 0 & 0 \\ 0 & 0 & 0 & 0 & 0 \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system. \begin{align*} r&=3&D&=\set{1,\,2,\,3}&F&=\set{4,\,5} \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? Singular. Notice that the row-reduced version of the matrix is not the identity matrix and apply Theorem NMRRI. At the same time, examine the sizes of the sets \(D\) and \(F\) from the analysis of the reduced row-echelon version of the matrix.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\) \begin{equation*} \set{\colvector{-1\\2\\-2\\1\\0},\,\colvector{2\\-2\\1\\0\\1}} \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS). \begin{equation*} \set{\colvector{-2\\-6\\10\\-7\\-4},\,\colvector{-1\\-5\\7\\-5\\-3},\,\colvector{-2\\-4\\7\\-6\\-4}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\) \begin{equation*} L=\begin{bmatrix}1&0&-2&-6&5\\0&1&4&10&-9\end{bmatrix} \end{equation*} \begin{equation*} \set{ \colvector{-5\\9\\0\\0\\1},\, \colvector{6\\-10\\0\\1\\0},\, \colvector{2\\-4\\1\\0\\0} } \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space. \begin{equation*} \set{\colvector{ 1\\0\\0\\\frac{9}{4}\\\frac{5}{2}},\,\colvector{0\\1\\0\\\frac{5}{4}\\\frac{3}{2}},\,\colvector{0\\0\\1\\\frac{1}{2}\\1}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS) \begin{equation*} \set{\colvector{1\\0\\0\\1\\-2},\,\colvector{0\\1\\0\\-2\\2},\,\colvector{0\\0\\1\\2\\-1}} \end{equation*}
⬜  Inverse Matrix?  The matrix is singular, and by Theorem NI does not have an inverse (Definition MI).
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC. \begin{align*} \text{rank}&=3&\text{nullity}&=2&\text{columns}&=5 \end{align*}
⬜  Determinant  Value of the determinant of the matrix. The matrix is singular so the determinant is \(0\) (Theorem SMZD). Notice that zero is an eigenvalue of the matrix (Theorem SMZE).
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check. \begin{align*} \eigensystem{L}{-1}{\colvector{-5\\9\\0\\0\\1},\,\colvector{6\\-10\\0\\1\\0},\,\colvector{2\\-4\\1\\0\\0}}\\ \eigensystem{L}{0}{\colvector{2\\-2\\1\\0\\1},\,\colvector{-1\\2\\-2\\1\\0}}\\ %& forces align environment \end{align*}
⬜  Eigenvalue Multiplicities  Geometric and algebraic multiplicities (Definition GME, Definition AME). \begin{align*} \geomult{L}{-1}&=3&\algmult{L}{-1}&=3\\ \geomult{L}{0}&=2&\algmult{L}{0}&=2 \end{align*}
⬜  Diagonalizable  Diagonalizable (Definition DZM)?

Yes, full eigenspaces, Theorem DMFE.

⬜  Diagonalization  The diagonalization (Theorem DC). \begin{equation*} \begin{bmatrix}4&3&4&6&-6\\7&5&6&9&-10\\ -10&-7&-7&-10&13\\-4&-3&-4&-6&7\\-7&-5&-6&-8&10 \end{bmatrix}\begin{bmatrix} -2 & -1 & -2 & -4 & 4 \\ -6 & -5 & -4 & -4 & 6 \\ 10 & 7 & 7 & 10 & -13 \\ -7 & -5 & -6 & -9 & 10 \\ -4 & -3 & -4 & -6 & 6 \\ \end{bmatrix} \begin{bmatrix}-5&6&2&2&-1\\9&-10&-4&-2&2\\ 0&0&1&1&-2\\0&1&0&0&1\\1&0&0&1&0 \end{bmatrix}=\begin{bmatrix}-1&0&0&0&0\\0&-1&0&0&0\\ 0&0&-1&0&0\\0&0&0&0&0\\0&0&0&0&0 \end{bmatrix} \end{equation*}

ArchetypeMArchetype M

⬜  Summary   Linear transformation with bigger domain than codomain, so it is guaranteed to not be injective. Happens to not be surjective.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{5}}{\complex{3}},\quad \lteval{T}{\colvector{x_1\\x_2\\x_3\\x_4\\x_5}}= \colvector{x_1 + 2 x_2 + 3 x_3 + 4 x_4 + 4 x_5\\ 3 x_1 + x_2 + 4 x_3 - 3 x_4 + 7 x_5\\ x_1 - x_2 - 5 x_4 + x_5} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\colvector{-2\\-1\\0\\0\\1},\,\colvector{2\\-3\\0\\1\\0},\,\colvector{-1\\-1\\1\\0\\0} } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank can not exceed 3, we are guaranteed to have a nullity of at least 2, just from checking dimensions of the domain and the codomain. In particular, verify that \begin{align*} \lteval{T}{\colvector{1\\2\\-1\\4\\5}}&=\colvector{38\\24\\-16}& \lteval{T}{\colvector{0\\ -3\\ 0\\ 5\\ 6}}&=\colvector{38\\24\\-16}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \colvector{0\\-3\\0\\5\\6}&=\colvector{1\\2\\-1\\4\\5}+\colvector{-1\\-5\\1\\1\\1}\\ \end{align*} and \begin{align*} \vect{z}&=\colvector{-1\\-5\\1\\1\\1}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\colvector{1\\3\\1},\,\colvector{2\\1\\-1},\,\colvector{3\\4\\0},\,\colvector{4\\-3\\-5},\,\colvector{4\\7\\1}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{\colvector{1\\0\\-\frac{4}{5}},\,\colvector{0\\1\\\frac{3}{5}} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

Notice that the range is not all of \(\complex{3}\) since it has dimension 2, not 3. In particular, verify that \(\colvector{3\\4\\5}\not\in\rng{T}\text{,}\) by setting the output equal to this vector and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{\colvector{3\\4\\5}}\text{,}\) is empty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=2&\text{nullity}&=3&\text{domain}&=5 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

Not injective or surjective.

⬜  Matrix Representation  Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If \(A\) is the matrix below, then \(\lteval{T}{\vect{x}} = A\vect{x}\text{.}\) This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of \(\complex{m}\) (Definition SUV). \begin{equation*} \begin{bmatrix} 1&2&3&4&4\\ 3&1&4&-3&7\\ 1&-1&0&-5&1 \end{bmatrix} \end{equation*}

ArchetypeNArchetype N

⬜  Summary   Linear transformation with domain larger than its codomain, so it is guaranteed to not be injective. Happens to be onto.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{5}}{\complex{3}},\quad \lteval{T}{\colvector{x_1\\x_2\\x_3\\x_4\\x_5}}= \colvector{2 x_1 + x_2 + 3 x_3 - 4 x_4 + 5 x_5\\ x_1 - 2 x_2 + 3 x_3 - 9 x_4 + 3 x_5\\ 3 x_1 + 4 x_3 - 6 x_4 + 5 x_5} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\colvector{1\\-1\\-2\\0\\1},\,\colvector{-2\\-1\\3\\1\\0} } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank can not exceed 3, we are guaranteed to have a nullity of at least 2, just from checking dimensions of the domain and the codomain. In particular, verify that \begin{align*} \lteval{T}{\colvector{-3\\1\\-2\\-3\\1}}&=\colvector{6\\19\\6}& \lteval{T}{\colvector{-4\\-4\\-2\\-1\\4}}&=\colvector{6\\19\\6}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \colvector{-4\\-4\\-2\\-1\\4}&=\colvector{-3\\1\\-2\\-3\\1}+\colvector{-1\\-5\\0\\2\\3}\\ \end{align*} and \begin{align*} \vect{z}&=\colvector{-1\\-5\\0\\2\\3}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\colvector{2\\1\\3},\,\colvector{1\\-2\\0},\,\colvector{3\\3\\4},\,\colvector{-4\\-9\\-6},\,\colvector{5\\3\\5}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? Yes.

Notice that the basis for the range above is the standard basis for \(\complex{3}\text{.}\) So the range is all of \(\complex{3}\) and thus the linear transformation is surjective.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=3&\text{nullity}&=2&\text{domain}&=5 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

Not surjective, and the relative sizes of the domain and codomain mean the linear transformation cannot be injective. (Theorem ILTIS)

⬜  Matrix Representation  Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If \(A\) is the matrix below, then \(\lteval{T}{\vect{x}} = A\vect{x}\text{.}\) This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of \(\complex{m}\) (Definition SUV). \begin{equation*} \begin{bmatrix} 2&1&3&-4&5\\ 1&-2&3&-9&3\\ 3&0&4&-6&5 \end{bmatrix} \end{equation*}

ArchetypeOArchetype O

⬜  Summary   Linear transformation with a domain smaller than the codomain, so it is guaranteed to not be onto. Happens to not be one-to-one.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{3}}{\complex{5}},\quad \lteval{T}{\colvector{x_1\\x_2\\x_3}}= \colvector{-x_1 + x_2 - 3 x_3\\ -x_1 + 2 x_2 - 4 x_3\\ x_1 + x_2 + x_3\\ 2 x_1 + 3 x_2 + x_3\\ x_1 + 2 x_3 } \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\colvector{-2\\1\\1}} \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank can not exceed 3, we are guaranteed to have a nullity of at least 2, just from checking dimensions of the domain and the codomain. In particular, verify that \begin{align*} \lteval{T}{\colvector{5\\-1\\3}}&=\colvector{-15\\-19\\7\\10\\11}& \lteval{T}{\colvector{1\\1\\5}}&=\colvector{-15\\-19\\7\\10\\11}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \colvector{1\\1\\5}&=\colvector{5\\-1\\3}+\colvector{-4\\2\\2}\\ \end{align*} and \begin{align*} \vect{z}&=\colvector{-4\\2\\2}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\colvector{-1\\-1\\1\\2\\1},\,\colvector{1\\2\\1\\3\\0},\, \colvector{-3\\-4\\1\\1\\2}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{\colvector{1\\0\\-3\\-7\\-2},\,\colvector{0\\1\\2\\5\\1} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

The dimension of the range is 2, and the codomain (\(\complex{5}\)) has dimension 5. So the transformation is not onto. Notice too that since the domain \(\complex{3}\) has dimension 3, it is impossible for the range to have a dimension greater than 3, and no matter what the actual definition of the function, it cannot possibly be onto.

To be more precise, verify that \(\colvector{2\\3\\1\\1\\1}\not\in\rng{T}\text{,}\) by setting the output equal to this vector and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{\colvector{2\\3\\1\\1\\1}}\text{,}\) is empty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=2&\text{nullity}&=1&\text{domain}&=3 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

Not injective, and the relative dimensions of the domain and codomain prohibit any possibility of being surjective.

⬜  Matrix Representation  Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If \(A\) is the matrix below, then \(\lteval{T}{\vect{x}} = A\vect{x}\text{.}\) This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of \(\complex{m}\) (Definition SUV). \begin{equation*} \begin{bmatrix} -1&1&-3\\ -1&2&-4\\ 1&1&1\\ 2&3&1\\ 1&0&2 \end{bmatrix} \end{equation*}

ArchetypePArchetype P

⬜  Summary   Linear transformation with a domain smaller that its codomain, so it is guaranteed to not be surjective. Happens to be injective.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{3}}{\complex{5}},\quad \lteval{T}{\colvector{x_1\\x_2\\x_3}}= \colvector{-x_1 + x_2 + x_3\\ -x_1 + 2 x_2 + 2 x_3\\ x_1 + x_2 + 3 x_3\\ 2 x_1 + 3 x_2 + x_3\\ -2 x_1 + x_2 + 3 x_3} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\ } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? Yes.

Since \(\krn{T}=\set{\zerovector}\text{,}\) Theorem KILT tells us that \(T\) is injective.

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\colvector{-1\\-1\\1\\2\\-2},\,\colvector{1\\2\\1\\3\\1},\,\colvector{1\\2\\3\\1\\3}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{\colvector{1\\0\\0\\-10\\6},\,\colvector{0\\1\\0\\7\\-3},\,\colvector{0\\0\\1\\-1\\1}} \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

The dimension of the range is 3, and the codomain (\(\complex{5}\)) has dimension 5. So the transformation is not surjective. Notice too that since the domain \(\complex{3}\) has dimension 3, it is impossible for the range to have a dimension greater than 3, and no matter what the actual definition of the function, it cannot possibly be surjective in this situation.

To be more precise, verify that \(\colvector{2\\1\\-3\\2\\6}\not\in\rng{T}\text{,}\) by setting the output equal to this vector and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{\colvector{2\\1\\-3\\2\\6}}\text{,}\) is empty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=3&\text{nullity}&=0&\text{domain}&=3 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

The relative dimensions of the domain and codomain prohibit any possibility of being surjective, so apply Theorem ILTIS.

⬜  Matrix Representation  Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If \(A\) is the matrix below, then \(\lteval{T}{\vect{x}} = A\vect{x}\text{.}\) This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of \(\complex{m}\) (Definition SUV). \begin{equation*} \begin{bmatrix} -1&1&1\\ -1&2&2\\ 1&1&3\\ 2&3&1\\ -2&1&3 \end{bmatrix} \end{equation*}

ArchetypeQArchetype Q

⬜  Summary   Linear transformation with equal-sized domain and codomain, so it has the potential to be invertible, but in this case is not. Neither injective nor surjective. Diagonalizable, though.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{5}}{\complex{5}},\quad \lteval{T}{\colvector{x_1\\x_2\\x_3\\x_4\\x_5}}= \colvector{-2 x_1 + 3 x_2 + 3 x_3 - 6 x_4 + 3 x_5\\ -16 x_1 + 9 x_2 + 12 x_3 - 28 x_4 + 28 x_5\\ -19 x_1 + 7 x_2 + 14 x_3 - 32 x_4 + 37 x_5\\ -21 x_1 + 9 x_2 + 15 x_3 - 35 x_4 + 39 x_5\\ -9 x_1 + 5 x_2 + 7 x_3 - 16 x_4 + 16 x_5} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\colvector{3\\4\\1\\3\\3}} \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank can not exceed 3, we are guaranteed to have a nullity of at least 2, just from checking dimensions of the domain and the codomain. In particular, verify that \begin{align*} \lteval{T}{\colvector{1\\3\\-1\\2\\4}}&=\colvector{4\\55\\72\\77\\31}& \lteval{T}{\colvector{4\\7\\0\\5\\7}}&=\colvector{4\\55\\72\\77\\31}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \colvector{4\\7\\0\\5\\7}&=\colvector{1\\3\\-1\\2\\4}+\colvector{3\\4\\1\\3\\3}\\ \end{align*} and \begin{align*} \vect{z}&=\colvector{3\\4\\1\\3\\3}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\colvector{-2\\-16\\-19\\-21\\-9},\,\colvector{3\\9\\7\\9\\5},\,\colvector{3\\12\\14\\15\\7},\,\colvector{-6\\-28\\-32\\-35\\-16},\,\colvector{3\\28\\37\\39\\16}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{\colvector{1\\0\\0\\0\\1},\,\colvector{0\\1\\0\\0\\-1},\,\colvector{0\\0\\1\\0\\-1},\,\colvector{0\\0\\0\\1\\2}} \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

The dimension of the range is 4, and the codomain (\(\complex{5}\)) has dimension 5. So \(\rng{T}\neq\complex{5}\) and by Theorem RSLT the transformation is not surjective.

To be more precise, verify that \(\colvector{-1\\2\\3\\-1\\4}\not\in\rng{T}\text{,}\) by setting the output equal to this vector and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{\colvector{-1\\2\\3\\-1\\4}}\text{,}\) is empty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=4&\text{nullity}&=1&\text{domain}&=5 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

Neither injective nor surjective. Notice that since the domain and codomain have the same dimension, either the transformation is both onto and one-to-one (making it invertible) or else it is both not onto and not one-to-one (as in this case) by Theorem RPNDD.

⬜  Matrix Representation  Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If \(A\) is the matrix below, then \(\lteval{T}{\vect{x}} = A\vect{x}\text{.}\) This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of \(\complex{m}\) (Definition SUV). \begin{equation*} \begin{bmatrix} -2&3&3&-6&3\\ -16&9&12&-28&28\\ -19&7&14&-32&37\\ -21&9&15&-35&39\\ -9&5&7&-16&16 \end{bmatrix} \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EELT, Theorem EER). Evaluate the linear transformation with each eigenvector as an interesting check. \begin{align*} \eigensystem{T}{-1}{\colvector{0\\2\\3\\3\\1}}\\ \eigensystem{T}{0}{\colvector{3\\4\\1\\3\\3}}\\ \eigensystem{T}{1}{\colvector{5\\3\\0\\0\\2},\,\colvector{-3\\1\\0\\2\\0},\,\colvector{1\\-1\\2\\0\\0}}\\ %& forces align environment \end{align*}
⬜  Diagonal Matrix Representation  A diagonal matrix representation relative to a basis of eigenvectors. \begin{equation*} \text{basis}=\set{\colvector{0\\2\\3\\3\\1},\,\colvector{3\\4\\1\\3\\3},\, \colvector{5\\3\\0\\0\\2},\,\colvector{-3\\1\\0\\2\\0},\, \colvector{1\\-1\\2\\0\\0}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} -1&0&0&0&0\\ 0&0&0&0&0\\ 0&0&1&0&0\\ 0&0&0&1&0\\ 0&0&0&0&1 \end{bmatrix} \end{equation*}

ArchetypeRArchetype R

⬜  Summary   Linear transformation with equal-sized domain and codomain. Injective, surjective, invertible, diagonalizable, the works.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{5}}{\complex{5}},\quad \lteval{T}{\colvector{x_1\\x_2\\x_3\\x_4\\x_5}}= \colvector{-65 x_1 + 128 x_2 + 10 x_3 - 262 x_4 + 40 x_5\\ 36 x_1 - 73 x_2 - x_3 + 151 x_4 - 16 x_5\\ -44 x_1 + 88 x_2 + 5 x_3 - 180 x_4 + 24 x_5\\ 34 x_1 - 68 x_2 - 3 x_3 + 140 x_4 - 18 x_5\\ 12 x_1 - 24 x_2 - x_3 + 49 x_4 - 5 x_5} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\ } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? Yes.

Since the kernel is trivial Theorem KILT tells us that the linear transformation is injective.

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{ \colvector{-65\\36\\-44\\34\\12},\, \colvector{128\\-73\\88\\-68\\-24},\, \colvector{10\\-1\\5\\-3\\-1},\, \colvector{-262\\151\\-180\\140\\49},\, \colvector{40\\-16\\24\\-18\\-5}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{\colvector{1\\0\\0\\0\\0},\,\colvector{0\\1\\0\\0\\0},\,\colvector{0\\0\\1\\0\\0},\,\colvector{0\\0\\0\\1\\0},\,\colvector{0\\0\\0\\0\\1}} \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? Yes.

A basis for the range is the standard basis of \(\complex{5}\text{,}\) so \(\rng{T}=\complex{5}\) and Theorem RSLT tells us \(T\) is surjective. Or, the dimension of the range is 5, and the codomain (\(\complex{5}\)) has dimension 5. So the transformation is surjective.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=5&\text{nullity}&=0&\text{domain}&=5 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? Yes.

Both injective and surjective (Theorem ILTIS). Notice that since the domain and codomain have the same dimension, either the transformation is both injective and surjective (making it invertible, as in this case) or else it is both not injective and not surjective.

⬜  Matrix Representation  Matrix representation of the linear transformation, as described in Theorem MLTCV. (See also Example MOLT.) If \(A\) is the matrix below, then \(\lteval{T}{\vect{x}} = A\vect{x}\text{.}\) This computation may also be viewed as an application of Definition MR and Theorem FTMR from Section MR, where the bases are chosen to be the standard bases of \(\complex{m}\) (Definition SUV). \begin{equation*} \begin{bmatrix} -65&128&10&-262&40\\ 36&-73&-1&151&-16\\ -44&88&5&-180&24\\ 34&-68&-3&140&-18\\ 12&-24&-1&49&-5 \end{bmatrix} \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EELT, Theorem EER). Evaluate the linear transformation with each eigenvector as an interesting check. \begin{align*} \eigensystem{T}{-1}{\colvector{-57\\0\\-18\\14\\5},\,\colvector{2\\1\\0\\0\\0}}\\ \eigensystem{T}{1}{\colvector{-10\\-5\\-6\\0\\1},\,\colvector{2\\3\\1\\1\\0}}\\ \eigensystem{T}{2}{\colvector{-6\\3\\-4\\3\\1}}\\ %& forces align environment \end{align*}
⬜  Diagonal Matrix Representation  A diagonal matrix representation relative to a basis of eigenvectors. \begin{equation*} \text{basis}=\set{\colvector{-57\\0\\-18\\14\\5},\,\colvector{2\\1\\0\\0\\0},\,\colvector{-10\\-5\\-6\\0\\1},\,\colvector{2\\3\\1\\1\\0},\,\colvector{-6\\3\\-4\\3\\1}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} -1&0&0&0&0\\ 0&-1&0&0&0\\ 0&0&1&0&0\\ 0&0&0&1&0\\ 0&0&0&0&2\end{bmatrix} \end{equation*}

ArchetypeSArchetype S

⬜  Summary   Domain is column vectors, codomain is matrices. Domain is dimension 3 and codomain is dimension 4. Not injective, not surjective.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{\complex{3}}{M_{22}},\quad \lteval{T}{\colvector{a\\b\\c}}= \begin{bmatrix} a-b&2a+2b+c\\ 3a+b+c&-2a-6b-2c \end{bmatrix} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\colvector{-1\\-1\\4}} \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank can not exceed 3, we are guaranteed to have a nullity of at least 1, just from checking dimensions of the domain and the codomain. In particular, verify that \begin{align*} \lteval{T}{\colvector{2\\1\\3}}&=\begin{bmatrix}1&9\\10&-16\end{bmatrix} & \lteval{T}{\colvector{0\\-1\\11}}&=\begin{bmatrix}1&9\\10&-16\end{bmatrix}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \colvector{0\\-1\\11}&=\colvector{2\\1\\3}+\colvector{-2\\-2\\8}\\ \end{align*} and \begin{align*} \vect{z}&=\colvector{-2\\-2\\8}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\begin{bmatrix}1&2\\3&-2\end{bmatrix},\, \begin{bmatrix}-1&2\\1&-6\end{bmatrix},\, \begin{bmatrix}0&1\\1&-2\end{bmatrix}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{ \begin{bmatrix}1&0\\1&2\end{bmatrix},\, \begin{bmatrix}0&1\\1&-2\end{bmatrix} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

The dimension of the range is 2, and the codomain (\(M_{22}\)) has dimension 4. So the transformation is not surjective. Notice too that since the domain \(\complex{3}\) has dimension 3, it is impossible for the range to have a dimension greater than 3, and no matter what the actual definition of the function, it cannot possibly be surjective in this situation.

To be more precise, verify that \(\begin{bmatrix}2& -1\\1& 3\end{bmatrix}\not\in\rng{T}\text{,}\) by setting the output of \(T\) equal to this matrix and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{\begin{bmatrix}2& -1\\1& 3\end{bmatrix}}\text{,}\) is empty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=2&\text{nullity}&=1&\text{domain}&=3 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

Not injective (Theorem ILTIS), and the relative dimensions of the domain and codomain prohibit any possibility of being surjective.

⬜  Matrix Representation  Matrix representation of the linear transformation, as given by Definition MR and explained by Theorem FTMR. \begin{equation*} \text{domain basis}=\set{\colvector{1\\0\\0},\,\colvector{0\\1\\0},\,\colvector{0\\0\\1}} \end{equation*} \begin{equation*} \text{codomain basis}=\set{\begin{bmatrix}1&0\\0&0\end{bmatrix},\, \begin{bmatrix}0&1\\0&0\end{bmatrix},\, \begin{bmatrix}0&0\\1&0\end{bmatrix},\, \begin{bmatrix}0&0\\0&1\end{bmatrix}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} 1 & -1 & 0 \\ 2 & 2 & 1 \\ 3 & 1 & 1 \\ -2 & -6 & -2 \end{bmatrix} \end{equation*}

ArchetypeTArchetype T

⬜  Summary   Domain and codomain are polynomials. Domain has dimension 5, while codomain has dimension 6. Is injective, can not be surjective.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{P_4}{P_5},\quad\lteval{T}{p(x)}=(x-2)p(x) \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\ } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? Yes.

Since the kernel is trivial Theorem KILT tells us that the linear transformation is injective.

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{ x-2,\, x^2-2x,\, x^3-2x^2,\, x^4-2x^3, x^5-2x^4, x^6-2x^5} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{ -\frac{1}{32}x^5+1,\, -\frac{1}{16}x^5+x,\, -\frac{1}{8}x^5+x^2,\, -\frac{1}{4}x^5+x^3,\, -\frac{1}{2}x^5+x^4 } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

The dimension of the range is 5, and the codomain (\(P_5\)) has dimension 6. So the transformation is not surjective. Notice too that since the domain \(P_4\) has dimension 5, it is impossible for the range to have a dimension greater than 5, and no matter what the actual definition of the function, it cannot possibly be surjective in this situation.

To be more precise, verify that \(1+x+x^2+x^3+x^4\not\in\rng{T}\text{,}\) by setting the output equal to this vector and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{1+x+x^2+x^3+x^4}\text{,}\) is nonempty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=5&\text{nullity}&=0&\text{domain}&=5 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

The relative dimensions of the domain and codomain prohibit any possibility of being surjective, so apply Theorem ILTIS.

⬜  Matrix Representation  Matrix representation of the linear transformation, as given by Definition MR and explained by Theorem FTMR. \begin{equation*} \text{domain basis}=\set{1,\,x,\,x^2,\,x^3,\,x^4} \end{equation*} \begin{equation*} \text{codomain basis}=\set{1,\,x,\,x^2,\,x^3,\,x^4,\,x^5} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} -2 & 0 & 0 & 0 & 0 \\ 1 & -2 & 0 & 0 & 0 \\ 0 & 1 & -2 & 0 & 0 \\ 0 & 0 & 1 & -2 & 0 \\ 0 & 0 & 0 & 1 & -2 \\ 0 & 0 & 0 & 0 & 1 \end{bmatrix} \end{equation*}

ArchetypeUArchetype U

⬜  Summary   Domain is matrices, codomain is column vectors. Domain has dimension 6, while codomain has dimension 4. Cannot be injective, is surjective.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{M_{23}}{\complex{4}},\quad \lteval{T}{\begin{bmatrix}a&b&c\\d&e&f\end{bmatrix}}= \colvector{a+2b+12c-3d+e+6f\\2a-b-c+d-11f\\a+b+7c+2d+e-3f\\a+2b+12c+5e-5f} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{ \begin{bmatrix} 3 & -4 & 0\\1 & 2 & 1 \end{bmatrix} ,\, \begin{bmatrix} -2& -5& 1\\0 & 0 & 0 \end{bmatrix} } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. Also, since the rank cannot exceed 4, we are guaranteed to have a nullity of at least 2, just from checking dimensions of the domain and the codomain. In particular, verify that \begin{align*} \lteval{T}{\begin{bmatrix}1&10&-2\\3&-1&1\end{bmatrix}}&=\colvector{-7\\-14\\-1\\-13} & \lteval{T}{\begin{bmatrix}5&-3&-1\\5&3&3\end{bmatrix}}&=\colvector{-7\\-14\\-1\\-13}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \begin{bmatrix}5&-3&-1\\5&3&3\end{bmatrix} &=\begin{bmatrix}1&10&-2\\3&-1&1\end{bmatrix}+\begin{bmatrix}4&-13&1\\2&4&2\end{bmatrix}\\ \end{align*} and \begin{align*} \vect{z}&=\begin{bmatrix}4&-13&1\\2&4&2\end{bmatrix}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\colvector{1\\2\\1\\1},\, \colvector{2\\-1\\1\\2},\, \colvector{12\\-1\\7\\12},\, \colvector{-3\\1\\2\\0},\, \colvector{1\\0\\1\\5},\, \colvector{6\\-11\\-3\\-5}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{ \colvector{1\\0\\0\\0},\, \colvector{0\\1\\0\\0},\, \colvector{0\\0\\1\\0},\, \colvector{0\\0\\0\\1} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? Yes.

A basis for the range is the standard basis of \(\complex{4}\text{,}\) so \(\rng{T}=\complex{4}\) and Theorem RSLT tells us \(T\) is surjective. Or, the dimension of the range is 4, and the codomain (\(\complex{4}\)) has dimension 4. So the transformation is surjective.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=4&\text{nullity}&=2&\text{domain}&=6 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

The relative sizes of the domain and codomain mean the linear transformation cannot be injective. (Theorem ILTIS)

⬜  Matrix Representation  Matrix representation of the linear transformation, as given by Definition MR and explained by Theorem FTMR. \begin{equation*} \text{domain basis}=\set{\begin{bmatrix}1&0&0\\0&0&0\end{bmatrix},\, \begin{bmatrix}0&1&0\\0&0&0\end{bmatrix},\, \begin{bmatrix}0&0&1\\0&0&0\end{bmatrix},\, \begin{bmatrix}0&0&0\\1&0&0\end{bmatrix},\, \begin{bmatrix}0&0&0\\0&1&0\end{bmatrix},\, \begin{bmatrix}0&0&0\\0&0&1\end{bmatrix}} \end{equation*} \begin{equation*} \text{codomain basis}=\set{\colvector{1\\0\\0\\0},\, \colvector{0\\1\\0\\0},\, \colvector{0\\0\\1\\0},\, \colvector{0\\0\\0\\1}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} 1 & 2 & 12 & -3 & 1 & 6 \\ 2 & -1 & -1 & 1 & 0 & -11 \\ 1 & 1 & 7 & 2 & 1 & -3 \\ 1 & 2 & 12 & 0 & 5 & -5 \end{bmatrix} \end{equation*}

ArchetypeVArchetype V

⬜  Summary   Domain is polynomials, codomain is matrices. Both domain and codomain have dimension 4. Injective, surjective, invertible. Square matrix representation, but domain and codomain are unequal, so no eigenvalue information.
⬜  Definition  A linear transformation (Definition LT). \begin{equation*} \ltdefn{T}{P_3}{M_{22}},\quad\lteval{T}{a+bx+cx^2+dx^3}= \begin{bmatrix} a+b & a-2c\\ d & b-d \end{bmatrix} \end{equation*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\ } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? Yes.

Since the kernel is trivial Theorem KILT tells us that the linear transformation is injective.

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\begin{bmatrix}1&1\\0&0\end{bmatrix},\, \begin{bmatrix}1&0\\0&1\end{bmatrix},\, \begin{bmatrix}0&-2\\0&0\end{bmatrix},\, \begin{bmatrix}0&0\\1&-1\end{bmatrix}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{ \begin{bmatrix}1&0\\0&0\end{bmatrix},\, \begin{bmatrix}0&1\\0&0\end{bmatrix},\, \begin{bmatrix}0&0\\1&0\end{bmatrix},\, \begin{bmatrix}0&0\\0&1\end{bmatrix} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? Yes.

A basis for the range is the standard basis of \(M_{22}\text{,}\) so \(\rng{T}=M_{22}\) and Theorem RSLT tells us \(T\) is surjective. Or, the dimension of the range is 4, and the codomain (\(M_{22}\)) has dimension 4. So the transformation is surjective.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=4&\text{nullity}&=0&\text{domain}&=4 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? Yes.

Both injective and surjective (Theorem ILTIS). Notice that since the domain and codomain have the same dimension, either the transformation is both injective and surjective (making it invertible, as in this case) or else it is both not injective and not surjective.

⬜  Matrix Representation  Matrix representation of the linear transformation, as given by Definition MR and explained by Theorem FTMR. \begin{equation*} \text{domain basis}=\set{1,\,x,\,x^2,\,x^3} \end{equation*} \begin{equation*} \text{codomain basis}=\set{\begin{bmatrix}1&0\\0&0\end{bmatrix},\, \begin{bmatrix}0&1\\0&0\end{bmatrix},\, \begin{bmatrix}0&0\\1&0\end{bmatrix},\, \begin{bmatrix}0&0\\0&1\end{bmatrix}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} 1 & 1 & 0 & 0 \\ 1 & 0 & -2 & 0 \\ 0 & 0 & 0 & 1 \\ 0 & 1 & 0 & -1 \end{bmatrix} \end{equation*}
⬜  Inverse Linear Transformation  The inverse linear transformation (Definition IVLT). Verify that \(T^{-1}\left(T\left(\vect{x}\right)\right)=\vect{x}\) and \(T\left(T^{-1}\left(\vect{x}\right)\right)=\vect{x}\text{.}\) \begin{equation*} \ltdefn{\ltinverse{T}}{M_{22}}{P_3},\quad\lteval{\ltinverse{T}}{\begin{bmatrix}a&b\\c&d\end{bmatrix}}=(a - c - d)+ (c + d)x +\frac{1}{2}(a - b - c - d)x^2+cx^3 \end{equation*}

ArchetypeWArchetype W

⬜  Summary   Domain is polynomials, codomain is polynomials. Domain and codomain both have dimension 3. Injective, surjective, invertible, 3 distinct eigenvalues, diagonalizable.
⬜  Definition  A linear transformation (Definition LT). \begin{align*} &\ltdefn{T}{P_2}{P_2}& \lteval{T}{a+bx+cx^2}&= \left(19a+6b-4c\right)+ \left(-24a-7b+4c\right)x+ \left(36a+12b-9c\right)x^2 \end{align*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{\ } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? Yes.

Since the kernel is trivial Theorem KILT tells us that the linear transformation is injective.

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{19-24x+36x^2,\, 6-7x+12x^2,\, -4+4x-9x^2} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{1,\,x,\,x^2} \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? Yes.

A basis for the range is the standard basis of \(\complex{5}\text{,}\) so \(\rng{T}=\complex{5}\) and Theorem RSLT tells us \(T\) is surjective. Or, the dimension of the range is 5, and the codomain (\(\complex{5}\)) has dimension 5. So the transformation is surjective.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=3&\text{nullity}&=0&\text{domain}&=3 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? Yes.

Both injective and surjective (Theorem ILTIS). Notice that since the domain and codomain have the same dimension, either the transformation is both injective and surjective (making it invertible, as in this case) or else it is both not injective and not surjective.

⬜  Matrix Representation  Matrix representation of the linear transformation, as given by Definition MR and explained by Theorem FTMR. \begin{equation*} \text{domain basis}=\set{1,\,x,\,x^2} \end{equation*} \begin{equation*} \text{codomain basis}=\set{1,\,x,\,x^2} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} 19 & 6 & -4 \\ -24 & -7 & 4 \\ 36 & 12 & -9 \end{bmatrix} \end{equation*}
⬜  Inverse Linear Transformation  The inverse linear transformation (Definition IVLT). Verify that \(T^{-1}\left(T\left(\vect{x}\right)\right)=\vect{x}\) and \(T\left(T^{-1}\left(\vect{x}\right)\right)=\vect{x}\text{.}\) \begin{equation*} \ltdefn{\ltinverse{T}}{P_2}{P_2},\quad \lteval{\ltinverse{T}}{a+bx+cx^2} = (-5a-2b+\frac{4}{3}c)+ (24a+9b-\frac{20}{3}c)x + (12a+4b-\frac{11}{3}c)x^2 \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EELT, Theorem EER). Evaluate the linear transformation with each eigenvector as an interesting check. \begin{align*} \eigensystem{T}{-1}{2x+3x^2}\\ \eigensystem{T}{1}{-1+3x}\\ \eigensystem{T}{3}{1-2x+x^2}\\ %& forces align environment \end{align*}
⬜  Diagonal Matrix Representation  A diagonal matrix representation relative to a basis of eigenvectors. \begin{equation*} \text{basis}=\set{2x+3x^2,\,-1+3x,\,1-2x+x^2} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} -1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 3 \end{bmatrix} \end{equation*}

ArchetypeXArchetype X

⬜  Summary   Domain and codomain are square matrices. Domain and codomain both have dimension 4. Not injective, not surjective, not invertible, 3 distinct eigenvalues, diagonalizable.
⬜  Definition  A linear transformation (Definition LT). \begin{align*} &\ltdefn{T}{M_{22}}{M_{22}}& \lteval{T}{\begin{bmatrix}a&b\\c&d\end{bmatrix}}&= \begin{bmatrix} -2a+15b+3c+27d & 10b+6c+18d \\ a-5b -9d & -a-4b-5c-8d \end{bmatrix} \end{align*}
⬜  Kernel  A basis for the kernel of the linear transformation (Definition KLT). \begin{equation*} \set{ \begin{bmatrix} -6 & -3 \\ 2 & 1 \end{bmatrix} } \end{equation*}
⬜  Injective?  Is the linear transformation injective (Definition ILT)? No.

Since the kernel is nontrivial Theorem KILT tells us that the linear transformation is not injective. In particular, verify that \begin{align*} \lteval{T}{\begin{bmatrix}-2&0\\1&-4\end{bmatrix}}&=\begin{bmatrix}115&78\\-38&-35\end{bmatrix} & \lteval{T}{\begin{bmatrix}4&3\\-1&3\end{bmatrix}}&=\begin{bmatrix}115&78\\-38&-35\end{bmatrix}\text{.} \end{align*} This demonstration that \(T\) is not injective is constructed with the observation that \begin{align*} \begin{bmatrix}4&3\\-1&3\end{bmatrix} &=\begin{bmatrix}-2&0\\1&-4\end{bmatrix}+\begin{bmatrix}6&3\\-2&-1\end{bmatrix}\\ \end{align*} and \begin{align*} \vect{z}&=\begin{bmatrix}6&3\\-2&-1\end{bmatrix}\in\krn{T} \end{align*} so the vector \(\vect{z}\) effectively “does nothing” in the evaluation of \(T\text{.}\)

⬜  Spanning Set for Range  A spanning set for the range of a linear transformation (Definition RLT) can be constructed easily by evaluating the linear transformation on a standard basis (Theorem SSRLT). \begin{equation*} \set{\begin{bmatrix}-2&0\\1&-1\end{bmatrix},\, \begin{bmatrix}15&10\\-5&-4\end{bmatrix},\, \begin{bmatrix}3&6\\0&-5\end{bmatrix},\, \begin{bmatrix}27&18\\-9&-8\end{bmatrix}} \end{equation*}
⬜  Range  A basis for the range of the linear transformation (Definition RLT). If the linear transformation is injective, then the spanning set just constructed is guaranteed to be linearly independent (Theorem ILTLI) and is therefore a basis of the range with no changes. Injective or not, this spanning set can be converted to a “nice” linearly independent spanning set by making the vectors the rows of a matrix (perhaps after using a vector representation), row-reducing, and retaining the nonzero rows (Theorem BRS), and perhaps un-coordinatizing. \begin{equation*} \set{ \begin{bmatrix}1 & 0 \\ -\frac{1}{2} & 0\end{bmatrix},\, \begin{bmatrix}0 & 1 \\ \frac{1}{4} & 0\end{bmatrix},\, \begin{bmatrix}0 & 0 \\ 0 & 1\end{bmatrix} } \end{equation*}
⬜  Surjective?  Is the linear transformation surjective (Definition SLT)? No.

The dimension of the range is 3, and the codomain (\(M_{22}\)) has dimension 5. So \(\rng{T}\neq M_{22}\) and by Theorem RSLT the transformation is not surjective.

To be more precise, verify that \(\begin{bmatrix}2 & 4\\ 3 & 1\end{bmatrix}\not\in\rng{T}\text{,}\) by setting the output of \(T\) equal to this matrix and seeing that the resulting system of linear equations has no solution, i.e. is inconsistent. So the preimage, \(\preimage{T}{\begin{bmatrix}2 & 4\\ 3 & 1\end{bmatrix}}\text{,}\) is empty. This alone is sufficient to see that the linear transformation is not onto.

⬜  Subspace Dimensions  Subspace dimensions associated with the linear transformation (Definition ROLT, Definition NOLT). Verify Theorem RPNDD, and examine parallels with earlier results for matrices. \begin{align*} \text{rank}&=3&\text{nullity}&=1&\text{domain}&=4 \end{align*}
⬜  Invertible?  Is the linear transformation invertible (Definition IVLT, and examine parallels with the existence of matrix inverses.)? No.

Neither injective nor surjective (Theorem ILTIS). Notice that since the domain and codomain have the same dimension, either the transformation is both injective and surjective or else it is both not injective and not surjective (making it not invertible, as in this case).

⬜  Matrix Representation  Matrix representation of the linear transformation, as given by Definition MR and explained by Theorem FTMR. \begin{equation*} \text{domain basis}=\set{\begin{bmatrix}1&0\\0&0\end{bmatrix},\, \begin{bmatrix}0&1\\0&0\end{bmatrix},\, \begin{bmatrix}0&0\\1&0\end{bmatrix},\, \begin{bmatrix}0&0\\0&1\end{bmatrix}} \end{equation*} \begin{equation*} \text{codomain basis}=\set{\begin{bmatrix}1&0\\0&0\end{bmatrix},\, \begin{bmatrix}0&1\\0&0\end{bmatrix},\, \begin{bmatrix}0&0\\1&0\end{bmatrix},\, \begin{bmatrix}0&0\\0&1\end{bmatrix}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} -2 & 15 & 3 & 27 \\ 0 & 10 & 6 & 18 \\ 1 & -5 & 0 & -9 \\ -1 & -4 & -5 & -8 \end{bmatrix} \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EELT, Theorem EER). Evaluate the linear transformation with each eigenvector as an interesting check. \begin{align*} \eigensystem{T}{0}{\begin{bmatrix}-6 & -3 \\ 2 & 1\end{bmatrix}}\\ \eigensystem{T}{1}{\begin{bmatrix}-7 & -2\\3 & 0\end{bmatrix},\,\begin{bmatrix}-1 & -2 \\ 0 & 1\end{bmatrix}}\\ \eigensystem{T}{3}{\begin{bmatrix}-3 & -2 \\ 1 & 1\end{bmatrix}}\\ %& forces align environment \end{align*}
⬜  Diagonal Matrix Representation  A diagonal matrix representation relative to a basis of eigenvectors. \begin{equation*} \text{basis}=\set{\begin{bmatrix}-6 & -3 \\ 2 & 1\end{bmatrix},\, \begin{bmatrix}-7 & -2 \\ 3 & 0\end{bmatrix},\, \begin{bmatrix}-1 & -2 \\ 0 & 1\end{bmatrix},\, \begin{bmatrix}-3 & -2 \\ 1 & 1\end{bmatrix}} \end{equation*} \begin{equation*} \text{matrix representation}=\begin{bmatrix} 0 & 0 & 0 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 3 & 0 \\ 0 & 0 & 0 & 3 \end{bmatrix} \end{equation*}