Archetype B   

From A First Course in Linear Algebra
Version 2.20
© 2004.
Licensed under the GNU Free Documentation License.
http://linear.ups.edu/

Summary  System with three equations, three unknowns. Nonsingular coefficient matrix. Distinct integer eigenvalues for coefficient matrix.

A system of linear equations (Definition SLE):

\eqalignno{ − 7{x}_{1} − 6{x}_{2} − 12{x}_{3} & = −33 & & \cr 5{x}_{1} + 5{x}_{2} + 7{x}_{3} & = 24 & & \cr {x}_{1} + 4{x}_{3} & = 5 & & }

Some solutions to the system of linear equations (not necessarily exhaustive):
{x}_{1} = −3,\quad {x}_{2} = 5,\quad {x}_{3} = 2

Augmented matrix of the linear system of equations (Definition AM):

\eqalignno{ \left [\array{ −7&−6&−12&−33\cr 5 & 5 & 7 & 24 \cr 1 & 0 & 4 & 5 } \right ]&&&&&& }

Matrix in reduced row-echelon form, row-equivalent to augmented matrix:

\eqalignno{ \left [\array{ \text{1}&0&0&−3\cr 0&\text{1 } &0 & 5 \cr 0&0&\text{1}& 2 } \right ]&&&&&& }

Analysis of the augmented matrix (Notation RREFA):

\eqalignno{ r = 3 & &D = \left \{1,\kern 1.95872pt 2,\kern 1.95872pt 3\right \} & &F = \left \{4\right \} & & & & & & }

Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set F above. Also, notice the pattern of 0’s and 1’s in the entries of the vectors corresponding to elements of the set F for the larger examples.
\left [\array{ {x}_{1} \cr {x}_{2} \cr {x}_{3} } \right ] = \left [\array{ −3\cr 5 \cr 2 } \right ]

Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system.

\eqalignno{ − 11{x}_{1} + 2{x}_{2} − 14{x}_{3} & = 0 & & \cr 23{x}_{1} − 6{x}_{2} + 33{x}_{3} & = 0 & & \cr 14{x}_{1} − 2{x}_{2} + 17{x}_{3} & = 0 & & }

Some solutions to the associated homogenous system of linear equations (not necessarily exhaustive):
{x}_{1} = 0,\quad {x}_{2} = 0,\quad {x}_{3} = 0

Form the augmented matrix of the homogenous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros:

\eqalignno{ \left [\array{ \text{1}&0&0&0\cr 0&\text{1 } &0 &0 \cr 0&0&\text{1}&0 } \right ]&&&&&& }

Analysis of the augmented matrix for the homogenous system (Notation RREFA). Notice the slight variation for the same analysis of the original system only when the original system was consistent:

\eqalignno{ r = 3 & &D = \left \{1,\kern 1.95872pt 2,\kern 1.95872pt 3\right \} & &F = \left \{4\right \} & & & & & & }

Coefficient matrix of original system of equations, and of associated homogenous system. This matrix will be the subject of further analysis, rather than the systems of equations.

\eqalignno{ \left [\array{ −7&−6&−12\cr 5 & 5 & 7 \cr 1 & 0 & 4 } \right ]&&&&&& }

Matrix brought to reduced row-echelon form:

\eqalignno{ \left [\array{ \text{1}&0&0\cr 0&\text{1 } &0 \cr 0&0&\text{1}} \right ]&&&&&& }

Analysis of the row-reduced matrix (Notation RREFA):

\eqalignno{ r = 3 & &D = \left \{1,\kern 1.95872pt 2,\kern 1.95872pt 3\right \} & &F = \left \{\ \right \} & & & & & & }

Matrix (coefficient matrix) is nonsingular or singular? (Theorem NMRRI) at the same time, examine the size of the set F above.Notice that this property does not apply to matrices that are not square.
Nonsingular.

This is the null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve the homogenous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise.
\left \langle \left \{\ \right \}\right \rangle

Column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set D above. (Theorem BCS)
\left \langle \left \{\left [\array{ −7\cr 5 \cr 1 } \right ],\kern 1.95872pt \left [\array{ −6\cr 5 \cr 0 } \right ],\kern 1.95872pt \left [\array{ −12\cr 7 \cr 4 } \right ]\right \}\right \rangle

The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix L is computed as described in Definition EEF. This is followed by the column space described by a set of linearly independent vectors that span the null space of L, computed as according to Theorem FS and Theorem BNS. When r = m, the matrix L has no rows and the column space is all of {ℂ}^{m}.
L = \left [\array{ } \right ]
\left \langle \left \{\left [\array{ 1\cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 0\cr 1 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 0\cr 0 \cr 1 } \right ]\right \}\right \rangle

Column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by row-reducing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space.
\left \langle \left \{\left [\array{ 1\cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 0\cr 1 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 0\cr 0 \cr 1 } \right ]\right \}\right \rangle

Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the equivalent matrix in reduced row-echelon form. (Theorem BRS)
\left \langle \left \{\left [\array{ 1\cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 0\cr 1 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 0\cr 0 \cr 1 } \right ]\right \}\right \rangle

Inverse matrix, if it exists. The inverse is not defined for matrices that are not square, and if the matrix is square, then the matrix must be nonsingular. (Definition MI, Theorem NI)
\left [\array{ −10&−12&−9 \cr {13\over 2} & 8 & {11\over 2} \cr {5\over 2} & 3 & {5\over 2} } \right ]

Subspace dimensions associated with the matrix. (Definition NOM, Definition ROM) Verify Theorem RPNC

\eqalignno{ \text{Matrix columns: }3 & &\text{Rank: }3 & &\text{Nullity: }0 & & & & & & }

Determinant of the matrix, which is only defined for square matrices. The matrix is nonsingular if and only if the determinant is nonzero (Theorem SMZD). (Product of all eigenvalues?)
\text{Determinant } =\ −2

Eigenvalues, and bases for eigenspaces. (Definition EEM,Definition EM)

\eqalignno{ λ & = −1 &{ℰ}_{B}\left (−1\right ) & = \left \langle \left \{\left [\array{ −5\cr 3 \cr 1 } \right ]\right \}\right \rangle & & & & \cr λ & = 1 &{ℰ}_{B}\left (1\right ) & = \left \langle \left \{\left [\array{ −3\cr 2 \cr 1 } \right ]\right \}\right \rangle & & & & \cr λ & = 2 &{ℰ}_{B}\left (2\right ) & = \left \langle \left \{\left [\array{ −2\cr 1 \cr 1 } \right ]\right \}\right \rangle & & & & }

Geometric and algebraic multiplicities. (Definition GMEDefinition AME)

\eqalignno{ {γ}_{B}\left (−1\right ) & = 1 &{α}_{B}\left (−1\right ) & = 1 & & & & \cr {γ}_{B}\left (1\right ) & = 1 &{α}_{B}\left (1\right ) & = 1 & & & & \cr {γ}_{B}\left (2\right ) & = 1 &{α}_{B}\left (2\right ) & = 1 & & & & }

Diagonalizable? (Definition DZM)
Yes, distinct eigenvalues, Theorem DED.

The diagonalization. (Theorem DC)

\eqalignno{ &\left [\array{ −1&−1&−1\cr 2 & 3 & 1 \cr −1&−2& 1 } \right ]\left [\array{ −7&−6&−12\cr 5 & 5 & 7 \cr 1 & 0 & 4 } \right ]\left [\array{ −5&−3&−2\cr 3 & 2 & 1 \cr 1 & 1 & 1 } \right ] & & \cr & = \left [\array{ −1&0&0\cr 0 &1 &0 \cr 0 &0&2 } \right ] & & }