Section LI  Linear Independence

From A First Course in Linear Algebra
Version 2.99
© 2004.
Licensed under the GNU Free Documentation License.
http://linear.ups.edu/

Subsection LISV: Linearly Independent Sets of Vectors

Theorem SLSLC tells us that a solution to a homogeneous system of equations is a linear combination of the columns of the coefficient matrix that equals the zero vector. We used just this situation to our advantage (twice!) in Example SCAD where we reduced the set of vectors used in a span construction from four down to two, by declaring certain vectors as surplus. The next two definitions will allow us to formalize this situation.

Definition RLDCV
Relation of Linear Dependence for Column Vectors
Given a set of vectors S = \left \{{u}_{1},\kern 1.95872pt {u}_{2},\kern 1.95872pt {u}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {u}_{n}\right \}, a true statement of the form

{α}_{1}{u}_{1} + {α}_{2}{u}_{2} + {α}_{3}{u}_{3} + \mathrel{⋯} + {α}_{n}{u}_{n} = 0

is a relation of linear dependence on S. If this statement is formed in a trivial fashion, i.e. {α}_{i} = 0, 1 ≤ i ≤ n, then we say it is the trivial relation of linear dependence on S.

Definition LICV
Linear Independence of Column Vectors
The set of vectors S = \left \{{u}_{1},\kern 1.95872pt {u}_{2},\kern 1.95872pt {u}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {u}_{n}\right \} is linearly dependent if there is a relation of linear dependence on S that is not trivial. In the case where the only relation of linear dependence on S is the trivial one, then S is a linearly independent set of vectors.

Notice that a relation of linear dependence is an equation. Though most of it is a linear combination, it is not a linear combination (that would be a vector). Linear independence is a property of a set of vectors. It is easy to take a set of vectors, and an equal number of scalars, all zero, and form a linear combination that equals the zero vector. When the easy way is the only way, then we say the set is linearly independent. Here’s a couple of examples.

Example LDS
Linearly dependent set in {ℂ}^{5}
Consider the set of n = 4 vectors from {ℂ}^{5},

S = \left \{\left [\array{ 2\cr −1 \cr 3\cr 1 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 1\cr 2 \cr −1\cr 5 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 2\cr 1 \cr −3\cr 6 \cr 1 } \right ],\kern 1.95872pt \left [\array{ −6\cr 7 \cr −1\cr 0 \cr 1 } \right ]\right \}

To determine linear independence we first form a relation of linear dependence,

{ α}_{1}\left [\array{ 2\cr −1 \cr 3\cr 1 \cr 2 } \right ]+{α}_{2}\left [\array{ 1\cr 2 \cr −1\cr 5 \cr 2 } \right ]+{α}_{3}\left [\array{ 2\cr 1 \cr −3\cr 6 \cr 1 } \right ]+{α}_{4}\left [\array{ −6\cr 7 \cr −1\cr 0 \cr 1 } \right ] = 0

We know that {α}_{1} = {α}_{2} = {α}_{3} = {α}_{4} = 0 is a solution to this equation, but that is of no interest whatsoever. That is always the case, no matter what four vectors we might have chosen. We are curious to know if there are other, nontrivial, solutions. Theorem SLSLC tells us that we can find such solutions as solutions to the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) where the coefficient matrix has these four vectors as columns,

A = \left [\array{ 2 & 1 & 2 &−6\cr −1 & 2 & 1 & 7 \cr 3 &−1&−3&−1\cr 1 & 5 & 6 & 0 \cr 2 & 2 & 1 & 1 } \right ]

Row-reducing this coefficient matrix yields,

\left [\array{ \text{1}&0&0&−2\cr 0&\text{1 } &0 & 4 \cr 0&0&\text{1}&−3\cr 0&0 &0 & 0 \cr 0&0&0& 0 } \right ]

We could solve this homogeneous system completely, but for this example all we need is one nontrivial solution. Setting the lone free variable to any nonzero value, such as {x}_{4} = 1, yields the nontrivial solution

x = \left [\array{ 2\cr −4 \cr 3\cr 1 } \right ]

completing our application of Theorem SLSLC, we have

2\left [\array{ 2\cr −1 \cr 3\cr 1 \cr 2 } \right ]+(−4)\left [\array{ 1\cr 2 \cr −1\cr 5 \cr 2 } \right ]+3\left [\array{ 2\cr 1 \cr −3\cr 6 \cr 1 } \right ]+1\left [\array{ −6\cr 7 \cr −1\cr 0 \cr 1 } \right ] = 0

This is a relation of linear dependence on S that is not trivial, so we conclude that S is linearly dependent.

Example LIS
Linearly independent set in {ℂ}^{5}
Consider the set of n = 4 vectors from {ℂ}^{5},

T = \left \{\left [\array{ 2\cr −1 \cr 3\cr 1 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 1\cr 2 \cr −1\cr 5 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 2\cr 1 \cr −3\cr 6 \cr 1 } \right ],\kern 1.95872pt \left [\array{ −6\cr 7 \cr −1\cr 1 \cr 1 } \right ]\right \}

To determine linear independence we first form a relation of linear dependence,

{ α}_{1}\left [\array{ 2\cr −1 \cr 3\cr 1 \cr 2 } \right ]+{α}_{2}\left [\array{ 1\cr 2 \cr −1\cr 5 \cr 2 } \right ]+{α}_{3}\left [\array{ 2\cr 1 \cr −3\cr 6 \cr 1 } \right ]+{α}_{4}\left [\array{ −6\cr 7 \cr −1\cr 1 \cr 1 } \right ] = 0

We know that {α}_{1} = {α}_{2} = {α}_{3} = {α}_{4} = 0 is a solution to this equation, but that is of no interest whatsoever. That is always the case, no matter what four vectors we might have chosen. We are curious to know if there are other, nontrivial, solutions. Theorem SLSLC tells us that we can find such solutions as solution to the homogeneous system ℒS\kern -1.95872pt \left (B,\kern 1.95872pt 0\right ) where the coefficient matrix has these four vectors as columns. Row-reducing this coefficient matrix yields,

\eqalignno{ B = \left [\array{ 2 & 1 & 2 &−6\cr −1 & 2 & 1 & 7 \cr 3 &−1&−3&−1\cr 1 & 5 & 6 & 1 \cr 2 & 2 & 1 & 1 } \right ] &\mathop{\longrightarrow}\limits_{}^{\text{RREF}}\left [\array{ \text{1}&0&0&0\cr 0&\text{1 } &0 &0 \cr 0&0&\text{1}&0\cr 0&0 &0 &\text{1} \cr 0&0&0&0 } \right ] & & }

From the form of this matrix, we see that there are no free variables, so the solution is unique, and because the system is homogeneous, this unique solution is the trivial solution. So we now know that there is but one way to combine the four vectors of T into a relation of linear dependence, and that one way is the easy and obvious way. In this situation we say that the set, T, is linearly independent.

Example LDS and Example LIS relied on solving a homogeneous system of equations to determine linear independence. We can codify this process in a time-saving theorem.

Theorem LIVHS
Linearly Independent Vectors and Homogeneous Systems
Suppose that A is an m × n matrix and S = \left \{{A}_{1},\kern 1.95872pt {A}_{2},\kern 1.95872pt {A}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {A}_{n}\right \} is the set of vectors in {ℂ}^{m} that are the columns of A. Then S is a linearly independent set if and only if the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) has a unique solution.

Proof   () Suppose that ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) has a unique solution. Since it is a homogeneous system, this solution must be the trivial solution x = 0. By Theorem SLSLC, this means that the only relation of linear dependence on S is the trivial one. So S is linearly independent.

() We will prove the contrapositive. Suppose that ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) does not have a unique solution. Since it is a homogeneous system, it is consistent (Theorem HSC), and so must have infinitely many solutions (Theorem PSSLS). One of these infinitely many solutions must be nontrivial (in fact, almost all of them are), so choose one. By Theorem SLSLC this nontrivial solution will give a nontrivial relation of linear dependence on S, so we can conclude that S is a linearly dependent set.

Since Theorem LIVHS is an equivalence, we can use it to determine the linear independence or dependence of any set of column vectors, just by creating a corresponding matrix and analyzing the row-reduced form. Let’s illustrate this with two more examples.

Example LIHS
Linearly independent, homogeneous system
Is the set of vectors

S = \left \{\left [\array{ 2\cr −1 \cr 3\cr 4 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 6\cr 2 \cr −1\cr 3 \cr 4 } \right ],\kern 1.95872pt \left [\array{ 4\cr 3 \cr −4\cr 5 \cr 1 } \right ]\right \}

linearly independent or linearly dependent?

Theorem LIVHS suggests we study the matrix whose columns are the vectors in S,

A = \left [\array{ 2 & 6 & 4\cr −1 & 2 & 3 \cr 3 &−1&−4\cr 4 & 3 & 5 \cr 2 & 4 & 1 } \right ]

Specifically, we are interested in the size of the solution set for the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ). Row-reducing A, we obtain

\left [\array{ \text{1}&0&0\cr 0&\text{1 } &0 \cr 0&0&\text{1}\cr 0&0 &0 \cr 0&0&0} \right ]

Now, r = 3, so there are n − r = 3 − 3 = 0 free variables and we see that ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) has a unique solution (Theorem HSC, Theorem FVCS). By Theorem LIVHS, the set S is linearly independent.

Example LDHS
Linearly dependent, homogeneous system
Is the set of vectors

S = \left \{\left [\array{ 2\cr −1 \cr 3\cr 4 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 6\cr 2 \cr −1\cr 3 \cr 4 } \right ],\kern 1.95872pt \left [\array{ 4\cr 3 \cr −4\cr −1 \cr 2 } \right ]\right \}

linearly independent or linearly dependent?

Theorem LIVHS suggests we study the matrix whose columns are the vectors in S,

A = \left [\array{ 2 & 6 & 4\cr −1 & 2 & 3 \cr 3 &−1&−4\cr 4 & 3 &−1 \cr 2 & 4 & 2 } \right ]

Specifically, we are interested in the size of the solution set for the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ). Row-reducing A, we obtain

\left [\array{ \text{1}&0&−1\cr 0&\text{1 } & 1 \cr 0&0& 0\cr 0&0 & 0 \cr 0&0& 0 } \right ]

Now, r = 2, so there are n − r = 3 − 2 = 1 free variables and we see that ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) has infinitely many solutions (Theorem HSC, Theorem FVCS). By Theorem LIVHS, the set S is linearly dependent.

As an equivalence, Theorem LIVHS gives us a straightforward way to determine if a set of vectors is linearly independent or dependent.

Review Example LIHS and Example LDHS. They are very similar, differing only in the last two slots of the third vector. This resulted in slightly different matrices when row-reduced, and slightly different values of r, the number of nonzero rows. Notice, too, that we are less interested in the actual solution set, and more interested in its form or size. These observations allow us to make a slight improvement in Theorem LIVHS.

Theorem LIVRN
Linearly Independent Vectors, r and n
Suppose that A is an m × n matrix and S = \left \{{A}_{1},\kern 1.95872pt {A}_{2},\kern 1.95872pt {A}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {A}_{n}\right \} is the set of vectors in {ℂ}^{m} that are the columns of A. Let B be a matrix in reduced row-echelon form that is row-equivalent to A and let r denote the number of non-zero rows in B. Then S is linearly independent if and only if n = r.

Proof   Theorem LIVHS says the linear independence of S is equivalent to the homogeneous linear system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) having a unique solution. Since ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) is consistent (Theorem HSC) we can apply Theorem CSRN to see that the solution is unique exactly when n = r.

So now here’s an example of the most straightforward way to determine if a set of column vectors in linearly independent or linearly dependent. While this method can be quick and easy, don’t forget the logical progression from the definition of linear independence through homogeneous system of equations which makes it possible.

Example LDRN
Linearly dependent, r < n
Is the set of vectors

S = \left \{\left [\array{ 2\cr −1 \cr 3\cr 1 \cr 0\cr 3 } \right ],\kern 1.95872pt \left [\array{ 9\cr −6 \cr −2\cr 3 \cr 2\cr 1 } \right ],\kern 1.95872pt \left [\array{ 1\cr 1 \cr 1\cr 0 \cr 0\cr 1 } \right ],\kern 1.95872pt \left [\array{ −3\cr 1 \cr 4\cr 2 \cr 1\cr 2 } \right ],\kern 1.95872pt \left [\array{ 6\cr −2 \cr 1\cr 4 \cr 3\cr 2 } \right ]\right \}

linearly independent or linearly dependent? Theorem LIVHS suggests we place these vectors into a matrix as columns and analyze the row-reduced version of the matrix,

\left [\array{ 2 & 9 &1&−3& 6\cr −1 &−6 &1 & 1 &−2 \cr 3 &−2&1& 4 & 1\cr 1 & 3 &0 & 2 & 4 \cr 0 & 2 &0& 1 & 3\cr 3 & 1 &1 & 2 & 2 } \right ]\mathop{\longrightarrow}\limits_{}^{\text{RREF}}\left [\array{ \text{1}&0&0&0&−1\cr 0&\text{1 } &0 &0 & 1 \cr 0&0&\text{1}&0& 2\cr 0&0 &0 &\text{1 } & 1 \cr 0&0&0&0& 0\cr 0&0 &0 &0 & 0 } \right ]

Now we need only compute that r = 4 < 5 = n to recognize, via Theorem LIVHS that S is a linearly dependent set. Boom!

Example LLDS
Large linearly dependent set in {ℂ}^{4}
Consider the set of n = 9 vectors from {ℂ}^{4},

R = \left \{\left [\array{ −1\cr 3 \cr 1\cr 2 } \right ],\kern 1.95872pt \left [\array{ 7\cr 1 \cr −3\cr 6 } \right ],\kern 1.95872pt \left [\array{ 1\cr 2 \cr −1\cr −2 } \right ],\kern 1.95872pt \left [\array{ 0\cr 4 \cr 2\cr 9 } \right ],\kern 1.95872pt \left [\array{ 5\cr −2 \cr 4\cr 3 } \right ],\kern 1.95872pt \left [\array{ 2\cr 1 \cr −6\cr 4 } \right ],\kern 1.95872pt \left [\array{ 3\cr 0 \cr −3\cr 1 } \right ],\kern 1.95872pt \left [\array{ 1\cr 1 \cr 5\cr 3 } \right ],\kern 1.95872pt \left [\array{ −6\cr −1 \cr 1\cr 1 } \right ]\right \}.

To employ Theorem LIVHS, we form a 4 × 9 coefficient matrix, C,

C = \left [\array{ −1& 7 & 1 &0& 5 & 2 & 3 &1&−6\cr 3 & 1 & 2 &4 &−2 & 1 & 0 &1 &−1 \cr 1 &−3&−1&2& 4 &−6&−3&5& 1\cr 2 & 6 &−2 &9 & 3 & 4 & 1 &3 & 1 } \right ].

To determine if the homogeneous system ℒS\kern -1.95872pt \left (C,\kern 1.95872pt 0\right ) has a unique solution or not, we would normally row-reduce this matrix. But in this particular example, we can do better. Theorem HMVEI tells us that since the system is homogeneous with n = 9 variables in m = 4 equations, and n > m, there must be infinitely many solutions. Since there is not a unique solution, Theorem LIVHS says the set is linearly dependent.

The situation in Example LLDS is slick enough to warrant formulating as a theorem.

Theorem MVSLD
More Vectors than Size implies Linear Dependence
Suppose that S = \left \{{u}_{1},\kern 1.95872pt {u}_{2},\kern 1.95872pt {u}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {u}_{n}\right \} is the set of vectors in {ℂ}^{m}, and that n > m. Then S is a linearly dependent set.

Proof   Form the m × n coefficient matrix A that has the column vectors {u}_{i}, 1 ≤ i ≤ n as its columns. Consider the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ). By Theorem HMVEI this system has infinitely many solutions. Since the system does not have a unique solution, Theorem LIVHS says the columns of A form a linearly dependent set, which is the desired conclusion.

Subsection LINM: Linear Independence and Nonsingular Matrices

We will now specialize to sets of n vectors from {ℂ}^{n}. This will put Theorem MVSLD off-limits, while Theorem LIVHS will involve square matrices. Let’s begin by contrasting Archetype A and Archetype B.

Example LDCAA
Linearly dependent columns in Archetype A
Archetype A is a system of linear equations with coefficient matrix,

A = \left [\array{ 1&−1&2\cr 2& 1 &1 \cr 1& 1 &0 } \right ]

Do the columns of this matrix form a linearly independent or dependent set? By Example S we know that A is singular. According to the definition of nonsingular matrices, Definition NM, the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) has infinitely many solutions. So by Theorem LIVHS, the columns of A form a linearly dependent set.

Example LICAB
Linearly independent columns in Archetype B
Archetype B is a system of linear equations with coefficient matrix,

B = \left [\array{ −7&−6&−12\cr 5 & 5 & 7 \cr 1 & 0 & 4 } \right ]

Do the columns of this matrix form a linearly independent or dependent set? By Example NM we know that B is nonsingular. According to the definition of nonsingular matrices, Definition NM, the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) has a unique solution. So by Theorem LIVHS, the columns of B form a linearly independent set.

That Archetype A and Archetype B have opposite properties for the columns of their coefficient matrices is no accident. Here’s the theorem, and then we will update our equivalences for nonsingular matrices, Theorem NME1.

Theorem NMLIC
Nonsingular Matrices have Linearly Independent Columns
Suppose that A is a square matrix. Then A is nonsingular if and only if the columns of A form a linearly independent set.

Proof   This is a proof where we can chain together equivalences, rather than proving the two halves separately.

\eqalignno{ \text{$A$ nonsingular}&\kern 3.26288pt \mathrel{⇔}\kern 3.26288pt \text{$ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right )$ has a unique solution} &&\text{@(a href="fcla-jsmath-latestli21.html#definition.NM")Definition NM@(/a)} &&&& \cr &\kern 3.26288pt \mathrel{⇔}\kern 3.26288pt \text{columns of $A$ are linearly independent}&&\text{@(a href="#theorem.LIVHS")Theorem LIVHS@(/a)}&&&& \cr & && & }

Here’s an update to Theorem NME1.

Theorem NME2
Nonsingular Matrix Equivalences, Round 2
Suppose that A is a square matrix. The following are equivalent.

  1. A is nonsingular.
  2. A row-reduces to the identity matrix.
  3. The null space of A contains only the zero vector, N\kern -1.95872pt \left (A\right ) = \left \{0\right \}.
  4. The linear system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt b\right ) has a unique solution for every possible choice of b.
  5. The columns of A form a linearly independent set.

Proof   Theorem NMLIC is yet another equivalence for a nonsingular matrix, so we can add it to the list in Theorem NME1.

Subsection NSSLI: Null Spaces, Spans, Linear Independence

In Subsection SS.SSNS we proved Theorem SSNS which provided n − r vectors that could be used with the span construction to build the entire null space of a matrix. As we have hinted in Example SCAD, and as we will see again going forward, linearly dependent sets carry redundant vectors with them when used in building a set as a span. Our aim now is to show that the vectors provided by Theorem SSNS form a linearly independent set, so in one sense they are as efficient as possible a way to describe the null space. Notice that the vectors {z}_{j}, 1 ≤ j ≤ n − r first appear in the vector form of solutions to arbitrary linear systems (Theorem VFSLS). The exact same vectors appear again in the span construction in the conclusion of Theorem SSNS. Since this second theorem specializes to homogeneous systems the only real difference is that the vector c in Theorem VFSLS is the zero vector for a homogeneous system. Finally, Theorem BNS will now show that these same vectors are a linearly independent set. We’ll set the stage for the proof of this theorem with a moderately large example. Study the example carefully, as it will make it easier to understand the proof.

Example LINSB
Linear independence of null space basis
Suppose that we are interested in the null space of a 3 × 7 matrix, A, which row-reduces to

B = \left [\array{ \text{1}&0&−2&4&0&3& 9\cr 0&\text{1 } & 5 &6 &0 &7 & 1 \cr 0&0& 0 &0&\text{1}&8&−5 } \right ]

The set F = \left \{3,\kern 1.95872pt 4,\kern 1.95872pt 6,\kern 1.95872pt 7\right \} is the set of indices for our four free variables that would be used in a description of the solution set for the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ). Applying Theorem SSNS we can begin to construct a set of four vectors whose span is the null space of A, a set of vectors we will reference as T.

N\kern -1.95872pt \left (A\right ) = \left \langle T\right \rangle = \left \langle \left \{{z}_{1},\kern 1.95872pt {z}_{2},\kern 1.95872pt {z}_{3},\kern 1.95872pt {z}_{4}\right \}\right \rangle = \left \langle \left \{\left [\array{ \cr \cr 1 \cr 0\cr \cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ \cr \cr 0 \cr 1\cr \cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ \cr \cr 0 \cr 0\cr \cr 1 \cr 0 } \right ],\kern 1.95872pt \left [\array{ \cr \cr 0 \cr 0\cr \cr 0 \cr 1 } \right ]\right \}\right \rangle

So far, we have constructed as much of these individual vectors as we can, based just on the knowledge of the contents of the set F. This has allowed us to determine the entries in slots 3, 4, 6 and 7, while we have left slots 1, 2 and 5 blank. Without doing any more, lets ask if T is linearly independent? Begin with a relation of linear dependence on T, and see what we can learn about the scalars,

\eqalignno{ 0 & = {α}_{1}{z}_{1} + {α}_{2}{z}_{2} + {α}_{3}{z}_{3} + {α}_{4}{z}_{4} & & \cr \left [\array{ 0\cr 0 \cr 0\cr 0 \cr 0\cr 0 \cr 0 } \right ] & = {α}_{1}\left [\array{ \cr \cr 1 \cr 0\cr \cr 0 \cr 0 } \right ] + {α}_{2}\left [\array{ \cr \cr 0 \cr 1\cr \cr 0 \cr 0 } \right ] + {α}_{3}\left [\array{ \cr \cr 0 \cr 0\cr \cr 1 \cr 0 } \right ] + {α}_{4}\left [\array{ \cr \cr 0 \cr 0\cr \cr 0 \cr 1 } \right ] & & \cr & = \left [\array{ \cr \cr {α}_{ 1} \cr 0\cr \cr 0 \cr 0 } \right ] + \left [\array{ \cr \cr 0 \cr {α}_{2}\cr \cr 0\cr 0 } \right ] + \left [\array{ \cr \cr 0 \cr 0\cr \cr {α}_{ 3} \cr 0 } \right ] + \left [\array{ \cr \cr 0 \cr 0\cr \cr 0 \cr {α}_{4} } \right ] = \left [\array{ \cr \cr {α}_{ 1}\cr {α}_{ 2}\cr \cr {α}_{ 3}\cr {α}_{ 4} } \right ] & & }

Applying Definition CVE to the two ends of this chain of equalities, we see that {α}_{1} = {α}_{2} = {α}_{3} = {α}_{4} = 0. So the only relation of linear dependence on the set T is a trivial one. By Definition LICV the set T is linearly independent. The important feature of this example is how the “pattern of zeros and ones” in the four vectors led to the conclusion of linear independence.

The proof of Theorem BNS is really quite straightforward, and relies on the “pattern of zeros and ones” that arise in the vectors {z}_{i}, 1 ≤ i ≤ n − r in the entries that correspond to the free variables. Play along with Example LINSB as you study the proof. Also, take a look at Example VFSAD, Example VFSAI and Example VFSAL, especially at the conclusion of Step 2 (temporarily ignore the construction of the constant vector, c). This proof is also a good first example of how to prove a conclusion that states a set is linearly independent.

Theorem BNS
Basis for Null Spaces
Suppose that A is an m × n matrix, and B is a row-equivalent matrix in reduced row-echelon form with r nonzero rows. Let D = \{{d}_{1},\kern 1.95872pt {d}_{2},\kern 1.95872pt {d}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {d}_{r}\} and F = \{{f}_{1},\kern 1.95872pt {f}_{2},\kern 1.95872pt {f}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {f}_{n−r}\} be the sets of column indices where B does and does not (respectively) have leading 1’s. Construct the n − r vectors {z}_{j}, 1 ≤ j ≤ n − r of size n as

{ \left [{z}_{j}\right ]}_{i} = \left \{\array{ 1 \quad &\text{if $i ∈ F$, $i = {f}_{j}$} \cr 0 \quad &\text{if $i ∈ F$, $i\mathrel{≠}{f}_{j}$} \cr −{\left [B\right ]}_{k,{f}_{j}}\quad &\text{if $i ∈ D$, $i = {d}_{k}$} } \right .

Define the set S = \left \{{z}_{1},\kern 1.95872pt {z}_{2},\kern 1.95872pt {z}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {z}_{n−r}\right \}. Then

  1. N\kern -1.95872pt \left (A\right ) = \left \langle S\right \rangle .
  2. S is a linearly independent set.

Proof   Notice first that the vectors {z}_{j}, 1 ≤ j ≤ n − r are exactly the same as the n − r vectors defined in Theorem SSNS. Also, the hypotheses of Theorem SSNS are the same as the hypotheses of the theorem we are currently proving. So it is then simply the conclusion of Theorem SSNS that tells us that N\kern -1.95872pt \left (A\right ) = \left \langle S\right \rangle . That was the easy half, but the second part is not much harder. What is new here is the claim that S is a linearly independent set.

To prove the linear independence of a set, we need to start with a relation of linear dependence and somehow conclude that the scalars involved must all be zero, i.e. that the relation of linear dependence only happens in the trivial fashion. So to establish the linear independence of S, we start with

{α}_{1}{z}_{1} + {α}_{2}{z}_{2} + {α}_{3}{z}_{3} + \mathrel{⋯} + {α}_{n−r}{z}_{n−r} = 0.

For each j, 1 ≤ j ≤ n − r, consider the equality of the individual entries of the vectors on both sides of this equality in position {f}_{j},

\eqalignno{ 0 & ={ \left [0\right ]}_{{f}_{j}} & & & & \cr & ={ \left [{α}_{1}{z}_{1} + {α}_{2}{z}_{2} + {α}_{3}{z}_{3} + \mathrel{⋯} + {α}_{n−r}{z}_{n−r}\right ]}_{{f}_{j}} & &\text{@(a href="fcla-jsmath-latestli23.html#definition.CVE")Definition CVE@(/a)} & & & & \cr & ={ \left [{α}_{1}{z}_{1}\right ]}_{{f}_{j}} +{ \left [{α}_{2}{z}_{2}\right ]}_{{f}_{j}} +{ \left [{α}_{3}{z}_{3}\right ]}_{{f}_{j}} + \mathrel{⋯} +{ \left [{α}_{n−r}{z}_{n−r}\right ]}_{{f}_{j}} & &\text{@(a href="fcla-jsmath-latestli23.html#definition.CVA")Definition CVA@(/a)} & & & & \cr & = {α}_{1}{\left [{z}_{1}\right ]}_{{f}_{j}} + {α}_{2}{\left [{z}_{2}\right ]}_{{f}_{j}} + {α}_{3}{\left [{z}_{3}\right ]}_{{f}_{j}} + \mathrel{⋯}+ & & & & \cr &\quad \quad {α}_{j−1}{\left [{z}_{j−1}\right ]}_{{f}_{j}} + {α}_{j}{\left [{z}_{j}\right ]}_{{f}_{j}} + {α}_{j+1}{\left [{z}_{j+1}\right ]}_{{f}_{j}} + \mathrel{⋯}+ & & & & \cr &\quad \quad {α}_{n−r}{\left [{z}_{n−r}\right ]}_{{f}_{j}} & &\text{@(a href="fcla-jsmath-latestli23.html#definition.CVSM")Definition CVSM@(/a)} & & & & \cr & = {α}_{1}(0) + {α}_{2}(0) + {α}_{3}(0) + \mathrel{⋯}+ & & & & \cr &\quad \quad {α}_{j−1}(0) + {α}_{j}(1) + {α}_{j+1}(0) + \mathrel{⋯} + {α}_{n−r}(0) & &\text{Definition of ${z}_{j}$} & & & & \cr & = {α}_{j} & & & & }

So for all j, 1 ≤ j ≤ n − r, we have {α}_{j} = 0, which is the conclusion that tells us that the only relation of linear dependence on S = \left \{{z}_{1},\kern 1.95872pt {z}_{2},\kern 1.95872pt {z}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {z}_{n−r}\right \} is the trivial one. Hence, by Definition LICV the set is linearly independent, as desired.

Example NSLIL
Null space spanned by linearly independent set, Archetype L
In Example VFSAL we previewed Theorem SSNS by finding a set of two vectors such that their span was the null space for the matrix in Archetype L. Writing the matrix as L, we have

N\kern -1.95872pt \left (L\right ) = \left \langle \left \{\left [\array{ −1\cr 2 \cr −2\cr 1 \cr 0 } \right ],\kern 1.95872pt \left [\array{ 2\cr −2 \cr 1\cr 0 \cr 1 } \right ]\right \}\right \rangle

Solving the homogeneous system ℒS\kern -1.95872pt \left (L,\kern 1.95872pt 0\right ) resulted in recognizing {x}_{4} and {x}_{5} as the free variables. So look in entries 4 and 5 of the two vectors above and notice the pattern of zeros and ones that provides the linear independence of the set.

Subsection READ: Reading Questions

  1. Let S be the set of three vectors below.
    S = \left \{\left [\array{ 1\cr 2 \cr −1} \right ],\kern 1.95872pt \left [\array{ 3\cr −4 \cr 2} \right ],\kern 1.95872pt \left [\array{ 4\cr −2 \cr 1} \right ]\right \}

    Is S linearly independent or linearly dependent? Explain why.

  2. Let S be the set of three vectors below.
    S = \left \{\left [\array{ 1\cr −1 \cr 0} \right ],\kern 1.95872pt \left [\array{ 3\cr 2 \cr 2} \right ],\kern 1.95872pt \left [\array{ 4\cr 3 \cr −4} \right ]\right \}

    Is S linearly independent or linearly dependent? Explain why.

  3. Based on your answer to the previous question, is the matrix below singular or nonsingular? Explain.
    \left [\array{ 1 &3& 4\cr −1 &2 & 3 \cr 0 &2&−4} \right ]

Subsection EXC: Exercises

Determine if the sets of vectors in Exercises C20–C25 are linearly independent or linearly dependent. When the set is linearly dependent, exhibit a nontrivial relation of linear dependence.
C20 \left \{\left [\array{ 1\cr −2 \cr 1 } \right ],\kern 1.95872pt \left [\array{ 2\cr −1 \cr 3 } \right ],\kern 1.95872pt \left [\array{ 1\cr 5 \cr 0 } \right ]\right \}  
Contributed by Robert Beezer Solution [440]

C21 \left \{\left [\array{ −1\cr 2 \cr 4\cr 2 } \right ],\kern 1.95872pt \left [\array{ 3\cr 3 \cr −1\cr 3 } \right ],\kern 1.95872pt \left [\array{ 7\cr 3 \cr −6\cr 4 } \right ]\right \}  
Contributed by Robert Beezer Solution [440]

C22 \left \{\left [\array{ −2\cr −1 \cr −1 } \right ],\kern 1.95872pt \left [\array{ 1\cr 0 \cr −1 } \right ],\kern 1.95872pt \left [\array{ 3\cr 3 \cr 6 } \right ],\kern 1.95872pt \left [\array{ −5\cr −4 \cr −6 } \right ],\kern 1.95872pt \left [\array{ 4\cr 4 \cr 7 } \right ]\right \}  
Contributed by Robert Beezer Solution [441]

C23 \left \{\left [\array{ 1\cr −2 \cr 2\cr 5 \cr 3 } \right ],\kern 1.95872pt \left [\array{ 3\cr 3 \cr 1\cr 2 \cr −4 } \right ],\kern 1.95872pt \left [\array{ 2\cr 1 \cr 2\cr −1 \cr 1 } \right ],\kern 1.95872pt \left [\array{ 1\cr 0 \cr 1\cr 2 \cr 2 } \right ]\right \}  
Contributed by Robert Beezer Solution [441]

C24 \left \{\left [\array{ 1\cr 2 \cr −1\cr 0 \cr 1 } \right ],\kern 1.95872pt \left [\array{ 3\cr 2 \cr −1\cr 2 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 4\cr 4 \cr −2\cr 2 \cr 3 } \right ],\kern 1.95872pt \left [\array{ −1\cr 2 \cr −1\cr −2 \cr 0 } \right ]\right \}  
Contributed by Robert Beezer Solution [442]

C25 \left \{\left [\array{ 2\cr 1 \cr 3\cr −1 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 4\cr −2 \cr 1\cr 3 \cr 2 } \right ],\kern 1.95872pt \left [\array{ 10\cr −7 \cr 0\cr 10 \cr 4 } \right ]\right \}  
Contributed by Robert Beezer Solution [443]

C30 For the matrix B below, find a set S that is linearly independent and spans the null space of B, that is, N\kern -1.95872pt \left (B\right ) = \left \langle S\right \rangle .

B = \left [\array{ −3&1&−2& 7\cr −1 &2 & 1 & 4 \cr 1 &1& 2 &−1 } \right ]

 
Contributed by Robert Beezer Solution [444]

C31 For the matrix A below, find a linearly independent set S so that the null space of A is spanned by S, that is, N\kern -1.95872pt \left (A\right ) = \left \langle S\right \rangle .

A = \left [\array{ −1&−2&2&1&5\cr 1 & 2 &1 &1 &5 \cr 3 & 6 &1&2&7\cr 2 & 4 &0 &1 &2 } \right ]

 
Contributed by Robert Beezer Solution [445]

C32 Find a set of column vectors, T, such that (1) the span of T is the null space of B, \left \langle T\right \rangle = N\kern -1.95872pt \left (B\right ) and (2) T is a linearly independent set.

\eqalignno{ B & = \left [\array{ 2 & 1 & 1 & 1\cr −4 &−3 & 1 &−7 \cr 1 & 1 &−1& 3 } \right ] & & }

 
Contributed by Robert Beezer Solution [447]

C33 Find a set S so that S is linearly independent and N\kern -1.95872pt \left (A\right ) = \left \langle S\right \rangle , where N\kern -1.95872pt \left (A\right ) is the null space of the matrix A below.

\eqalignno{ A & = \left [\array{ 2&3& 3 & 1 & 4\cr 1&1 &−1 &−1 &−3 \cr 3&2&−8&−1& 1 } \right ] & & }

 
Contributed by Robert Beezer Solution [448]

C50 Consider each archetype that is a system of equations and consider the solutions listed for the homogeneous version of the archetype. (If only the trivial solution is listed, then assume this is the only solution to the system.) From the solution set, determine if the columns of the coefficient matrix form a linearly independent or linearly dependent set. In the case of a linearly dependent set, use one of the sample solutions to provide a nontrivial relation of linear dependence on the set of columns of the coefficient matrix (Definition RLD). Indicate when Theorem MVSLD applies and connect this with the number of variables and equations in the system of equations.
Archetype A
Archetype B
Archetype C
Archetype D/Archetype E
Archetype F
Archetype G/Archetype H
Archetype I
Archetype J

 
Contributed by Robert Beezer

C51 For each archetype that is a system of equations consider the homogeneous version. Write elements of the solution set in vector form (Theorem VFSLS) and from this extract the vectors {z}_{j} described in Theorem BNS. These vectors are used in a span construction to describe the null space of the coefficient matrix for each archetype. What does it mean when we write a null space as \left \langle \left \{\ \right \}\right \rangle ?
Archetype A
Archetype B
Archetype C
Archetype D/Archetype E
Archetype F
Archetype G/Archetype H
Archetype I
Archetype J

 
Contributed by Robert Beezer

C52 For each archetype that is a system of equations consider the homogeneous version. Sample solutions are given and a linearly independent spanning set is given for the null space of the coefficient matrix. Write each of the sample solutions individually as a linear combination of the vectors in the spanning set for the null space of the coefficient matrix.
Archetype A
Archetype B
Archetype C
Archetype D/Archetype E
Archetype F
Archetype G/Archetype H
Archetype I
Archetype J

 
Contributed by Robert Beezer

C60 For the matrix A below, find a set of vectors S so that (1) S is linearly independent, and (2) the span of S equals the null space of A, \left \langle S\right \rangle = N\kern -1.95872pt \left (A\right ). (See Exercise SS.C60.)

A = \left [\array{ 1 & 1 & 6 &−8\cr 1 &−2 & 0 & 1 \cr −2& 1 &−6& 7 } \right ]

 
Contributed by Robert Beezer Solution [449]

M20 Suppose that S = \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3}\right \} is a set of three vectors from {ℂ}^{873}. Prove that the set

\eqalignno{ T & = \left \{2{v}_{1} + 3{v}_{2} + {v}_{3},\kern 1.95872pt {v}_{1} − {v}_{2} − 2{v}_{3},\kern 1.95872pt 2{v}_{1} + {v}_{2} − {v}_{3}\right \} & & }

is linearly dependent.  
Contributed by Robert Beezer Solution [450]

M21 Suppose that S = \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3}\right \} is a linearly independent set of three vectors from {ℂ}^{873}. Prove that the set

\eqalignno{ T & = \left \{2{v}_{1} + 3{v}_{2} + {v}_{3},\kern 1.95872pt {v}_{1} − {v}_{2} + 2{v}_{3},\kern 1.95872pt 2{v}_{1} + {v}_{2} − {v}_{3}\right \} & & }

is linearly independent.  
Contributed by Robert Beezer Solution [452]

M50 Consider the set of vectors from {ℂ}^{3}, W, given below. Find a set T that contains three vectors from W and such that W = \left \langle T\right \rangle .

W = \left \langle \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4},\kern 1.95872pt {v}_{5}\right \}\right \rangle = \left \langle \left \{\left [\array{ 2\cr 1 \cr 1 } \right ],\kern 1.95872pt \left [\array{ −1\cr −1 \cr 1 } \right ],\kern 1.95872pt \left [\array{ 1\cr 2 \cr 3 } \right ],\kern 1.95872pt \left [\array{ 3\cr 1 \cr 3 } \right ],\kern 1.95872pt \left [\array{ 0\cr 1 \cr −3 } \right ]\right \}\right \rangle

 
Contributed by Robert Beezer Solution [454]

M51 Consider the subspace W = \left \langle \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4}\right \}\right \rangle . Find a set S so that (1) S is a subset of W, (2) S is linearly independent, and (3) W = \left \langle S\right \rangle . Write each vector not included in S as a linear combination of the vectors that are in S.

\eqalignno{ {v}_{1} & = \left [\array{ 1\cr −1 \cr 2 } \right ] &{v}_{2} & = \left [\array{ 4\cr −4 \cr 8 } \right ] &{v}_{3} & = \left [\array{ −3\cr 2 \cr −7 } \right ] &{v}_{4} & = \left [\array{ 2\cr 1 \cr 7 } \right ] & & & & & & & & }

 
Contributed by Manley Perkel Solution [456]

T10 Prove that if a set of vectors contains the zero vector, then the set is linearly dependent. (Ed. “The zero vector is death to linearly independent sets.”)  
Contributed by Martin Jackson

T12 Suppose that S is a linearly independent set of vectors, and T is a subset of S, T ⊆ S (Definition SSET). Prove that T is linearly independent.  
Contributed by Robert Beezer

T13 Suppose that T is a linearly dependent set of vectors, and T is a subset of S, T ⊆ S (Definition SSET). Prove that S is linearly dependent.  
Contributed by Robert Beezer

T15 Suppose that \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {v}_{n}\right \} is a set of vectors. Prove that

\eqalignno{ \left \{{v}_{1} − {v}_{2},\kern 1.95872pt {v}_{2} − {v}_{3},\kern 1.95872pt {v}_{3} − {v}_{4},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {v}_{n} − {v}_{1}\right \} & & }

is a linearly dependent set.

 
Contributed by Robert Beezer Solution [457]

T20 Suppose that \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4}\right \} is a linearly independent set in {ℂ}^{35}. Prove that

\left \{{v}_{1},\kern 1.95872pt {v}_{1} + {v}_{2},\kern 1.95872pt {v}_{1} + {v}_{2} + {v}_{3},\kern 1.95872pt {v}_{1} + {v}_{2} + {v}_{3} + {v}_{4}\right \}

is a linearly independent set.  
Contributed by Robert Beezer Solution [458]

T50 Suppose that A is an m × n matrix with linearly independent columns and the linear system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt b\right ) is consistent. Show that this system has a unique solution. (Notice that we are not requiring A to be square.)  
Contributed by Robert Beezer Solution [460]

Subsection SOL: Solutions

C20 Contributed by Robert Beezer Statement [430]
With three vectors from {ℂ}^{3}, we can form a square matrix by making these three vectors the columns of a matrix. We do so, and row-reduce to obtain,

\left [\array{ \text{1}&0&0\cr 0&\text{1 } &0 \cr 0&0&\text{1}} \right ]

the 3 × 3 identity matrix. So by Theorem NME2 the original matrix is nonsingular and its columns are therefore a linearly independent set.

C21 Contributed by Robert Beezer Statement [430]
Theorem LIVRN says we can answer this question by putting theses vectors into a matrix as columns and row-reducing. Doing this we obtain,

\left [\array{ \text{1}&0&0\cr 0&\text{1 } &0 \cr 0&0&\text{1}\cr 0&0 &0} \right ]

With n = 3 (3 vectors, 3 columns) and r = 3 (3 leading 1’s) we have n = r and the theorem says the vectors are linearly independent.

C22 Contributed by Robert Beezer Statement [430]
Five vectors from {ℂ}^{3}. Theorem MVSLD says the set is linearly dependent. Boom.

C23 Contributed by Robert Beezer Statement [430]
Theorem LIVRN suggests we analyze a matrix whose columns are the vectors of S,

A = \left [\array{ 1 & 3 & 2 &1\cr −2 & 3 & 1 &0 \cr 2 & 1 & 2 &1\cr 5 & 2 &−1 &2 \cr 3 &−4& 1 &2 } \right ]

Row-reducing the matrix A yields,

\left [\array{ \text{1}&0&0&0\cr 0&\text{1 } &0 &0 \cr 0&0&\text{1}&0\cr 0&0 &0 &\text{1} \cr 0&0&0&0 } \right ]

We see that r = 4 = n, where r is the number of nonzero rows and n is the number of columns. By Theorem LIVRN, the set S is linearly independent.

C24 Contributed by Robert Beezer Statement [430]
Theorem LIVRN suggests we analyze a matrix whose columns are the vectors from the set,

A = \left [\array{ 1 & 3 & 4 &−1\cr 2 & 2 & 4 & 2 \cr −1&−1&−2&−1\cr 0 & 2 & 2 &−2 \cr 1 & 2 & 3 & 0 } \right ]

Row-reducing the matrix A yields,

\left [\array{ \text{1}&0&1& 2\cr 0&\text{1 } &1 &−1 \cr 0&0&0& 0\cr 0&0 &0 & 0 \cr 0&0&0& 0 } \right ]

We see that r = 2\mathrel{≠}4 = n, where r is the number of nonzero rows and n is the number of columns. By Theorem LIVRN, the set S is linearly dependent.

C25 Contributed by Robert Beezer Statement [430]
Theorem LIVRN suggests we analyze a matrix whose columns are the vectors from the set,

A = \left [\array{ 2 & 4 &10\cr 1 &−2 &−7 \cr 3 & 1 & 0\cr −1 & 3 & 10 \cr 2 & 2 & 4 } \right ]

Row-reducing the matrix A yields,

\left [\array{ \text{1}&0&−1\cr 0&\text{1 } & 3 \cr 0&0& 0\cr 0&0 & 0 \cr 0&0& 0 } \right ]

We see that r = 2\mathrel{≠}3 = n, where r is the number of nonzero rows and n is the number of columns. By Theorem LIVRN, the set S is linearly dependent.

C30 Contributed by Robert Beezer Statement [430]
The requested set is described by Theorem BNS. It is easiest to find by using the procedure of Example VFSAL. Begin by row-reducing the matrix, viewing it as the coefficient matrix of a homogeneous system of equations. We obtain,

\left [\array{ \text{1}&0&1&−2\cr 0&\text{1 } &1 & 1 \cr 0&0&0& 0 } \right ]

Now build the vector form of the solutions to this homogeneous system (Theorem VFSLS). The free variables are {x}_{3} and {x}_{4}, corresponding to the columns without leading 1’s,

\left [\array{ {x}_{1} \cr {x}_{2} \cr {x}_{3} \cr {x}_{4} } \right ] = {x}_{3}\left [\array{ −1\cr −1 \cr 1\cr 0 } \right ]+{x}_{4}\left [\array{ 2\cr −1 \cr 0\cr 1 } \right ]

The desired set S is simply the constant vectors in this expression, and these are the vectors {z}_{1} and {z}_{2} described by Theorem BNS.

S = \left \{\left [\array{ −1\cr −1 \cr 1\cr 0 } \right ],\kern 1.95872pt \left [\array{ 2\cr −1 \cr 0\cr 1 } \right ]\right \}

C31 Contributed by Robert Beezer Statement [431]
Theorem BNS provides formulas for n − r vectors that will meet the requirements of this question. These vectors are the same ones listed in Theorem VFSLS when we solve the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ), whose solution set is the null space (Definition NSM).

To apply Theorem BNS or Theorem VFSLS we first row-reduce the matrix, resulting in

B = \left [\array{ \text{1}&2&0&0& 3\cr 0&0 &\text{1 } &0 & 6 \cr 0&0&0&\text{1}&−4\cr 0&0 &0 &0 & 0 } \right ]

So we see that n − r = 5 − 3 = 2 and F = \left \{2, 5\right \}, so the vector form of a generic solution vector is

\left [\array{ {x}_{1} \cr {x}_{2} \cr {x}_{3} \cr {x}_{4} \cr {x}_{5} } \right ] = {x}_{2}\left [\array{ −2\cr 1 \cr 0\cr 0 \cr 0 } \right ]+{x}_{5}\left [\array{ −3\cr 0 \cr −6\cr 4 \cr 1 } \right ]

So we have

N\kern -1.95872pt \left (A\right ) = \left \langle \left \{\left [\array{ −2\cr 1 \cr 0\cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ −3\cr 0 \cr −6\cr 4 \cr 1 } \right ]\right \}\right \rangle

C32 Contributed by Robert Beezer Statement [431]
The conclusion of Theorem BNS gives us everything this question asks for. We need the reduced row-echelon form of the matrix so we can determine the number of vectors in T, and their entries.

\eqalignno{ \left [\array{ 2 & 1 & 1 & 1\cr −4 &−3 & 1 &−7 \cr 1 & 1 &−1& 3 } \right ] &\mathop{\longrightarrow}\limits_{}^{\text{RREF}}\left [\array{ \text{1}&0& 2 &−2\cr 0&\text{1 } &−3 & 5 \cr 0&0& 0 & 0 } \right ] & & }

We can build the set T in immediately via Theorem BNS, but we will illustrate its construction in two steps. Since F = \left \{3,\kern 1.95872pt 4\right \}, we will have two vectors and can distribute strategically placed ones, and many zeros. Then we distribute the negatives of the appropriate entries of the non-pivot columns of the reduced row-echelon matrix.

\eqalignno{ T & = \left \{\left [\array{ \cr \cr 1 \cr 0 } \right ],\kern 1.95872pt \left [\array{ \cr \cr 0 \cr 1 } \right ]\right \} &T & = \left \{\left [\array{ −2\cr 3 \cr 1\cr 0 } \right ],\kern 1.95872pt \left [\array{ 2\cr −5 \cr 0\cr 1 } \right ]\right \} & & & & }

C33 Contributed by Robert Beezer Statement [432]
A direct application of Theorem BNS will provide the desired set. We require the reduced row-echelon form of A.

\eqalignno{ \left [\array{ 2&3& 3 & 1 & 4\cr 1&1 &−1 &−1 &−3 \cr 3&2&−8&−1& 1 } \right ] &\mathop{\longrightarrow}\limits_{}^{\text{RREF}}\left [\array{ \text{1}&0&−6&0& 3\cr 0&\text{1 } & 5 &0 &−2 \cr 0&0& 0 &\text{1}& 4 } \right ] & & }

The non-pivot columns have indices F = \left \{3,\kern 1.95872pt 5\right \}. We build the desired set in two steps, first placing the requisite zeros and ones in locations based on F, then placing the negatives of the entries of columns 3 and 5 in the proper locations. This is all specified in Theorem BNS.

\eqalignno{ S & = \left \{\left [\array{ \cr \cr 1\cr \cr 0 } \right ],\kern 1.95872pt \left [\array{ \cr \cr 0\cr \cr 1 } \right ]\right \} = \left \{\left [\array{ 6\cr −5 \cr 1\cr 0 \cr 0 } \right ],\kern 1.95872pt \left [\array{ −3\cr 2 \cr 0\cr −4 \cr 1 } \right ]\right \} & & }

C60 Contributed by Robert Beezer Statement [434]
Theorem BNS says that if we find the vector form of the solutions to the homogeneous system ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ), then the fixed vectors (one per free variable) will have the desired properties. Row-reduce A, viewing it as the augmented matrix of a homogeneous system with an invisible columns of zeros as the last column,

\left [\array{ \text{1}&0&4&−5\cr 0&\text{1 } &2 &−3 \cr 0&0&0& 0 } \right ]

Moving to the vector form of the solutions (Theorem VFSLS), with free variables {x}_{3} and {x}_{4}, solutions to the consistent system (it is homogeneous, Theorem HSC) can be expressed as

\left [\array{ {x}_{1} \cr {x}_{2} \cr {x}_{3} \cr {x}_{4} } \right ] = {x}_{3}\left [\array{ −4\cr −2 \cr 1\cr 0 } \right ]+{x}_{4}\left [\array{ 5\cr 3 \cr 0\cr 1 } \right ]

Then with S given by

S = \left \{\left [\array{ −4\cr −2 \cr 1\cr 0 } \right ],\kern 1.95872pt \left [\array{ 5\cr 3 \cr 0\cr 1 } \right ]\right \}

Theorem BNS guarantees the set has the desired properties.

M20 Contributed by Robert Beezer Statement [435]
By Definition LICV, we can complete this problem by finding scalars, {α}_{1},\kern 1.95872pt {α}_{2},\kern 1.95872pt {α}_{3}, not all zero, such that

\eqalignno{ {α}_{1}\left (2{v}_{1} + 3{v}_{2} + {v}_{3}\right ) + {α}_{2}\left ({v}_{1} − {v}_{2} − 2{v}_{3}\right ) + {α}_{3}\left (2{v}_{1} + {v}_{2} − {v}_{3}\right ) & = 0 & & }

Using various properties in Theorem VSPCV, we can rearrange this vector equation to

\eqalignno{ \left (2{α}_{1} + {α}_{2} + 2{α}_{3}\right ){v}_{1} + \left (3{α}_{1} − {α}_{2} + {α}_{3}\right ){v}_{2} + \left ({α}_{1} − 2{α}_{2} − {α}_{3}\right ){v}_{3} & = 0 & & }

We can certainly make this vector equation true if we can determine values for the α’s such that

\eqalignno{ 2{α}_{1} + {α}_{2} + 2{α}_{3} & = 0 & & \cr 3{α}_{1} − {α}_{2} + {α}_{3} & = 0 & & \cr {α}_{1} − 2{α}_{2} − {α}_{3} & = 0 & & }

Aah, a homogeneous system of equations. And it has infinitely many non-zero solutions. By the now familiar techniques, one such solution is {α}_{1} = 3, {α}_{2} = 4, {α}_{3} = −5, which you can check in the original relation of linear dependence on T above.

Note that simply writing down the three scalars, and demonstrating that they provide a nontrivial relation of linear dependence on T, could be considered an ironclad solution. But it wouldn’t have been very informative for you if we had only done just that here. Compare this solution very carefully with Solution LI.M21.

M21 Contributed by Robert Beezer Statement [435]
By Definition LICV we can complete this problem by proving that if we assume that

\eqalignno{ {α}_{1}\left (2{v}_{1} + 3{v}_{2} + {v}_{3}\right ) + {α}_{2}\left ({v}_{1} − {v}_{2} + 2{v}_{3}\right ) + {α}_{3}\left (2{v}_{1} + {v}_{2} − {v}_{3}\right ) & = 0 & & }

then we must conclude that {α}_{1} = {α}_{2} = {α}_{3} = 0. Using various properties in Theorem VSPCV, we can rearrange this vector equation to

\eqalignno{ \left (2{α}_{1} + {α}_{2} + 2{α}_{3}\right ){v}_{1} + \left (3{α}_{1} − {α}_{2} + {α}_{3}\right ){v}_{2} + \left ({α}_{1} + 2{α}_{2} − {α}_{3}\right ){v}_{3} & = 0 & & }

Because the set S = \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3}\right \} was assumed to be linearly independent, by Definition LICV we must conclude that

\eqalignno{ 2{α}_{1} + {α}_{2} + 2{α}_{3} & = 0 & & \cr 3{α}_{1} − {α}_{2} + {α}_{3} & = 0 & & \cr {α}_{1} + 2{α}_{2} − {α}_{3} & = 0 & & }

Aah, a homogeneous system of equations. And it has a unique solution, the trivial solution. So, {α}_{1} = {α}_{2} = {α}_{3} = 0, as desired. It is an inescapable conclusion from our assumption of a relation of linear dependence above. Done.

Compare this solution very carefully with Solution LI.M20, noting especially how this problem required (and used) the hypothesis that the original set be linearly independent, and how this solution feels more like a proof, while the previous problem could be solved with a fairly simple demonstration of any nontrivial relation of linear dependence.

M50 Contributed by Robert Beezer Statement [436]
We want to first find some relations of linear dependence on \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4},\kern 1.95872pt {v}_{5}\right \} that will allow us to “kick out” some vectors, in the spirit of Example SCAD. To find relations of linear dependence, we formulate a matrix A whose columns are {v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4},\kern 1.95872pt {v}_{5}. Then we consider the homogeneous system of equations ℒS\kern -1.95872pt \left (A,\kern 1.95872pt 0\right ) by row-reducing its coefficient matrix (remember that if we formulated the augmented matrix we would just add a column of zeros). After row-reducing, we obtain

\left [\array{ \text{1}&0&0&2&−1\cr 0&\text{1 } &0 &1 &−2 \cr 0&0&\text{1}&0& 0 } \right ]

From this we find that solutions can be obtained employing the free variables {x}_{4} and {x}_{5}. With appropriate choices we will be able to conclude that vectors {v}_{4} and {v}_{5} are unnecessary for creating W via a span. By Theorem SLSLC the choice of free variables below lead to solutions and linear combinations, which are then rearranged.

\eqalignno{ {x}_{4} = 1,{x}_{5} = 0&& ⇒&&(−2){v}_{1} + (−1){v}_{2} + (0){v}_{3} + (1){v}_{4} + (0){v}_{5} = 0&& ⇒&&{v}_{4} = 2{v}_{1} + {v}_{2} &&&&&&&&&& \cr {x}_{4} = 0,{x}_{5} = 1&& ⇒&&(1){v}_{1} + (2){v}_{2} + (0){v}_{3} + (0){v}_{4} + (1){v}_{5} = 0 && ⇒&&{v}_{5} = −{v}_{1} − 2{v}_{2}&&&&&&&&&& \cr && && && && && }

Since {v}_{4} and {v}_{5} can be expressed as linear combinations of {v}_{1} and {v}_{2} we can say that {v}_{4} and {v}_{5} are not needed for the linear combinations used to build W (a claim that we could establish carefully with a pair of set equality arguments). Thus

W = \left \langle \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3}\right \}\right \rangle = \left \langle \left \{\left [\array{ 2\cr 1 \cr 1 } \right ],\kern 1.95872pt \left [\array{ −1\cr −1 \cr 1 } \right ],\kern 1.95872pt \left [\array{ 1\cr 2 \cr 3 } \right ]\right \}\right \rangle

That the \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3}\right \} is linearly independent set can be established quickly with Theorem LIVRN.

There are other answers to this question, but notice that any nontrivial linear combination of {v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4},\kern 1.95872pt {v}_{5} will have a zero coefficient on {v}_{3}, so this vector can never be eliminated from the set used to build the span.

M51 Contributed by Robert Beezer Statement [436]
This problem can be solved using the approach in Solution LI.M50. We will provide a solution here that is more ad-hoc, but note that we will have a more straight-forward procedure given by the upcoming Theorem BS.

{v}_{1} is a non-zero vector, so in a set all by itself we have a linearly independent set. As {v}_{2} is a scalar multiple of {v}_{1}, the equation − 4{v}_{1} + {v}_{2} = 0 is a relation of linear dependence on \left \{{v}_{1},\kern 1.95872pt {v}_{2}\right \}, so we will pass on {v}_{2}. No such relation of linear dependence exists on \left \{{v}_{1},\kern 1.95872pt {v}_{3}\right \}, though on \left \{{v}_{1},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4}\right \} we have the relation of linear dependence 7{v}_{1} + 3{v}_{3} + {v}_{4} = 0. So take S = \left \{{v}_{1},\kern 1.95872pt {v}_{3}\right \}, which is linearly independent.

Then

\eqalignno{ {v}_{2} & = 4{v}_{1} + 0{v}_{3} &{v}_{4} & = −7{v}_{1} − 3{v}_{3} & & & & }

The two equations above are enough to justify the set equality

\eqalignno{ W & = \left \langle \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4}\right \}\right \rangle = \left \langle \left \{{v}_{1},\kern 1.95872pt {v}_{3}\right \}\right \rangle = \left \langle S\right \rangle & & }

There are other solutions (for example, swap the roles of {v}_{1} and {v}_{2}, but by upcoming theorems we can confidently claim that any solution will be a set S with exactly two vectors.

T15 Contributed by Robert Beezer Statement [438]
Consider the following linear combination

\eqalignno{ 1\left ({v}_{1} − {v}_{2}\right )+ &1\left ({v}_{2} − {v}_{3}\right ) + 1\left ({v}_{3} − {v}_{4}\right ) + \mathrel{⋯} + 1\left ({v}_{n} − {v}_{1}\right ) & & \cr & = {v}_{1} − {v}_{2} + {v}_{2} − {v}_{3} + {v}_{3} − {v}_{4} + \mathrel{⋯} + {v}_{n} − {v}_{1} & & \cr & = {v}_{1} + 0 + 0 + \mathrel{⋯} + 0 − {v}_{1} & & \cr & = 0 & & }

This is a nontrivial relation of linear dependence (Definition RLDCV), so by Definition LICV the set is linearly dependent.

T20 Contributed by Robert Beezer Statement [438]
Our hypothesis and our conclusion use the term linear independence, so it will get a workout. To establish linear independence, we begin with the definition (Definition LICV) and write a relation of linear dependence (Definition RLDCV),

{α}_{1}\left ({v}_{1}\right ) + {α}_{2}\left ({v}_{1} + {v}_{2}\right ) + {α}_{3}\left ({v}_{1} + {v}_{2} + {v}_{3}\right ) + {α}_{4}\left ({v}_{1} + {v}_{2} + {v}_{3} + {v}_{4}\right ) = 0

Using the distributive and commutative properties of vector addition and scalar multiplication (Theorem VSPCV) this equation can be rearranged as

\left ({α}_{1} + {α}_{2} + {α}_{3} + {α}_{4}\right ){v}_{1} + \left ({α}_{2} + {α}_{3} + {α}_{4}\right ){v}_{2} + \left ({α}_{3} + {α}_{4}\right ){v}_{3} + \left ({α}_{4}\right ){v}_{4} = 0

However, this is a relation of linear dependence (Definition RLDCV) on a linearly independent set, \left \{{v}_{1},\kern 1.95872pt {v}_{2},\kern 1.95872pt {v}_{3},\kern 1.95872pt {v}_{4}\right \} (this was our lone hypothesis). By the definition of linear independence (Definition LICV) the scalars must all be zero. This is the homogeneous system of equations,

\eqalignno{ {α}_{1} + {α}_{2} + {α}_{3} + {α}_{4} & = 0 & & \cr {α}_{2} + {α}_{3} + {α}_{4} & = 0 & & \cr {α}_{3} + {α}_{4} & = 0 & & \cr {α}_{4} & = 0 & & }

Row-reducing the coefficient matrix of this system (or backsolving) gives the conclusion

\eqalignno{ {α}_{1} = 0 & &{α}_{2} = 0 & &{α}_{3} = 0 & &{α}_{4} = 0 & & & & & & & & }

This means, by Definition LICV, that the original set

\left \{{v}_{1},\kern 1.95872pt {v}_{1} + {v}_{2},\kern 1.95872pt {v}_{1} + {v}_{2} + {v}_{3},\kern 1.95872pt {v}_{1} + {v}_{2} + {v}_{3} + {v}_{4}\right \}

is linearly independent.

T50 Contributed by Robert Beezer Statement [439]
Let A = \left [{A}_{1}|{A}_{2}|{A}_{3}|\mathop{\mathop{…}}|{A}_{n}\right ]. ℒS\kern -1.95872pt \left (A,\kern 1.95872pt b\right ) is consistent, so we know the system has at least one solution (Definition CS). We would like to show that there are no more than one solution to the system. Employing Technique U, suppose that x and y are two solution vectors for ℒS\kern -1.95872pt \left (A,\kern 1.95872pt b\right ). By Theorem SLSLC we know we can write,

\eqalignno{ b & ={ \left [x\right ]}_{1}{A}_{1} +{ \left [x\right ]}_{2}{A}_{2} +{ \left [x\right ]}_{3}{A}_{3} + \mathrel{⋯} +{ \left [x\right ]}_{n}{A}_{n} & & \cr b & ={ \left [y\right ]}_{1}{A}_{1} +{ \left [y\right ]}_{2}{A}_{2} +{ \left [y\right ]}_{3}{A}_{3} + \mathrel{⋯} +{ \left [y\right ]}_{n}{A}_{n} & & }

Then

\eqalignno{ 0 & = b − b & & \cr & = \left ({\left [x\right ]}_{1}{A}_{1} +{ \left [x\right ]}_{2}{A}_{2} + \mathrel{⋯} +{ \left [x\right ]}_{n}{A}_{n}\right ) −\left ({\left [y\right ]}_{1}{A}_{1} +{ \left [y\right ]}_{2}{A}_{2} + \mathrel{⋯} +{ \left [y\right ]}_{n}{A}_{n}\right ) & & \cr & = \left ({\left [x\right ]}_{1} −{\left [y\right ]}_{1}\right ){A}_{1} + \left ({\left [x\right ]}_{2} −{\left [y\right ]}_{2}\right ){A}_{2} + \mathrel{⋯} + \left ({\left [x\right ]}_{n} −{\left [y\right ]}_{n}\right ){A}_{n} & & }

This is a relation of linear dependence (Definition RLDCV) on a linearly independent set (the columns of A). So the scalars must all be zero,

\eqalignno{ {\left [x\right ]}_{1} −{\left [y\right ]}_{1} & = 0 &{\left [x\right ]}_{2} −{\left [y\right ]}_{2} & = 0 &\mathop{\mathop{…}} & &{\left [x\right ]}_{n} −{\left [y\right ]}_{n} & = 0 & & & & & & & & }

Rearranging these equations yields the statement that {\left [x\right ]}_{i} ={ \left [y\right ]}_{i}, for 1 ≤ i ≤ n. However, this is exactly how we define vector equality (Definition CVE), so x = y and the system has only one solution.