# SectionLDSLinear Dependence and Spans¶ permalink

In any linearly dependent set there is always one vector that can be written as a linear combination of the others. This is the substance of the upcoming Theorem DLDS. Perhaps this will explain the use of the word “dependent.” In a linearly dependent set, at least one vector “depends” on the others (via a linear combination).

Indeed, because Theorem DLDS is an equivalence (Proof Technique E) some authors use this condition as a definition (Proof Technique D) of linear dependence. Then linear independence is defined as the logical opposite of linear dependence. Of course, we have chosen to take Definition LICV as our definition, and then follow with Theorem DLDS as a theorem.

# SubsectionLDSSLinearly Dependent Sets and Spans¶ permalink

If we use a linearly dependent set to construct a span, then we can always create the same infinite set with a starting set that is one vector smaller in size. We will illustrate this behavior in Example RSC5. However, this will not be possible if we build a span from a linearly independent set. So in a certain sense, using a linearly independent set to formulate a span is the best possible way — there are not any extra vectors being used to build up all the necessary linear combinations. OK, here is the theorem, and then the example.

##### Proof

This theorem can be used, sometimes repeatedly, to whittle down the size of a set of vectors used in a span construction. We have seen some of this already in Example SCAD, but in the next example we will detail some of the subtleties.

Click to open

# SubsectionCOVCasting Out Vectors¶ permalink

In Example RSC5 we used four vectors to create a span. With a relation of linear dependence in hand, we were able to “toss out” one of these four vectors and create the same span from a subset of just three vectors from the original set of four. We did have to take some care as to just which vector we tossed out. In the next example, we will be more methodical about just how we choose to eliminate vectors from a linearly dependent set while preserving a span.

Example COV deserves your careful attention, since this important example motivates the following very fundamental theorem.

##### Proof

In Example COV, we tossed-out vectors one at a time. But in each instance, we rewrote the offending vector as a linear combination of those vectors with the column indices of the pivot columns of the reduced row-echelon form of the matrix of columns. In the proof of Theorem BS, we accomplish this reduction in one big step. In Example COV we arrived at a linearly independent set at exactly the same moment that we ran out of free variables to exploit. This was not a coincidence, it is the substance of our conclusion of linear independence in Theorem BS.

Here is a straightforward application of Theorem BS.

##### 1

Let $S$ be the linearly dependent set of three vectors below. \begin{equation*} S=\set{\colvector{1\\10\\100\\1000},\,\colvector{1\\1\\1\\1},\,\colvector{5\\23\\203\\2003}} \end{equation*} Write one vector from $S$ as a linear combination of the other two and include this vector equality in your response. (You should be able to do this on sight, rather than doing some computations.) Convert this expression into a nontrivial relation of linear dependence on $S\text{.}$

##### 2

Explain why the word “dependent” is used in the definition of linear dependence.

##### 3

Suppose that $Y=\spn{P}=\spn{Q}\text{,}$ where $P$ is a linearly dependent set and $Q$ is linearly independent. Would you rather use $P$ or $Q$ to describe $Y\text{?}$ Why?

# SubsectionExercises

##### C20

Let $T$ be the set of columns of the matrix $B$ below. Define $W=\spn{T}\text{.}$ Find a set $R$ so that (1) $R$ has 3 vectors, (2) $R$ is a subset of $T\text{,}$ and (3) $W=\spn{R}\text{.}$ \begin{equation*} B= \begin{bmatrix} -3 & 1 & -2 & 7\\ -1 & 2 & 1 & 4\\ 1 & 1 & 2 & -1 \end{bmatrix} \end{equation*}

Solution
##### C40

Verify that the set $R^\prime=\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_4}$ at the end of Example RSC5 is linearly independent.

##### C50

Consider the set of vectors from $\complex{3}\text{,}$ $W\text{,}$ given below. Find a linearly independent set $T$ that contains three vectors from $W$ and such that $\spn{W}=\spn{T}\text{.}$ \begin{equation*} W= \set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3,\,\vect{v}_4,\,\vect{v}_5} =\set{ \colvector{2\\1\\1},\, \colvector{-1\\-1\\1},\, \colvector{1\\2\\3},\, \colvector{3\\1\\3},\, \colvector{0\\1\\-3} }\text{.} \end{equation*}

Solution
##### C51

Given the set $S$ below, find a linearly independent set $T$ so that $\spn{T}=\spn{S}\text{.}$ \begin{equation*} S=\set{ \colvector{2\\-1\\2},\, \colvector{3\\0\\1},\, \colvector{1\\1\\-1},\, \colvector{5\\-1\\3} }\text{.} \end{equation*}

Solution
##### C52

Let $W$ be the span of the set of vectors $S$ below, $W=\spn{S}\text{.}$ Find a set $T$ so that 1) the span of $T$ is $W\text{,}$ $\spn{T}=W\text{,}$ (2) $T$ is a linearly independent set, and (3) $T$ is a subset of $S\text{.}$ \begin{align*} S&= \set{ \colvector{1 \\ 2 \\ -1},\, \colvector{2 \\ -3 \\ 1},\, \colvector{4 \\ 1 \\ -1},\, \colvector{3 \\ 1 \\ 1},\, \colvector{3 \\ -1 \\ 0} }\text{.} \end{align*}

Solution
##### C55

Let $T$ be the set of vectors \begin{equation*} T=\set{ \colvector{1 \\ -1 \\ 2},\, \colvector{3 \\ 0 \\ 1},\, \colvector{4 \\ 2 \\ 3},\, \colvector{3 \\ 0 \\ 6} }\text{.} \end{equation*} Find two different subsets of $T\text{,}$ named $R$ and $S\text{,}$ so that $R$ and $S$ each contain three vectors, and so that $\spn{R}=\spn{T}$ and $\spn{S}=\spn{T}\text{.}$ Prove that both $R$ and $S$ are linearly independent.

Solution
##### C70

Reprise Example RES by creating a new version of the vector $\vect{y}\text{.}$ In other words, form a new, different linear combination of the vectors in $R$ to create a new vector $\vect{y}$ (but do not simplify the problem too much by choosing any of the five new scalars to be zero). Then express this new $\vect{y}$ as a combination of the vectors in $P\text{.}$

##### M10

At the conclusion of Example RSC4 two alternative solutions, sets $T^{\prime}$ and $T^{*}\text{,}$ are proposed. Verify these claims by proving that $\spn{T}=\spn{T^{\prime}}$ and $\spn{T}=\spn{T^{*}}\text{.}$

##### T40

Suppose that $\vect{v}_1$ and $\vect{v}_2$ are any two vectors from $\complex{m}\text{.}$ Prove the following set equality. \begin{equation*} \spn{\set{\vect{v}_1,\,\vect{v}_2}} = \spn{\set{\vect{v}_1+\vect{v}_2,\,\vect{v}_1-\vect{v}_2}}\text{.} \end{equation*}

Solution