Skip to main content
\(\newcommand{\orderof}[1]{\sim #1} \newcommand{\Z}{\mathbb{Z}} \newcommand{\reals}{\mathbb{R}} \newcommand{\real}[1]{\mathbb{R}^{#1}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\complex}[1]{\mathbb{C}^{#1}} \newcommand{\conjugate}[1]{\overline{#1}} \newcommand{\modulus}[1]{\left\lvert#1\right\rvert} \newcommand{\zerovector}{\vect{0}} \newcommand{\zeromatrix}{\mathcal{O}} \newcommand{\innerproduct}[2]{\left\langle#1,\,#2\right\rangle} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} \newcommand{\dimension}[1]{\dim\left(#1\right)} \newcommand{\nullity}[1]{n\left(#1\right)} \newcommand{\rank}[1]{r\left(#1\right)} \newcommand{\ds}{\oplus} \newcommand{\detname}[1]{\det\left(#1\right)} \newcommand{\detbars}[1]{\left\lvert#1\right\rvert} \newcommand{\trace}[1]{t\left(#1\right)} \newcommand{\sr}[1]{#1^{1/2}} \newcommand{\spn}[1]{\left\langle#1\right\rangle} \newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)} \newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)} \newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)} \newcommand{\per}[1]{#1^\perp} \newcommand{\augmented}[2]{\left\lbrack\left.#1\,\right\rvert\,#2\right\rbrack} \newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)} \newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}} \newcommand{\rowopswap}[2]{R_{#1}\leftrightarrow R_{#2}} \newcommand{\rowopmult}[2]{#1R_{#2}} \newcommand{\rowopadd}[3]{#1R_{#2}+R_{#3}} \newcommand{\leading}[1]{\boxed{#1}} \newcommand{\rref}{\xrightarrow{\text{RREF}}} \newcommand{\elemswap}[2]{E_{#1,#2}} \newcommand{\elemmult}[2]{E_{#2}\left(#1\right)} \newcommand{\elemadd}[3]{E_{#2,#3}\left(#1\right)} \newcommand{\scalarlist}[2]{{#1}_{1},\,{#1}_{2},\,{#1}_{3},\,\ldots,\,{#1}_{#2}} \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\vectorcomponents}[2]{\colvector{#1_{1}\\#1_{2}\\#1_{3}\\\vdots\\#1_{#2}}} \newcommand{\vectorlist}[2]{\vect{#1}_{1},\,\vect{#1}_{2},\,\vect{#1}_{3},\,\ldots,\,\vect{#1}_{#2}} \newcommand{\vectorentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\matrixentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\lincombo}[3]{#1_{1}\vect{#2}_{1}+#1_{2}\vect{#2}_{2}+#1_{3}\vect{#2}_{3}+\cdots +#1_{#3}\vect{#2}_{#3}} \newcommand{\matrixcolumns}[2]{\left\lbrack\vect{#1}_{1}|\vect{#1}_{2}|\vect{#1}_{3}|\ldots|\vect{#1}_{#2}\right\rbrack} \newcommand{\transpose}[1]{#1^{t}} \newcommand{\inverse}[1]{#1^{-1}} \newcommand{\submatrix}[3]{#1\left(#2|#3\right)} \newcommand{\adj}[1]{\transpose{\left(\conjugate{#1}\right)}} \newcommand{\adjoint}[1]{#1^\ast} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\setparts}[2]{\left\lbrace#1\,\middle|\,#2\right\rbrace} \newcommand{\card}[1]{\left\lvert#1\right\rvert} \newcommand{\setcomplement}[1]{\overline{#1}} \newcommand{\charpoly}[2]{p_{#1}\left(#2\right)} \newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)} \newcommand{\eigensystem}[3]{\lambda&=#2&\eigenspace{#1}{#2}&=\spn{\set{#3}}} \newcommand{\geneigenspace}[2]{\mathcal{G}_{#1}\left(#2\right)} \newcommand{\algmult}[2]{\alpha_{#1}\left(#2\right)} \newcommand{\geomult}[2]{\gamma_{#1}\left(#2\right)} \newcommand{\indx}[2]{\iota_{#1}\left(#2\right)} \newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3} \newcommand{\lteval}[2]{#1\left(#2\right)} \newcommand{\ltinverse}[1]{#1^{-1}} \newcommand{\restrict}[2]{{#1}|_{#2}} \newcommand{\preimage}[2]{#1^{-1}\left(#2\right)} \newcommand{\rng}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\krn}[1]{\mathcal{K}\!\left(#1\right)} \newcommand{\compose}[2]{{#1}\circ{#2}} \newcommand{\vslt}[2]{\mathcal{LT}\left(#1,\,#2\right)} \newcommand{\isomorphic}{\cong} \newcommand{\similar}[2]{\inverse{#2}#1#2} \newcommand{\vectrepname}[1]{\rho_{#1}} \newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}} \newcommand{\vectrepinvname}[1]{\ltinverse{\vectrepname{#1}}} \newcommand{\vectrepinv}[2]{\lteval{\ltinverse{\vectrepname{#1}}}{#2}} \newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}} \newcommand{\matrixrepcolumns}[4]{\left\lbrack \left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{1}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{2}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{3}}}\right|\ldots\left|\vectrep{#2}{\lteval{#1}{\vect{#3}_{#4}}}\right.\right\rbrack} \newcommand{\cbm}[2]{C_{#1,#2}} \newcommand{\jordan}[2]{J_{#1}\left(#2\right)} \newcommand{\hadamard}[2]{#1\circ #2} \newcommand{\hadamardidentity}[1]{J_{#1}} \newcommand{\hadamardinverse}[1]{\widehat{#1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

SectionLDSLinear Dependence and Spans

In any linearly dependent set there is always one vector that can be written as a linear combination of the others. This is the substance of the upcoming Theorem DLDS. Perhaps this will explain the use of the word “dependent.” In a linearly dependent set, at least one vector “depends” on the others (via a linear combination).

Indeed, because Theorem DLDS is an equivalence (Proof Technique E) some authors use this condition as a definition (Proof Technique D) of linear dependence. Then linear independence is defined as the logical opposite of linear dependence. Of course, we have chosen to take Definition LICV as our definition, and then follow with Theorem DLDS as a theorem.

SubsectionLDSSLinearly Dependent Sets and Spans

If we use a linearly dependent set to construct a span, then we can always create the same infinite set with a starting set that is one vector smaller in size. We will illustrate this behavior in Example RSC5. However, this will not be possible if we build a span from a linearly independent set. So in a certain sense, using a linearly independent set to formulate a span is the best possible way — there are not any extra vectors being used to build up all the necessary linear combinations. OK, here is the theorem, and then the example.

Proof

This theorem can be used, sometimes repeatedly, to whittle down the size of a set of vectors used in a span construction. We have seen some of this already in Example SCAD, but in the next example we will detail some of the subtleties.

SubsectionCOVCasting Out Vectors

In Example RSC5 we used four vectors to create a span. With a relation of linear dependence in hand, we were able to “toss out” one of these four vectors and create the same span from a subset of just three vectors from the original set of four. We did have to take some care as to just which vector we tossed out. In the next example, we will be more methodical about just how we choose to eliminate vectors from a linearly dependent set while preserving a span.

Example COV deserves your careful attention, since this important example motivates the following very fundamental theorem.

Proof

In Example COV, we tossed-out vectors one at a time. But in each instance, we rewrote the offending vector as a linear combination of those vectors with the column indices of the pivot columns of the reduced row-echelon form of the matrix of columns. In the proof of Theorem BS, we accomplish this reduction in one big step. In Example COV we arrived at a linearly independent set at exactly the same moment that we ran out of free variables to exploit. This was not a coincidence, it is the substance of our conclusion of linear independence in Theorem BS.

Here is a straightforward application of Theorem BS.

SubsectionReading Questions

1

Let \(S\) be the linearly dependent set of three vectors below. \begin{equation*} S=\set{\colvector{1\\10\\100\\1000},\,\colvector{1\\1\\1\\1},\,\colvector{5\\23\\203\\2003}} \end{equation*} Write one vector from \(S\) as a linear combination of the other two and include this vector equality in your response. (You should be able to do this on sight, rather than doing some computations.) Convert this expression into a nontrivial relation of linear dependence on \(S\text{.}\)

2

Explain why the word “dependent” is used in the definition of linear dependence.

3

Suppose that \(Y=\spn{P}=\spn{Q}\text{,}\) where \(P\) is a linearly dependent set and \(Q\) is linearly independent. Would you rather use \(P\) or \(Q\) to describe \(Y\text{?}\) Why?

SubsectionExercises

C20

Let \(T\) be the set of columns of the matrix \(B\) below. Define \(W=\spn{T}\text{.}\) Find a set \(R\) so that (1) \(R\) has 3 vectors, (2) \(R\) is a subset of \(T\text{,}\) and (3) \(W=\spn{R}\text{.}\) \begin{equation*} B= \begin{bmatrix} -3 & 1 & -2 & 7\\ -1 & 2 & 1 & 4\\ 1 & 1 & 2 & -1 \end{bmatrix} \end{equation*}

Solution
C40

Verify that the set \(R^\prime=\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_4}\) at the end of Example RSC5 is linearly independent.

C50

Consider the set of vectors from \(\complex{3}\text{,}\) \(W\text{,}\) given below. Find a linearly independent set \(T\) that contains three vectors from \(W\) and such that \(\spn{W}=\spn{T}\text{.}\) \begin{equation*} W= \set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3,\,\vect{v}_4,\,\vect{v}_5} =\set{ \colvector{2\\1\\1},\, \colvector{-1\\-1\\1},\, \colvector{1\\2\\3},\, \colvector{3\\1\\3},\, \colvector{0\\1\\-3} }\text{.} \end{equation*}

Solution
C51

Given the set \(S\) below, find a linearly independent set \(T\) so that \(\spn{T}=\spn{S}\text{.}\) \begin{equation*} S=\set{ \colvector{2\\-1\\2},\, \colvector{3\\0\\1},\, \colvector{1\\1\\-1},\, \colvector{5\\-1\\3} }\text{.} \end{equation*}

Solution
C52

Let \(W\) be the span of the set of vectors \(S\) below, \(W=\spn{S}\text{.}\) Find a set \(T\) so that 1) the span of \(T\) is \(W\text{,}\) \(\spn{T}=W\text{,}\) (2) \(T\) is a linearly independent set, and (3) \(T\) is a subset of \(S\text{.}\) \begin{align*} S&= \set{ \colvector{1 \\ 2 \\ -1},\, \colvector{2 \\ -3 \\ 1},\, \colvector{4 \\ 1 \\ -1},\, \colvector{3 \\ 1 \\ 1},\, \colvector{3 \\ -1 \\ 0} }\text{.} \end{align*}

Solution
C55

Let \(T\) be the set of vectors \begin{equation*} T=\set{ \colvector{1 \\ -1 \\ 2},\, \colvector{3 \\ 0 \\ 1},\, \colvector{4 \\ 2 \\ 3},\, \colvector{3 \\ 0 \\ 6} }\text{.} \end{equation*} Find two different subsets of \(T\text{,}\) named \(R\) and \(S\text{,}\) so that \(R\) and \(S\) each contain three vectors, and so that \(\spn{R}=\spn{T}\) and \(\spn{S}=\spn{T}\text{.}\) Prove that both \(R\) and \(S\) are linearly independent.

Solution
C70

Reprise Example RES by creating a new version of the vector \(\vect{y}\text{.}\) In other words, form a new, different linear combination of the vectors in \(R\) to create a new vector \(\vect{y}\) (but do not simplify the problem too much by choosing any of the five new scalars to be zero). Then express this new \(\vect{y}\) as a combination of the vectors in \(P\text{.}\)

M10

At the conclusion of Example RSC4 two alternative solutions, sets \(T^{\prime}\) and \(T^{*}\text{,}\) are proposed. Verify these claims by proving that \(\spn{T}=\spn{T^{\prime}}\) and \(\spn{T}=\spn{T^{*}}\text{.}\)

T40

Suppose that \(\vect{v}_1\) and \(\vect{v}_2\) are any two vectors from \(\complex{m}\text{.}\) Prove the following set equality. \begin{equation*} \spn{\set{\vect{v}_1,\,\vect{v}_2}} = \spn{\set{\vect{v}_1+\vect{v}_2,\,\vect{v}_1-\vect{v}_2}}\text{.} \end{equation*}

Solution