Skip to main content
\(\newcommand{\orderof}[1]{\sim #1} \newcommand{\Z}{\mathbb{Z}} \newcommand{\reals}{\mathbb{R}} \newcommand{\real}[1]{\mathbb{R}^{#1}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\complex}[1]{\mathbb{C}^{#1}} \newcommand{\conjugate}[1]{\overline{#1}} \newcommand{\modulus}[1]{\left\lvert#1\right\rvert} \newcommand{\zerovector}{\vect{0}} \newcommand{\zeromatrix}{\mathcal{O}} \newcommand{\innerproduct}[2]{\left\langle#1,\,#2\right\rangle} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} \newcommand{\dimension}[1]{\dim\left(#1\right)} \newcommand{\nullity}[1]{n\left(#1\right)} \newcommand{\rank}[1]{r\left(#1\right)} \newcommand{\ds}{\oplus} \newcommand{\detname}[1]{\det\left(#1\right)} \newcommand{\detbars}[1]{\left\lvert#1\right\rvert} \newcommand{\trace}[1]{t\left(#1\right)} \newcommand{\sr}[1]{#1^{1/2}} \newcommand{\spn}[1]{\left\langle#1\right\rangle} \newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)} \newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)} \newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)} \newcommand{\per}[1]{#1^\perp} \newcommand{\augmented}[2]{\left\lbrack\left.#1\,\right\rvert\,#2\right\rbrack} \newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)} \newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}} \newcommand{\rowopswap}[2]{R_{#1}\leftrightarrow R_{#2}} \newcommand{\rowopmult}[2]{#1R_{#2}} \newcommand{\rowopadd}[3]{#1R_{#2}+R_{#3}} \newcommand{\leading}[1]{\boxed{#1}} \newcommand{\rref}{\xrightarrow{\text{RREF}}} \newcommand{\elemswap}[2]{E_{#1,#2}} \newcommand{\elemmult}[2]{E_{#2}\left(#1\right)} \newcommand{\elemadd}[3]{E_{#2,#3}\left(#1\right)} \newcommand{\scalarlist}[2]{{#1}_{1},\,{#1}_{2},\,{#1}_{3},\,\ldots,\,{#1}_{#2}} \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\vectorcomponents}[2]{\colvector{#1_{1}\\#1_{2}\\#1_{3}\\\vdots\\#1_{#2}}} \newcommand{\vectorlist}[2]{\vect{#1}_{1},\,\vect{#1}_{2},\,\vect{#1}_{3},\,\ldots,\,\vect{#1}_{#2}} \newcommand{\vectorentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\matrixentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\lincombo}[3]{#1_{1}\vect{#2}_{1}+#1_{2}\vect{#2}_{2}+#1_{3}\vect{#2}_{3}+\cdots +#1_{#3}\vect{#2}_{#3}} \newcommand{\matrixcolumns}[2]{\left\lbrack\vect{#1}_{1}|\vect{#1}_{2}|\vect{#1}_{3}|\ldots|\vect{#1}_{#2}\right\rbrack} \newcommand{\transpose}[1]{#1^{t}} \newcommand{\inverse}[1]{#1^{-1}} \newcommand{\submatrix}[3]{#1\left(#2|#3\right)} \newcommand{\adj}[1]{\transpose{\left(\conjugate{#1}\right)}} \newcommand{\adjoint}[1]{#1^\ast} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\setparts}[2]{\left\lbrace#1\,\middle|\,#2\right\rbrace} \newcommand{\card}[1]{\left\lvert#1\right\rvert} \newcommand{\setcomplement}[1]{\overline{#1}} \newcommand{\charpoly}[2]{p_{#1}\left(#2\right)} \newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)} \newcommand{\eigensystem}[3]{\lambda&=#2&\eigenspace{#1}{#2}&=\spn{\set{#3}}} \newcommand{\geneigenspace}[2]{\mathcal{G}_{#1}\left(#2\right)} \newcommand{\algmult}[2]{\alpha_{#1}\left(#2\right)} \newcommand{\geomult}[2]{\gamma_{#1}\left(#2\right)} \newcommand{\indx}[2]{\iota_{#1}\left(#2\right)} \newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3} \newcommand{\lteval}[2]{#1\left(#2\right)} \newcommand{\ltinverse}[1]{#1^{-1}} \newcommand{\restrict}[2]{{#1}|_{#2}} \newcommand{\preimage}[2]{#1^{-1}\left(#2\right)} \newcommand{\rng}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\krn}[1]{\mathcal{K}\!\left(#1\right)} \newcommand{\compose}[2]{{#1}\circ{#2}} \newcommand{\vslt}[2]{\mathcal{LT}\left(#1,\,#2\right)} \newcommand{\isomorphic}{\cong} \newcommand{\similar}[2]{\inverse{#2}#1#2} \newcommand{\vectrepname}[1]{\rho_{#1}} \newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}} \newcommand{\vectrepinvname}[1]{\ltinverse{\vectrepname{#1}}} \newcommand{\vectrepinv}[2]{\lteval{\ltinverse{\vectrepname{#1}}}{#2}} \newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}} \newcommand{\matrixrepcolumns}[4]{\left\lbrack \left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{1}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{2}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{3}}}\right|\ldots\left|\vectrep{#2}{\lteval{#1}{\vect{#3}_{#4}}}\right.\right\rbrack} \newcommand{\cbm}[2]{C_{#1,#2}} \newcommand{\jordan}[2]{J_{#1}\left(#2\right)} \newcommand{\hadamard}[2]{#1\circ #2} \newcommand{\hadamardidentity}[1]{J_{#1}} \newcommand{\hadamardinverse}[1]{\widehat{#1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

SectionLISSLinear Independence and Spanning Sets

A vector space is defined as a set with two operations, meeting ten properties (Definition VS). Just as the definition of span of a set of vectors only required knowing how to add vectors and how to multiply vectors by scalars, so it is with linear independence. A definition of a linearly independent set of vectors in an arbitrary vector space only requires knowing how to form linear combinations and equating these with the zero vector. Since every vector space must have a zero vector (Property Z), we always have a zero vector at our disposal.

In this section we will also put a twist on the notion of the span of a set of vectors. Rather than beginning with a set of vectors and creating a subspace that is the span, we will instead begin with a subspace and look for a set of vectors whose span equals the subspace.

The combination of linear independence and spanning will be very important going forward.

SubsectionLILinear Independence

Our previous definition of linear independence (Definition LICV) employed a relation of linear dependence that was a linear combination on one side of an equality and a zero vector on the other side. As a linear combination in a vector space (Definition LC) depends only on vector addition and scalar multiplication, and every vector space must have a zero vector (Property Z), we can extend our definition of linear independence from the setting of \(\complex{m}\) to the setting of a general vector space \(V\) with almost no changes. Compare these next two definitions with Definition RLDCV and Definition LICV.

DefinitionRLDRelation of Linear Dependence

Suppose that \(V\) is a vector space. Given a set of vectors \(S=\set{\vectorlist{u}{n}}\text{,}\) an equation of the form \begin{equation*} \lincombo{\alpha}{u}{n}=\zerovector \end{equation*} is a relation of linear dependence on \(S\text{.}\) If this equation is formed in a trivial fashion, i.e. \(\alpha_i=0\text{,}\) \(1\leq i\leq n\text{,}\) then we say it is a trivial relation of linear dependence on \(S\text{.}\)

DefinitionLILinear Independence

Suppose that \(V\) is a vector space. The set of vectors \(S=\set{\vectorlist{u}{n}}\) from \(V\) is linearly dependent if there is a relation of linear dependence on \(S\) that is not trivial. In the case where the only relation of linear dependence on \(S\) is the trivial one, then \(S\) is a linearly independent set of vectors.

Notice the emphasis on the word “only.” This might remind you of the definition of a nonsingular matrix, where if the matrix is employed as the coefficient matrix of a homogeneous system then the only solution is the trivial one.

SubsectionSSSpanning Sets

In a vector space \(V\text{,}\) suppose we are given a set of vectors \(S\subseteq V\text{.}\) Then we can immediately construct a subspace, \(\spn{S}\text{,}\) using Definition SS and then be assured by Theorem SSS that the construction does provide a subspace. We now turn the situation upside-down. Suppose we are first given a subspace \(W\subseteq V\text{.}\) Can we find a set \(S\) so that \(\spn{S}=W\text{?}\) Typically \(W\) is infinite and we are searching for a finite set of vectors \(S\) that we can combine in linear combinations and “build” all of \(W\text{.}\)

I like to think of \(S\) as the raw materials that are sufficient for the construction of \(W\text{.}\) If you have nails, lumber, wire, copper pipe, drywall, plywood, carpet, shingles, paint (and a few other things), then you can combine them in many different ways to create a house (or infinitely many different houses for that matter). A fast-food restaurant may have beef, chicken, beans, cheese, tortillas, taco shells and hot sauce and from this small list of ingredients build a wide variety of items for sale. Or maybe a better analogy comes from Ben Cordes — the additive primary colors (red, green and blue) can be combined to create many different colors by varying the intensity of each. The intensity is like a scalar multiple, and the combination of the three intensities is like vector addition. The three individual colors, red, green and blue, are the elements of the spanning set.

Because we will use terms like “spanned by” and “spanning set,” there is the potential for confusion with “the span.” Come back and reread the first paragraph of this subsection whenever you are uncertain about the difference. Here is the working definition.

DefinitionSSVSSpanning Set of a Vector Space

Suppose \(V\) is a vector space. A subset \(S\) of \(V\) is a spanning set of \(V\) if \(\spn{S}=V\text{.}\) In this case, we also frequently say \(S\) spans \(V\text{.}\)

The definition of a spanning set requires that two sets (subspaces actually) be equal. If \(S\) is a subset of \(V\text{,}\) then \(\spn{S}\subseteq V\text{,}\) always. Thus it is usually only necessary to prove that \(V\subseteq\spn{S}\text{.}\) Now would be a good time to review Definition SE.

Given a subspace and a set of vectors, as in Example SSP4 it can take some work to determine that the set actually is a spanning set. An even harder problem is to be confronted with a subspace and required to construct a spanning set with no guidance. We will now work an example of this flavor, but some of the steps will be unmotivated. Fortunately, we will have some better tools for this type of problem later on.

SubsectionVRVector Representation

In Chapter R we will take up the matter of representations fully, where Theorem VRRB will be critical for Definition VR. We will now motivate and prove a critical theorem that tells us how to “represent” a vector. This theorem could wait, but working with it now will provide some extra insight into the nature of linearly independent spanning sets. First an example, then the theorem.

Proof

The converse of Theorem VRRB is true as well, but is not important enough to rise beyond an exercise (see Exercise LISS.T51).

This is a very typical use of the hypothesis that a set is linearly independent — obtain a relation of linear dependence and then conclude that the scalars must all be zero. The result of this theorem tells us that we can write any vector in a vector space as a linear combination of the vectors in a linearly independent spanning set, but only just. There is only enough raw material in the spanning set to write each vector one way as a linear combination. So in this sense, we could call a linearly independent spanning set a “minimal spanning set.” These sets are so important that we will give them a simpler name (“basis”) and explore their properties further in the next section.

SubsectionReading Questions

1

Is the set of matrices below linearly independent or linearly dependent in the vector space \(M_{22}\text{?}\) Why or why not? \begin{equation*} \set{ \begin{bmatrix} 1&3\\-2&4 \end{bmatrix},\, \begin{bmatrix} -2&3\\3&-5 \end{bmatrix},\, \begin{bmatrix} 0&9\\-1&3 \end{bmatrix} } \end{equation*}

2

Explain the difference between the following two uses of the term “span”:

  1. \(S\) is a subset of the vector space \(V\) and the span of \(S\) is a subspace of \(V\text{.}\)
  2. \(W\) is a subspace of the vector space \(Y\) and \(T\) spans \(W\text{.}\)
3

The set \begin{equation*} S=\set{ \colvector{6\\2\\1},\, \colvector{4\\-3\\1},\, \colvector{5\\8\\2} } \end{equation*} is linearly independent and spans \(\complex{3}\text{.}\) Write the vector \(\vect{x}=\colvector{-6\\2\\2}\) as a linear combination of the elements of \(S\text{.}\) How many ways are there to answer this question, and which theorem allows you to say so?

SubsectionExercises

C20

In the vector space of \(2\times 2\) matrices, \(M_{22}\text{,}\) determine if the set \(S\) below is linearly independent. \begin{equation*} S=\set{ \begin{bmatrix} 2&-1\\1&3 \end{bmatrix},\, \begin{bmatrix} 0&4\\-1&2 \end{bmatrix},\, \begin{bmatrix} 4&2\\1&3 \end{bmatrix} } \end{equation*}

Solution
C21

In the crazy vector space \(C\) (Example CVS), is the set \(S=\set{(0,\,2),\ (2,\,8)}\) linearly independent?

Solution
C22

In the vector space of polynomials \(P_3\text{,}\) determine if the set \(S\) is linearly independent or linearly dependent. \begin{equation*} S=\set{2 +x -3x^2 -8x^3,\, 1+ x + x^2 +5x^3,\, 3 -4x^2 -7x^3} \end{equation*}

Solution
C23

Determine if the set \(S=\set{(3,\,1),\,(7,\,3)}\) is linearly independent in the crazy vector space \(C\) (Example CVS).

Solution
C24

In the vector space of real-valued functions \(F = \setparts{f}{f:\mathbb{R}\rightarrow\mathbb{R}}\text{,}\) determine if the following set \(S\) is linearly independent. \begin{equation*} S = \set{\sin^2{x}, \cos^2{x}, 2} \end{equation*}

Solution
C25

Let \begin{equation*} S = \set{ \begin{bmatrix} 1& 2\\2 & 1 \end{bmatrix}, \begin{bmatrix} 2 & 1\\ -1 & 2\end{bmatrix}, \begin{bmatrix} 0 & 1\\ 1 & 2\end{bmatrix} }\text{.} \end{equation*}

  1. Determine if \(S\) spans \(M_{22}\text{.}\)
  2. Determine if \(S\) is linearly independent.
Solution
C26

Let \begin{equation*} S = \set{ \begin{bmatrix} 1& 2\\2 & 1 \end{bmatrix}, \begin{bmatrix} 2 & 1\\ -1 & 2\end{bmatrix}, \begin{bmatrix} 0 & 1\\ 1 & 2\end{bmatrix}, \begin{bmatrix} 1 & 0\\1 & 1 \end{bmatrix}, \begin{bmatrix} 1 & 4\\0 & 3\end{bmatrix} }\text{.} \end{equation*}

  1. Determine if \(S\) spans \(M_{22}\text{.}\)
  2. Determine if \(S\) is linearly independent.
Solution
C30

In Example LIM32, find another nontrivial relation of linear dependence on the linearly dependent set of \(3\times 2\) matrices, \(S\text{.}\)

C40

Determine if the set \(T=\set{x^2-x+5,\,4x^3-x^2+5x,\,3x+2}\) spans the vector space of polynomials with degree 4 or less, \(P_4\text{.}\)

Solution
C41

The set \(W\) is a subspace of \(M_{22}\text{,}\) the vector space of all \(2\times 2\) matrices. Prove that \(S\) is a spanning set for \(W\text{.}\) \begin{align*} W&=\setparts{\begin{bmatrix}a&b\\c&d\end{bmatrix}}{2a-3b+4c-d=0} & S=&\set{ \begin{bmatrix}1&0\\0&2\end{bmatrix},\, \begin{bmatrix}0&1\\0&-3\end{bmatrix},\, \begin{bmatrix}0&0\\1&4\end{bmatrix} } \end{align*}

Solution
C42

Determine if the set \(S=\set{(3,\,1),\,(7,\,3)}\) spans the crazy vector space \(C\) (Example CVS).

Solution
M10

Halfway through Example SSP4, we need to show that the system of equations \begin{equation*} \linearsystem{ \begin{bmatrix} 0 & 0 & 0 & 1\\ 0 & 0 & 1 & -8\\ 0 & 1 & -6 & 24\\ 1 & -4 & 12 & -32\\ -2 & 4 & -8 & 16\\ \end{bmatrix} } {\colvector{a\\b\\c\\d\\e}} \end{equation*} is consistent for every choice of the vector of constants satisfying \(16a+8b+4c+2d+e=0\text{.}\)

Express the column space of the coefficient matrix of this system as a null space, using Theorem FS. From this use Theorem CSCS to establish that the system is always consistent. Notice that this approach removes from Example SSP4 the need to row-reduce a symbolic matrix.

Solution
T20

Suppose that \(S\) is a finite linearly independent set of vectors from the vector space \(V\text{.}\) Let \(T\) be any subset of \(S\text{.}\) Prove that \(T\) is linearly independent.

Solution
T40

Prove the following variant of Theorem EMMVP that has a weaker hypothesis: Suppose that \(C=\set{\vectorlist{u}{p}}\) is a linearly independent spanning set for \(\complex{n}\text{.}\) Suppose also that \(A\) and \(B\) are \(m\times n\) matrices such that \(A\vect{u}_i=B\vect{u}_i\) for every \(1\leq i\leq n\text{.}\) Then \(A=B\text{.}\)

Can you weaken the hypothesis even further while still preserving the conclusion?

T50

Suppose that \(V\) is a vector space and \(\vect{u},\,\vect{v}\in V\) are two vectors in \(V\text{.}\) Use the definition of linear independence to prove that \(S=\set{\vect{u},\,\vect{v}}\) is a linearly dependent set if and only if one of the two vectors is a scalar multiple of the other. Prove this directly in the context of an abstract vector space (\(V\)), without simply giving an upgraded version of Theorem DLDS for the special case of just two vectors.

Solution
T51

Carefully formulate the converse of Theorem VRRB and provide a proof.

Solution