Skip to main content
\(\newcommand{\orderof}[1]{\sim #1} \newcommand{\Z}{\mathbb{Z}} \newcommand{\reals}{\mathbb{R}} \newcommand{\real}[1]{\mathbb{R}^{#1}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\complex}[1]{\mathbb{C}^{#1}} \newcommand{\conjugate}[1]{\overline{#1}} \newcommand{\modulus}[1]{\left\lvert#1\right\rvert} \newcommand{\zerovector}{\vect{0}} \newcommand{\zeromatrix}{\mathcal{O}} \newcommand{\innerproduct}[2]{\left\langle#1,\,#2\right\rangle} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} \newcommand{\dimension}[1]{\dim\left(#1\right)} \newcommand{\nullity}[1]{n\left(#1\right)} \newcommand{\rank}[1]{r\left(#1\right)} \newcommand{\ds}{\oplus} \newcommand{\detname}[1]{\det\left(#1\right)} \newcommand{\detbars}[1]{\left\lvert#1\right\rvert} \newcommand{\trace}[1]{t\left(#1\right)} \newcommand{\sr}[1]{#1^{1/2}} \newcommand{\spn}[1]{\left\langle#1\right\rangle} \newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)} \newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)} \newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)} \newcommand{\per}[1]{#1^\perp} \newcommand{\augmented}[2]{\left\lbrack\left.#1\,\right\rvert\,#2\right\rbrack} \newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)} \newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}} \newcommand{\rowopswap}[2]{R_{#1}\leftrightarrow R_{#2}} \newcommand{\rowopmult}[2]{#1R_{#2}} \newcommand{\rowopadd}[3]{#1R_{#2}+R_{#3}} \newcommand{\leading}[1]{\boxed{#1}} \newcommand{\rref}{\xrightarrow{\text{RREF}}} \newcommand{\elemswap}[2]{E_{#1,#2}} \newcommand{\elemmult}[2]{E_{#2}\left(#1\right)} \newcommand{\elemadd}[3]{E_{#2,#3}\left(#1\right)} \newcommand{\scalarlist}[2]{{#1}_{1},\,{#1}_{2},\,{#1}_{3},\,\ldots,\,{#1}_{#2}} \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\vectorcomponents}[2]{\colvector{#1_{1}\\#1_{2}\\#1_{3}\\\vdots\\#1_{#2}}} \newcommand{\vectorlist}[2]{\vect{#1}_{1},\,\vect{#1}_{2},\,\vect{#1}_{3},\,\ldots,\,\vect{#1}_{#2}} \newcommand{\vectorentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\matrixentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\lincombo}[3]{#1_{1}\vect{#2}_{1}+#1_{2}\vect{#2}_{2}+#1_{3}\vect{#2}_{3}+\cdots +#1_{#3}\vect{#2}_{#3}} \newcommand{\matrixcolumns}[2]{\left\lbrack\vect{#1}_{1}|\vect{#1}_{2}|\vect{#1}_{3}|\ldots|\vect{#1}_{#2}\right\rbrack} \newcommand{\transpose}[1]{#1^{t}} \newcommand{\inverse}[1]{#1^{-1}} \newcommand{\submatrix}[3]{#1\left(#2|#3\right)} \newcommand{\adj}[1]{\transpose{\left(\conjugate{#1}\right)}} \newcommand{\adjoint}[1]{#1^\ast} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\setparts}[2]{\left\lbrace#1\,\middle|\,#2\right\rbrace} \newcommand{\card}[1]{\left\lvert#1\right\rvert} \newcommand{\setcomplement}[1]{\overline{#1}} \newcommand{\charpoly}[2]{p_{#1}\left(#2\right)} \newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)} \newcommand{\eigensystem}[3]{\lambda&=#2&\eigenspace{#1}{#2}&=\spn{\set{#3}}} \newcommand{\geneigenspace}[2]{\mathcal{G}_{#1}\left(#2\right)} \newcommand{\algmult}[2]{\alpha_{#1}\left(#2\right)} \newcommand{\geomult}[2]{\gamma_{#1}\left(#2\right)} \newcommand{\indx}[2]{\iota_{#1}\left(#2\right)} \newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3} \newcommand{\lteval}[2]{#1\left(#2\right)} \newcommand{\ltinverse}[1]{#1^{-1}} \newcommand{\restrict}[2]{{#1}|_{#2}} \newcommand{\preimage}[2]{#1^{-1}\left(#2\right)} \newcommand{\rng}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\krn}[1]{\mathcal{K}\!\left(#1\right)} \newcommand{\compose}[2]{{#1}\circ{#2}} \newcommand{\vslt}[2]{\mathcal{LT}\left(#1,\,#2\right)} \newcommand{\isomorphic}{\cong} \newcommand{\similar}[2]{\inverse{#2}#1#2} \newcommand{\vectrepname}[1]{\rho_{#1}} \newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}} \newcommand{\vectrepinvname}[1]{\ltinverse{\vectrepname{#1}}} \newcommand{\vectrepinv}[2]{\lteval{\ltinverse{\vectrepname{#1}}}{#2}} \newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}} \newcommand{\matrixrepcolumns}[4]{\left\lbrack \left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{1}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{2}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{3}}}\right|\ldots\left|\vectrep{#2}{\lteval{#1}{\vect{#3}_{#4}}}\right.\right\rbrack} \newcommand{\cbm}[2]{C_{#1,#2}} \newcommand{\jordan}[2]{J_{#1}\left(#2\right)} \newcommand{\hadamard}[2]{#1\circ #2} \newcommand{\hadamardidentity}[1]{J_{#1}} \newcommand{\hadamardinverse}[1]{\widehat{#1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

SectionLILinear Independence

Linear independence is one of the most fundamental conceptual ideas in linear algebra, along with the notion of a span. So this section, and the subsequent Section LDS, will explore this new idea.

SubsectionLISVLinearly Independent Sets of Vectors

Theorem SLSLC tells us that a solution to a homogeneous system of equations is a linear combination of the columns of the coefficient matrix that equals the zero vector. We used just this situation to our advantage (twice!) in Example SCAD where we reduced the set of vectors used in a span construction from four down to two, by declaring certain vectors as surplus. The next two definitions will allow us to formalize this situation.

DefinitionRLDCVRelation of Linear Dependence for Column Vectors

Given a set of vectors \(S=\set{\vectorlist{u}{n}}\text{,}\) a true statement of the form \begin{equation*} \lincombo{\alpha}{u}{n}=\zerovector \end{equation*} is a relation of linear dependence on \(S\text{.}\) If this statement is formed in a trivial fashion, i.e. \(\alpha_i=0\text{,}\) \(1\leq i\leq n\text{,}\) then we say it is the trivial relation of linear dependence on \(S\text{.}\)

DefinitionLICVLinear Independence of Column Vectors

The set of vectors \(S=\set{\vectorlist{u}{n}}\) is linearly dependent if there is a relation of linear dependence on \(S\) that is not trivial. In the case where the only relation of linear dependence on \(S\) is the trivial one, then \(S\) is a linearly independent set of vectors.

Notice that a relation of linear dependence is an equation. Though most of it is a linear combination, it is not a linear combination (that would be a vector). Linear independence is a property of a set of vectors. It is easy to take a set of vectors, and an equal number of scalars, all zero, and form a linear combination that equals the zero vector. When the easy way is the only way, then we say the set is linearly independent. Here are a couple of examples.

Example LDS and Example LIS relied on solving a homogeneous system of equations to determine linear independence. We can codify this process in a time-saving theorem.

Proof

Since Theorem LIVHS is an equivalence, we can use it to determine the linear independence or dependence of any set of column vectors, just by creating a matrix and analyzing the row-reduced form. Let us illustrate this with two more examples.

As an equivalence, Theorem LIVHS gives us a straightforward way to determine if a set of vectors is linearly independent or dependent.

Review Example LIHS and Example LDHS. They are very similar, differing only in the last two slots of the third vector. This resulted in slightly different matrices when row-reduced, and slightly different values of \(r\text{,}\) the number of nonzero rows. Notice, too, that we are less interested in the actual solution set, and more interested in its form or size. These observations allow us to make a slight improvement in Theorem LIVHS.

Proof

So now here is an example of the most straightforward way to determine if a set of column vectors is linearly independent or linearly dependent. While this method can be quick and easy, do not forget the logical progression from the definition of linear independence through homogeneous system of equations which makes it possible.

The situation in Example LLDS is slick enough to warrant formulating as a theorem.

Proof

SubsectionLINMLinear Independence and Nonsingular Matrices

We will now specialize to sets of \(n\) vectors from \(\complex{n}\text{.}\) This will put Theorem MVSLD off-limits, while Theorem LIVHS will involve square matrices. Let us begin by contrasting Archetype A and Archetype B.

That Archetype A and Archetype B have opposite properties for the columns of their coefficient matrices is no accident. Here is the theorem, and then we will update our equivalences for nonsingular matrices, Theorem NME1.

Proof

Here is the update to Theorem NME1.

Proof

SubsectionNSSLINull Spaces, Spans, Linear Independence

In Subsection SS.SSNS we proved Theorem SSNS which provided \(n-r\) vectors that could be used with the span construction to build the entire null space of a matrix. As we have hinted in Example SCAD, and as we will see again going forward, linearly dependent sets carry redundant vectors with them when used in building a set as a span. Our aim now is to show that the vectors provided by Theorem SSNS form a linearly independent set, so in one sense they are as efficient as possible a way to describe the null space. Notice that the vectors \(\vect{z}_j\text{,}\) \(1\leq j\leq n-r\) first appear in the vector form of solutions to arbitrary linear systems (Theorem VFSLS). The exact same vectors appear again in the span construction in the conclusion of Theorem SSNS. Since this second theorem specializes to homogeneous systems the only real difference is that the vector \(\vect{c}\) in Theorem VFSLS is the zero vector for a homogeneous system. Finally, Theorem BNS will now show that these same vectors are a linearly independent set. We will set the stage for the proof of this theorem with a moderately large example. Study the example carefully, as it will make it easier to understand the proof.

The proof of Theorem BNS is really quite straightforward, and relies on the “pattern of zeros and ones” that arise in the vectors \(\vect{z}_i\text{,}\) \(1\leq i\leq n-r\) in the entries that arise with the locations of the non-pivot columns. Play along with Example LINSB as you study the proof. Also, take a look at Example VFSAD, Example VFSAI and Example VFSAL, especially at the conclusion of Step 2 (temporarily ignore the construction of the constant vector, \(\vect{c}\)). This proof is also a good first example of how to prove a conclusion that states a set is linearly independent.

Proof

SubsectionReading Questions

1

Let \(S\) be the set of three vectors below. \begin{equation*} S=\set{\colvector{1\\2\\-1},\,\colvector{3\\-4\\2},\,\colvector{4\\-2\\1}} \end{equation*} Is \(S\) linearly independent or linearly dependent? Explain why.

2

Let \(S\) be the set of three vectors below. \begin{equation*} S=\set{\colvector{1\\-1\\0},\,\colvector{3\\2\\2},\,\colvector{4\\3\\-4}} \end{equation*} Is \(S\) linearly independent or linearly dependent? Explain why.

3

Is the matrix below singular or nonsingular? Explain your answer using only the final conclusion you reached in the previous question, along with one new theorem. \begin{equation*} \begin{bmatrix} 1 &3 &4\\ -1 &2 &3\\ 0 &2 &-4 \end{bmatrix} \end{equation*}

SubsectionExercises

Determine if the sets of vectors in Exercises C20–C25 are linearly independent or linearly dependent. When the set is linearly dependent, exhibit a nontrivial relation of linear dependence.

C20

\(\set{\colvector{1\\-2\\1},\,\colvector{2\\-1\\3},\,\colvector{1\\5\\0}}\)

Solution
C21

\(\set{\colvector{-1\\2\\4\\2},\,\colvector{3\\3\\-1\\3},\,\colvector{7\\3\\-6\\4}}\)

Solution
C22

\(\set{ \colvector{-2\\-1\\-1},\, \colvector{1\\0\\-1},\, \colvector{3\\3\\6},\, \colvector{-5\\-4\\-6},\, \colvector{4\\4\\7} }\)

Solution
C23

\(\set{ \colvector{1\\-2\\2\\5\\3},\, \colvector{3\\3\\1\\2\\-4},\, \colvector{2\\1\\2\\-1\\1},\, \colvector{1\\0\\1\\2\\2} }\)

Solution
C24

\(\set{ \colvector{1\\2\\-1\\0\\1},\, \colvector{3\\2\\-1\\2\\2},\, \colvector{4\\4\\-2\\2\\3},\, \colvector{-1\\2\\-1\\-2\\0} }\)

Solution
C25

\(\set{ \colvector{2\\1\\3\\-1\\2},\, \colvector{4\\-2\\1\\3\\2},\, \colvector{10\\-7\\0\\10\\4} }\)

Solution
C30

For the matrix \(B\) below, find a set \(S\) that is linearly independent and spans the null space of \(B\text{,}\) that is, \(\nsp{B}=\spn{S}\text{.}\) \begin{equation*} B= \begin{bmatrix} -3 & 1 & -2 & 7\\ -1 & 2 & 1 & 4\\ 1 & 1 & 2 & -1 \end{bmatrix} \end{equation*}

Solution
C31

For the matrix \(A\) below, find a linearly independent set \(S\) so that the null space of \(A\) is spanned by \(S\text{,}\) that is, \(\nsp{A}=\spn{S}\text{.}\) \begin{equation*} A= \begin{bmatrix} -1 & -2 & 2 & 1 & 5 \\ 1 & 2 & 1 & 1 & 5 \\ 3 & 6 & 1 & 2 & 7 \\ 2 & 4 & 0 & 1 & 2 \end{bmatrix} \end{equation*}

Solution
C32

Find a set of column vectors, \(T\text{,}\) such that (1) the span of \(T\) is the null space of \(B\text{,}\) \(\spn{T}=\nsp{B}\) and (2) \(T\) is a linearly independent set. \begin{align*} B&= \begin{bmatrix} 2 & 1 & 1 & 1 \\ -4 & -3 & 1 & -7 \\ 1 & 1 & -1 & 3 \end{bmatrix} \end{align*}

Solution
C33

Find a set \(S\) so that \(S\) is linearly independent and \(\nsp{A}=\spn{S}\text{,}\) where \(\nsp{A}\) is the null space of the matrix \(A\) below. \begin{align*} A&= \begin{bmatrix} 2 & 3 & 3 & 1 & 4 \\ 1 & 1 & -1 & -1 & -3 \\ 3 & 2 & -8 & -1 & 1 \end{bmatrix} \end{align*}

Solution
C50

Consider each archetype that is a system of equations and consider the solutions listed for the homogeneous version of the archetype. (If only the trivial solution is listed, then assume this is the only solution to the system.) From the solution set, determine if the columns of the coefficient matrix form a linearly independent or linearly dependent set. In the case of a linearly dependent set, use one of the sample solutions to provide a nontrivial relation of linear dependence on the set of columns of the coefficient matrix (Definition RLD). Indicate when Theorem MVSLD applies and connect this with the number of variables and equations in the system of equations.

Archetype A, Archetype B, Archetype C, Archetype D/Archetype E, Archetype F, Archetype G/Archetype H, Archetype I, Archetype J

C51

For each archetype that is a system of equations consider the homogeneous version. Write elements of the solution set in vector form (Theorem VFSLS) and from this extract the vectors \(\vect{z}_j\) described in Theorem BNS. These vectors are used in a span construction to describe the null space of the coefficient matrix for each archetype. What does it mean when we write a null space as \(\spn{\set{\ }}\text{?}\)

Archetype A, Archetype B, Archetype C, Archetype D/Archetype E, Archetype F, Archetype G/Archetype H, Archetype I, Archetype J

C52

For each archetype that is a system of equations consider the homogeneous version. Sample solutions are given and a linearly independent spanning set is given for the null space of the coefficient matrix. Write each of the sample solutions individually as a linear combination of the vectors in the spanning set for the null space of the coefficient matrix.

Archetype A, Archetype B, Archetype C, Archetype D/Archetype E, Archetype F, Archetype G/Archetype H, Archetype I, Archetype J

C60

For the matrix \(A\) below, find a set of vectors \(S\) so that (1) \(S\) is linearly independent, and (2) the span of \(S\) equals the null space of \(A\text{,}\) \(\spn{S}=\nsp{A}\text{.}\) (See Exercise SS.C60.) \begin{equation*} A= \begin{bmatrix} 1 & 1 & 6 & -8\\ 1 & -2 & 0 & 1\\ -2 & 1 & -6 & 7 \end{bmatrix} \end{equation*}

Solution
M20

Suppose that \(S=\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3}\) is a set of three vectors from \(\complex{873}\text{.}\) Prove that the set \begin{align*} T&= \set{ 2\vect{v}_1+3\vect{v}_2+\vect{v}_3,\, \vect{v}_1-\vect{v}_2-2\vect{v}_3,\, 2\vect{v}_1+\vect{v}_2-\vect{v}_3 } \end{align*} is linearly dependent.

Solution
M21

Suppose that \(S=\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3}\) is a linearly independent set of three vectors from \(\complex{873}\text{.}\) Prove that the set \begin{align*} T&= \set{ 2\vect{v}_1+3\vect{v}_2+\vect{v}_3,\, \vect{v}_1-\vect{v}_2+2\vect{v}_3,\, 2\vect{v}_1+\vect{v}_2-\vect{v}_3 } \end{align*} is linearly independent.

Solution
M50

Consider the set of vectors from \(\complex{3}\text{,}\) \(W\text{,}\) given below. Find a set \(T\) that contains three vectors from \(W\) and such that \(W=\spn{T}\text{.}\) \begin{equation*} W=\spn{\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3,\,\vect{v}_4,\,\vect{v}_5}}=\spn{\set{\colvector{2\\1\\1},\,\colvector{-1\\-1\\1},\,\colvector{1\\2\\3},\,\colvector{3\\1\\3},\,\colvector{0\\1\\-3}}} \end{equation*}

Solution
M51

Consider the subspace \(W=\spn{\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3,\,\vect{v}_4}}\text{.}\) Find a set \(S\) so that (1) \(S\) is a subset of \(W\text{,}\) (2) \(S\) is linearly independent, and (3) \(W=\spn{S}\text{.}\) Write each vector not included in \(S\) as a linear combination of the vectors that are in \(S\text{.}\) \begin{align*} \vect{v}_1&=\colvector{1\\-1\\2} & \vect{v}_2&=\colvector{4\\-4\\8} & \vect{v}_3&=\colvector{-3\\2\\-7} & \vect{v}_4&=\colvector{2\\1\\7} \end{align*}

Solution
T10

Prove that if a set of vectors contains the zero vector, then the set is linearly dependent. (Ed. “The zero vector is death to linearly independent sets.”)

T12

Suppose that \(S\) is a linearly independent set of vectors, and \(T\) is a subset of \(S\text{,}\) \(T\subseteq S\) (Definition SSET). Prove that \(T\) is linearly independent.

T13

Suppose that \(T\) is a linearly dependent set of vectors, and \(T\) is a subset of \(S\text{,}\) \(T\subseteq S\) (Definition SSET). Prove that \(S\) is linearly dependent.

T15

Suppose that \(\set{\vectorlist{v}{n}}\) is a set of vectors. Prove that \begin{gather*} \set{\vect{v}_1-\vect{v}_2,\,\vect{v}_2-\vect{v}_3,\,\vect{v}_3-\vect{v}_4,\,\dots,\,\vect{v}_n-\vect{v}_1} \end{gather*} is a linearly dependent set.

Solution
T20

Suppose that \(\set{\vect{v}_1,\,\vect{v}_2,\,\vect{v}_3,\,\vect{v}_4}\) is a linearly independent set in \(\complex{35}\text{.}\) Prove that \begin{equation*} \set{ \vect{v}_1,\, \vect{v}_1+\vect{v}_2,\, \vect{v}_1+\vect{v}_2+\vect{v}_3,\, \vect{v}_1+\vect{v}_2+\vect{v}_3+\vect{v}_4 } \end{equation*} is a linearly independent set.

Solution
T50

Suppose that \(A\) is an \(m\times n\) matrix with linearly independent columns and the linear system \(\linearsystem{A}{\vect{b}}\) is consistent. Show that this system has a unique solution. (Notice that we are not requiring \(A\) to be square.)

Solution