Skip to main content
\(\newcommand{\orderof}[1]{\sim #1} \newcommand{\Z}{\mathbb{Z}} \newcommand{\reals}{\mathbb{R}} \newcommand{\real}[1]{\mathbb{R}^{#1}} \newcommand{\complexes}{\mathbb{C}} \newcommand{\complex}[1]{\mathbb{C}^{#1}} \newcommand{\conjugate}[1]{\overline{#1}} \newcommand{\modulus}[1]{\left\lvert#1\right\rvert} \newcommand{\zerovector}{\vect{0}} \newcommand{\zeromatrix}{\mathcal{O}} \newcommand{\innerproduct}[2]{\left\langle#1,\,#2\right\rangle} \newcommand{\norm}[1]{\left\lVert#1\right\rVert} \newcommand{\dimension}[1]{\dim\left(#1\right)} \newcommand{\nullity}[1]{n\left(#1\right)} \newcommand{\rank}[1]{r\left(#1\right)} \newcommand{\ds}{\oplus} \newcommand{\detname}[1]{\det\left(#1\right)} \newcommand{\detbars}[1]{\left\lvert#1\right\rvert} \newcommand{\trace}[1]{t\left(#1\right)} \newcommand{\sr}[1]{#1^{1/2}} \newcommand{\spn}[1]{\left\langle#1\right\rangle} \newcommand{\nsp}[1]{\mathcal{N}\!\left(#1\right)} \newcommand{\csp}[1]{\mathcal{C}\!\left(#1\right)} \newcommand{\rsp}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\lns}[1]{\mathcal{L}\!\left(#1\right)} \newcommand{\per}[1]{#1^\perp} \newcommand{\augmented}[2]{\left\lbrack\left.#1\,\right\rvert\,#2\right\rbrack} \newcommand{\linearsystem}[2]{\mathcal{LS}\!\left(#1,\,#2\right)} \newcommand{\homosystem}[1]{\linearsystem{#1}{\zerovector}} \newcommand{\rowopswap}[2]{R_{#1}\leftrightarrow R_{#2}} \newcommand{\rowopmult}[2]{#1R_{#2}} \newcommand{\rowopadd}[3]{#1R_{#2}+R_{#3}} \newcommand{\leading}[1]{\boxed{#1}} \newcommand{\rref}{\xrightarrow{\text{RREF}}} \newcommand{\elemswap}[2]{E_{#1,#2}} \newcommand{\elemmult}[2]{E_{#2}\left(#1\right)} \newcommand{\elemadd}[3]{E_{#2,#3}\left(#1\right)} \newcommand{\scalarlist}[2]{{#1}_{1},\,{#1}_{2},\,{#1}_{3},\,\ldots,\,{#1}_{#2}} \newcommand{\vect}[1]{\mathbf{#1}} \newcommand{\colvector}[1]{\begin{bmatrix}#1\end{bmatrix}} \newcommand{\vectorcomponents}[2]{\colvector{#1_{1}\\#1_{2}\\#1_{3}\\\vdots\\#1_{#2}}} \newcommand{\vectorlist}[2]{\vect{#1}_{1},\,\vect{#1}_{2},\,\vect{#1}_{3},\,\ldots,\,\vect{#1}_{#2}} \newcommand{\vectorentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\matrixentry}[2]{\left\lbrack#1\right\rbrack_{#2}} \newcommand{\lincombo}[3]{#1_{1}\vect{#2}_{1}+#1_{2}\vect{#2}_{2}+#1_{3}\vect{#2}_{3}+\cdots +#1_{#3}\vect{#2}_{#3}} \newcommand{\matrixcolumns}[2]{\left\lbrack\vect{#1}_{1}|\vect{#1}_{2}|\vect{#1}_{3}|\ldots|\vect{#1}_{#2}\right\rbrack} \newcommand{\transpose}[1]{#1^{t}} \newcommand{\inverse}[1]{#1^{-1}} \newcommand{\submatrix}[3]{#1\left(#2|#3\right)} \newcommand{\adj}[1]{\transpose{\left(\conjugate{#1}\right)}} \newcommand{\adjoint}[1]{#1^\ast} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\setparts}[2]{\left\lbrace#1\,\middle|\,#2\right\rbrace} \newcommand{\card}[1]{\left\lvert#1\right\rvert} \newcommand{\setcomplement}[1]{\overline{#1}} \newcommand{\charpoly}[2]{p_{#1}\left(#2\right)} \newcommand{\eigenspace}[2]{\mathcal{E}_{#1}\left(#2\right)} \newcommand{\eigensystem}[3]{\lambda&=#2&\eigenspace{#1}{#2}&=\spn{\set{#3}}} \newcommand{\geneigenspace}[2]{\mathcal{G}_{#1}\left(#2\right)} \newcommand{\algmult}[2]{\alpha_{#1}\left(#2\right)} \newcommand{\geomult}[2]{\gamma_{#1}\left(#2\right)} \newcommand{\indx}[2]{\iota_{#1}\left(#2\right)} \newcommand{\ltdefn}[3]{#1\colon #2\rightarrow#3} \newcommand{\lteval}[2]{#1\left(#2\right)} \newcommand{\ltinverse}[1]{#1^{-1}} \newcommand{\restrict}[2]{{#1}|_{#2}} \newcommand{\preimage}[2]{#1^{-1}\left(#2\right)} \newcommand{\rng}[1]{\mathcal{R}\!\left(#1\right)} \newcommand{\krn}[1]{\mathcal{K}\!\left(#1\right)} \newcommand{\compose}[2]{{#1}\circ{#2}} \newcommand{\vslt}[2]{\mathcal{LT}\left(#1,\,#2\right)} \newcommand{\isomorphic}{\cong} \newcommand{\similar}[2]{\inverse{#2}#1#2} \newcommand{\vectrepname}[1]{\rho_{#1}} \newcommand{\vectrep}[2]{\lteval{\vectrepname{#1}}{#2}} \newcommand{\vectrepinvname}[1]{\ltinverse{\vectrepname{#1}}} \newcommand{\vectrepinv}[2]{\lteval{\ltinverse{\vectrepname{#1}}}{#2}} \newcommand{\matrixrep}[3]{M^{#1}_{#2,#3}} \newcommand{\matrixrepcolumns}[4]{\left\lbrack \left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{1}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{2}}}\right|\left.\vectrep{#2}{\lteval{#1}{\vect{#3}_{3}}}\right|\ldots\left|\vectrep{#2}{\lteval{#1}{\vect{#3}_{#4}}}\right.\right\rbrack} \newcommand{\cbm}[2]{C_{#1,#2}} \newcommand{\jordan}[2]{J_{#1}\left(#2\right)} \newcommand{\hadamard}[2]{#1\circ #2} \newcommand{\hadamardidentity}[1]{J_{#1}} \newcommand{\hadamardinverse}[1]{\widehat{#1}} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \)

SectionSSubspaces

A subspace is a vector space that is contained within another vector space. So every subspace is a vector space in its own right, but it is also defined relative to some other (larger) vector space. We will discover shortly that we are already familiar with a wide variety of subspaces from previous sections.

SubsectionSSubspaces

Here is the principal definition for this section.

DefinitionSSubspace

Suppose that \(V\) and \(W\) are two vector spaces that have identical definitions of vector addition and scalar multiplication, and suppose that \(W\) is a subset of \(V\text{,}\) \(W\subseteq V\text{.}\) Then \(W\) is a subspace of \(V\text{.}\)

Let us look at an example of a vector space inside another vector space.

SubsectionTSTesting Subspaces

In Example SC3 we proceeded through all ten of the vector space properties before believing that a subset was a subspace. But six of the properties were easy to prove, and we can lean on some of the properties of the vector space (the superset) to make the other four easier. Here is a theorem that will make it easier to test if a subset is a vector space. A shortcut if there ever was one.

Proof

So just three conditions, plus being a subset of a known vector space, gets us all ten properties. Fabulous! This theorem can be paraphrased by saying that a subspace is “a nonempty subset (of a vector space) that is closed under vector addition and scalar multiplication.”

You might want to go back and rework Example SC3 in light of this result, perhaps seeing where we can now economize or where the work done in the example mirrored the proof and where it did not. We will press on and apply this theorem in a slightly more abstract setting.

Much of the power of Theorem TSS is that we can easily establish new vector spaces if we can locate them as subsets of other vector spaces, such as the vector spaces presented in Subsection VS.EVS.

It can be as instructive to consider some subsets that are not subspaces. Since Theorem TSS is an equivalence (see Proof Technique E) we can be assured that a subset is not a subspace if it violates one of the three conditions, and in any example of interest this will not be the “nonempty” condition. However, since a subspace has to be a vector space in its own right, we can also search for a violation of any one of the ten defining properties in Definition VS or any inherent property of a vector space, such as those given by the basic theorems of Subsection VS.VSP. Notice also that a violation need only be for a specific vector or pair of vectors.

There are two examples of subspaces that are trivial. Suppose that \(V\) is any vector space. Then \(V\) is a subset of itself and is a vector space. By Definition S, \(V\) qualifies as a subspace of itself. The set containing just the zero vector \(Z=\set{\zerovector}\) is also a subspace as can be seen by applying Theorem TSS or by simple modifications of the techniques hinted at in Example VSS. Since these subspaces are so obvious (and therefore not too interesting) we will refer to them as being trivial.

DefinitionTSTrivial Subspaces

Given the vector space \(V\text{,}\) the subspaces \(V\) and \(\set{\zerovector}\) are each called a trivial subspace.

We can also use Theorem TSS to prove more general statements about subspaces, as illustrated in the next theorem.

Proof

Here is an example where we can exercise Theorem NSMS.

SubsectionTSSThe Span of a Set

The span of a set of column vectors got a heavy workout in Chapter V and Chapter M. The definition of the span depended only on being able to formulate linear combinations. In any of our more general vector spaces we always have a definition of vector addition and of scalar multiplication. So we can build linear combinations and manufacture spans. This subsection contains two definitions that are just mild variants of definitions we have seen earlier for column vectors. If you have not already, compare them with Definition LCCV and Definition SSCV.

DefinitionLCLinear Combination

Suppose that \(V\) is a vector space. Given \(n\) vectors \(\vectorlist{u}{n}\) and \(n\) scalars \(\alpha_1,\,\alpha_2,\,\alpha_3,\,\ldots,\,\alpha_n\text{,}\) their linear combination is the vector \begin{equation*} \lincombo{\alpha}{u}{n}\text{.} \end{equation*}

When we realize that we can form linear combinations in any vector space, then it is natural to revisit our definition of the span of a set, since it is the set of all possible linear combinations of a set of vectors.

DefinitionSSSpan of a Set

Suppose that \(V\) is a vector space. Given a set of vectors \(S=\{\vectorlist{u}{t}\}\text{,}\) their span, \(\spn{S}\text{,}\) is the set of all possible linear combinations of \(\vectorlist{u}{t}\text{.}\) Symbolically, \begin{align*} \spn{S}&=\setparts{\lincombo{\alpha}{u}{t}}{\alpha_i\in\complexes,\,1\leq i\leq t}\\ &=\setparts{\sum_{i=1}^{t}\alpha_i\vect{u}_i}{\alpha_i\in\complexes,\,1\leq i\leq t}\text{.} \end{align*}

Proof

Let us again examine membership in a span.

Notice how Example SSP and Example SM32 contained questions about membership in a span, but these questions quickly became questions about solutions to a system of linear equations. This will be a common theme going forward.

SubsectionSCSubspace Constructions

Several of the subsets of vectors spaces that we worked with in Chapter M are also subspaces — they are closed under vector addition and scalar multiplication in \(\complex{m}\text{.}\)

Proof

That was easy! Notice that we could have used this same approach to prove that the null space is a subspace, since Theorem SSNS provided a description of the null space of a matrix as the span of a set of vectors. However, I much prefer the current proof of Theorem NSMS. Speaking of easy, here is a very easy theorem that exposes another of our constructions as creating subspaces.

Proof

One more.

Proof

So the span of a set of vectors, and the null space, column space, row space and left null space of a matrix are all subspaces, and hence are all vector spaces, meaning they have all the properties detailed in Definition VS and in the basic theorems presented in Section VS. We have worked with these objects as just sets in Chapter V and Chapter M, but now we understand that they have much more structure. In particular, being closed under vector addition and scalar multiplication means a subspace is also closed under linear combinations.

SubsectionReading Questions

1

Summarize the three conditions that allow us to quickly test if a set is a subspace.

2

Consider the set of vectors \begin{equation*} W=\setparts{\colvector{a\\b\\c}}{3a-2b+c=5}\text{.} \end{equation*} Is the set \(W\) a subspace of \(\complex{3}\text{?}\) Explain your answer.

3

Name five general constructions of sets of column vectors (subsets of \(\complex{m}\)) that we now know as subspaces.

SubsectionExercises

C15

Working within the vector space \(\complex{3}\text{,}\) determine if \(\vect{b} = \colvector{4\\3\\1}\) is in the subspace \(W\text{,}\) \begin{equation*} W = \spn{\set{ \colvector{3\\2\\3}, \colvector{1\\0\\3}, \colvector{1\\1\\0}, \colvector{2\\1\\3} }}\text{.} \end{equation*}

Solution
C16

Working within the vector space \(\complex{4}\text{,}\) determine if \(\vect{b} = \colvector{1\\1\\0\\1}\) is in the subspace \(W\text{,}\) \begin{equation*} W =\spn{\set{ \colvector{1\\2\\-1\\1}, \colvector{1\\0\\3\\1}, \colvector{2\\1\\1\\2} }}\text{.} \end{equation*}

Solution
C17

Working within the vector space \(\complex{4}\text{,}\) determine if \(\vect{b} = \colvector{2\\1\\2\\1}\) is in the subspace \(W\text{,}\) \begin{equation*} W = \spn{\set{ \colvector{1\\2\\0\\2}, \colvector{1\\0\\3\\1}, \colvector{0\\1\\0\\2}, \colvector{1\\1\\2\\0} }}\text{.} \end{equation*}

Solution
C20

Working within the vector space \(P_3\) of polynomials of degree 3 or less, determine if \(p(x)=x^3+6x+4\) is in the subspace \(W\) below. \begin{equation*} W=\spn{\set{x^3+x^2+x,\,x^3+2x-6,\,x^2-5}} \end{equation*}

Solution
C21

Consider the subspace \begin{equation*} W=\spn{\set{ \begin{bmatrix} 2 & 1\\3 & -1 \end{bmatrix} ,\, \begin{bmatrix} 4 & 0\\2 & 3 \end{bmatrix} ,\, \begin{bmatrix} -3 & 1\\2 & 1 \end{bmatrix} }} \end{equation*} of the vector space of \(2\times 2\) matrices, \(M_{22}\text{.}\) Is \begin{equation*} C=\begin{bmatrix} -3 & 3\\6 & -4 \end{bmatrix} \end{equation*} an element of \(W\text{?}\)

Solution
C26

Show that the set \(Y=\setparts{\colvector{x_1\\x_2}}{x_1\in{\mathbb Z},\,x_2\in{\mathbb Z}}\) from Example NSC2S has Property AC.

M20

In \(\complex{3}\text{,}\) the vector space of column vectors of size 3, prove that the set \(Z\) is a subspace. \begin{equation*} Z=\setparts{\colvector{x_1\\x_2\\x_3}}{4x_1-x_2+5x_3=0} \end{equation*}

Solution
T20

A square matrix \(A\) of size \(n\) is upper triangular if \(\matrixentry{A}{ij}=0\) whenever \(i\gt j\text{.}\) Let \(UT_n\) be the set of all upper triangular matrices of size \(n\text{.}\) Prove that \(UT_n\) is a subspace of the vector space of all square matrices of size \(n\text{,}\) \(M_{nn}\text{.}\)

Solution
T30

Let \(P\) be the set of all polynomials, of any degree. The set \(P\) is a vector space. Let \(E\) be the subset of \(P\) consisting of all polynomials with only terms of even degree. Prove or disprove: the set \(E\) is a subspace of \(P\text{.}\)

Solution
T31

Let \(P\) be the set of all polynomials, of any degree. The set \(P\) is a vector space. Let \(F\) be the subset of \(P\) consisting of all polynomials with only terms of odd degree. Prove or disprove: the set \(F\) is a subspace of \(P\text{.}\)

Solution