Section LISS Linear Independence and Spanning Sets
A vector space is defined as a set with two operations, meeting ten properties (Definition VS). Just as the definition of span of a set of vectors only required knowing how to add vectors and how to multiply vectors by scalars, so it is with linear independence. A definition of a linearly independent set of vectors in an arbitrary vector space only requires knowing how to form linear combinations and equating these with the zero vector. Since every vector space must have a zero vector (Property Z), we always have a zero vector at our disposal.
In this section we will also put a twist on the notion of the span of a set of vectors. Rather than beginning with a set of vectors and creating a subspace that is the span, we will instead begin with a subspace and look for a set of vectors whose span equals the subspace.
The combination of linear independence and spanning will be very important going forward.
Subsection LI Linear Independence
Our previous definition of linear independence (Definition LICV) employed a relation of linear dependence that was a linear combination on one side of an equality and a zero vector on the other side. As a linear combination in a vector space (Definition LC) depends only on vector addition and scalar multiplication, and every vector space must have a zero vector (Property Z), we can extend our definition of linear independence from the setting of $\complex{m}$ to the setting of a general vector space $V$ with almost no changes. Compare these next two definitions with Definition RLDCV and Definition LICV.
Definition RLD Relation of Linear Dependence
Suppose that $V$ is a vector space. Given a set of vectors $S=\set{\vectorlist{u}{n}}$, an equation of the form \begin{equation*} \lincombo{\alpha}{u}{n}=\zerovector \end{equation*} is a relation of linear dependence on $S$. If this equation is formed in a trivial fashion, i.e. $\alpha_i=0$, $1\leq i\leq n$, then we say it is a trivial relation of linear dependence on $S$.
Definition LI Linear Independence
Suppose that $V$ is a vector space. The set of vectors $S=\set{\vectorlist{u}{n}}$ from $V$ is linearly dependent if there is a relation of linear dependence on $S$ that is not trivial. In the case where the only relation of linear dependence on $S$ is the trivial one, then $S$ is a linearly independent set of vectors.
Notice the emphasis on the word “only.” This might remind you of the definition of a nonsingular matrix, where if the matrix is employed as the coefficient matrix of a homogeneous system then the only solution is the trivial one.
Example LIP4 Linear independence in $P_4$
Example LIM32 Linear independence in $M_{32}$
Example LIC Linearly independent set in the crazy vector space
Subsection SS Spanning Sets
In a vector space $V$, suppose we are given a set of vectors $S\subseteq V$. Then we can immediately construct a subspace, $\spn{S}$, using Definition SS and then be assured by Theorem SSS that the construction does provide a subspace. We now turn the situation upside-down. Suppose we are first given a subspace $W\subseteq V$. Can we find a set $S$ so that $\spn{S}=W$? Typically $W$ is infinite and we are searching for a finite set of vectors $S$ that we can combine in linear combinations and “build” all of $W$.
I like to think of $S$ as the raw materials that are sufficient for the construction of $W$. If you have nails, lumber, wire, copper pipe, drywall, plywood, carpet, shingles, paint (and a few other things), then you can combine them in many different ways to create a house (or infinitely many different houses for that matter). A fast-food restaurant may have beef, chicken, beans, cheese, tortillas, taco shells and hot sauce and from this small list of ingredients build a wide variety of items for sale. Or maybe a better analogy comes from Ben Cordes — the additive primary colors (red, green and blue) can be combined to create many different colors by varying the intensity of each. The intensity is like a scalar multiple, and the combination of the three intensities is like vector addition. The three individual colors, red, green and blue, are the elements of the spanning set.
Because we will use terms like “spanned by” and “spanning set,” there is the potential for confusion with “the span.” Come back and reread the first paragraph of this subsection whenever you are uncertain about the difference. Here is the working definition.
Definition SSVS Spanning Set of a Vector Space
Suppose $V$ is a vector space. A subset $S$ of $V$ is a spanning set of $V$ if $\spn{S}=V$. In this case, we also frequently say $S$ spans $V$.
The definition of a spanning set requires that two sets (subspaces actually) be equal. If $S$ is a subset of $V$, then $\spn{S}\subseteq V$, always. Thus it is usually only necessary to prove that $V\subseteq\spn{S}$. Now would be a good time to review Definition SE.
Example SSP4 Spanning set in $P_4$
Given a subspace and a set of vectors, as in Example SSP4 it can take some work to determine that the set actually is a spanning set. An even harder problem is to be confronted with a subspace and required to construct a spanning set with no guidance. We will now work an example of this flavor, but some of the steps will be unmotivated. Fortunately, we will have some better tools for this type of problem later on.
Example SSM22 Spanning set in $M_{22}$
Example SSC Spanning set in the crazy vector space
Subsection VR Vector Representation
In Chapter R we will take up the matter of representations fully, where Theorem VRRB will be critical for Definition VR. We will now motivate and prove a critical theorem that tells us how to “represent” a vector. This theorem could wait, but working with it now will provide some extra insight into the nature of linearly independent spanning sets. First an example, then the theorem.
Example AVR A vector representation
Theorem VRRB Vector Representation Relative to a Basis
Suppose that $V$ is a vector space and $B=\set{\vectorlist{v}{m}}$ is a linearly independent set that spans $V$. Let $\vect{w}$ be any vector in $V$. Then there exist unique scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ such that \begin{equation*} \vect{w}=\lincombo{a}{v}{m}. \end{equation*}
The converse of Theorem VRRB is true as well, but is not important enough to rise beyond an exercise (see Exercise LISS.T51).
This is a very typical use of the hypothesis that a set is linearly independent — obtain a relation of linear dependence and then conclude that the scalars must all be zero. The result of this theorem tells us that we can write any vector in a vector space as a linear combination of the vectors in a linearly independent spanning set, but only just. There is only enough raw material in the spanning set to write each vector one way as a linear combination. So in this sense, we could call a linearly independent spanning set a “minimal spanning set.” These sets are so important that we will give them a simpler name (“basis”) and explore their properties further in the next section.