Once the dimension of a vector space is known, then the determination of whether or not a set of vectors is linearly independent, or if it spans the vector space, can often be much easier. In this section we will state a workhorse theorem and then apply it to the column space and row space of a matrix. It will also help us describe a super-basis for $\complex{m}\text{.}$

We begin with a useful theorem that we will need later, and in the proof of the main theorem in this subsection. This theorem says that we can extend linearly independent sets, one vector at a time, by adding vectors from outside the span of the linearly independent set, all the while preserving the linear independence of the set.

##### Proof

In the story Goldilocks and the Three Bears, the young girl Goldilocks visits the empty house of the three bears while out walking in the woods. One bowl of porridge is too hot, the other too cold, the third is just right. One chair is too hard, one too soft, the third is just right. So it is with sets of vectors — some are too big (linearly dependent), some are too small (they do not span), and some are just right (bases). Here is Goldilocks' Theorem.

##### Proof

There is a tension in the construction of a basis. Make a set too big and you will end up with relations of linear dependence among the vectors. Make a set too small and you will not have enough raw material to span the entire vector space. Make a set just the right size (the dimension) and you only need to have linear independence or spanning, and you get the other property for free. These roughly-stated ideas are made precise by Theorem G.

The structure and proof of this theorem also deserve comment. The hypotheses seem innocuous. We presume we know the dimension of the vector space in hand, then we mostly just look at the size of the set $S\text{.}$ From this we get big conclusions about spanning and linear independence. Each of the four proofs relies on ultimately contradicting Theorem SSLD, so in a way we could think of this entire theorem as a corollary of Theorem SSLD. (See Proof Technique LC.) The proofs of the third and fourth parts parallel each other in style: introduce $\vect{w}$ using Theorem ELIS or toss $\vect{v}_k$ using Theorem DLDS. Then obtain a contradiction to Theorem SSLD.

Theorem G is useful in both concrete examples and as a tool in other proofs. We will use it often to bypass verifying linear independence or spanning.

A simple consequence of Theorem G is the observation that a proper subspace has strictly smaller dimension than its parent vector space. Hopefully this may seem intuitively obvious, but it still requires proof, and we will cite this result later.

##### Proof

The final theorem of this subsection is an extremely powerful tool for establishing the equality of two sets that are subspaces. Notice that the hypotheses include the equality of two integers (dimensions) while the conclusion is the equality of two sets (subspaces). It is the extra “structure” of a vector space and its dimension that makes possible this huge leap from an integer equality to a set equality.

##### Proof

We now prove one of the most surprising theorems about matrices. Notice the paucity of hypotheses compared to the precision of the conclusion.

##### Proof

This says that the row space and the column space of a matrix have the same dimension, which should be very surprising. It does not say that column space and the row space are identical. Indeed, if the matrix is not square, then the sizes (number of slots) of the vectors in each space are different, so the sets are not even comparable.

It is not hard to construct by yourself examples of matrices that illustrate Theorem RMRT, since it applies equally well to any matrix. Grab a matrix, row-reduce it, count the nonzero rows or the number of pivot columns. That is the rank. Transpose the matrix, row-reduce that, count the nonzero rows or the pivot columns. That is the rank of the transpose. The theorem says the two will be equal. Every time. Here is an example anyway.

# SubsectionDFSDimension of Four Subspaces¶ permalink

That the rank of a matrix equals the rank of its transpose is a fundamental and surprising result. However, applying Theorem FS we can easily determine the dimension of all four fundamental subspaces associated with a matrix.

##### Proof

There are many different ways to state and prove this result, and indeed, the equality of the dimensions of the column space and row space is just a slight expansion of Theorem RMRT. However, we have restricted our techniques to applying Theorem FS and then determining dimensions with bases provided by Theorem BNS and Theorem BRS. This provides an appealing symmetry to the results and the proof.

Click to open

##### 1

Why does Theorem G have the title it does?

##### 3

Row-reduce the matrix $A$ to reduced row-echelon form. Without any further computations, compute the dimensions of the four subspaces, (a) $\nsp{A}\text{,}$ (b) $\csp{A}\text{,}$ (c) $\rsp{A}$ and (d) $\lns{A}\text{.}$ \begin{equation*} A= \begin{bmatrix} 1 & -1 & 2 & 8 & 5 \\ 1 & 1 & 1 & 4 & -1 \\ 0 & 2 & -3 & -8 & -6 \\ 2 & 0 & 1 & 8 & 4 \end{bmatrix} \end{equation*}

# SubsectionExercises

##### C10

Example SVP4 leaves several details for the reader to check. Verify these five claims.

##### C40

Determine if the set $T=\set{x^2-x+5,\,4x^3-x^2+5x,\,3x+2}$ spans the vector space of polynomials with degree 4 or less, $P_4\text{.}$ (Compare the solution to this exercise with Solution C40.1.)

Solution
##### T05

Trivially, if $U$ and $V$ are two subspaces of $W$ with $U = V\text{,}$ then $\dimension{U}=\dimension{V}\text{.}$ Combine this fact, Theorem PSSD, and Theorem EDYES all into one grand combined theorem. You might look to Theorem PIP for stylistic inspiration. (Notice this problem does not ask you to prove anything. It just asks you to roll up three theorems into one compact, logically equivalent statement.)

##### T10

Prove the following theorem, which could be viewed as a reformulation of parts (3) and (4) of Theorem G, or more appropriately as a corollary of Theorem G (Proof Technique LC).

Suppose $V$ is a vector space and $S$ is a subset of $V$ such that the number of vectors in $S$ equals the dimension of $V\text{.}$ Then $S$ is linearly independent if and only if $S$ spans $V\text{.}$

##### T15

Suppose that $A$ is an $m\times n$ matrix and let $\text{min}(m,\,n)$ denote the minimum of $m$ and $n\text{.}$ Prove that $\rank{A}\leq \text{min}(m,\,n)\text{.}$ (If $m$ and $n$ are two numbers, then $\text{min}(m,\,n)$ stands for the number that is the smaller of the two. For example $\text{min}(4,\,6)=4\text{.}$)

##### T20

Suppose that $A$ is an $m\times n$ matrix and $\vect{b}\in\complex{m}\text{.}$ Prove that the linear system $\linearsystem{A}{\vect{b}}$ is consistent if and only if $\rank{A}=\rank{\augmented{A}{\vect{b}}}\text{.}$

Solution
##### T25

Suppose that $V$ is a vector space with finite dimension. Let $W$ be any subspace of $V\text{.}$ Prove that $W$ has finite dimension.

##### T33

Part of Exercise B.T50 is the half of the proof where we assume the matrix $A$ is nonsingular and prove that a set is a basis. In Solution T50.1 we proved directly that the set was both linearly independent and a spanning set. Shorten this part of the proof by applying Theorem G. Be careful, there is one subtlety.

Solution
##### T60

Suppose that $W$ is a vector space with dimension 5, and $U$ and $V$ are subspaces of $W\text{,}$ each of dimension 3. Prove that $U\cap V$ contains a nonzero vector. State a more general result.

Solution