Section PD Properties of Dimension
Once the dimension of a vector space is known, then the determination of whether or not a set of vectors is linearly independent, or if it spans the vector space, can often be much easier. In this section we will state a workhorse theorem and then apply it to the column space and row space of a matrix. It will also help us describe a super-basis for $\complex{m}$.
Subsection GT Goldilocks' Theorem
We begin with a useful theorem that we will need later, and in the proof of the main theorem in this subsection. This theorem says that we can extend linearly independent sets, one vector at a time, by adding vectors from outside the span of the linearly independent set, all the while preserving the linear independence of the set.
Theorem ELIS Extending Linearly Independent Sets
Suppose $V$ is a vector space and $S$ is a linearly independent set of vectors from $V$. Suppose $\vect{w}$ is a vector such that $\vect{w}\not\in\spn{S}$. Then the set $S^\prime=S\cup\set{\vect{w}}$ is linearly independent.
In the story Goldilocks and the Three Bears, the young girl Goldilocks visits the empty house of the three bears while out walking in the woods. One bowl of porridge is too hot, the other too cold, the third is just right. One chair is too hard, one too soft, the third is just right. So it is with sets of vectors — some are too big (linearly dependent), some are too small (they do not span), and some are just right (bases). Here is Goldilocks' Theorem.
Theorem G Goldilocks
Suppose that $V$ is a vector space of dimension $t$. Let $S=\set{\vectorlist{v}{m}}$ be a set of vectors from $V$. Then
- If $m>t$, then $S$ is linearly dependent.
- If $m<t$, then $S$ does not span $V$.
- If $m=t$ and $S$ is linearly independent, then $S$ spans $V$.
- If $m=t$ and $S$ spans $V$, then $S$ is linearly independent.
There is a tension in the construction of a basis. Make a set too big and you will end up with relations of linear dependence among the vectors. Make a set too small and you will not have enough raw material to span the entire vector space. Make a set just the right size (the dimension) and you only need to have linear independence or spanning, and you get the other property for free. These roughly-stated ideas are made precise by Theorem G.
The structure and proof of this theorem also deserve comment. The hypotheses seem innocuous. We presume we know the dimension of the vector space in hand, then we mostly just look at the size of the set $S$. From this we get big conclusions about spanning and linear independence. Each of the four proofs relies on ultimately contradicting Theorem SSLD, so in a way we could think of this entire theorem as a corollary of Theorem SSLD. (See Proof Technique LC.) The proofs of the third and fourth parts parallel each other in style: introduce $\vect{w}$ using Theorem ELIS or toss $\vect{v}_k$ using Theorem DLDS. Then obtain a contradiction to Theorem SSLD.
Theorem G is useful in both concrete examples and as a tool in other proofs. We will use it often to bypass verifying linear independence or spanning.
Example BPR Bases for $P_n$, reprised
Example BDM22 Basis by dimension in $M_{22}$
Example SVP4 Sets of vectors in $P_4$
A simple consequence of Theorem G is the observation that a proper subspace has strictly smaller dimension that its parent vector space. Hopefully this may seem intuitively obvious, but it still requires proof, and we will cite this result later.
Theorem PSSD Proper Subspaces have Smaller Dimension
Suppose that $U$ and $V$ are subspaces of the vector space $W$, such that $U\subsetneq V$. Then $\dimension{U}<\dimension{V}$.
The final theorem of this subsection is an extremely powerful tool for establishing the equality of two sets that are subspaces. Notice that the hypotheses include the equality of two integers (dimensions) while the conclusion is the equality of two sets (subspaces). It is the extra “structure” of a vector space and its dimension that makes possible this huge leap from an integer equality to a set equality.
Theorem EDYES Equal Dimensions Yields Equal Subspaces
Suppose that $U$ and $V$ are subspaces of the vector space $W$, such that $U\subseteq V$ and $\dimension{U}=\dimension{V}$. Then $U=V$.
Subsection RT Ranks and Transposes
We now prove one of the most surprising theorems about matrices. Notice the paucity of hypotheses compared to the precision of the conclusion.
Theorem RMRT Rank of a Matrix is the Rank of the Transpose
Suppose $A$ is an $m\times n$ matrix. Then $\rank{A}=\rank{\transpose{A}}$.
This says that the row space and the column space of a matrix have the same dimension, which should be very surprising. It does not say that column space and the row space are identical. Indeed, if the matrix is not square, then the sizes (number of slots) of the vectors in each space are different, so the sets are not even comparable.
It is not hard to construct by yourself examples of matrices that illustrate Theorem RMRT, since it applies equally well to any matrix. Grab a matrix, row-reduce it, count the nonzero rows or the number of pivot columns. That is the rank. Transpose the matrix, row-reduce that, count the nonzero rows or the pivot columns. That is the rank of the transpose. The theorem says the two will be equal. Every time. Here is an example anyway.
Example RRTI Rank, rank of transpose, Archetype I
Subsection DFS Dimension of Four Subspaces
That the rank of a matrix equals the rank of its transpose is a fundamental and surprising result. However, applying Theorem FS we can easily determine the dimension of all four fundamental subspaces associated with a matrix.
Theorem DFS Dimensions of Four Subspaces
Suppose that $A$ is an $m\times n$ matrix, and $B$ is a row-equivalent matrix in reduced row-echelon form with $r$ nonzero rows. Then
- $\dimension{\nsp{A}}=n-r$
- $\dimension{\csp{A}}=r$
- $\dimension{\rsp{A}}=r$
- $\dimension{\lns{A}}=m-r$
There are many different ways to state and prove this result, and indeed, the equality of the dimensions of the column space and row space is just a slight expansion of Theorem RMRT. However, we have restricted our techniques to applying Theorem FS and then determining dimensions with bases provided by Theorem BNS and Theorem BRS. This provides an appealing symmetry to the results and the proof.