Subsection

{\sc\large This Section is a Draft, Subject to Changes}\\ {\sc\large Needs Numerical Examples}

\bigskip The singular value decomposition is one of the more useful ways to represent any matrix, even rectangular ones. We can also view the singular values of a (rectangular) matrix as analogues of the eigenvalues of a square matrix. Our definitions and theorems in this section rely heavily on the properties of the matrix-adjoint products ($\adjoint{A}A$ and $A\adjoint{A}$), which we first met in Theorem CPSM. We start by examining some of the basic properties of these two matrices. Now would be a good time to review the basic facts about positive semi-definite matrices in Section PSM. \subsect{MAP}{Matrix-Adjoint Product}
Theorem EEMAP Eigenvalues and Eigenvectors of Matrix-Adjoint Product
Suppose that $A$ is an $m\times n$ matrix and $\adjoint{A}A$ has rank $r$. Let $\scalarlist{\lambda}{p}$ be the nonzero distinct eigenvalues of $\adjoint{A}A$ and let $\scalarlist{\rho}{q}$ be the nonzero distinct eigenvalues of $A\adjoint{A}$. Then,
  1. $p=q$.
  2. The distinct nonzero eigenvalues can be ordered such that $\lambda_i=\rho_i$, $1\leq i\leq p$.
  3. Properly ordered, $\algmult{\adjoint{A}A}{\lambda_i}=\algmult{A\adjoint{A}}{\rho_i}$, $1\leq i\leq p$.
  4. The rank of $\adjoint{A}A$ is equal to the rank of $A\adjoint{A}$.
  5. There is an orthonormal basis, $\set{\vectorlist{x}{n}}$ of $\complex{n}$ composed of eigenvectors of $\adjoint{A}A$ and an orthonormal basis, $\set{\vectorlist{y}{m}}$ of $\complex{m}$ composed of eigenvectors of $A\adjoint{A}$ with the following properties. Order the eigenvectors so that $\vect{x}_i$, $r+1\leq i\leq n$ are the eigenvectors of $\adjoint{A}A$ for the zero eigenvalue. Let $\delta_i$, $1\leq i\leq r$ denote the nonzero eigenvalues of $\adjoint{A}A$. Then $A\vect{x}_i=\sqrt{\delta_i}\vect{y_i}$, $1\leq i\leq r$ and $A\vect{x}_i=\zerovector$, $r+1\leq i\leq n$. Finally, $\vect{y}_i$, $r+1\leq i\leq m$, are eigenvectors of $A\adjoint{A}$ for the zero eigenvalue.

Proof

\subsect{SVD}{Singular Value Decomposition} The square roots of the eigenvalues of $\adjoint{A}A$ (or almost equivalently, $A\adjoint{A}$!) are known as the singular values of $A$. Here is the definition.
Definition SV Singular Values

Suppose $A$ is an $m\times n$ matrix. If the eigenvalues of $\adjoint{A}A$ are $\scalarlist{\delta}{n}$, then the singular values of $A$ are $\sqrt{\delta_1},\,\sqrt{\delta_2},\,\sqrt{\delta_3},\,\ldots,\,\sqrt{\delta_n}$.

$\square$

Theorem EEMAP is a total setup for the singular value decomposition. This remarkable theorem says that any matrix can be broken into a product of three matrices. Two are square, and unitary. In light of Theorem UMPIP, we can view these matrices as transforming vectors or coordinates in a rotational fashion. The middle matrix of this decomposition is rectangular, but is as close to being diagonal as a rectangular matrix can be. Viewed as a transformation, this matrix effects, reflections, contractions or expansions along axes — it stretches vectors. So any matrix, viewed as a transformation is the product of a rotation, a stretch and a rotation.

The singular value theorem can also be viewed as an application of our most general statement about matrix representations of linear transformations relative to different bases. Theorem MRCB concerns linear transformations $\ltdefn{T}{U}{V}$ where $U$ and $V$ are possibly different vector spaces. When $U$ and $V$ have different dimensions, the resulting matrix representation will be rectangular. In Section CB we quickly specialized to the case where $U=V$ and the matrix representations are square with one of our most central results, Theorem SCB. Theorem SVD is an application of the full generality of Theorem MRCB where the relevant bases are now orthonormal sets.
Theorem SVD Singular Value Decomposition
Suppose $A$ is an $m\times n$ matrix of rank $r$ with nonzero singular values $\scalarlist{s}{r}$. Then $A=UD\adjoint{V}$ where $U$ is a unitary matrix of size $m$, $V$ is a unitary matrix of size $n$ and $D$ is an $m\times n$ matrix given by \begin{align*} \matrixentry{D}{ij}&= \begin{cases} s_i&\text{if }1\leq i=j\leq r\\ 0&\text{otherwise} \end{cases} \end{align*}

Proof