Subsection

This section contributed by \andyzimmer.

\bigskip The matrix trace is a function that sends square matrices to scalars. In some ways it is reminiscent of the determinant. And like the determinant, it has many useful and surprising properties.
Definition T Trace

Suppose $A$ is a square matrix of size $n$. Then the trace of $A$, $\trace{A}$, is the sum of the diagonal entries of $A$. Symbolically, \begin{align*} \trace{A} &=\sum_{i=1}^n \matrixentry{A}{ii} \end{align*} $\trace{A}$

$\square$

The next three proofs make for excellent practice. In some books they would be left as exercises for the reader as they are all “trivial” in the sense they do not rely on anything but the definition of the matrix trace.
Theorem TL Trace is Linear
Suppose $A$ and $B$ are square matrices of size $n$. Then $\trace{A+B}=\trace{A}+\trace{B}$. Furthermore, if $\alpha\in\complexes$, then $\trace{\alpha A}=\alpha\trace{A}$.

Proof

Theorem TSRM Trace is Symmetric with Respect to Multiplication
Suppose $A$ and $B$ are square matrices of size $n$. Then $\trace{AB}=\trace{BA}$.

Proof

Theorem TIST Trace is Invariant Under Similarity Transformations
Suppose $A$ and $S$ are square matrices of size $n$ and $S$ is invertible. Then $\trace{\inverse{S}AS}=\trace{A}$.

Proof

Now we could define the trace of a linear transformation as the trace of any matrix representation of the transformation. Would this definition be well-defined? That is, will two different representations of the same linear transformation always have the same trace? Why? (Think Theorem SCB.) We will now prove one of the most interesting and surprising results about the trace.
Theorem TSE Trace is the Sum of the Eigenvalues
Suppose that $A$ is a square matrix of size $n$ with distinct eigenvalues $\scalarlist{\lambda}{k}$. Then \begin{align*} \trace{A} &=\sum_{i=1}^{k}\algmult{A}{\lambda_i}\lambda_i \end{align*}

Proof