### Section PSM  Positive Semi-definite Matrices

From A First Course in Linear Algebra
Version 2.10
http://linear.ups.edu/

This Section is a Draft, Subject to Changes
Needs Numerical Examples

Positive semi-definite matrices (and their cousins, positive definite matrices) are square matrices which in many ways behave like non-negative (respectively, positive) real numbers. Results given here are employed in the decompositions of Section SVD, Section SR and Section PD.

#### Subsection PSM: Positive Semi-Definite Matrices

Definition PSM
Positive Semi-Definite Matrix
A square matrix A of size n is positive semi-definite if A is Hermitian and for all x ∈ {ℂ}^{n}, \left \langle Ax,\kern 1.95872pt x\right \rangle ≥ 0.

For a definition of positive definite replace the inequality in the definition with a strict inequality, and exclude the zero vector from the vectors x required to meet the condition. Similar variations allow definitions of negative definite and negative semi-definite. Our first theorem in this section gives us an easy way to build positive semi-definite matrices.

Theorem CPSM
Creating Positive Semi-Definite Matrices
Suppose that A is any m × n matrix. Then the matrices {A}^{∗}A and A{A}^{∗} are positive semi-definite matrices.

Proof   We will give the proof for the first matrix, the proof for the second is entirely similar. First we check that {A}^{∗}A is Hermitian,

\eqalignno{ {\left ({A}^{∗}A\right )}^{∗} & = {A}^{∗}{\left ({A}^{∗}\right )}^{∗} & &\text{@(a href="fcla-jsmath-2.10li31.html#theorem.MMAD")Theorem MMAD@(/a)} & & & & \cr & = {A}^{∗}A & &\text{@(a href="fcla-jsmath-2.10li30.html#theorem.AA")Theorem AA@(/a)} & & & & }

so by Definition HM, the matrix {A}^{∗}A is Hermitian. Second, for any x ∈ {ℂ}^{n},

\eqalignno{ \left \langle {A}^{∗}Ax,\kern 1.95872pt x\right \rangle & = \left \langle Ax,\kern 1.95872pt {\left ({A}^{∗}\right )}^{∗}x\right \rangle & &\text{@(a href="fcla-jsmath-2.10li31.html#theorem.AIP")Theorem AIP@(/a)} & & & & \cr & = \left \langle Ax,\kern 1.95872pt Ax\right \rangle & &\text{@(a href="fcla-jsmath-2.10li30.html#theorem.AA")Theorem AA@(/a)} & & & & \cr & ≥ 0 & &\text{@(a href="fcla-jsmath-2.10li28.html#theorem.PIP")Theorem PIP@(/a)} & & & & }

which is the second criteria in the definition of a positive semi-definite matrix (Definition PSM).

A statement very similar to the converse of this theorem is also true. Any positive semi-definite matrix can be realized as the product of a square matrix, B, with its adjoint, {B}^{∗}. (See Exercise PSM.T20 after studying this entire section.) The matrices {A}^{∗}A and A{A}^{∗} will be important later when we define singular values (Section SVD).

Positive semi-definite matrices can also be characterized by their eigenvalues, without any mention of inner products. This next result further reinforces the notion that positive semi-definite matrices behave like non-negative real numbers.

Theorem EPSM
Eigenvalues of Positive Semi-definite Matrices
Suppose that A is a Hermitian matrix. Then A is positive semi-definite matrix if and only if whenever λ is an eigenvalue of A, then λ ≥ 0.

Proof   Notice first that since we are considering only Hermitian matrices in this theorem, it is always possible to compare eigenvalues with the real number zero, since eigenvalues of Hermitian matrices are all real numbers (Theorem HMRE). Let n denote the size of A.

() Let x\mathrel{≠}0 be an eigenvector of A for λ. Then by Theorem PIP we know \left \langle x,\kern 1.95872pt x\right \rangle \mathrel{≠}0. So

\eqalignno{ λ & = {1\over \left \langle x,\kern 1.95872pt x\right \rangle }λ\left \langle x,\kern 1.95872pt x\right \rangle & &\text{@(a href="fcla-jsmath-2.10li69.html#property.MICN")Property MICN@(/a)} & & & & \cr & = {1\over \left \langle x,\kern 1.95872pt x\right \rangle }\left \langle λx,\kern 1.95872pt x\right \rangle & &\text{@(a href="fcla-jsmath-2.10li28.html#theorem.IPSM")Theorem IPSM@(/a)} & & & & \cr & = {1\over \left \langle x,\kern 1.95872pt x\right \rangle }\left \langle Ax,\kern 1.95872pt x\right \rangle & &\text{@(a href="fcla-jsmath-2.10li47.html#definition.EEM")Definition EEM@(/a)} & & & & }

By Theorem PIP, \left \langle x,\kern 1.95872pt x\right \rangle > 0 and by Definition PSM we have \left \langle Ax,\kern 1.95872pt x\right \rangle ≥ 0. With λ expressed as the product of these two quantities, we have λ ≥ 0.

() Suppose now that {λ}_{1},\kern 1.95872pt {λ}_{2},\kern 1.95872pt {λ}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {λ}_{n} are the (not necessarily distinct) eigenvalues of the Hermitian matrix A, each of which is non-negative. Let B = \left \{{x}_{1},\kern 1.95872pt {x}_{2},\kern 1.95872pt {x}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {x}_{n}\right \} be a set of associated eigenvectors for these eigenvalues. Since a Hermitian matrix is normal (Definition HM, Definition NM), Theorem OBNM allows us to choose this set of eigenvectors to also be an orthonormal basis of {ℂ}^{n}. Choose any x ∈ {ℂ}^{n} and let {a}_{1},\kern 1.95872pt {a}_{2},\kern 1.95872pt {a}_{3},\kern 1.95872pt \mathop{\mathop{…}},\kern 1.95872pt {a}_{n} be the scalars guaranteed by the spanning property of the basis B such that

\eqalignno{ x & = {a}_{1}{x}_{1} + {a}_{2}{x}_{2} + {a}_{3}{x}_{3} + \mathrel{⋯} + {a}_{n}{x}_{n} ={ \mathop{∑ }}_{i=1}^{n}{a}_{ i}{x}_{i} & & }

Since we have presumed A is Hermitian, we need only check the other defining property,

\eqalignno{ \left \langle Ax,\kern 1.95872pt x\right \rangle & = \left \langle A{\mathop{∑ }}_{i=1}^{n}{a}_{ i}{x}_{i},\kern 1.95872pt {\mathop{∑ }}_{j=1}^{n}{a}_{ j}{x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li39.html#definition.TSVS")Definition TSVS@(/a)} & & & & \cr & = \left \langle {\mathop{∑ }}_{i=1}^{n}A{a}_{ i}{x}_{i},\kern 1.95872pt {\mathop{∑ }}_{j=1}^{n}{a}_{ j}{x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li31.html#theorem.MMDAA")Theorem MMDAA@(/a)} & & & & \cr & = \left \langle {\mathop{∑ }}_{i=1}^{n}{a}_{ i}A{x}_{i},\kern 1.95872pt {\mathop{∑ }}_{j=1}^{n}{a}_{ j}{x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li31.html#theorem.MMSMM")Theorem MMSMM@(/a)} & & & & \cr & = \left \langle {\mathop{∑ }}_{i=1}^{n}{a}_{ i}{λ}_{i}{x}_{i},\kern 1.95872pt {\mathop{∑ }}_{j=1}^{n}{a}_{ j}{x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li47.html#definition.EEM")Definition EEM@(/a)} & & & & \cr & ={ \mathop{∑ }}_{i=1}^{n}{ \mathop{∑ }}_{j=1}^{n}\left \langle {a}_{ i}{λ}_{i}{x}_{i},\kern 1.95872pt {a}_{j}{x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li28.html#theorem.IPVA")Theorem IPVA@(/a)} & & & & \cr & ={ \mathop{∑ }}_{i=1}^{n}{ \mathop{∑ }}_{j=1}^{n}{a}_{ i}{λ}_{i}\overline{{a}_{j}}\left \langle {x}_{i},\kern 1.95872pt {x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li28.html#theorem.IPSM")Theorem IPSM@(/a)} & & & & \cr & ={ \mathop{∑ }}_{i=1}^{n}{a}_{ i}{λ}_{i}\overline{{a}_{i}}\left \langle {x}_{i},\kern 1.95872pt {x}_{i}\right \rangle +{ \mathop{∑ }}_{i=1}^{n}{ \mathop{∑ }}_{\begin{array}{c}j=1 \\ j\mathrel{≠}i \end{array}}^{n}{a}_{ i}{λ}_{i}\overline{{a}_{j}}\left \langle {x}_{i},\kern 1.95872pt {x}_{j}\right \rangle & &\text{@(a href="fcla-jsmath-2.10li69.html#property.CACN")Property CACN@(/a)} & & & & \cr & ={ \mathop{∑ }}_{i=1}^{n}{a}_{ i}{λ}_{i}\overline{{a}_{i}}(1) +{ \mathop{∑ }}_{i=1}^{n}{ \mathop{∑ }}_{\begin{array}{c}j=1 \\ j\mathrel{≠}i \end{array}}^{n}{a}_{ i}{λ}_{i}\overline{{a}_{j}}(0) & &\text{@(a href="fcla-jsmath-2.10li28.html#definition.ONS")Definition ONS@(/a)} & & & & \cr & ={ \mathop{∑ }}_{i=1}^{n}{a}_{ i}{λ}_{i}\overline{{a}_{i}} & & & & \cr & ={ \mathop{∑ }}_{i=1}^{n}{λ}_{ i}{\left \vert {a}_{i}\right \vert }^{2} & &\text{@(a href="fcla-jsmath-2.10li69.html#definition.MCN")Definition MCN@(/a)} & & & & }

With non-negative values for each eigenvalue {λ}_{i}, 1 ≤ i ≤ n, and each modulus squared, it should be clear that this sum is non-negative. Which is exactly what is required by Definition PSM to establish that A is positive semi-definite.

As positive semi-definite matrices are defined to be Hermitian, they are then normal and subject to orthonormal diagonalization (Theorem OD). Now consider the interpretation of orthonormal diagonalization as a rotation to principal axes, a stretch by a diagonal matrix and a rotation back (Subsection OD.OD). For a positive semi-definite matrix, the diagonal matrix has diagonal entries that are the non-negative eigenvalues of the original positive semi-definite matrix. So the “stretching” along each axis is never a reflection.

#### Subsection EXC: Exercises

T20 Suppose that A is a positive semi-definite matrix of size n. Prove that there is a square matix B of size n such that A = B{B}^{∗}.
Contributed by Robert Beezer