Section 1.8 Positive Semi-Definite Matrices
ΒΆPositive semi-definite matrices (and their cousins, positive definite matrices) are square matrices which in many ways behave like non-negative (respectively, positive) real numbers. These results will be useful as we study various matrix decompositions in Chapter Chapter 2.Definition 1.8.1. Positive Semi-Definite Matrix.
A square matrix A of size n is positive semi-definite if A is Hermitian and for all xβCn, β¨x,Axβ©β₯0.
Theorem 1.8.2. Creating Positive Semi-Definite Matrices.
Suppose that A is any mΓn matrix. Then the matrices AβA and AAβ are positive semi-definite matrices.
Proof.
We will give the proof for the first matrix, the proof for the second is entirely similar. First we check that \(\adjoint{A}A\) is Hermitian, with an appeal to Definition HM,
Second, for any \(\vect{x}\in\complex{n}\text{,}\) Theorem AIP and Theorem PIP give,
which is the second condition for a positive semi-definite matrix.
Theorem 1.8.3. Eigenvalues of Positive Semi-definite Matrices.
Suppose that A is a Hermitian matrix. Then A is a positive semi-definite matrix if and only if every eigenvalue Ξ» of A has Ξ»β₯0.
Proof.
First notice first that since we are considering only Hermitian matrices in this theorem, it is always possible to compare eigenvalues with the real number zero, since eigenvalues of Hermitian matrices are all real numbers (Theorem HMRE).
(β) Let \(\vect{x}\neq 0\) be an eigenvector of \(A\) for \(\lambda\text{.}\) Since \(A\) is positive semi-definite,
By Theorem PIP we know \(\innerproduct{\vect{x}}{\vect{x}}\gt 0\text{,}\) so we conclude that \(\lambda\geq 0\text{.}\)
(β) Let \(n\) denote the size of \(A\text{.}\) Suppose that \(\scalarlist{\lambda}{n}\) are the eigenvalues of the Hermitian matrix \(A\text{,}\) each of which is non-negative. Let \(B=\set{\vectorlist{\vect{x}}{n}}\) be a set of associated eigenvectors for these eigenvalues. Since a Hermitian matrix is normal (Definition 1.7.1), Theorem OBNM allows us to choose \(B\) to also be an orthonormal basis of \(\complex{n}\text{.}\) Choose any \(\vect{x}\in\complex{n}\) and let \(\scalarlist{a}{n}\) be the scalars guaranteed by the spanning property of the basis \(B\text{,}\) so \(\vect{x}=\sum_{i=1}^{n}a_i\vect{x}_i\text{.}\) Since we have presumed \(A\) is Hermitian, we need only check the second condition of the definition. The use of an orthonormal basis provides the simplification for the last equality.
The expression \(\conjugate{a_i}a_i\) is the modulus of \(a_i\) squared, hence is always non-negative. With the eigenvalues assumed non-negative, this final sum is clearly non-negative as well, as desired.