Positive semi-definite matrix involves many concepts like quadratic form, real symmetric matrix and singular value decomposition. It can be quite helpful to glue these things together here.

Quadratic Form

A quadratic function $f$ of $n$ variables, or say a vector $\x$ of length $n$, is the sum of second-order terms: $$ f(\x) = \sum_{i=1}^n \sum_{j=1}^n c_{ij} x_i x_j $$

The above summation can be simplified as matrix product $\x^T A \x$ where $A$ is $n \times n$ and $a_{ij} = c_{ij}$. $\x^T A \x$ is called the quadratic form of $A$.

In essence, there is a nicer formulation for $A$. Firstly define the $n \times n$ matrix $A$ such that $a_{ij} = \frac{1}{2} (c_{ij} + c_{ji})$. It suffices to show $A$ is a real symmetric matrix and $$ f(\x) = \x^T A \x $$

Thus the discussion of quadratic form usually encompasses the real symmetric matrix.

Positive Semi-definiteness

Let $A$ be a real symmetric matrix. $A$ is positive definite if and only if the quadratic form of $A$ is positive. Specifically, for every $\x \ne 0$, $\x^T A \x > 0$. $A$ is positive semi-definite if and only if the quadratic form of $A$ is non-negative. Specifically, for every $\x \ne 0$, $\x^T A \x \ge 0$.

One possible reason why we would like to define positive definiteness on real symmetric matrix is that, any real matrix can be easily decomposed into the addition of a real symmetric matrix and a real skew-symmetric matrix whose quadratic form is zero: $$ A = \frac{A + A^T}{2} + \frac{A - A^T}{2} $$ Since $\frac{A - A^T}{2}$’s quadratic form is zero and it makes no contribution to $A$’s, the only component of interest will be the real symmetric $\frac{A + A^T}{2}$. So why not just focus on the real symmetric matrix?

Note that there is also “PSD” matrix that is not symmetric. It is not very easy to find a such matrix. As a guideline, drop the idea to find such a matrix whose eigenvalues are all real. But for an example, $$ \x^T \begin{bmatrix} 1 & 1 \ -1 & 1 \end{bmatrix} \x = x_1^2 + x_2^2 \ge 0 $$

Interestingly, the real part of such matrix’s eigenvalues must be positive. Refer to this.

PSD and Eigenvalues

Theorem: A real symmetric matrix $A$ is positive semi-definite if and only if $A$’s eigenvalues are non-negative.

Proof:

  • Necessity

    For every $A$’s eigenpair $(\lambda, \v)$, we have $\v^T A \v = \lambda \v^T \v \ge 0 \Rightarrow \lambda \ge 0$.

  • Sufficiency

    Take $A$’s spectral decomposition as $A = Q \Lambda Q^T$ where $Q Q^T = I$. For every $\x > 0$, we have $$ \x^T A \x = \overbrace{\x^T Q}^{y^T} \Lambda \overbrace{Q^T \x}^{y} \ge 0 $$

Similarly we have that a real symmetric matrix $A$ is positive definite if and only if $A$’s eigenvalues are positive. Then it follows that a positive definite matrix is invertible, because its eigenvalues are positive and thus its determinant (product of eigenvalues) is positive.

Non-negative Diagonals

The diagonal entries of a PSD matrix $A$ must be non-negative. This is because for the basis vector $e_i = [0, \dots, \underset{i\text{-th}}{1}, \dots, 0]$, we have $e_i^T A e_i = a_{ii} \ge 0$. Furthermore, a PSD matrix is PD if and only if its main diagonal entries are positive.

Specifically, if one diagonal entry of $A$ is zero, then the row and the column to which this diagonal entry belongs to must be zero.

To show it, we firstly argue that $a_{ij}^2 \le a_{ii} a_{jj}$. For every $\lambda \in \R$, we have $$ \begin{gathered} (e_i + \lambda e_j)^T A (e_i + \lambda e_j) = e_i^T A e_i + \lambda (e_i^T A e_j + e_j^T A e_i) + \lambda^2 e_j^T A e_j \ = a_{ii} + 2 a_{ij} \lambda + a_{jj} \lambda^2 \ge 0 \end{gathered} $$ Note the formula above is a quadratic function in $\lambda$. This quadratic form has at most one real root. Hence, $4 a_{ij}^2 \le 4 a_{ii} a_{jj} \Rightarrow a_{ij}^2 \le a_{ii} a_{jj}$. Therefore, if $j$-th diagonal entry of $A$ is zero, for any $i = 1,\dots,n$, we have $a_{ij}^2 \le 0 \Rightarrow a_{ij} = 0$.

Another implication is that any $2 \times 2$ submatrix $\begin{bmatrix} a_{ii} & a_{ij} \ a_{ji} & a_{jj} \end{bmatrix}$ obtained from $A$ is always PSD. In fact, any submatrix obtained by removing the row and column of one of the main diagonal entry of a PSD matrix is PSD. Let $A_{kk}$ be the matrix obtained by removing $A$’s $k$-th row and column. To show it, let $\x \in \R^n$ and let $\bar \x = \x$ except that $\bar \x_k = 0$. $A_{kk}$’s quadratic form can be written as $$ \sum_{i=1, i \ne k}^{n} \sum_{j=1, j \ne k}^n a_{ij} x_i x_j = \sum_{i=1}^n \sum_{j=1}^n a_{ij} x_i x_j \mathbb{1}[i,j \ne k] = \bar \x^T A \bar x \ge 0 $$

Decomposition

Definitive Decomposition

A positive semi-definite matrix $A$ can be decomposed into the product of a square matrix $Q$ and this matrix’s transpose $Q^T$. In fact, a real symmetric matrix is positive semi-definite if and only if it can be decomposed this way.

Consider that a real symmetric matrix can be orthogonally diagonalized: $$ A = P \Lambda P^T $$ When $A$ is PSD, its eigenvalues are all non-negative. Thus, $\Lambda$ can be written as $\Lambda^{1/2} \Lambda^{1/2}$ and $$ \begin{aligned} &A = P (\Lambda^{1/2} \Lambda^{1/2}) P^T \ &= P \Lambda^{1/2} (\Lambda^{1/2})^T P^T \ &= \underbrace{(P \Lambda^{1/2})}{Q} \underbrace{(P \Lambda^{1/2})^T}{Q^T} \end{aligned} $$ $A$ is positive definite if and only if $Q$ is invertible.

Square Root

A real symmetric matrix $A$ is PSD if and only if there is a PSD matrix $B$ satisfying that $A = BB$. This $B$ is unique and is called non-negative square root of $A$ (there are non-PSD matrix whose square also equals $A$). $B$ is usually denoted as $A^{1/2}$.

Consider that a real symmetric matrix $A$ can be orthogonally diagonalized as $A = P \Lambda P^T$. Setting $B = P \Lambda^{1/2} P^T$ would give the non-negative square root of $A$.

Note that this cannot be naively extrapolated to that “a real symmetric matrix can be decomposed (not necessarily PSD) into two identical symmetric matrices (not necessarily PSD)”. Just consider that, the product of two identical symmetric matrices are positive semi-definite; but not all real symmetric matrices are positive semi-definite.

In fact, any diagonalizable matrix can have its square root, though not necessarily a PSD one.

Cholesky Decomposition

A PSD matrix $A$ can be written as $A = L L^T$, where $L$ is a lower-triangular matrix with non-negative diagonal. If $A$ is positive definite, then the diagonal of $L$ is positive and the Cholesky decomposition is unique.

Schur Complement

By applying Schur complement in PSD matrix, we have

$$ \begin{bmatrix} A & B \ B^T & C \end{bmatrix} \in \mathcal{S}+^{m+n} \iff \begin{gathered} A \in \mathcal{S}+^m, C \in \mathcal{S}+^n, \ A - B C^{-1} B^T \in \mathcal{S}+^m \text{ or } C - B^T A^{-1} B \in \mathcal{S}_+^n \end{gathered} $$

Sylvester’s Condition

Sylvester’s criterion states that a $n \times n$ real symmetric matrix $M$ is positive-definite if and only if all the following matrices have a positive determinant:

  • the upper left 1-by-1 corner of $M$,
  • the upper left 2-by-2 corner of $M$,
  • the upper left 3-by-3 corner of $M$,
  • $M$ itself.

External Materials

Course Notes - 29  Positive definite matrices (adityam.github.io)