Real Symmetric Matrix

Real Symmetric Matrix

Let \(A\) be an \(n \times n\) real-valued symmetric matrix. We have its properties as follows.

Real-valued Eigenvalues and Eigenvectors

Its eigenvalues and thus eigenvectors are real-valued. Suppose by contradiction that \(A\) has some imaginary eigenvalue \(\lambda\) and the corresponding imaginary eigenvector \(x\). We have \[ \begin{aligned} A x &= \lambda x \\ A(x_\text{real} + x_\text{img}) &= (\lambda_\text{real} + \lambda_\text{img})(x_\text{real} + x_\text{img}) \\ A x_\text{real} + A x_\text{img} &= (\lambda_\text{real} x_\text{real} + \lambda_\text{img} x_\text{img}) + (\lambda_\text{real} x_\text{real} + \lambda_\text{img} x_\text{real}) \\ \end{aligned} \] Denoting \(\lambda\)’s and \(x\)’s complex conjugate by \(\bar \lambda\) and \(\bar x\) respectively, we have \[ \begin{gather} \left. \begin{aligned} &A\bar x = A x_\text{real} - A x_\text{img} \\ &= (\lambda_\text{real} x_\text{real} + \lambda_\text{img} x_\text{img}) \\ &\quad- (\lambda_\text{real} x_\text{real} + \lambda_\text{img} x_\text{real}) \\ &= \bar \lambda \bar x \end{aligned} \right\} \Rightarrow \begin{aligned} (A\bar x)^T &= (\bar \lambda \bar x)^T \\ \bar x^T A^T &= \bar \lambda \bar x^T \\ \bar x^T A &= \bar \lambda \bar x^T \end{aligned} \end{gather} \] Left-multiply \(\bar x^T\) on both sides of \(Ax = \lambda x\) to give: \[ \bar x^TAx = \bar x^T \lambda x = \lambda \bar x^T x \] Right-multiply \(x\) on both side of \(\bar x^TA = \bar \lambda \bar x^T\) to give: \[ \bar x^TAx = \bar \lambda \bar x^T x \] Therefore \(\bar \lambda \bar x^T x = \lambda \bar x^T x\). Since \(\bar x^Tx\) is real-value, \(\bar \lambda = \lambda\). In other words, \(\lambda\) is real-valued and thus so is \(x\).

Sum of Real Symmetric Matrices

Let \(A\) and \(B\) be two real symmetric matrices. Let \(\lambda^-, \lambda^+\) be \(A\)’s smallest and largest eigenvalue of \(A\), \(\mu^-, \mu^+\) be \(B\)’s smallest and largest eigenvalue. Denote \(\gamma^-, \gamma^+\) as \(A+B\)’s smallest and largest eigenvalue. Then it can be derived that \[ \lambda^- + \mu^- \le \gamma^- \le \gamma^+ \le \lambda^+ + \mu^+ \] ### Orthogonal Eigenvectors

Its eigenvectors corresponding to different eigenvalues are orthogonal. Arbitrarily taking \(A\)’ s two eigenvectors \(v_1, v_2\) and their eigenvalues \(\lambda_1, \lambda_2, \lambda_1 \ne \lambda_2\), we have

\[ \begin{align} &\begin{aligned} (Av_1)^Tv_2 &= v_1^TA^Tv_2 \\ &= v_1^TAv_2 \\ &= v_1^T \lambda_2v_2 \\ &= \lambda_2v_1^Tv_2 \\ \end{aligned} \\ &\begin{aligned} (Av_1)^Tv_2 &= \lambda_1v_1^Tv_2 \end{aligned} \end{align} \]

Therefore, \[ \begin{aligned} \lambda_1v_1^Tv_2 &= \lambda_2v_1^Tv_2 \\ (\lambda_1 - \lambda_2)v_1^Tv_2 &= 0 \end{aligned} \]

Since \(\lambda_1 \ne \lambda_2\), we have \(v_1^Tv_2 = 0\) and thus \(v_1\) and \(v_2\) are orthogonal.

Diagonalizable

It has \(n\) independent eigenvectors and thus diagonalizable. To show it, eigenvectors in different eigenspaces are orthogonal and thus linearly independent; and eigenvectors in the same eigenspace are also linearly independent because they form the basis of this eigenspace.

“Easily Invertible”

Further on the diagonalizable property, suppose \(A = P\Lambda P^{-1}\). If \(A\) is not invertible, by the relation between the matrix rank and the eigenvalues, some of \(\Lambda\)’s entries on the diagonal are zero. By adding \(A\) with \(\lambda I\), we have \[ \begin{aligned} A^\prime &= P\Lambda P^{-1} + \lambda PIP^{-1} \\ &= P(\Lambda + \lambda I)P^{-1} \end{aligned} \] where \(\lambda I\) complements all the zero entries on the diagonal of \(\Lambda\). Thus \(A^\prime\) has \(n\) nonzero eigenvalues and is invertible. A singular symmetric matrix \(A\) becomes invertible by adding \(\lambda I\).

Orthogonally Diagonalizable

Its diagonalization can be in the form of \(A = P\Lambda P^T\), where \(P^TP = I\), by properly selecting the orthonormal eigenvectors.

Eigenvectors from different eigenspaces are already orthogonal. Eigenvectors from the same eigenspace are independent but not necessarily orthogonal. However, the linear combination of these homo-spatial independent eigenvectors is still an eigenvector. Thus we can apply the Gram-Schmidt process to these eigenvectors and obtain the orthogonal basis for this eigenspace.

Finally, we pull together all the orthogonal eigenvectors, normalize them to unit vector, and get the orthonormal matrix \(P\).

Orthogonal diagonalization is diagonalization, as well as SVD. In fact, an \(n \times n\) matrix \(A\) is orthogonally diagonalizable if and only if \(A\) is a symmetric matrix. Such orthogonal diagonalization is also referred to as spectral decomposition.

Commutativity

If the product of two symmetric matrices \(A\) and \(B\) is symmetric, then \(A\) and \(B\) commute, i.e. \(AB = BA\). This is simply because \[ \begin{aligned} A B &= (A B)^T &&\iff \\ A B &= B^T A^T &&\iff \\ A B &= B A \end{aligned} \]

Covariance Matrix

Covariance matrix is a special kind of real symmetric matrix. It is in the form of \(A A^T\). It is positive semi-definite and thus its eigenvalues are non-negative.

In fact, matrices of this form are positive semi-definite.

External Material

real-valued eigenvalues and orthogonal eigenvectors

Previous
Next