Generating Covariance Matrix

Covariance matrix of a random vector can usually be deduced from the distribution’s property, or estimated from samples. But how to generate an arbitrary covariance matrix? How should we populate the entries in a square matrix so that it makes a legitimate covariance matrix?

First Method

In general, we construct the target covariance matrix \(\Sigma\) by giving its eigenvalues and its orthonormal eigenvectors (any real symmetric matrix, including the covariance matrix of course, can be constructed in this way). A diagonal matrix of eigenvalues (denoted as \(D\)) are easy to synthesize. It remains that how to synthesize a square matrix that has orthonormal column vectors.

Let the dimension of the target covariance matrix be \(n\). Given an arbitrary square matrix \(M\) of dimension \(n\), we can decompose \(M M^T\), which is a real symmetric matrix, into \(U \Lambda U^T\), where \(U\) is the orthonormal matrix consisting of \(M M^T\)’s eigenvectors and \(\Lambda\) is the diagonal matrix populated with \(M M^T\)’s eigenvalues, due to the property of real symmetric matrix.

Then we define \[ \begin{align} (M M^T)^{1/2} &\coloneqq U \Lambda^{1/2} U^T \\ (M M^T)^{-1/2} &\coloneqq U \Lambda^{-1/2} U^T \\ \end{align} \] We take \(E = (M M^T)^{-1/2} M\). Now \(E\) will contain the orthonormal column vectors as expected. To verify, \[ \begin{aligned} &E E^T = \left[ (M M^T)^{-1/2} M \right] \left[ M^T ((M M^T)^{-1/2})^T \right] \\ &= \left[ (M M^T)^{-1/2} \right] \left[ M M^T \right] \left[ ((M M^T)^{-1/2})^T \right] \\ &= \left[ U \Lambda^{-1/2} U^T \right] \left[ U \Lambda U^T \right] \left[ U \Lambda^{-1/2} U^T \right] \\ &= U \Lambda^{-1/2} \underbrace{\left[ U^T U \right]}_{I} \Lambda \underbrace{\left[ U^T U \right]}_{I} \Lambda^{-1/2} U^T \\ &= U \Lambda^{-1/2} \Lambda \Lambda^{-1/2} U^T = U U^T = I \end{aligned} \] Thus, the targeting covariance matrix can be constructed as \(\Sigma = E D E^T\).

Second Method

The easiest way to generate a legitimate covariance matrix would be to arbitrarily synthesize a square matrix \(A\) of dimension \(n\), and take \(\Sigma = A A^T\) (see here).

By doing so, we can obtain a covariance matrix very fast. But you lose the control over it. Since \(A\) is totally arbitrary, you can tell little about \(\Sigma\)’ s eigenvalues, eigenvectors, etc.