Generate Symmetric Positive Definite Matrices

Date: Mon, 11 Jun 2007 00:48:25 -0500 From: Robert Kern <robert.kern@gmail.com>

David Cournapeau wrote:

Hi there,

>

    I need to generate random positive definite matrix, mainly for
testing purpose. Before, I was generating them using a random matrix A
given by randn, and computing A'A. Unfortunately, if A is singular, so
is A'A. Is there a better way to do than testing whether A is singular ?
They do not need to follow a specific distribution (but I would like to
avoid them to follow a really special "pattern").

I would generate N random direction vectors (draw from a multivariate normal distribution with eye(N) as the covariance matrix and normalize the samples to be unit vectors). Resample any vector which happens to be nearly parallel to another (i.e. the dot product is within some eps of 1). Now, form a correlation matrix using the dot products of each of the unit vectors. Draw N random positive values from some positive distribution like log-normal or gamma. Multiply this vector on either side of the correlation matrix:

v * corr * v[:,newaxis]

You now have a random positive definite matrix which is even somewhat interpretable.

— Robert Kern

Date: Mon, 11 Jun 2007 03:01:20 -0400 From: "Anne Archibald" <peridot.faceted@gmail.com>

On 10/06/07, David Cournapeau <david@ar.media.kyoto-u.ac.jp> wrote:

Hi there,

>

    I need to generate random positive definite matrix, mainly for
testing purpose. Before, I was generating them using a random matrix A
given by randn, and computing A'A. Unfortunately, if A is singular, so
is A'A. Is there a better way to do than testing whether A is singular ?
They do not need to follow a specific distribution (but I would like to
avoid them to follow a really special "pattern").

Every symmetric positive definite matrix (you didn't say symmetric but I assume you want it...) can be diagonalized using an orthogonal matrix. That is, you can always factor it as O'DO, where D is diagonal and has positive elements on the diagonal. So to get a random one, do it backwards: pick a set of eigenvalues from some distribution, then for each pair of axes, rotate the matrix by a random angle in the plane they form (which you can do with a cos/sin/-sin/cos matrix) with a randomly-decided reflection. This gives a distribution that's invariant under orthonormal change of basis; your only nonuniformities come in the choice of eigenvalues, and those you will probably want to control in testing. Eigenvalues of wildly differing sizes tend to lead to big numerical headaches, so you can experiment with the size of headache you want to give your code. If you want them all roughly the same size you could take normal variates and square them, but for better control I'd go with exponentiating a normal distribution.

There's an alternative, probably faster, approach based on the uniqueness of the Cholesky square root of a matrix - just choose your eigenvalues, again, put their square roots on the diagonal, and randomly generate the upper triangle. Then multiply A'A and you should be able to get an arbitrary (symmetric) positive definite matrix. Unless you are cleverer than I am about choosing the distributions of the entries, it won't be nicely direction-independent.

Now, a sort-of interesting question is, is there a natural distribution on the cone of positive definite matrices one could hope to draw from? Apart from direction invariance, the only other criterion I can think of to include is some sort of convexity condition - all the matrices on the line segment connecting two positive definite matrices are positive definite - so perhaps one could want the probability densities for all those matrices to be at least as high as for the endpoints? It's not clear to me that this is a sufficient (or appropriate) constraint on the PDF.

Anne


sub-sahara africa
subrahaman
sahara trip IanKnot rainbow