solve a matrix A if A times A transpose equals B?

Hi,

Here is my question. A are B are matrices. If A times ATranspose = B, what's the explicit formula to solve A?

Thanks you!

Comments

  • In general it is not possible to solve for A. For example, taking B = I to be the identity matrix, the condition AA^T = I expresses the fact that the columns of A are orthogonal unit vectors (or equivalently that the rows of A are orthogonal unit vectors). It isn't possible to determine what A is just from this fact (e.g. it could be that A = I or that A = -I or that A is any other diagonal matrix with each diagonal entry taken from {1, -1}, and of course if A and B are nxn matrices with n > 1, this is just the tip of the iceberg).

    The equation is not solvable at all if B isn't positive and symmetric (I'm assuming A, B are to have real entries here; the discussion must be changed slightly for complex entries). Recall that a matrix M is positive if for all column vectors x the dot product of Mx with x is nonnegative, and that M is said to be symmetric if M^T = M. Any matrix M of the form M = AA^T has both of these properties so B clearly has to have these properties.

    If B has these properties, then there is a unique *positive and symmetric* matrix A that satisfies AA^T = B (actually A^2 = B, since A = A^T if A is symmetric). The way you can see this is to use a rather deep, technical theorem (often called the "spectral theorem" for symmetric positive real matrices): because B is positive and symmetric, it must be diagonalizable by an orthogonal matrix, and all of its eigenvalues must be nonnegative. So there are real matrices Q and D with D diagonal (having nonnegative entries down the diagonal) with QQ^T = I and B = Q^T D Q. If you let D^(1/2) denote the diagonal matrix with the square roots of the entries of D down its diagonal, it follows that the matrix A = Q^T D^(1/2) Q is positive, symmetric, and satisfies AA^T = B. The fact that it is the *only* positive symmetric matrix that does so takes a short separate argument to prove. If you don't mind, I'll skip that (since it's already much simpler than the fact that B can be diagonalized by an orthogonal matrix, which I am also leaving out). I'll point out that when B = I, the unique positive and symmetric solution to AA^T = I is the solution A = I.

    Most commercial software packages know how to orthogonally diagonalize positive symmetric matrices (ie, given B, to produce Q and D as above) so you can implement this in software rather easily. In fact, many commercial software packages have this facility built in: given a positive symmetric B, asking the software to compute B^(1/2) will produce the unique positive symmetric A with A^2 = B.

    Once you have one solution A to AA^T = B, if O is any orthogonal matrix (ie, any matrix satisfying OO^T = I), then (AO)(AO)^T = AOO^TA^T= AA^T is also equal to B. So once you have the unique positive symmetric solution, you can easily generate other solutions.

Sign In or Register to comment.