Low-rank matrix perturbations
Here are a couple of linear algebra identities that can be very useful, but aren't that widely known, somewhere between common knowledge and arcane. Neither result assumes any matrix has low rank, but their most common application, at least in my experience, is in the context of something of low rank added to something of full rank.
Sylvester's determinant theoremThe first is Sylvester's determinant theorem. If A is a n by k matrix and B is a k by n matrix, then
where I is an identity matrix whose dimension is given by its subscript. The theorem is true for any k, but it's especially useful when k is small because the matrices on the right side are of size k. If k = 1, the right side is the determinant of a 1 by 1 matrix, i.e. just a number!
Woodbury matrix inversion lemmaThe second is known as the matrix inversion lemma or Woodbury's matrix identity. It says
which is a lot to take in at once. We'll unpack it a little at a time.
First of all, the matrices have whatever properties are necessary for the equation to make sense: A and C are square, invertible matrices, though possibly of different sizes.
Suppose A is n by n. Then U necessarily has n rows. Let k be the number of columns in U. Then C is k by k and V is k by n.
To simplify things a little, let C be the identity.
Think of k as small, maybe even 1. Then UV is a low-rank perturbation of A. Suppose you have computed the inverse of A, or know something about the inverse of A, and want to know how that inverse changes when you change A by adding a rank k matrix to it.
To simplify things further, assume A is the identity matrix. Now the matrix inversion lemma reduces to something similar to Sylvester's theorem above.
To make things clearer, we'll add subscripts on the identity matrices as above.
If k is small, then the matrix between U and V on the right side is small. If k = 1, it's just a number, and its inverse is just it's reciprocal.
Related posts