Convex function of diagonals and eigenvalues
Sam Walters posted an elegant theorem on his Twitter account this morning. The theorem follows the pattern of an equality for linear functions generalizing to an inequality for convex functions. We'll give a little background, state the theorem, and show an example application.
Let A be a real symmetric n*n matrix, or more generally a complex n*n Hermitian matrix, with entries aij. Note that the diagonal elements aii are real numbers even if some of the other entries are complex. (A Hermitian matrix equals its conjugate transpose, which means the elements on the diagonal equal their own conjugate.)
A general theorem says that A has n eigenvalues. Denote these eigenvalues 1, 2, ..., n.
It is well known that the sum of the diagonal elements of A equals the sum of its eigenvalues.
We could trivially generalize this to say that for any linear function : R R,
because we could pull any shifting and scaling constants out of the sum.
The theorem Sam Walters posted says that the equality above extends to an inequality if is convex.
Here's an application of this theorem. Assume the eigenvalues of A are all positive and let (x) = - log(x). Then is convex, and
and so
i.e. the product of the diagonals of A is an upper bound on the determinant of A.
This post illustrates two general principles:
- Linear equalities often generalize to convex inequalities.
- When you hear a new theorem about convex functions, see what it says about exp or -log.