Article 73G6S Aligning one matrix with another

Aligning one matrix with another

by
John
from John D. Cook on (#73G6S)

Suppose you have twon * n matrices, A and B, and you would like to find a rotation matrix that lines upB withA. That is, you'd like to find such that

A = B.

This is asking too much, except in the trivial case ofA andB being 1 * 1 matrices. You could view the matrix equation above as a set of n^2 equations in real numbers, but the space of orthogonal matrices only hasn(n - 1) degrees of freedom [1].

When an equation doesn't have an exact solution, the next best thing is to get as close as possible to a solution, typically in a least squares sense. The orthogonal Procrustes problem is to find an orthogonal matrix minimizing the distance between A and B That is, we want to minimize

||A - B ||

subject to the constraint that is orthogonal. The matrix norm used in this problem is the Frobenius norm, the sum of the squares of the matrix entries. The Frobenius norm is the 2-norm if we straighten the matrices into vectors of dimension n^2.

Peter Schonemann found a solution to the orthogonal Procrustes problem in 1964. His solution involves singular value decomposition (SVD). This shouldn't be surprising since SVD solves the problem of finding the closest thing to an inverse of an non-invertible matrix. (More on that here.)

Schonemann's solution is to setM =ABT and find its singular value decomposition

M = UVT.

Then

= UVT.

Python code

The following code illustrates solving the orthogonal Procrustes problem for random matrices.

import numpy as npn = 3# Generate random n x n matrices A and Brng = np.random.default_rng(seed=20260211) A = rng.standard_normal((n, n))B = rng.standard_normal((n, n))# Compute M = A * B^TM = A @ B.T# SVD: M = U * Sigma * V^TU, s, Vt = np.linalg.svd(M, full_matrices=False)# R = U * V^TR = U @ Vt# Verify that R * R^T is very nearly the identity matrixprint("||R^T R - I||_F =", np.linalg.norm(R.T @ R - np.eye(n), ord="fro"))

In this example the Frobenius norm between RRT and I is 4 * 10-16, so essentially RRT = I to machine precision.

Related posts

[1] Every column of an orthogonal matrix must have length 1, so that gives n constraints. Furthermore, each pair of columns must be orthogonal, which gives n choose 2 more constraints. We start with having n^2 degrees of freedom, but then remove n and then n(n - 1)/2 degrees of freedom.

n^2 - n - n(n - 1)/2 = n(n - 1)/2

The post Aligning one matrix with another first appeared on John D. Cook.
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TheEndeavour?format=xml
Feed Title John D. Cook
Feed Link https://www.johndcook.com/blog
Reply 0 comments