back

Sums of Squares Q

Instead of a Householder transformation, you can also use Givens rotations to create zeros below the pivot element of a column vector. Multiplying all Givens rotations into a matrix creates an orthogonal Hessenberg matrix, the sums of squares Q..
Why the name sums of squares Q?
This Hessenberg matrix can be created more easily using sums of squares. The first row is the vector v. The following rows agree with v starting at the diagonal element. The subdiagonal element is calculated so that the row becomes orthogonal to the row above or the first row. The value is thus the negative sum of squares starting at the diagonal, divided by the diagonal element above. If this is zero, the subdiagonal element is set to one, and the remaining row elements are set to zero. A special case arises when the sum of squares is zero. Then the remaining rows are supplemented with ones along the main diagonal. Now all rows are orthogonal, and all that's left to do is normalize.
Properties:
The transpose of the sums of squares Q yields an eigenspace for projection matrices, such as v'vT, I-v'vT, or Householder reflections. However, it is thus in competition with the Householder matrix.
Furthermore: Every orthogonal Hessenberg matrix with non-zero subdiagonal elements is a sum of squares Q, determined by the first row vector (except for individual row factors of -1). For Hessenberg matrices with non-zero subdiagonal elements, the characteristic polynomial is equal to the minimal polynomial. Therefore, and because an orthogonal matrix is diagonally similar, the sums of squares Q have only simple eigenvalues.
Example:
If the rows of the identity matrix are cyclically swapped by one row, a sum of squares Q is created. This permutation matrix has the minimal polynomial xn-1. The powers of the matrix form a finite group with An+1=A. The eigenvalues are all separated by the same angle.

Ludwig Resch