Next: Example Code
Up: The HouseholderQR/QL Algorithm
Previous: The HouseholderQR/QL Algorithm
The Householder rotation
P has the following
form:

(3.40) 
where
1 is the Kronecker delta
and
is the tensor product,
i.e.,
.
Of vector
w we demand only that
.
Matrix
P is clearly symmetric, hence
P^{T} = P.
Furthermore matrix
P is its own inverse, and this
can be seen as follows:
Now, observe that
and, of course,
,
hence
Consequently:
P = P^{T} = P^{1},
so
P is orthogonal.
We can use any vector
uin place of
w if we normalize it
at the same time:

(3.41) 
where

(3.42) 
So far
P isn't specific enough to deserve a proud name
of a Housholder matrix. But now we choose
u to be

(3.43) 
where
e_{1} is the first basis vector.
The Householder operator with the
u vector so defined
rotates vector
x onto
e_{1}, and this
can be seen as follows:
What's
?
Substituting equation (3.45) into (3.44) yields:
Remember that
P is not a projection operator.
Projection operators are not orthogonal.
P is a rotation,
which rotates vector
x onto the
e_{1} direction.
In the process the length of the vector does not change. Whereas it would
have changed for a projection operator.
The procedure for transforming matrix
A into a tridiagonal
form works as follows.
The first Householder operator,
P_{1} is selected to
rotate the subcolumn of the first column:
onto
And, to accomplish that, the operator has to have the following
form:
Multiplying
A by
P_{1} from the left:
attacks the first subcolumn with
^{(n1)}P_{1}and if that suboperator has been chosen to rotate
the first subcolumn onto its first dimension then the resulting
matrix
A' will look as follows:
Observe that the first row will not be changed by that operation
at all, but the lower right corner, of course, will get changed.
But now if we apply operator
P_{1} to
A' from the right the same thing will happen to the
first row (remember that
P is symmetric),
so that the resulting matrix
A'' will
look as follows:
The second Householder matrix is going to look as follows:
and, as you should understand by now, it will do to
the same as
P_{1} did to
In other words it will rotate it onto its first direction, thus leaving
only one term under the diagonal. The identity block in the upper left
corner ensures that tridiagonalization achieved so far will not be spoiled
during successive rotations.
It is now obvious that in n2 such steps the whole matrix must become
triadiagonalized.
Because
we can write down our operations on
A in
more detail. First

(3.46) 
Introducing a new vector
p:

(3.47) 
we get

(3.48) 
So now:
where

(3.50) 
Our expression for
can
be further simplified by introducing vector
q:
and observing that:

(3.52) 
Let us summarise the flow of computation now. For a particular
square submatrix of
A we'll have
and the accumulated transform, which we are going to need for
the eigenvectors is:

(3.53) 
Of course, you now appreciate that if eigenvectors are needed, this
is going to cost us additional operations, because we'll have
to compute
for every rotation, whereas without eigenvectors we can get away with
just the ,
u, H,
p,
K,
q,
A' sequence.
Next: Example Code
Up: The HouseholderQR/QL Algorithm
Previous: The HouseholderQR/QL Algorithm
Zdzislaw Meglicki
20010226