¡ ¡

uR S XR

W§ § (R

¡ ¡

The resulting algorithm is called ORTHODIR [127]. Restarted and truncated versions of

ORTHODIR can also be de¬ned.

m7}© f7 A¢© 7fq v

™ ™ ™ –™ ˜™‘ ™

©

¡9¡

£

§

As was seen in Section 6.6 when is symmetric, the Arnoldi algorithm simpli¬es into the

Lanczos procedure, which is de¬ned through a three-term recurrence. As a consequence,

FOM is mathematically equivalent to the Conjugate Gradient algorithm in this case. Simi-

larly, the full GMRES algorithm gives rise to the Conjugate Residual algorithm. It is clear

that the CG-type algorithms, i.e., algorithms de¬ned through short-term recurrences, are

more desirable than those algorithms which require storing entire sequences of vectors as

in the GMRES process. These algorithms require less memory and operations per step.

Therefore, the question is: Is it possible to de¬ne algorithms which are based on op-

timal Krylov subspace projection and which give rise to sequences involving short-term

recurrences? An optimal Krylov subspace projection means a technique which minimizes

a certain norm of the error, or residual, on the Krylov subspace. Such methods can be de-

µ C

´

k| 1t¦{¡ ¡ ¦y5 1Wq{£™¡ 5

|„

§|j

§| £

§ |„5 ©|

¡£

¢

¬ned from the Arnoldi process. If the Arnoldi process simpli¬es into an -term recurrence, ¢

¬ l

²

` vR

u R

i.e., if for , then the conjugate directions in DIOM are also de¬ned ¢ w

Y E ¡

£

y

from an -term recurrence. Similarly, the full GMRES would also simplify into a DQGM-

¢

RES algorithm involving a short recurrence. Therefore, for all purposes, it is suf¬cient to

analyze what happens to the Arnoldi process (or FOM). We start by generalizing the CG

result in a simple way, by considering the DIOM algorithm.

Q¤ ¡ v A£

£¦ 9tDA

C

' ' ©' 2 )

§)

§

Let be a matrix such that T

X

§ ¡

§

¦ ¦(

for any vector . Then, DIOM(s) is mathematically equivalent to the FOM algorithm. ¦

£ 6 A T

£

The assumption is equivalent to the statement that, for any , there is a polyno- ¦

¢ ¢

’¦ § X R u y T ²g¢ u R X

¬ §

mial of degree , such that . In the Arnoldi process, the scalars ¦

¦ ( ©§ ¬

vR

u

are de¬ned by and therefore ¦T

T T Tg

£ £

¢ a± „ ®

i

§’( u ¦ ¬ X R ¦ ( u ©§ ¬ u R ¬ XR ¦ XX

§

Tg (u¦

R¦

¦

g £

¢

R ¦X

m¢

² § ¢ ¢z

Since is a polynomial of degree , the vector is a linear combination

y ² ¬

R ¦ ( R ¦ ( R ¦ uR

of the vectors . As a result, if , then . Therefore,

w

( E Y

£

W“ U

"U

W y

DIOM(k) will give the same approximate solution as FOM.

In particular, if T

X

§¢¬ §

²}¢

¢

where is a polynomial of degree , then the result holds. However, the above

y§ §

relation implies that each eigenvector of is also an eigenvector of . According to

§

Theorem 1.2, this can be true only if is a normal matrix. As it turns out, the reverse is

T

§ ²¤

also true. That is, when is normal, then there is a polynomial of degree such

TX T

§¢ ¬ ©§ © y

¬§

that . Proving this is easy because when where is unitary and ¤ ¤ ¤

£

¢

¬X§ ©X

¢ ¢¤ ¢

$

diagonal, then . By choosing the polynomial so that T¤

¢ ¢

Xu ¢

¤r(( y } ( u © ¬ © ©

¬

$

T C

X

¢T T

©

¬ X§ §¬

we obtain which is the desired result. ¤ ¤

¢ X

§¢ ¬ ©§

§ ¢

Let be the smallest degree of all polynomials such that . Then the

¤

T

following lemma due to Faber and Manteuffel [85] states an interesting relation between ¢