TR

¥

application of each to a vector is performed as

£ ²z¦ X R

² ¦¬ ¦ R ¥ `¬£

¥R ¥ ˜ R¥ with ˜

w E ²y¤ T

This is essentially the result of a dot-product of length followed by a vector £

Xy w

$² ¤

update of the same length, requiring a total of about operations for each E

y

¥ R

application of . Neglecting the last step, the number of operations due to the Householder

transformations alone approximately totals T

T

u £

X

¢ ¢

² ¤„

²

f f f £

£

X

²¢¤ ¬ ²¤

w&

E R !

y !

'

¡

y ˜

u R u

W W W

The table below shows the costs of different orthogonalization procedures. GS stands for

Gram-Schmidt, MGS for Modi¬ed Gram-Schmidt, MGSR for Modi¬ed Gram-Schmidt

with reorthogonalization, and HO for Householder.

´

yq

„ 5| mV ¡ ® ky5„" @¦£ t ©

| § |

±

CC

¨ ¡ © B ¦£

§

!–¡ 1B

§

GS MGS MGSR HO

£ £ £ £

£ £ ¤

¤ ¤ ¤ ²X¤

Flops !˜ T !˜ T T T

! ! !

£R R

X X X

¤ ¤ ¤ ¢¤ y

²

Storage w w w w ! W£

! ! ! !

y y y

The number of operations shown for MGSR corresponds to the worst case scenario when a

second orthogonalization is performed each time. In practice, the number of operations is

rE( R ¦

¬

usually closer to that of the standard MGS. Regarding storage, the vectors $(V(

!

y

need not be saved. In the algorithms for solving linear systems, these vectors are needed at

the end of the process. This issue will be covered with the Householder implementations

R¥

of these algorithms. For now, assume that only the ™s are saved. The small gain in mem-

ory usage in the Householder version can be explained by the diminishing lengths of the

vectors required at each step of the Householder transformation. However, this difference

¤ ¡!

is negligible relative to the whole storage requirement of the algorithm, because ,

typically.

The Householder orthogonalization may be a reasonable choice when developing gen-

eral purpose, reliable software packages where robustness is a critical criterion. This is

especially true for solving eigenvalue problems since the cost of orthogonalization is then

amortized over several eigenvalue/eigenvector calculations. When solving linear systems,

the Modi¬ed Gram-Schmidt orthogonalization, with a reorthogonalization strategy based

on a measure of the level of cancellation, is more than adequate in most cases.

™ © y© z — vv ’i ¦ Q— “ ‚’m

— ™ — ™ “ ©™

S ¡

T

© ¨ ¬ t§

©

Given an initial guess to the original linear system , we now consider an orthogo-

X

§ ¬¬

nal projection method as de¬ned in the previous chapter, which takes , ¨(

¢

with T

a±0 w°„ ®

i

£

X

§ ¢

W§(( ¨ W§( §W§( ¨ ¤ c ¤¬

¢

¨( t© ¨ W “

¨ (

¢ §

¡

©²w

t§V¨ ¬ ©

in which . This method seeks an approximate solution from the af¬ne

¨ ¢

©

subspace of dimension by imposing the Galerkin condition

¢ !

a±a w°„ ®

i

§ tA¨

©§ ² ¢

¢

¨ ¬ ¦ ¨

’S

¬

£ £

If in Arnoldi™s method, and set , then

¨

W ¡

§ ¬

¢ ¢

¢

by (6.6) and T