©

w © ¬

¢

The above step requires roughly as many operations as computing the last Arnoldi

vector . Therefore, its cost is negligible relative to the cost of the Arnoldi loop.

¦ ¢

¡vm( • R¡CDA 2˜h¤0v(%&$¢

—™ 3 1 ¡ )¦ ' # ¦ ¦ ¦

PT£&H Q C4PR 7 `C84C¥ P7 5 £¡ D

SQ 6 DQ S @ B @ 6 D D S 8 D 5 B D ¢

¨¬ 3 ©V§ A¨ ¨

²¬

1. Compute , .

v

¬

2. For Do:

w #( #(V(

! !

yT y u¥

3. Compute the Householder unit vector such that

V(w ( y vrE¡Y ¬ R XR X u u ¥ T

¬(

²

4. and V

y(

¤r( y vEe`¬ u ¥ u ¬ (Y

u ¥ u ¥ ˜&² ¬ u ¥

5. where ;

3 3

} ¬ W

¬

¬ S ¥

6. ; If then let . ¡

£ £

3 y u u ¥ £ ¥ ¦¬ “ ¦

W ¥

7. .

¡

W! 3

u ¥ u ¦¬

§§ ¥ “ ¥

8. If compute ,

¦

$

T

W

W

9. EndDo $ ¡

Xw

T

! ¥

10. De¬ne = the upper part of the matrix .

( $

( &

¢ ¢

!¢ £ £

¬ b £ b ¡ ² W S y 5C4I¦ ¤ ¬ b X

( £ W (

11. Compute . Let .

6

( ¢ ¢ ¢ ¢¨

¡F ¦ ¤

¦ ¦

W

12. z := 0

T

VX( ² !#( ! v ¬

13. For Do: (

3 y w u u u ¥¬

y

14. , ¡¦

15. EndDo

w (¬ © ©

16. Compute ¢

u¥

Note that now only the set of vectors needs to be saved. The scalar de¬ned in line S

z W ¥

¬

£

£

6 is equal to . This is because where is de¬ned by the equations (1.21)

¨ ¡F`

S S

W

seen in Chapter 1, which de¬ne the ¬rst Householder transformation. As was observed

¦

¬

earlier the Householder factorization actually obtains the QR factorization (6.10) with

© ©

¬

. We can also formulate GMRES directly from this factorization. Indeed, if w

¨

¢b ¢

, then according to this factorization, the corresponding residual norm is equal to

² ²W ` ²

²

£££¦ £

¢¦ ¢

¦

£ £ £

W

whose minimizer is the same as the one de¬ned by the algorithm.

¨ ¡¡µ£ „ ¢

§|5¥ yq¢| ¢ £ ¥§

„ 5| j C¦£¥

5§

C "–© ¡"

§

1B

The details of implementation of the solution of the least-squares problem as well as

the estimate of the residual norm are identical with those of the Gram-Schmidt versions

and are discussed next.

§¡F B

¦F

£

¢ ¨ ©DC776EP5 P¡& I B & #)DCA ) 8 6

3

8B

I5

E 'B A 8 A ¡5

F

§

¢

A clear dif¬culty with Algorithm 6.9 is that it does not provide the approximate solution

© explicitly at each step. As a result, it is not easy to determine when to stop. One remedy

¢

©

is to compute the approximation solution at regular intervals and check for convergence ¢

by a test on the residual, for example. However, there is a more elegant solution which is $

related to the way in which the least-squares problem (6.24) is solved. 5C4I6