£

¢ E 'B A 3 8 E C73C7A ¡63#

B A 8 F5

¢ ¢

Similar to the FOM algorithm of the previous section, the GMRES algorithm becomes

impractical when is large because of the growth of memory and computational require-

!

ments as increases. These requirements are identical with those of FOM. As with FOM,

!

there are two remedies. One is based on restarting and the other on truncating the Arnoldi

orthogonalization. The straightforward restarting option is described here.

m( •

—™

˜h¤ v ¢

¡¦ ¥ UY84@ S ¢ 0 7¡CDA

5

5S 3C

0 (%&$

'#

) 2

1

¬ $ W¡ ¦

¨

’S ©V§ A¨ ¨

¬ ²¬ £

1. Compute , , and –Q

S ¨

2. Generate the Arnoldi basis and the matrix using the Arnoldi algorithm ¢

$

3. starting with ¦ £ b ¢

¡

W

¢b ¢b ¢ w

²W (¬ ©

©

4. Compute which minimizes and

¡FS ¢

3

©¦¬ ©

5. If satis¬ed then Stop, else set and GoTo 1 ¢

Note that the implementation tricks discussed in the previous section can be applied, pro-

© u

viding the residual norm at each sub-step without computing the approximation . This

enables the program to exit as soon as this norm is small enough.

A well known dif¬culty with the restarted GMRES algorithm is that it can stagnate

when the matrix is not positive de¬nite. The full GMRES algorithm is guaranteed to con-

¤

verge in at most steps, but this would be impractical if there were many steps required

for convergence. Obviously, a preconditioner for the linear system can be used to reduce

the number of steps, or a better preconditioner if one is already in use. This issue will be

covered later along with preconditioning techniques.

¥

£¤A 7 §¨ ¦ ¥£

¤¢

Table 6.2 shows the results of applying the GMRES algorithm with no

preconditioning to three of the test problems described in Section 3.7.

¨ ¡¡µ£ „ ¢

§|5¥ yq¢| ¢ £ ¥§

„ 5| j C¦£¥

5§

¡ C

"–© ¡"

§

1B

Matrix Iters K¬‚ops Residual Error

F2DA 95 3841 0.32E-02 0.11E-03

F3D 67 11862 0.37E-03 0.28E-03

ORS 205 9221 0.33E+00 0.68E-04

¤ ¤A 7 §¡¤

£

A test run of GMRES with no preconditioning.

See Example 6.1 for the meaning of the column headers in the table. In this test, the di-

¬

mension of the Krylov subspace is . Observe that the problem ORS, which could Y

!

y

not be solved by FOM(10), is now solved in 205 steps.

¢

¤ ¦ F ©' P7P0¤ ¡63 I © 078P2E

¢ 3 ¡ ¥©' 0A B8 C8

3

A¤EB 5A )

E BF35 F5

¢ ¡

It is possible to derive an Incomplete version of the GMRES algorithm. This algorithm

is called Quasi-GMRES (QGMRES) for the sake of notational uniformity with other al-

gorithms developed in the literature (some of which will be seen in the next chapter). A

direct version called DQGMRES using exactly the same arguments as in Section 6.4.2 for

DIOM can also be derived. We begin by de¬ning the QGMRES algorithm, in simple terms,

by replacing the Arnoldi Algorithm with Algorithm 6.6, the Incomplete Orthogonalization

procedure.

V( r•˜ ¢

—™

˜Q¤ v ¢

¡¦ @¡ Q

R£ DA

3C

' 7$

%# ) 2

1 ¢£

Run a modi¬cation of Algorithm 6.9 in which the Arnoldi process in lines 3 to 11

is replaced by the Incomplete Orthogonalization process and all other computa-

tions remain unchanged.

R¦

Similar to IOM, only the previous vectors must be kept at any given step. How-

W

ever, this version of GMRES will potentially save computations but not storage. This is

¬7E

R¦

because computing the solution by formula (6.23) requires the vectors for $(V(

!

y

to be accessed. Fortunately, the approximate solution can be updated in a progressive man-

ner, as in DIOM.

$

The implementation of this progressive version is quite similar to DIOM. First, note

¡ v !

¬(¬

that if is banded, as for example, when , ˜

¢ W

S

£

£ £