g

associated with the original variable since . Multiplying (9.19) through to

¡

the left by and exploiting again (9.17), observe that the generic variable associated

d

f

with a vector of the subspace (9.19) belongs to the af¬ne subspace &W

§

´ 7( P d h D d ¡ ` „DB¬ P d ¡% P %# w

w

¡ ¡

¬ f a f %¬¬ f

h

'd

f 'd

f

This is identical to the af¬ne subspace (9.16) invoked in the left preconditioned variant. In

other words, for the right preconditioned GMRES, the approximate -solution can also be

expressed as

7 C a d £x` d h ¢ w ‘ h

¬f¡f

¨±

However, now is a polynomial of degree which minimizes the norm

h

¢ d

f G

G) ‘v' £

¦ u§ C a 'd ¡ x` ¢ d £© C §

¡¨

«f f

¨ 3

±

among all polynomials of degree . What is surprising is that the two quantities

¢ G

fd ¡

which are minimized, namely, (9.18) and (9.20), differ only by a multiplication by . C

d ¡

Speci¬cally, the left preconditioned GMRES minimizes , whereas the right precon- f

C C

ditioned variant minimizes , where is taken over the same subspace in both cases.

¶

” 8

¨¥ "¡¢0'¡ ¡0

$ ¥¡ © ¡

©

§5· ¦¦ ¨E¨ ¤¦ n¦ ‚£

¥ ¤£ £ ¦ The approximate solution obtained by left or right preconditioned

GMRES is of the form

C

P bufd ¡ ` d h ¢ w C h w

¡ ` f'd h ¢ d ¡

a f f !d

af

fd ¡ – P

¨!±

where and is a polynomial of degree . The polynomial

h¢ h

¢

fd f'd

G

h ¡PB§

«§ ¨

minimizes the residual norm in the right preconditioning case, and the pre-

« Da h ‚ ` 'd ¡ §

§¨f

conditioned residual norm in the left preconditioning case.

In most practical situations, the difference in the convergence behavior of the two

¡

approaches is not signi¬cant. The only exception is when is ill-conditioned which could

lead to substantial differences.

m r ‚¨iw£• ™

˜… • §

¦u

¥

In the discussion of preconditioning techniques so far, it is implicitly assumed that the pre-

¡

conditioning matrix is ¬xed, i.e., it does not change from step to step. However, in some

¡ d ¡

cases, no matrix is available. Instead, the operation is the result of some unspeci- f

¬ed computation, possibly another iterative process. In such cases, it may well happen that

¡ is not a constant operator. The previous preconditioned iterative procedures will not

d

f

¡

converge if is not constant. There are a number of variants of iterative procedures devel-

oped in the literature that can accommodate variations in the preconditioner, i.e., that allow

the preconditioner to vary from step to step. Such iterative procedures are called “¬‚exible”

iterations. One of these iterations, a ¬‚exible variant of the GMRES algorithm, is described

next.

¥ DTp

©H D3hAhDT P

BHe f8 H

72 ¡23

6

We begin by examining the right preconditioned GMRES algorithm. In line 11 of Algo-

rithm 9.5 the approximate solution is expressed as a linear combination of the precon-

h

s±% BBD% G %S £ d ¡ S P

¬¬¬ f

ditioned vectors . These vectors are also computed in line 3,

prior to their multiplication by to obtain the vector . They are all obtained by applying

S£

d ¡

the same preconditioning matrix to the ™s. As a result it is not necessary to save

f

S£

d ¡

them. Instead, we only need to apply to the linear combination of the ™s, i.e., to

f

¡ in line 11. Suppose now that the preconditioner could change at every step, i.e., that

h h

is given by

P

¬ £ 'd ¢

¡

P f

Then it would be natural to compute the approximate solution as

w

h

h

h

¶ ’ ”p£¡¨8 ³¡© &’ ”8 ” ’ ¤ §

8˜ 8¥© $ $ ©

¡ ¡¨ ¨¦ ¤

© §¥ &

$

¥h P % D¬BD% f P ¢