¦

¬ V§ W “

©

–¨ W “

The above system which has the same solution as the original system is called a precon-

ditioned system and is the preconditioning matrix or preconditioner. In other words, a

relaxation scheme is equivalent to a ¬xed-point iteration on a preconditioned system.

For example, for the Jacobi, Gauss-Seidel, SOR, and SSOR iterations, these precon-

ditioning matrices are, respectively,

± ¯ 0 i ¯®

¬

( $ #

a±—0 i ¯®

²¬

( %& T $ #

0(

a± 0 i ¯®

X# £¡

¬ ²

¢

# y#

( T% T

0

T a± 0 i ¯®

X X

%# '#

²g ¬ ² ²

£¡

¢

# X# y # W“ #

# 00

˜

§

Thus, the Jacobi preconditioner is simply the diagonal of , while the Gauss-Seidel pre-

§

conditioner is the lower triangular part of . The constant coef¬cients in front of the matri-

£¡

¢ ¤¡

¢

ces and only have the effect of scaling the equations of the preconditioned

0 00

system uniformly. Therefore, they are unimportant in the preconditioning context.

Note that the “preconditioned” system may be a full system. Indeed, there is no reason

why should be a sparse matrix (even though may be sparse), since the inverse

W“

of a sparse matrix is not necessarily sparse. This limits the number of techniques that can

be applied to solve the preconditioned system. Most of the iterative techniques used only

¬ §

require matrix-by-vector products. In this case, to compute for a given vector ¥ §W “

¦

§ ¬˜¨ ¬

, ¬rst compute and then solve the system :

¦ ©

¦ ¨ ¥

§ ˜¨

¬ (¤©

¦

¬¥ ¨W “

T

"§

¬ ²

In some cases, it may be advantageous to exploit the splitting and compute

X

¬ § ² ¬

as by the procedure

¥ ©W “

¦ ¥ ¦

W “

’ ¬ ¨

(¤

¦

¬ ¥

(¨W “

3

²z ¬

¥

¥¦

§

The matrix may be sparser than and the matrix-by-vector product may be less

¦

§

expensive than the product . A number of similar but somewhat more complex ideas

©

¦

have been exploited in the context of preconditioned iterative methods. A few of these will

be examined in Chapter 9.

µ£ „ ¢

|5¥ j qz A C¡y 5

„ 5| | 5£§|

¢ ¡

¡¡C

9 @¡

¨§ £ "!

cz•tc™¡ l

™ ™

¢£

¡

All the methods seen in the previous section de¬ne a sequence of iterates of the form

r± 0 i ¯®

© ¦ ¬

© ( w

"VH

WU H

in which is a certain iteration matrix. The questions addressed in this section are: (a) if

¦

the iteration converges, then is the limit indeed a solution of the original system? (b) under

which conditions does the iteration converge? (c) when the iteration does converge, how

fast is it?

©

If the above iteration converges, its limit satis¬es

r± 0 i ¯®

¦4¢©

©¬ w

² ¬§

In the case where the above iteration arises from the splitting , it is easy to see

© ¨ ¬ t§

©

that the solution to the above system is identical to that of the original system .

Indeed, in this case the sequence (4.28) has the form

w

© ¬ © ¨W “

W “

"QH

WU H

and its limit satis¬es