It is now possible to imitate what was done for the standard FOM and GMRES algorithms.

T

The only missing link is the vector in (6.21) which now becomes a matrix. Let be FS %

¡

X

¥W W

¥

the matrix whose upper principal block is an identity matrix. Then, the

w

! ¡ ¡ ¡ ¡

relation (6.109) results in T

X

² ¢¢ ¬ ¤ ²

§

§ w

¢ ¤ ¢

$

² ¬

§ ¢

¡

$ ² ¢ & ¦ ( ¦ %¬

( $ ¢ ¢

± ¯ r°w°„ ®

i

¡ ² ¢X % W ¬ XU

¢ ¢

W XU

The vector $

R

R¡ ¢

& ! %

PG

W $

is a vector of length whose components are zero except those from 1 to which w E

! ¡

¡

¢

T

are extracted from the -th column of the upper triangular matrix . The matrix is an E ¢

$

$X ¥

matrix. The block-FOM approximation would consist of deleting the last

w

! !R

¡ ¡

¡

rows of and and solving the resulting system,

& ¢

PG

Rb R

¡ ¬ & P G

PG ¢

R

©

The approximate solution is then computed by (6.112). PG R

R Rb

© ©

The block-GMRES approximation is the unique vector of the form P G ¢ w P G

PG

which minimizes the 2-norm of the individual columns of the block-residual (6.114). Since

$

the column-vectors of are orthonormal, then from (6.114) we get, $

¢

R X U a±— r°w°„ ®

i

£PG ² P G & ¬ R

R R ¡ £ PGb ¢

V§y² ¨

©

PG

To minimize the residual norm, the function on the right hand-side must be minimized over

Rb

. The resulting least-squares problem is similar to the one encountered for GMRES.

PG

The only differences are in the right-hand side and the fact that the matrix is no longer

Hessenberg, but band-Hessenberg. Rotations can be used in a way similar to the scalar $

case. However, rotations are now needed at each new step instead of only one. Thus, if

¡

¡

¬ ¢

¬

and , the matrix and block right-hand side would be as follows: ¡

˜`

! ¦ ¡

£ £

¤

& &

¥ ¡

£ £ £ £ £ £

rW£

W ¥W£ SW£

R W£ W£ W£ W™W £¤W£

£ ¤

&

¥

„¡

£ £ £ £ £ £

„

W R

$ $

£ ¡

¤ ¥

£ £ £ £ £ £

R¤

WR £ R¤ ¥R¤

R R R

¡¡

¡

¤¤ ¥¤

¬ ¬

£ £ £ £ £

¥

¦

R ¡

¤¥

¥£ ¥¥ £

£ £

R ¡¡

¤ ¥

¡¢£ ¡

£ £