˜˜ ˜ ˜˜

} } }

In addition, the matrix or does not need to be formed explicitly since

x x x

can be computed for any vector from a sequence of matrix-by-vector products.

Initially, this approach was motivated by the good performance of matrix-vector oper-

ations on vector computers for long vectors, e.g., the Cyber 205. However, the idea itself is

an old one and has been suggested by Stiefel [204] for eigenvalue calculations in the mid

1950s. Next, some of the popular choices for the polynomial are described.

'$&$"

# %# ! PH5 GEDC9@8(0¥642)0(

31

5

(

7

BA

9(

3

F

IA

The simplest polynomial which has been used is the polynomial of the Neumann series

expansion )

¨

t t t u

t

QR S

Q XQR §VTU

TT

W

in which )

˜'

D

Q

˜'

'

and is a scaling parameter. The above series comes from expanding the inverse of

using the splitting ) )

˜' ˜' }

D h

x

This approach can also be generalized by using a splitting of the form

˜' ˜' }

D

„x

„

˜ ˜

where can be the diagonal of or, more appropriately, a block diagonal of . Then,

) )

„

c

˜' ˜

)„ `D c v } v be} c v „ '

a}

)

x x x

Y

c a} c

˜

c

' hv„ v

D v„

x

Y

Thus, setting )

˜

c

'

D v„

Q

t™

—t

7 —— p w7 z p — |w z

˜ { { | |

¡

“£§

¢ ¢

¡

4 ¢

$#

! § "#

results in the approximate -term expansion )

An¦ j

£c }

¢ ¥c

¤

˜

˜ ') c

– „l h

v v v

x t t S

t

W

QR QR V§8

TTT

˜

' c v c

Since note that )

D – tl

v

„ Q

˜ ˜)

c c

) t f v

–D v „l

t u

t

R

Q QR V§8

TTT W

e }

D – xl

t t S

t

' W