IA

Consider the inner product on the space :

p

V

un j

£ ¥ ˜

˜

) } )x £} )x } )x y T

y

t D

}

where is some non-negative weight function on (6 ). Denote by and call

)x £ 7t "

!y

-norm, the 2-norm induced by this inner product.

£

We seek the polynomial which minimizes p c

v

ns j

¢¥ ˜

˜˜

! } ) x )

e

over all polynomials of degree . Call the least-squares iteration polynomial, p

U e ¨

c

¤ }) p #

v }

or simply the least-squares polynomial, and refer to as the least- ) x c p 8e

)

x

v

squares residual polynomial. A crucial observation is that the least squares polynomial is

}

well de¬ned for arbitrary values of and . Computing the polynomial is not a 6 7 )x c p

v

dif¬cult task when the weight function is suitably chosen. £

Computation of the least-squares polynomials There are three ways to compute the

least-squares polynomial de¬ned in the previous section. The ¬rst approach is to use an

p

explicit formula for , known as the kernel polynomials formula,

# p

p

) x i } ¢ x i sC }

un j

C

C I¥

˜ ˜

%D } ) x p #

$

p

}

¨ ¢ x i

C sC

$

in which the ™s represent a sequence of polynomials orthogonal with respect to the weight

C

}

function . The second approach generates a three-term recurrence satis¬ed by the

)x £

}

residual polynomials . These polynomials are orthogonal with respect to the weight

)x p #

}

function . From this three-term recurrence, we can proceed exactly as for the Cheby-

) x &)

£

shev iteration to obtain a recurrence formula for the sequence of approximate solutions . p|

Finally, a third approach solves the Normal Equations associated with the minimization of

(12.11), namely,

) ') }

}

d t it£D…H t ¢ D ) x 0(Ht ) x c v p ) e

e¢ ` j¢i¢t

thhh

e

where is any basis of the space of polynomials of degree

m’)1' c p2

Ht D —5j2¢ihit e

e thh

v

. ` U

e

These three approaches can all be useful in different situations. For example, the ¬rst

approach can be useful for computing least-squares polynomials of low degree explicitly.

For high-degree polynomials, the last two approaches are preferable for their better numer-

™ —t

7 —— p w7 z p — |w z

˜ { { | |

¡U

“£§

¢ ¢

¡

4 ¢

$#

! § "#

ical behavior. The second approach is restricted to the case where , while the third is ¢6

more general.

Since the degrees of the polynomial preconditioners are often low, e.g., not exceeding

¢ D et } ) x C

5 or 10, we will give some details on the ¬rst formulation. Let , w ¢iitVei¢i’¢t

hhh ithhhte

}

be the orthonormal polynomials with respect to . It is known that the least-squares )x £

}

residual polynomial of degree is determined by the kernel polynomials formula

)x p #

}

(12.12). To obtain , simply notice that )x c p

v

}

)x p #

e

}

)x c p D

v )

p } }

unH¦ j

X ¢xC )xC ˜

˜

sC

with

$

D t

p ¨}

¢ x C sC

$