¢

£ £

What is ? Not too commonly used an alternative is to take , which amounts to solving

a least-squares problem instead of a linear system. Develop algorithms for this case. What are

the advantages and disadvantages of the two approaches (ignoring convergence rates)?

£b

14 Let the scalars in the additive projection procedure satisfy the constraint

"f

a± 0 „— ®

i

£b A

¨£

§

v

5 @ £ @b

£b

It is not assumed that each is positive but only that for all . The residual vector is

given by the Formula (5.26) or, equivalently,

"f

g w£ b A ¢! 9W£

3

v ¢ ¢!

¨£

§

v

2 2 5 2 ¢ ©! 2 £b

Show that in the least-squares case, we have for any choice of ™s which

¢!

v

satisfy the constraint (5.27).

` £b

We wish to choose a set of ™s such that the 2-norm of the residual vector is minimal. ¢ ©!

v

£ g3

£b !9

Determine this set of ™s, assuming that the vectors are all linearly independent.

µ£ „ ¢

|5¥ qzl 5 ¢ ¤ ¤¥

„ 5| | §

£ tC

9 ¡

§ "!

£b

•” The “optimal” ™s provided in the previous question require the solution of a Symmet- #¥ !

!

…

£ £ ¢£

¡

ric Positive De¬nite linear system. Let be the “search directions” provided by each

of the individual projection steps. To avoid this dif¬culty, a simpler strategy is used which

consists of performing successive minimal residual iterations along these search directions,

!

as is described below.

! ¢!

¤

£

For Do:

! CABA0B

A

q3 ' 9 £ ! 3 £ £ b £ £

9

£ £ bE £

£ q£ b ! ¤!

£

EndDo

2 !2 5 2 ¢ !2

Show that . Give a suf¬cient condition to ensure global convergence.

v

“

15 Consider the iteration: , where is a vector called the direction of search, &

" "

v ¢ E

and is a scalar. It is assumed throughout that is a nonzero vector. Consider a method which "

2 2

¢

determines so that the residual is the smallest possible. v ¢ ©!

v

2 ¢ ¢! 2

Determine so that is minimal.

v

` ©!

Show that the residual vector obtained in this manner is orthogonal to .

¢ ©!

v

•” Show that the residual vectors satisfy the relation:

2 2 5 2 ¢ ¢! 2 ¤SPQ1

¥R ¢! 3 A9

¢! "

v

¨9

§

" ©! 3

Assume that at each step , we have . Will the method always converge?

¦

…

¡©

Now assume that is positive de¬nite and select at each step . Prove that the method

" !

¤

will converge for any initial guess .

¢ E

16 Consider the iteration: , where is a vector called the direction of search, " "

v

and is a scalar. It is assumed throughout that is a vector which is selected in the form "

¥ X

¥ § vTt

£

where is some nonzero vector. Let be the exact solution. Now

"

2 2

¢ £

consider a method which at each step determines so that the error norm v ¢

¦ v

is the smallest possible.

2 2 ¢

¢

£

Determine so that is minimal and show that the error vector

v v

£

is orthogonal to . The expression of should not contain unknown quantities

" v ¢

£

(e.g., or ).

2 2 5 2 ¢ 2

` 9 " P3

¤SQ1

¥ RP

Show that .