A set of vectors is said to be orthogonal if

$

C ‘…T‚T„w‚B Y

()

'%a

&$ 7

It is orthonormal if, in addition, every vector of has a 2-norm equal to unity. A vector $

that is orthogonal to all the vectors of a subspace is said to be orthogonal to this sub-

space. The set of all the vectors that are orthogonal to is a vector subspace called the

‘

orthogonal complement of and denoted by . The space is the direct sum of and

)

(

¦

its orthogonal complement. Thus, any vector can be written in a unique fashion as the

¦

sum of a vector in and a vector in . The operator which maps into its component in

¤

(

the subspace is the orthogonal projector onto .

z

@¥¡ £ § ¡ ¢¦£ ¨ £ ¡ £

n¥ ¢© ¥¥ ¡§

¡£ ¥ ¥

©¢

§

Every subspace admits an orthonormal basis which is obtained by taking any basis and

“orthonormalizing” it. The orthonormalization can be achieved by an algorithm known as

the Gram-Schmidt process which we now describe. Given a set of linearly independent

i1)1Y6 0

!"

0

¦ ¦©

¦ ¦

vectors , ¬rst normalize the vector , which means divide it by its 2-

6

) 0 ¦

norm, to obtain the scaled vector of norm unity. Then is orthogonalized against the

6

) )

0 0

¦

vector by subtracting from a multiple of to make the resulting vector orthogonal

) 0

to , i.e.,

0 C0 6

¢6

¡ B 36 0

) )

¦ ¦

¦

6

)

The resulting vector is again normalized to yield the second vector . The -th step of

¦

the Gram-Schmidt process consists of orthogonalizing the vector against all previous

…

)

vectors .

„ ¡¤q G ¥¦¤¢ ¦£ &$ #!) ¨# U ¨ F

%" ©T §

Q

i)11zG © 0

60 ¦ © 00' '(0

7 )

00' 00

¦

1. Compute . If Stop, else compute .

2. For Do:

'

…) H… … ri1)1u`‘@

% ©

¦B 3P

3. Compute , for

0'

C

) H… z … ¢ ' & ) 3 ) © ()

) ¦

4. 01(

7…… 7…… 6 () © '

5. ,

7…… …

' ( () © ) a

7

6. If then Stop, else

'

7. EndDo

It is easy to prove that the above algorithm will not break down, i.e., all steps will '

)1)1 6 0 "

¦ ¦© ©

¦

be completed if and only if the set of vectors is linearly independent. From

lines 4 and 5, it is clear that at every step of the algorithm the following relation holds:

…&

H… ' 1( … )

¦

0

4" )11) 6 0 2 0 4" )11) 6 0 2

3

3 1

1 w

) 0

) 0

)

¦ ¦© ©

¦

If , , and if denotes the upper triangular

0 5 6'

'

H… '

matrix whose nonzero elements are the de¬ned in the algorithm, then the above relation

can be written as

t

5¤0 0 ¨ £ ¦ §¥

¦

7U˜

'

This is called the QR decomposition of the matrix . From what was said above, the 0

QR decomposition of a matrix exists whenever the column vectors of form a linearly 0

independent set of vectors.

The above algorithm is the standard Gram-Schmidt process. There are alternative for-

mulations of the algorithm which have better numerical properties. The best known of

these is the Modi¬ed Gram-Schmidt (MGS) algorithm.

¡¤q G ¥¦¤¢

¦£ &I$H!G) F!E¨# DBCA @9% & U ¨

% " $ "$8 © T §

Q F

)i 11zG

6 0 ¦ © E0 '

0 0 ' ( 0 )© 0 ) 78 E0 ' 0

0