стр. 44 |

(5)

c,, ˜2. ci, not all zero, such that c,v, + czv> + c˜vj = 0.

Conversely, we say:

v,, vL, v3 are linearly independent if and only if

(6)

ClвЂќ, + CZвЂќZ + CIVj = 0 Cl = C? = C) = 0.

===+

Figure

I[Y,. Y:, v;] i.5 LI plane ifr; i.7 0 linrrrr conlhinution of v, cud vz

11.4

It is straightforward now to generalize the concepts of linear depcndcnce and

linear independence to arbitrary finite collections of vectors in RвЂќ by extending

definitions (5) and (6) in the natural way.

Vecrors v,, Y:, vii in RвЂќ are linearly dependent if and only if

Definition

there exist scalars c,, q, c17 nor a//zero, such that

Vectors ˜1, ˜2,. , vk in RвЂќ arc linearly independent if and only if C,Y, + +

civk = 0 for scalars c,, , ci implies that c, = = ˜1 = 0.

241

I1 1 .l] LiNEAK INDEPENDENCE

Exumple 11.1 The vectors

1

00

0

0 0

: , , e,, :

0 i

e,= =

in RвЂќ are linearly independent, because if c,. , cn are scalars such that clel +

c+z + + c,e, = 0,

The last vector equation implies that cl = cz = = cn = 0

Example 11.2 The vectors

I

arc linearly dependent in R-вЂ˜, since

as can easily be veritied.

Checking Linear Independence

How would one decide whether or not w, i wz, and w? in Example 11.2 are linearly

independent starting from scratch? To use definition (S), start with the equation

(7)

of

and solve this system for all possible values cl, cl, and cl. Multiplying system

(7) out yields

242 LINEAR INDEPENDENCE 1111

a system of linear equations in the variables c,, cz, and ci. The matrix formulation

of system (8) is

Note that the coefficient matrix in (9) is simply the matrix whose columns are the

original three vectors WI, ˜2, and ˜3. So, the question of the linear independence

of wI, w?, and w3 reduces to a consideration of the coefficient matrix whose

columns are wlr w2, and ˜3. In that case, we reduce the coefficient matrix to its

row echelon form:

and conclude that, because its IOW echelon form has a row of zeros, the coefficient

matrix in (9) is singular and therefore that system (Y) has a nonrcro solution (in

fact, infinitely many). One such solution is easily seen to be

cz = -2, and c3 = I,

c, = 1,

the coefficients we used in Example 11.2. We conclude that w,. w,, and w3 arc

linearly dependent.

The analysis with w,, w2, and wi in the previous example can easily be

generalized tu prove the following theorem by substituting general v,, , vi in

steps (7) to (9) for wI. w?. and w3 in Example I I .2

Vectors v,, ˜vi in RвЂќ arc linearly dependent if and only if

Theorem 11 .l

the linear system

has a nonzero solution (c,. , cp). where A is the n X k matrix whose columns

arc the vectors v,, , VI under study:

A =(v, v2 вЂњвЂ™ vx).

The following is a restatement of Theorem I I .I for the case k = II, using the

fact that a square matrix is nonsingular if and only if its determinant is not zero.

111 .11 LINEAR INDEPENDENCE 243

Theorem 11.2 A set of n vectors ˜1,. , Y, in Rn is linearly independent if

and only if

det(v, ˜2 .I. v,)+O.

J

1

For example, the matrix whose columns are the vectors e,, , e, in Rn in

Example 11.1 is the identity matrix, whose determinant is one. We conclude from

Theorem 11.2 that e,, , e, form a linearly independent set of n-vectors.

We can use Theorem 11.1 to derive a basic result about linear independence.

It generalizes the fact that any two vectors on a line are linearly dependent and

any three vectors in a plane are linearly dependent.

Theorem 11.3 If k > n, any set of k vectors in RвЂќ is linearly dependent.

Proof Let v,, _. , vk be k vectors in RвЂќ with k 5) n. By Theorem 11.1, the viвЂ™s

are linearly dependent if and only if the system

стр. 44 |