matrix is always less than or equal to the number of columns, k. Therefore, if k

vectors span R”, then n 5 k. W

EXERCISES

11.9 n) Write (2.2) as a linear combination of (1, 2) and (I, 4).

h) Write (I, 2. 3) as a linear combination of (I, I, 0). (I, 0, I), and (0. I, I)

[l 1.31 BAS1S AND D,MENSlON IN R” 247

11.3 BASIS AND DIMENSION IN R”

If we have a spanning set of vectors, we can always throw in 0 or any linear

combination of the vectors in the spanning set to create a larger spanning set.

But what we would really like to do is to go the other way and find an efficient

spanning set.

Example 11.7 Let W be the set of all linear combinations of vI = (1, 1, l),

v2 = (1, -1, -l), and vx = (2,0,0) in R3: W = r[vi, v?, v?]. Note that

v1 = v1 + v2. Thus, any vector which is a linear combination of Y,, v2, and Y?

can be written as a linear combination ofjust v, and ˜2, because if w E W, then

there are scalars a, h: and c such that

w = av, + bvz + cv3

= 0, + bv2 + c(v, + v2)

I = (0 I c)v, + (h + C)Y?.

The set (v,, VZ} is a more “efticicnt” spanning set than is the set {v,, ˜2, v˜J},

For the sake of efficiency. if Y,, , vx span V, we would like to find the smallesr

possible subset of v,, , vk that spans V. However, this is precisely the role of the

concept of linear independence that WC considered in Section 11.1. If v,>. , vk

are linearly independent, no one of these vectors is a linear combination of the

others and therefore no proper subset of v,, , vi spans I[v,, ., vk]. The set

v,, , v1 spans f[v,, , vr] most efficiently. In this case, we call v,, , vk a

basis of f(v,. ., vk]. Since Qvl,. , va] can be spanned by different sets of

vectors, as illustrated in Example 11.6. we defmc a basis more generally as any

set of iirraarly indqwndeni vectors that vpmz r[v,, vk].

Definition Let v,, vx he a tixcd set of k vectors in R”. Let V be the set

I[v,, vi] s p a n n e d b y v,, VI. Then, if Y,, , VI are linearly independent,

v,, vk is called a basis of V. More generally. let w,, , IV,” be a collection of

vectors in V. Then, w,, , w,,, forms a basis of I/ if:

(u) w,, w,,, span I/. and

(h) w,, , w,,, are linearly independent.

LlNEAR INDEPENDENCE ill]

248

Example 11.8 We conclude from Examples I I .I and I I .5 that the unit vectors

1 0

0 0

e, = , e,, = :

0 I

i:) i.1

form a basis of R”. Since this is such a natural basis, it is called the canunical

basis of R”.

I I .9 Example 11.6 presents five collections of vectors that span R™.

Exumple

By Theorem 11.3, collections c and P are not linearly independent since each

contains more than two vectors. However, the collections in a, h and d are

linearly independent (exercise), and therefore, each forms a basis of R™.

Notice that each basis in R2 singled out in Example 11.9 is composed of two

vectors. This is natural since R” is a plane and two linearly independent vectors

span a plane. The following theorem generalizes this result to R”.

Theorem 11.7 Every basis of R” contains II vectors.

Proof By Theorem 11.3. a basis of R” cannot contain more than ii elements:

otherwise. the set under consideration would not be linearly independent. By

Theorem 11.6, a basis of R” cannot contain fewer than ,I elements; otherwise.

the set under consideration would nor span R”. It follows that a basis of R”

must have exactly n elements. n

We can combine Theorems I I. I I I .2. and I I .S and the fact that a square matrix

is nonsingular if and only if its determinant is nonzero to achieve the following

equivalence of the notions of linear independence, spanning, and basis for n vectors

n R”

Theorem 11.8 Let v,, v,, be a collection of n vectors in R”. Form the

n X n matrix A whose columns are these vi™s: A = (v, v1 v,, ).

Then, the following statements are equivalent:

(a) v, v,, are linearly independent.

(h) v,, v,, span R”;

Cc) t-,, .v,, form a basis of R”. and

(d) the determinant of A is nonzero.

The fact that every basis of R” contains exactly II vectors tells us that there

are n independent directions in R”. WC cxprcss this when we say that R” is

rz-dimensional. WC can use the idea of basis to extend the concept of dimension to

R”. In particular, let V he the set II[v,, , vk] generated by the

other suhscts of

set of vectors v,, ., vk. If v,, , v k are linearly indcpcndcnt, they form a basis

V. In Chapter 27, we prove that evq V has exactly k vectors-the

of hasis of

of R”. This number k of vectors in

analogue of Theorem 11.7 for proper subsets

V is called the dimension of V.

every hasis of

EXERCISES

11.12 Which of the following arc hascs of R”!

INDEPENDENCE [l 11

250 LlNEAR

It concludes with a complete characterization of the size, that is, dimension, of the

set of solutions to a system of linear equation Ax = b.

This chapter presents

Chapter 28: Applications of Linear Independence

applications of the material in Chapters 11 and 27 to portfolio analysis, activity

analysis, and voting paradoxes.

PART III

Calculus

of Several

Variables

C H A P T E R 1 2

Limits and

OP en Sets

A central concern in economic theory is the effect of a small change in one

economic variable x on some other economic variable y. How will a change in x

to a nearby x™ affect y? Before we can make this effect precise, we need to have

a working knowledge of the concepts of smaN change and nearby. What does it

mean to say that one commodity bundle or input bundle is close to another? What

does a small change in prices mean? How can we quantify trends in prices or

consumption?

This chapter focuses on these questions by studying in some detail the notions

of sequence, limit, neighborhood, open set and closed set. From a mathematical

point of view, the limit of a sequerzce is the concept that separates high school