стр. 43 |

The graphs 10.33

Figure

236 EUCLIDEAN SPACES [lOI

The parameters cl, f,, and a. are naturally between 0 and 1. It is usually assumed

that 0 < c,(l ˜ fl) + no < 1, so that the normal vector points northeast and the

IS-line has negative slope

n + c7

1 - c,(l - f,) - aa

The normal vector to the LM-line is (m, -h) which points southeast, and so the

LM-line has a positive slope h/m.

Using these diagrams, one can use geometry to study the effects of changes in

parameters or in exogenous variables, just as we did analytically in the exercises

in Section 9.3. For example, if G or IвЂ™ increases or if t,, decreases, then the right

hand side of the IS-equation increases and the lS-line shifts outward as in Figure

10.34. The result is an increase in the equilibrium Y and r, just as we found in

Exercise 9.15. Note that this result would hold even if the slope of the IS-line were

positive, as long as it was less than the slope of the LM-line.

EXERCISES

10.42 Use the diagram in Figure IO.33 to find the effect on Y and I of an increase in each

01 the variahlrs r. M,, m. h, u,,, a, c,,. and f,.

CHAPTER 11

Linear

Independence

Many economic problems deal with number OI size. How many equilibria does

a model of an economy or a game have? How large is the production possibility

set? Since these sets are often described as solutions of a system of equations,

questions of sire often reduce to questions about the size of the set of solutions

to a particular system of equations. If there are finitely many solutions, the exact

number of solutions gives a satisfactory answer. But if there are infinifely many

solutions, the size of the solution set is best captured by its dimension. We have

a good intuition about the difference between a one-dimensional line and a two-

dimensional plane. In this chapter, we will give a precise definition of вЂњdimensionвЂќ

for linear spaces. The key underlying concept is that of linear independence.

The most direct relevant mathematical question is the size, that is, the dimen-

sion, of the set of solutions of a system of linear equationsAx = b. Chapter 27

presents a sharp answer to this question via the Fundamer˜tul Theorem of Linear

Algebra: the dimension of the solution set of Ax = b is the number of variables

minus the rank ofA. Chapter 27 also investigates the size of the set of right-hand

sides b for which a given system Ax = b has a solution; and we present an in-

depth description of the dimension of an abstract vector space. Chapter 28 presents

applications of these concepts to portfolio analysis, voting paradoxes, and activity

analysis. Those who have the time are encouraged to read Chapters 27 and 2X

between Chapters I I and 12.

Linear independence is defined and characterized in Section I I .l. The com-

plementary notion of span is the focus of Section 11.2. The concept of a basis for

Euclidean space is introduced in Section 1 I .3.

11.1 LINEAR INDEPENDENCE

In Section 10.5, we noted that the set of all scalar multiples of a nonrero vector Y

is a straight line through the origin. In this chapter, we denote this set by r[v]:

and call it the line genemated OT spanned hy v. See Figure 11.1. For example, if

v = (l,O,. ,O), then 1[v] is the x,-axis in RвЂќ. If v = (I, 1) in RвЂ™, then r[v] is

the diagonal line pictured in Figure 11.1.

237

238 LlNEAR INDEPENDENCE [ll]

Figum

The line L[v] spanned by vector v.

11.1

Definition

If we start with two nonzero vectors v1 and v2 (considered as vectors with their

tails at the origin), we can take all possible linear combinations of v, and v2 to

obtain the set spanned by v, and v2:

If vI is a multiple of v2, then Z[v,, v2] = L[ vz ] IS simply the line spanned by v2, as

in Figure 11.2. However, if v, is not a multiple of v2, then together they generate

a two-dimensional plane L[v,, v2], which contains the lines f[v,] and L[vJ, as in

Figure 11.3.

If v, is a multiple of ˜2, or vice versa, we say that v, and vL are linearly

dependent. Otherwise, we say that v, and vz are linearly independent. We now

develop a precise way of expressing these two concepts. If v, is a multiple of ˜2,

Figure

11.2

I1 1 .l] LINEAR INDEPENDENCE 239

Figure

11.3

we write

вЂњ, = r:v* вЂњ I вЂњ, ˜ QYZ = 0 (1)

for some scalar r2. If v2 is a multiple of Y,, we write

or l,вЂњ, вЂњ2 =0

вЂњ2 = l,вЂњ, (2)

for some scalar I,. We can combine statements (1) and (2) by defining v, and v2

to be linearly dependent if there exist scalars c, and cz, nof both zero, so that

C,вЂњ, + C2вЂќ2 = 0. (вЂ˜1 or c> nonzero. (3)

In Exercise 1 1. I below. you are asked to show that (3) is an equivalent definition

to (I) and (2).

From this point of view. we say that Y, and y2 are linearly independent if

there are no xalars cl and ˜2, at least one nonzero; so that (3) holds. A working

version of this definition is the following:

vectors v, and v1 are linearly independent if

(4)

C,вЂњ, + CZвЂќ2 = 0 * (вЂ˜, = C? = 0.

This process extends to larger collections of vectors. The set of all linear

combinations of three VLвЂ˜C˜OIS Y,, v2, and vi,

LlNEAR lNDEPENDENCE [ll]

240

yields a three-dimensional space, provided that no one of Y,, ˜2, and VJ is a linear

combination of the other two. If, say, ˜3 is a linear combination of VI and ˜2, that

is, vj = livl + QYZ, while vi and vz are linearly independent, then L[v,, vz] is a

plane and v3 lies on this plane; so all combinations of vI, v2. and ˜3, X[v,, ˜2, vj],

yield just the plane L[vl, vz], as pictured in Figure 11.4. As before, we say that

vI, v2, and v3 are linearly dependent if one of them can be written as a linear

combination of the other two. The working version of this definition is that some

n˜nzex combination of VI, v2, and vj yields the O-vector:

стр. 43 |