-0.5

2

2

representation (13.22) we ¬nd that the SSA

patterns add to the same numbers:

1 2 3 4 5 6

i+

(ek )2 = constant, (13.63)

i

Figure 13.10: The ¬rst four time EOFs of an AR(1)

for all lags k. process with a = 0.8 obtained using window

length m = 6. The patterns are normalized with

13.6.4 Paired Eigenvectors and Oscillatory the square root of the eigenvalue.

Components. We now consider, brie¬‚y, time

series that contain an oscillatory component. For

signal. That is, we expect to ¬nd a pair of degener-

simplicity, we suppose that Xt is pure cosine so

ate EOFs with coef¬cients that vary coherently and

that

are 90—¦ out-of-phase with each other.29 The pair of

Yt = (Xt , . . . , Xt+m’1 )T (13.64) patterns and their coef¬cients may be written as

one complex pattern and one complex coef¬cient.

T

= cos 2π m , . . . , cos 2π t+m’1 .

t

m

By equation (13.60), the time EOFs must be able 13.6.5 SSA of White Noise. A white noise

to represent this structure. Suppose one of the time process {Xt } (see [10.2.3]) consists of a sequence

of independent, identically distributed random

EOFs contains the cosine pattern, that is,

variables. It has auto-covariance function γx x („ )

T

such that γx x (0) = Var(Xt ) and γx x („ ) = 0 for

e i = 1, cos 2π , . . . , cos 2π(m’1) .

m m

nonzero „ . Thus

Then Y0 = e i . However, one time step later, we ΣY Y = Var(Xt )I,

have

where Yt is the delay-coordinate space version of

T

Xt (13.58) and I is the m — m identity matrix.

Y1 = cos 2π , . . . , cos 2π(m’1) , 1

m m

Hence Yt has m eigenvalues »i = Var(Xt )

T

and m degenerate eigenvectors. One possible

2π(m’1)

= cos m 1, cos m , . . . , cos

2π 2π

m

set of eigenvectors are the unit vectors, e i =

T

(0, . . . , 1, . . . , 0) with the 1 in the ith column.

’ sin 2π 0, sin 2π , . . . , sin 2π(m’1)

m m m

= cos e i ’ sin e j,

2π 2π

13.6.6 SSA of Red Noise. Red noise proces-

m m

30

j = 0, sin 2π , . . . , sin 2π(m’1) T is ses have exponentially decaying auto-covariance

where e

functions γx x („ ) = σ X a |„ | , where σ X = Var(Xt ).

2 2

m m

another eigenvector of Yt . At time t, Thus

«

. . . a m’1

Yt = cos 2π t e i ’ sin 2π t e j 1 a

m m

¬a . . . a m’2 ·

1

2¬ ·

= ±i (t)e i + ± (t)e j

ΣY Y = σ X ¬ . . ·.

.

j

..

. . .

.

. . .

where ±i (t) = cos 2π t and ± j (t) = m’1 a m’2 . . .

m a 1

sin 2πt . Note that both coef¬cients have the same

m

˜variance™ (i.e., »i = » j ), and that the coef¬cients 29 Compare with the discussion of complex POP coef¬cients

are 90—¦ out-of-phase. in Chapter 15.

30 AR(1), or ˜red noise,™ processes were introduced in

While the example is arti¬cial, the properties of [10.3.2]. They can be represented by a stochastic difference

the eigenvectors and coef¬cients above character- equation Xt = aXt’1 + Zt , where Zt is white noise. The

ize what happens when Xt contains an oscillatory auto-covariance function was derived in [11.1.6]. We represent

the lag-1 correlation coef¬cient by ˜a™ instead of ˜±™ to avoid

28 Note that (13.62) is just a special case of (13.7). confusion with our notation for the EOF coef¬cients.

13.6: Singular Systems Analysis 315

0.0 0.2 0.4 0.6 0.8 1.0

A 4.89 1.06

0.99

A 2.82 0.95

B 0.9

B

1.0

1.63 0.86

0.8

C

C

0.6

D

0.4

E

0.5

D 0.2

F

0.05

G

E

-0.5 0.0

F E

D

F

G G

C F

G G

E G G

F F

D F

E

B E E

D

C D D