t+„ = r („ )e

about the technical aspects of this subject.

217

11: Parameters of Univariate and Bivariate Time Series

218

where the amplitude r („ ) and phase φ(„ ) are

functions of the lag „ . Thus the product xt x— t+„

of two realizations „ time steps apart, averaged

over many times t, will equal r („ )eiφ(„ ) . This

tells us that, on average, a real xt is followed

„ time steps later by a complex xt+„ centred

on r („ ) (cos(φ(„ ))xt ’ i sin(φ(„ ))xt ). That is the

persistent part of Xt follows a damped rotation in

the complex plane.

This behaviour is often seen in climate data.

An example is the estimated auto-correlation

function of the bivariate MJO index (Figure 10.3)

that is shown in Figure 11.1. Since Re(ρ(„ )) ˆ

is approximately zero at about lag-10 days, we

estimate that this bivariate index will rotate 90—¦ to

the right in about 10 days on average. Similarly,

it will rotate about 180—¦ in 22 days, and 270—¦ in

37 days. The estimated auto-correlation function is

certainly contaminated by sampling variation after

Figure 11.1: The auto-correlation function of

about day 20 (see Section 12.1).

a complex index of the Madden-and-Julian

Properties of the Auto-correlation Oscillation. The dots represent the estimated auto-

11.1.4

Function. We note that the auto-correlation correlation function. The continuous line displays

the theoretical auto-correlation function of a ¬tted

function is symmetric about the origin,

complex AR(1) process. The real part of the auto-

ρ(„ ) = ρ(’„ ), correlation function is represented by the vertical

and that it does not take values outside the interval axis, and the imaginary part by the horizontal axis.

[-1,1] (if X is real) or outside the unit circle (if X From von Storch and Baumhefner [388].

t t

is complex). That is,

|ρ(„ )| ¤ 1. at lags „ = 1, . . . , p to the process parameters

± p = (±1 , ±2 , . . . , ± p )T

11.1.5 The Auto-correlation Function of White

Noise. Because the elements of white noise are and the auto-covariances γ („ ) at lags „ =

independent, it immediately follows that the auto- 0, . . . , p ’ 1 through the p — p matrix

«

correlation function is

γ (0) γ (1) . . . γ ( p ’ 1)

1 if „ = 0 ¬ γ (1) . . . γ ( p ’ 2)·

γ (0)

ρ(„ ) = ¬ ·

Σ p =¬ . ·.

. . ..

0 otherwise. . . .

.

. . .

γ ( p ’ 1) γ ( p ’ 2) . . . γ (0)

11.1.6 The Yule“Walker Equations for an

AR( p) Process. If we multiply a zero mean This system of equations has two applications.

AR( p) process Xt (10.6) by Xt’„ , for „ = First, if γ (0), . . . , γ ( p) are known (or have

1, . . . , p, been estimated from a time series), the parameters

of the AR( p) process can be determined (or

p

(11.1) estimated) by solving (11.2) for ± p . Once the

Xt Xt’„ = ±i Xt’i Xt’„ + Zt Xt’„ ,

parameters have been estimated, both the auto-

i=1

and take expectations, we obtain a system of covariance function for lags „ > p [11.1.7]

and the spectrum (Section 11.2) of the unknown

equations

process can be estimated by the corresponding

Σp±p = γp (11.2) characterizations of the ¬tted AR( p) process.

Second, if ± p is known, then (11.2) can

that are known as the Yule“Walker equations. The

be recast as a linear equation with unknowns

equation relates the auto-covariances

γ (1), . . . , γ ( p), given the variance of the process

T

γ p = γ (1), γ (2), . . . , γ ( p) γ (0). Thus the Yule“Walker equations can be used

11.1: The Auto-covariance Function 219

to derive the ¬rst p +1 elements 1, ρ(1), . . . , ρ( p) Using the ¬rst two equations, we obtain

of the auto-correlation function. The full auto-

±1 + ±2 ±3

covariance or auto-correlation function can now be

ρ(1) =

derived by recursively extending equations (11.2). 1 ’ ±2 ’ ±1 ±3 ’ ±32

This is done by evaluating equation (11.1) for „ ≥

(±1 + ±3 )±1 + (1 ’ ±2 )±2

ρ(2) = .

p and taking expectations to obtain

1 ’ ±2 ’ ±1 ±3 ’ ±3

2

p

γ („ ) = ±k γ (k ’ „ )

Recursion relationship (11.3) can again be

k=1

used to extend ρ(„ ) to longer lags.

and

• p ≥ 4:

p

ρ(„ ) = ±k ρ(k ’ „ ). (11.3) The calculations required at higher orders

k=1

become increasingly laborious, but no more

complex.

11.1.7 Auto-covariance and Auto-correlation

Functions of Some Low-order AR( p) Processes. Note that the auto-covariance function can

be obtained by using (10.9) to compute the

• p = 1:

variance Var(Xt ) and then applying

The Yule“Walker equation (11.2) for an

AR(1) process is

γ („ ) = Var(Xt ) ρ(„ ).

±1 γ (0) = γ (1).

11.1.8 Examples. We will now discuss the

Hence ρ(1) = ±1 . Applying (11.3) auto-correlation functions of the processes that

recursively we see that were used as examples in [10.3.2]. Recall that

there are two AR(1) processes with ±1 = 0.3

|„ |

ρ(„ ) = ±1 . (11.4) and 0.9, and two AR(2) processes with (±1 , ±2 ) =

(0.9, ’0.8) and (0.3, 0.3). Sample realizations of

• p = 2: these processes are shown in Figures 10.7 and

The Yule“Walker equations (11.2) for an 10.9.

AR(2) process are The auto-correlation functions of the AR(1)

processes (Figure 11.2a) decrease monotonically.

±1 γ (0) + ±2 γ (1) = γ (1) The value of the auto-correlation function for

±1 γ (1) + ±2 γ (0) = γ (2). the ±1 = 0.3 process is less than 0.5 for all

nonzero lags; thus the persistence forecast is able

Using the ¬rst equation, we see that to forecast less than 25% of process variance at any

lag. When ±1 = 0.9, it takes ¬ve time steps for

±1

ρ(1) = . (11.5) the skill to fall below 25%; this process is much

1 ’ ±2 more persistent than the ±1 = 0.3 process. This is

consistent with the analysis of the distributions of

Recursion (11.3) can be used to extend the

the run length L in [10.3.3].

auto-correlation function to higher lags. For

The auto-correlation functions of the AR(2)

example, the auto-correlation at lag-2 is

processes are shown in Figure 11.2b. The ¬rst

two auto-correlations of the (±1 , ±2 ) = (0.3, 0.3)

±1 ’ ±2 + ±2

2 2

process are ρ(1) = ρ(2) = 0.43, and those for

ρ(2) = .

1 ’ ±2 the (±1 , ±2 ) = (0.9, ’0.8) process are ρ(1) =

0.5, ρ(2) = ’0.35. In the (0.3, 0.3) case