the auto-correlation function is always positive

The Yule“Walker equations (11.2) for an

and has a pattern similar to that of an AR(1)

AR(3) process are

process. The (0.9, ’0.8) case reveals considerably

more structure. The main feature is a damped

±1 γ (0) + ±2 γ (1) + ±3 γ (2) = γ (1)

˜periodicity™ of about six time steps in length. This

±1 γ (1) + ±2 γ (0) + ±3 γ (1) = γ (2) result is also consistent with the run length analysis

±1 γ (2) + ±2 γ (1) + ±3 γ (0) = γ (3). in [10.3.3].

11: Parameters of Univariate and Bivariate Time Series

220

exponentially. Similarly, each pair of complex

conjugate roots contributes an exponentially

damped oscillation.

We now consider some speci¬c cases.

First, suppose Xt is a weakly stationary AR(1)

process. The characteristic polynomial is φ(B) =

1’±1 B and the only root is y1 = (±1 )’1 . Note that

|y1 | > 1 since |±1 | < 1. Thus the auto-correlation

function (11.6) consists of a single term ρ(„ ) =

a1 (±1 )„ that decays exponentially. The constant

a1 = 1.

Now suppose Xt is an AR(2) process. We saw

in [10.3.6] there are two types of AR(2) processes;

one has a pair of decaying modes, the other has a

single damped oscillatory mode. The ¬rst occurs

when ±1 + 4±2 > 0, in which case (10.11)

2

has real roots y1 and y2 , and the auto-correlation

function (11.6) is the sum of two terms that decay

exponentially.

The (0.3, 0.3) process (see [10.3.5]) belongs to

this class. The roots of its characteristic polyno-

mial are y1 = 1.39 and y2 = ’2.39. The y1 -mode

has a monotonically decaying auto-correlation

’1 |„ |

= a1 (0.72)|„ | . The y2 -

Figure 11.2: Auto-correlation functions of auto- function a1 (y1 )

’1

mode has auto-correlation function a2 (y2 )|„ | =

regressive processes.

a) Two AR(1) processes with ±1 = 0.3 (hatched a2 (’0.42)|„ | , which decays even more quickly but

has alternating sign.

bars) and 0.9 (solid bars).

b) Two AR(2) processes with (±1 , ±2 ) = (0.3, 0.3) The constants a1 and a2 can be calculated

(hatched bars) and (0.9, ’0.8) (solid bars). from (11.5) and (11.6). Since

ρ(0) = 1 = a1 + a2

±1 ’1 ’1

11.1.9 The General Form of the Auto- ρ(1) = = a1 y1 + a2 y2 ,

1 ’ ±2

correlation Function of an AR( p) Process.

The auto-correlation function of a weakly station- it follows that

ary AR( p) process can be expressed as ’1 ’1

ρ(1) ’ y2 y1 ’ ρ(1)

a1 = ’1 , a2 = ’1 . (11.7)

p

’1 ’1

’|„ | y1 ’ y2 y1 ’ y2

ρ(„ ) = ak yk (11.6)

In this example, a1 = 0.74 and a2 = 0.26.

k=1

for all „ , where yk , k = 1, . . . , p, are the When ±1 + 4±2 < 0, equation (10.11) has

2

—

roots of the characteristic polynomial (10.11), a pair of complex conjugate roots, y1 = y2 =

p

φ(B) = 1 ’ k=1 ±k B k (see, e.g., [195, 60]). y. Consequently, for positive „ , equation (11.6)

Since the characteristic polynomial can be factored reduces to

as a product of linear and quadratic functions,

ρ(„ ) = a1 y ’„ + a2 (y — )’„ ,

—

the roots yk are either real or come in complex

—

conjugate pairs. The constants ak can be derived where a1 = a2 = a. If we write y = r eiφ , this

from the process parameters ± p . When yk is may be rewritten as

real, the corresponding coef¬cient ak is also real, 2 Re(a) cos(„ φ) ’ 2 Im(a) sin(„ φ)

and when yk and yl are complex conjugates, ρ(„ ) = .

r„

the corresponding coef¬cients ak and al are also

(11.8)

complex conjugates.

Regardless of whether the roots are real To determine the complex constant a we ¬rst

or complex, the weak stationarity assumption evaluate (11.8) at „ = 0 and obtain Re(a) = 1/2.

ensures that |yk | > 1 for all k (see [10.3.5]). We then evaluate (11.8) at „ = 1 and obtain

Thus each real root contributes a component to cos(φ) ’ 2 Im(a) sin(φ)

the auto-correlation function (11.6) that decays ρ(1) = r

11.1: The Auto-covariance Function 221

11.1.11 The Partial Auto-correlation Function.

so that

When Xt is a normal process, ±„,„ is called

cos(φ) ’ r ρ(1)

Im(a) = (11.9) the partial auto-correlation coef¬cient between

2 sin(φ) Xt and Xt’„ (see [60]). A useful property of

where ρ(1) is given by (11.5). Finally, we see the partial auto-correlation function is that ±„,„

that the auto-correlation function (11.8) may be becomes zero for „ > p when Xt is an AR( p)

process. Thus an estimate of ±„,„ is often plotted

rewritten as

as a diagnostic to help identify the order of an AR

1 + 4 Im(a)2 process.

ρ(„ ) = cos(„ φ + ψ)

r„

11.1.12 What is the Partial Auto-correla-

with tan(ψ) = 2 Im(a). Note that r = 1.12 and

tion Coef¬cient? Details. In technical terms,

φ ≈ π/3 in the (0.9, ’0.8) example, so that

the partial auto-correlation coef¬cient ± p, p is

a = 0.5 + i 0.032 and ψ ≈ ’π/50.

the correlation between Xt and Xt’ p when

In general, the auto-correlation function of an

Xt’1 , . . . , Xt’ p+1 are held ¬xed. When Xt is a

AR( p) process is the sum of decaying exponentials

stationary normal process,

(one for every real root of the characteristic

polynomial) and damped oscillations (one for Cov Xt , Xt’ p |Gt = gt

every pair of complex conjugate roots of the ± p, p =

σ f σb

characteristic polynomial). Thus, the general auto-

correlation function has the form where

cos(„ φk + ψk )

ai σ 2 = Var Xt |Gt = gt

ρ(„ ) = + . (11.10)

ak f

yi„ „

rk

σb = Var Xt’ p |Gt = gt ,

2

i k

We use this property in [11.2.7] when we discuss

and where, for notational convenience, Gt is

the general form and interpretation of the spectrum

the ( p ’ 1)-dimensional random vector Gt =

of an AR( p) process.

(Xt’1 , . . . , Xt’ p+1 )T . The value of this correla-

tion does not depend upon the speci¬c realization

xt’1 , . . . , xt’ p+1 of Xt’1 , . . . , Xt’ p+1 .2

11.1.10 Uniqueness of the AR( p) Approxima-

tion to an Arbitrary Stationary Process. The The easiest way to understand the partial

following theorem is useful when ¬tting an AR( p) correlation coef¬cient is by means of an example.

process to an observed time series. Therefore suppose Xt is a zero mean normal AR(1)

process with parameter ±1 . For an arbitrary time t,

Let Xt be a stationary process with auto-

let Y1 = Xt+1 , Y2 = Xt and Y3 = Xt’1 . These

correlation function ρ. For each p ≥ 0 there a

random variables have variance-covariance matrix