length 240. Approximate critical values for testing

[12.2.2] describes a couple common methods.

the null hypothesis that ±„ „ is zero (see equation

Topics that we do cover include the Yule“Walker

(12.5)) at the 5% signi¬cance level are shown as

method and the method of maximum likelihood.

dashed lines.

We assume, for now, that all processes are

Top: Partial auto-correlation function estimated

ergodic and weakly stationary. Beran [45], Box

from a time series generated from an AR(1) pro-

and Jenkins [60], Brockwell and Davis [68], and

cess with ±1 = 0.9.

Tong [367], amongst others, describe techniques

Bottom: Partial auto-correlation function esti-

for identifying and ¬tting non-stationary and

mated from a time series generated from an

long memory stationary processes. Huang and

MA(10) process with β1 = · · · = β10 = 1.

North [189] and Polyak [318] are examples

The theoretical function is shown with solid dots

of authors who describe the analysis of cyclo-

connected by broken lines.

stationary processes in a climate research setting

(cf. [10.2.6]).

However, note that the non-stationary models

12.2 Identifying and Fitting and methods described in the literature are often

Auto-regressive Models most relevant in an econometric setting. For

example, Box and Jenkins [60] describe a class of

12.2.0 Overview. We will describe two ap- models called auto-regressive integrated moving

proaches that are frequently used to identify and average, or ARIMA, models. ARIMA processes

¬t AR models to time series. Xt are nonstationary stochastic processes that

become weakly stationary ARMA processes after

The Box“Jenkins method [60] is subjective

a differencing operator of some order has been

in nature. Diagnostic aids, such as plots of

applied. That is, they are processes that have

the estimated auto-correlation function (cf. Sec-

backshift operator (cf. [10.5.5]) representation of

tion 12.1) and partial auto-correlation function

the form

(cf. [11.1.11]), and a practised eye, are used to

make a ¬rst guess at the order of AR model φ(B)(1 ’ B)d Xt = θ(B)Zt (12.6)

to ¬t. The ¬tted model is then used to estimate

where all the roots of φ(B) lie outside the unit

the noise time series that forced the observed

circle. The operator (1 ’ B) represents the ¬rst

process, and the goodness-of-¬t is determined

differencing operation Xt ’ Xt’1 . The simplest

by examining the estimated noise process. This

model of this form is the random walk (cf.

process may be repeated several times, although

equation (10.4) in [10.2.8]), which has φ(B) =

care must be taken not to over¬t the time series by

θ (B) = 1 and d = 1. As with the random walk,

choosing models with too many free parameters.

all ARIMA processes integrate noise without

An advantage of this subjective approach is that

forgetting any of its effects. The ARIMA class of

the analyst is closely involved with the data and is

therefore better able to judge the goodness-of-¬t of 2 For example, when ¬tting a univariate AR model at every

the model and the in¬‚uence that idiosyncrasies in grid point of a time series of analysed ¬elds, as in Trenberth

the data have on the ¬t. [369].

12: Estimating Covariance Functions and Spectra

256

Full ACF - (0.9, -0.8)

models (12.6) is attractive because it provides a

1.0

method that can be used to deal with many types of

non-stationary behaviour (random walks, trends,

0.5

explosive growth, etc.) simply by repeatedly

0.0

applying the ¬rst differencing operator. Processes

-0.5

that can be made stationary in this way are often 0 5 10 15 20

seen in the economic world (e.g., the accumulation

of money by a ¬nancial institution) but seldom Full ACF - (0.3, 0.3)

seen in the physical world except on short time

0.8

scales (e.g., the accumulation of precipitation over

short periods of time).

0.4

0.0

12.2.1 Making a First Guess of the Order. We 0 5 10 15 20

will illustrate the method used to make a ¬rst guess

of the order of the process with simulated time Partial ACF - (0.9, -0.8)

series from known processes.

0.4

First we consider the examples presented

0.0

in [12.1.3] and [12.1.7].

-0.4

Estimates of the full and partial auto-correlation

-0.8

functions computed from two time series of

5 10 15 20

length 240 are shown in Figure 12.3. The

samples were taken from the AR(2) processes Partial ACF - (0.3, 0.3)

with (±1 , ±2 ) = (0.9, ’0.8) and (0.3, 0.3)

0.1 0.2 0.3

that were discussed extensively in Chapters 10

and 11. The estimated auto-correlation functions

(upper panels) are similar to their theoretical