where

These models, ¬rst made popular by Box and

Jenkins [60], are widely used in some parts of

1 {±k : k = 0, 1, . . .} is a sequence of

geophysical science. We discuss them here for

coef¬cients such that ∞ |±k | < ∞, and

completeness. We also brie¬‚y discuss regime- k=0

dependent auto-regressive processes, which are

2 {Zt : t ∈ Z} is a white noise process.

nonlinear generalizations of the seasonal AR

processes.

We begin by de¬ning a moving average process. 10.5.4 Examples. Figure 10.15 shows ¬nite

samples of two MA(q) processes with q = 2

and 10, respectively, µ X = 0, and Var(Zt ) =

10.5.2 De¬nition: Moving Average Processes.

1. We have set all coef¬cients βl = 1 so that

Moving average processes are a special class

of stochastic processes that have ¬nite memory these MA(q) processes are running sums of length

„ M . Such models represent physical systems that q + 1 of a white noise process. The variance of the

MA(q) process is q + 1. The longer the summing

integrate the effects of only the last m encounters

with a random forcing mechanism. A process Xt is interval for the ˜forcing™ process Zt , the longer the

said to be a moving average process of order q, or memory and the longer the typical excursions of

equivalently, an MA(q) process, if the ˜responding™ process Xt from the mean.

What are the characteristic times „ M (10.1) for

q

Xt = µ X + Zt + βl Zt’l the MA(q) processes in Figure 10.15? Note that

(10.28)

l=1

q

E(Xt Xt+„ ) = βl βm E Zt+i Zt+„ + j

where

1 µ X is the mean of the process, l,m=0

q

l=0 βl βl’„ Var(Z) |„ | ¤ q

=

2 β1 , . . . , βq are constants such that βq = 0, |„ | > q.

0

and

Therefore, since we have implicitly assumed that

3 {Zt : t ∈ Z} is a white noise process.

Zt (and hence Xt ) is normally distributed, it

A moving average process is stationary with follows that P (Xt+„ > 0|xt > 0) = 0.5 for all

mean µ X and variance Var(Xt ) = Var(Zt )(1 + „ ≥ q + 1. Hence the characteristic time (10.1)

q

l=1 βl ). of an MA(q) process is „ M = q + 1.

2

10: Time Series and Stochastic Processes

214

any weakly stationary ergodic process can be

10

approximated arbitrarily closely by any of the

three types of models. However, the ARMA

5

models can approximate the behaviour of a given

weakly stationary ergodic process to a speci¬ed

0

level of accuracy with fewer parameters that can

a pure AR or MA model. That is, they are more

-5

parsimonious than their AR or MA counterparts.

The parsimony of the ARMA models is of some

-10

practical signi¬cance when ¬tting models to a

40 80 120 160 200 240

¬nite data set because fewer parameters need to be

estimated from a limited data resource. However,

10

this comes at the cost of developing dynamical

models that are forced by stochastic processes

5

with memory. This may be desirable if speci¬c

knowledge that can be used to choose the memory

0

of the forcing (i.e., order of the moving average)

appropriately is at hand. However, in the absence

-5

of such knowledge, the analyst risks obscuring the

true dynamical nature of the process under study

-10

by resorting to the more parsimonious statistical

40 80 120 160 200 240

model.

10.5.6 Invertible Linear Processes. All of the

Figure 10.15: Top: A 240 time step realization of

an MA(q) process with q = 2, µ X = 0, and models described in this section can be represented

βl = 1, for l = 1, . . . , q. formally in terms of a backward shift operator

Bottom: As top, except q = 10. B that acts on the time index of the stochastic

process. The operator B is de¬ned so that

B [X t ] = X t’1 . (10.32)

10.5.5 Auto-regressive Moving Average

Processes. An auto-regressive moving average AR, MA, and ARMA processes can all formally

(ARMA) process of order ( p, q) [60] is simply an be written in terms of the back shift operator.

auto-regressive process of order p (10.6) that is Speci¬cally, we de¬ne the auto-regressive operator

forced by a zero mean moving average process of φ(B) as the polynomial

order q (10.28) instead of by white noise.

p

An ARMA( p, q) process is formally de¬ned as

φ(B) = ±0 ’ ±i B i (10.33)

follows: Xt is said to be an auto-regressive moving

i=1

average process of order ( p, q) if

and we de¬ne the moving average operator θ(B)

p

as the polynomial

(Xt ’ µ X ) ’ ±i Xt’i

q

i=1

θ (B) = 1 + βj B j. (10.34)

q

= Zt + β j Zt’ j (10.31) j=1

j=1