10.3.1 De¬nition: Auto-regressive Processes.

The dynamics of many physical processes can be The variance of Xt is obtained by multiplying both

approximated by ¬rst- or second-order ordinary sides of (10.8) with Xt ’ µ, and again taking

linear differential equations, for example, expectations on both sides of the equation. We see

that

d 2 x(t) d x(t)

+ a1 + a0 x(t) = z(t),

a2 p

dt 2 dt

Var(Xt ) = ±k E((Xt ’ µ) (Xt’k ’ µ))

where z is some external forcing function. k=1

Standard time discretization yields + E((Xt ’ µ) Zt )

p

a2 (xt + xt’2 ’ 2xt’1 )

= ±k ρk Var(Xt ) + Var(Zt ),

+ a1 (xt ’ xt’1 ) + a0 xt = z t , k=1

or where

xt = ±1 xt’1 + ±2 xt’2 + z t . E((Xt’k ’ µ) (Xt ’ µ))

(10.5)

ρk = .

Var(Xt )

where

a1 + 2a2 Thus,

±1 =

a0 + a1 + a2

Var(Zt )

a2 Var(Xt ) = . (10.9)

±2 = ’ p

1 ’ k=1 ±k ρk

a0 + a1 + a2

1

The function ρk , k = 0, ±1, . . . is known as the

zt = zt .

a0 + a1 + a2 auto-correlation function (see Chapter 11).

If z t is a white noise process, then (10.5) de¬nes a We assume in the following, for convenience,

that ±0 = 0 so that E(Xt ) = µ = 0.

second-order auto-regressive or AR(2) process.

10.3: Auto-regressive Processes 205

100.00

10.00

percentage

± 1 = 0.9

1.00

±1 = 0

± = 0.3

0.10 1

0.01

0 2 4 6 8 10 12 14 16 18

run length

interval length

Figure 10.8: The frequency distribution of the

run length L as derived from 100 000 time step

random realizations of three AR(1) processes Xt

with different process parameters ±1 .

50 095 runs were found for ±1 = 0,

40 280 runs for ±1 = 0.3,

14 375 runs for ±1 = 0.9.

The horizontal axis indicates the run length L.

and thus, using (10.9),

σz2

Var(Xt ) = . (10.10)

Figure 10.7: 240 time step realizations of AR(1) 1 ’ ±12

processes with ±1 = 0.3 (top) and 0.9 (bottom).

Both processes are forced by unit variance Thus, the variance of the process is a linear

function of the variance σz2 of the ˜input™ noise Zt

normally distributed white noise.

and a nonlinear function of the memory parameter

±1 . For processes with small memory, that is, ±1 ≈

10.3.3 AR(1) Processes. AR(1) processes may 0, the variance of Xt is almost equal to the variance

be understood as discretized ¬rst-order differential of Zt . When ±1 > 0, Var(Xt ) > Var(Zt ), and

equations. Such systems have only one degree when ±1 is almost 1, the variance of Xt becomes

of freedom and are unable to oscillate when the very large. The variance of (10.9) is not de¬ned

damping coef¬cient is positive. A nonzero value when ±1 = 1. Figure 10.7 neatly demonstrates that

xt at time t tends to be damped with an average the variance of an AR(1) process increases with the

damping rate of ±1 per time step.6 Obviously process parameter ±1 .

the system can only be stationary if ±1 < 1.7 Now recall the run length random variable L,

Figure 10.7 shows realizations of AR(1) processes discussed in [10.2.3]. We were able to derive the

with ±1 = 0.3 and 0.9. The upper time series is distribution of L analytically for white noise (i.e.,

very noisy and usually changes sign within just a ±1 = 0). The derivation can not be repeated when

few time steps; the lower one has markedly longer ±1 = 0 because then elements of the process

˜memory™ and tends to keep the same sign for 10 are serially correlated. We therefore estimated the

and more consecutive time steps. distribution of L with a Monte Carlo experiment

What is the variance of an AR(1) process? (see Section 6.3). The experiment was conducted

Because of the independence of Xt’1 and the by generating a time series of length 100 000 from

an AR(1) process. The runs of length L = L were

driving noise Zt we ¬nd that

counted for each L > 0. The result of this exercise

E(Xt’1 Xt )

ρ1 = = ±1 is shown in Figure 10.8.

Var(Xt ) When ±1 = 0, the Monte Carlo result agrees

well with the analytical result (10.3) for L ¤ 10.

6 Speci¬cally, E X

t+ |Xt = xt = ±1 xt .

7 The realizations of {X } grow explosively when ± > 1, For larger run lengths, the relative uncertainty of

t 1

and the process with ±1 = 1 behaves as a random walk (see the estimate becomes large because so few runs

are observed. The frequency of short runs (e.g.,

[10.2.8]).

10: Time Series and Stochastic Processes

206

100.00

0.9 -0.8

10.00

( ± 1 ,± 2 ) = (0.3,0.3)

percent

1.00

(± 1 ,± 2

0.10 ) = (0.9,-0.8)

0.01

0 2 4 6 8 10 12 14 16 18

interval length

Figure 10.10: The frequency distribution of the

run length L as derived from 100 000 time step

random realizations of two AR(2) processes Xt

with different process parameters (±1 , ±2 ). There