Linear Stationary Processes. ARMA models

41
Linear Stationary Processes. Linear Stationary Processes. ARMA models ARMA models

description

Linear Stationary Processes. ARMA models. This lecture introduces the basic linear models for stationary processes. Considering only stationary processes is very restrictive since most economic variables are non-stationary. - PowerPoint PPT Presentation

Transcript of Linear Stationary Processes. ARMA models

Page 1: Linear Stationary Processes.  ARMA models

Linear Stationary Processes. Linear Stationary Processes. ARMA modelsARMA models

Page 2: Linear Stationary Processes.  ARMA models

• This lecture introduces the basic linear models for stationary processes.

• Considering only stationary processes is very restrictive since most economic variables are non-stationary.

• However, stationary linear models are used as building blocks in more complicated nonlinear and/or non-stationary models.

Page 3: Linear Stationary Processes.  ARMA models

Roadmap

1. The Wold decomposition2. From the Wold decomposition to the

ARMA representation.3. MA processes and invertibility4. AR processes, stationarity and causality5. ARMA, invertibility and causality.

Page 4: Linear Stationary Processes.  ARMA models

The Wold DecompositionThe Wold Decomposition

Wold theorem in words:

Any stationary process {Zt} can be expressed as a sum of two components:

- a stochastic component: a linear combination of lags of a white noise process.

- a deterministic component, uncorrelated with the latter stochastic component.

Page 5: Linear Stationary Processes.  ARMA models

The Wold TheoremThe Wold Theorem

If {Zt} is a nondeterministic stationary time series, then

Page 6: Linear Stationary Processes.  ARMA models

Some Remarks on the Wold Decomposition, ISome Remarks on the Wold Decomposition, I

Page 7: Linear Stationary Processes.  ARMA models

Importance of the Wold decompositionImportance of the Wold decomposition

• Any stationary process can be written as a linear combination of lagged values of a white noise process (MA(∞) representation).

• This implies that if a process is stationary we immediately know how to write a model for it.

•Problem: we might need to estimate a lot of parameters (in most cases, an infinite number of them!)

• ARMA models: they are an approximation to the Wold representation. This approximation is more parsimonious (=less parameters)

Page 8: Linear Stationary Processes.  ARMA models

Birth of the ARMA(p,q) modelsBirth of the ARMA(p,q) models

)L(p

)L(q)L(

Under general conditions the infinite lag polynomial of the Wolddecomposition can be approximated by the ratio of two finite-lag polynomials:

Therefore

AR(p) MA(q)

Page 9: Linear Stationary Processes.  ARMA models

MA processes

Page 10: Linear Stationary Processes.  ARMA models

MA(1) process (or ARMA(0,1))MA(1) process (or ARMA(0,1))

Let a zero-mean white noise process

- Expectation

- Variance

Autocovariance

Page 11: Linear Stationary Processes.  ARMA models

MA(1) processes (cont)MA(1) processes (cont)-Autocovariance of higher order

- Autocorrelation

Partial autocorrelation

Page 12: Linear Stationary Processes.  ARMA models
Page 13: Linear Stationary Processes.  ARMA models

MA(1) processes (cont)MA(1) processes (cont)

StationarityMA(1) process is always covariance-stationary because

Page 14: Linear Stationary Processes.  ARMA models

MA(q)MA(q)

Moments

MA(q) is covariance-Stationary

for the same reasonsas in a MA(1)

Page 15: Linear Stationary Processes.  ARMA models

MA(infinite)MA(infinite)

Is it covariance-stationary?

The process iscovariance-stationaryprovided that

(the MA coefficients are square-summable)

Page 16: Linear Stationary Processes.  ARMA models

InvertibilityInvertibility

Definition: A MA(q) process is said to be invertible if it admits an autoregressive representation.

Theorem: (necessary and sufficient conditions for invertibility)

Let {Zt} be a MA(q), .Then {Zt} is invertible if and only

The coefficients of the AR

representation, {j}, are determined by the relation

Page 17: Linear Stationary Processes.  ARMA models

Consider the autocorrelation function of these two MA(1) processes:

The autocorrelation functions are:

Then, this two processes show identical correlation pattern. The MA coefficient is not uniquely identified.In other words: any MA(1) process has two representations (one with MA parameter larger than 1, and the other, with MA parameter smaller than 1).

Identification of the MA(1)Identification of the MA(1)

Page 18: Linear Stationary Processes.  ARMA models

Identification of the MA(1)Identification of the MA(1)

• If we identify the MA(1) through the autocorrelation structure, we would need to decide which value of to choose, the one greater than one or the one smaller than one. We prefer representations that are invertible so we will choose the value .

Z

Page 19: Linear Stationary Processes.  ARMA models

AR processes

Page 20: Linear Stationary Processes.  ARMA models

AR(1)AR(1) process process

Stationarity

geometric progression

Remember!!

Page 21: Linear Stationary Processes.  ARMA models

AR(1) (cont)AR(1) (cont)

Hence, an AR(1) process is stationary if

Mean of a stationary AR(1)

Variance of a stationary AR(1)

Page 22: Linear Stationary Processes.  ARMA models

Autocovariance of a stationary AR(1)

You need to solve a system of equations:

11 jjj

Autocorrelation of a stationary AR(1)

ACF

Page 23: Linear Stationary Processes.  ARMA models

EXERCISE

Compute the Partial autocorrelation function of an AR(1) process. Compare its pattern to that of the MA(1) process.

Page 24: Linear Stationary Processes.  ARMA models
Page 25: Linear Stationary Processes.  ARMA models
Page 26: Linear Stationary Processes.  ARMA models

AR(p)AR(p)

stationarity All p roots of the characteristic equation outside of the unit circle

ACF

System to solve for the first pautocorrelations:p unknowns and p equations

ACF decays as mixture of exponentials and/or damped sine waves, Depending on real/complex roots

PACF

Page 27: Linear Stationary Processes.  ARMA models

Exercise

Compute the mean, the variance and the autocorrelation function of an AR(2) process.

Describe the pattern of the PACF of an AR(2) process.

Page 28: Linear Stationary Processes.  ARMA models

Causality and StationarityCausality and Stationarity

ttt aZZ 11Consider the AR(1) process,

Page 29: Linear Stationary Processes.  ARMA models

Causality and Stationarity (II)Causality and Stationarity (II)

11 11

11

However, this stationary representation depends on future values of

It is customary to restrict attention to AR(1) processes with

Such processes are called stationary but also CAUSAL, or future-indepent AR representations.

Remark: any AR(1) process with can be rewritten as an AR(1) process with and a new white sequence.

Thus, we can restrict our analysis (without loss of generality) to processes with

11

Page 30: Linear Stationary Processes.  ARMA models
Page 31: Linear Stationary Processes.  ARMA models

Causality (III)Causality (III)

Definition: An AR(p) process defined by the equation

is said to be causal, or a causal function of {at}, if there exists a sequence of constants

and

- A necessary and sufficient condition for causality is

tatZ)L(p

Page 32: Linear Stationary Processes.  ARMA models
Page 33: Linear Stationary Processes.  ARMA models

Relationship between AR(p) and MA(q)Relationship between AR(p) and MA(q)Stationary AR(p)

Invertible MA(q)

Page 34: Linear Stationary Processes.  ARMA models

ARMA(p,q) Processes

Page 35: Linear Stationary Processes.  ARMA models

ARMA (p,q)ARMA (p,q)

ttp

q

ttq

pt

q

tqtp

aLaL

L

aZL

LZL

xx

xx

aLZL

)()(

)(ZtionrepresentaMA Pure

)(

)()( tion representa AR Pure

10)( of roots ty Stationari

10)( of roots ity Invertibil

)()(

t

p

Page 36: Linear Stationary Processes.  ARMA models

ARMA(1,1)ARMA(1,1)

Page 37: Linear Stationary Processes.  ARMA models

ACF of ARMA(1,1)

taking expectations

you get this system of equations

Page 38: Linear Stationary Processes.  ARMA models

PACF

ACF

Page 39: Linear Stationary Processes.  ARMA models
Page 40: Linear Stationary Processes.  ARMA models
Page 41: Linear Stationary Processes.  ARMA models

Summary

• Key concepts– Wold decomposition– ARMA as an approx. to the Wold decomp.– MA processes: moments. Invertibility– AR processes: moments. Stationarity and

causality.– ARMA processes: moments, invertibility,

causality and stationarity.