Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.
-
Upload
agnes-ward -
Category
Documents
-
view
214 -
download
0
Transcript of Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.
![Page 1: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/1.jpg)
Time Series Basics
Fin250f: Lecture 8.1
Spring 2010
Reading: Brooks, chapter 5.1-5.7
![Page 2: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/2.jpg)
Outline
Linear stochastic processes Autoregressive process Moving average process Lag operator Model identification
PACF/ACF Information Criteria
![Page 3: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/3.jpg)
Stochastic Processes
Yt(y1, y2 , y3,K yT )
E(Yt | yt−1,yt−2 ,K ) =E(Yt |t−1)
![Page 4: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/4.jpg)
Time Series Definitions
Strictly stationaryCovariance stationaryUncorrelated White noise
![Page 5: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/5.jpg)
Strictly Stationary
All distributional features are independent of time
F(yt , yt−1,K yt−m)E(Yt |yt−1,…,yt−m) independent of time
![Page 6: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/6.jpg)
Weak or Covariance Stationary
Variances and covariances independent of time
E(yt ) =μE(yt −μ)(yt −μ)=σ 2 <∞E(yt −μ)(yt+ j −μ)=γ j
![Page 7: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/7.jpg)
Autocorrelation
![Page 8: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/8.jpg)
White Noise
E(yt ) =μE(yt −μ)(yt −μ)=σ 2 <∞E(yt −μ)(yt+ j −μ)=0 j > 0
![Page 9: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/9.jpg)
White Noise in Words
Weakly stationaryAll autocovariances are zeroNot necessarily independent
![Page 10: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/10.jpg)
Time Series Estimates
γ̂ j =1
T − j(
t=1
T−j
∑ yt −μ)(yt+ j −μ)
τ̂ j =γ̂ j
γ̂0
White noise:τ̂ j ~N(0,1 /T )
![Page 11: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/11.jpg)
Ljung-Box Statistic
Q* =T(T + 2)τ̂ k
2
T −kk=1
m
∑Q* ~χm
2
![Page 12: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/12.jpg)
Linear Stochastic Processes
Linear modelsTime series dependenceCommon econometric frameworksEngineering background
![Page 13: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/13.jpg)
Autoregressive Process, Order 1:AR(1)
![Page 14: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/14.jpg)
AR(1) Properties
E(yt ) =μ +φE(yt−1) =μ +φE(yt)
E(yt) =μ
1−φEt(yt+1) =μ +φyt
![Page 15: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/15.jpg)
More AR(1) Propertiesμ =0
yt = φyt−1 + utE(yt
2 ) = E(φyt−1 + ut )(φyt−1 + ut )
E(yt2 ) = E(φ2yt−1
2 ) + 2E(utφyt−1) +σ u2
E(yt2 ) = E(φ2yt
2 ) +σ u2
E(yt2 ) =
σ u2
(1−φ2 )= var(yt )
![Page 16: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/16.jpg)
More AR(1) propertiesμ =0
yt = φyt−1 + utE(ytyt−1) = E(φyt−1 + ut )(yt−1)
γ 1 = E(ytyt−1) = φσ y2
τ 1 =E(ytyt−1)
σ y2 = φ
τ j = φ j
![Page 17: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/17.jpg)
AR(1): Zero mean form
yt+1 =μ +φyt +ut+1
E(yt+1) =μ
1−φ=Ú
(yt+1 −Ú)=φ(yt −Ú)+ut+1
![Page 18: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/18.jpg)
AR(m) (Order m)
yt =μ + φjyt−jj=1
m
∑ +ut
![Page 19: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/19.jpg)
Moving Average Process of Order 1, MA(1)
yt =μ +θut−1 +ut
![Page 20: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/20.jpg)
MA(1) Properties
yt =μ +θut−1 +ut
E(yt) =μEt(yt+1) =μ +θut
E(yt −μ)2 =E(θut−1 +ut)(θut−1 +ut)
var(yt) =(1+θ 2 )σu2
γ1 =E(yt −μ)(yt−1 −μ)=E(θut−1 +ut)(θut−2 +ut−1) =θσu2
τ1 =cor(yt,yt−1) =θ
(1+θ 2 )τ j =0 j ≥2
![Page 21: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/21.jpg)
MA(m)
yt =μ + θ jut−j +utj=1
q
∑var(yt) =(1+θ1
2 +…+θq2 )σu
2
cov(yt,yt−j ) =(θ j +θ j+1θ1 +…+θqθq−j )σu2
cov(yt,yt−j ) =0 j > q
![Page 22: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/22.jpg)
Stationarity
Process not explodingFor AR(1)All finite MA's are stationaryMore complex beyond AR(1)
|φ|<1
![Page 23: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/23.jpg)
AR(1)->MA(infinity)yt =φyt−1 +ut
yt−1 =φyt−2 +ut−1
yt =φ(φyt−2 +ut−1) +ut
yt =φ2yt−2 +φut−1 +ut
yt =φmyt−m+ φ jut−j
j=0
m
∑ , |φ |<1
yt = φ jut−jj=0
∞
∑
![Page 24: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/24.jpg)
Lag Operator (L)
Lyt =yt−1
Lkyt =yt−k
Lkμ =μ
![Page 25: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/25.jpg)
Using the Lag Operator (Mean adjusted form)
yt −ε =φ(yt−1 −ε)+ut
yt −ε =φL(yt −ε)+ut
(1−φL)(yt −ε) =ut
![Page 26: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/26.jpg)
An important feature for Lyt =φyt−1 +ut
(1−φL)yt =ut
yt =1
(1−φL)ut
yt = φ jLjutj=0
∞
∑ = (φL) j utj=0
∞
∑1
(1−φL)= (φL) j
j=0
∞
∑
![Page 27: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/27.jpg)
MA(1) -> AR(infinity)
yt =μ +θut−1 +ut
yt −μ =(1+θL)ut
1(1+θL)
(yt −μ)=ut
(−θL) j (yt −μ)j=0
∞
∑ =ut
![Page 28: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/28.jpg)
MA->AR
yt −μ = −(−θ) j
j=1
∞
∑ (yt−j −μ )+ut
yt −μ = (−1) j−1θ j
j=1
∞
∑ (yt−j −μ )+ut
|θ |<1 "Invertibility"
![Page 29: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/29.jpg)
AR's and MA's
Can convert any stationary AR to an infinite MA
Exponentially declining weightsCan only convert "invertible" MA's to
AR'sStationarity and invertibility:
Easy for AR(1), MA(1) More difficult for larger models
![Page 30: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/30.jpg)
Combining AR and MA ARMA(p,q) (more later)
yt =μ + φiyt−ii=1
p
∑ + θ jut−jj=1
q
∑ +ut
![Page 31: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/31.jpg)
Modeling ProceduresBox/Jenkins
Identification Determine structure
How many lags? AR, MA, ARMA?
Tricky Estimation
Estimate the parameters Residual diagnostics Next section: Forecast performance and
evaluation
![Page 32: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/32.jpg)
Identification Tools
Diagnostics ACF, Partial ACF Information criteria Forecast
![Page 33: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/33.jpg)
Autocorrelationγ̂ j =
1T − j
(t=1
T−j
∑ yt −μ )(yt+ j −μ )
τ̂ j =γ̂ j
γ̂0
White noise:τ̂ j ~N(0,1 /T )
95% bands
[-1.96 (1 /T ),1.96 (1 /T )]
![Page 34: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/34.jpg)
Partial Autocorrelation
Correlation between y(t) and y(t-k) after removing all smaller (<k) correlations
Marginal forecast impact from t-k given all earlier information
![Page 35: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/35.jpg)
Partial Autocorrelation
yt =μ +β1,1yt−1 +ut
yt =μ +β1,2yt−1 +β2,2yt−2 +ut
yt =μ +β1,3yt−1 +β2,3yt−2 +β3,3yt−3 +ut
pacf =[β1,1,β2,2 ,β3,3,…]
![Page 36: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/36.jpg)
For an AR(1)
yt =μ +φyt−1 +ut
ACF( j) =τ j =φj
PACF(1) =φPACF(>1) =0
![Page 37: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/37.jpg)
AR(1) (0.9)
![Page 38: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/38.jpg)
For an MA(1)yt =μ +θut−1 +ut
ACF(1) =τ1 =θ
1+θ 2
ACF(>1) =0PACF =AR(∞)
yt = (−1) j−1θ j
j=1
∞
∑ (yt−j ) +ut
PACF =(θ,−θ 2 ,θ 3,−θ 4 ,…)
![Page 39: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/39.jpg)
MA(1) (0.9)
![Page 40: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/40.jpg)
General Features
Autoregressive Decaying ACF PACF drops to zero beyond model order(p)
Moving average Decaying PACF ACF drops to zero beyond model order(q)
Don’t count on things looking so good
![Page 41: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/41.jpg)
Information Criteria
Akaike, AICSchwarz Bayesian criterion, SBICHannan-Quinn, HQICObjective:
Penalize model errors Penalize model complexity Simple/accurate models
![Page 42: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/42.jpg)
Information Criteria
k=number of parameters
AIC =log(σ̂ 2 ) +2kT
SBIC =log(σ̂ 2 ) +kT
log(T )
HQIC =log(σ̂ 2 ) +2kT
log(log(T ))
![Page 43: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/43.jpg)
Estimation
Autoregressive AR OLS Biased(-), but consistent, and approaches
normal distribution for large TMoving average MA and ARMA
Numerical estimation procedures Built into many packages
Matlab econometrics toolbox
![Page 44: Time Series Basics Fin250f: Lecture 8.1 Spring 2010 Reading: Brooks, chapter 5.1-5.7.](https://reader035.fdocuments.us/reader035/viewer/2022062518/5697bf9d1a28abf838c93bfe/html5/thumbnails/44.jpg)
Residual Diagnostics
Get model residuals (forecast errors)Run this time series through various
diagnostics ACF, PACF, Ljung/Box, plots
Should be white noise (no structure)