Post on 24-Feb-2018
Lesson 17: Vector AutoRegressive Models
Umberto Triacca
Dipartimento di Ingegneria e Scienze dell’Informazione e MatematicaUniversita dell’Aquila,
umberto.triacca@ec.univaq.it
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Vector AutoRegressive models
The extension of ARMA models into a multivariate frameworkleads to
Vector AutoRegressive (VAR) models
Vector Moving Average (VMA) models
Vector AutoregRegressive Moving Average (VARMA)models
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Vector AutoRegressive models
The Vector AutoRegressive (VAR) models ,
made famous in Chris Sims’s paper Macroeconomics and Reality,Econometrica, 1980,
are one of the most applied models in the empirical economics.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Vector AutoRegressive models
Let {yt = (y1t , ..., yKt)
′; t ∈ Z}
be a K -variate random process.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Vector AutoRegressive models
We say that the process {yt ; t ∈ Z} follows a vector autoregressivemodel of order p, denoted VAR(p) if
yt = ν + A1yt−1 + ...+ Apyt−p + ut , t ∈ Z
p is a positive integer,
Ai are fixed (K × K ) coefficient matrices,
ν = (ν1, ..., νK )′ is a fixed (K × 1) vector of intercept terms,
ut = (u1t , ..., uKt)′ is aK -dimensional white noise with
covariance matrix Σu.
The covariance matrix Σu is assumed to be nonsingular.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Example: Bivariate VAR(1)
[y1t
y2t
]=
[ν1
ν2
]+
[a11 a12
a21 a22
] [y1t−1
y2t−1
]+
[u1t
u2t
]or
y1t = ν1 + a11y1t−1 + a12y2t−1 + u1t
y2t = ν1 + a21y1t−1 + a22y2t−1 + u2t
where cov(u1t , u2s) = σ12 for t = s, 0 otherwise
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Advantages of VAR models
1 They are easy to estimate.
2 They have good forecasting capabilities.
3 The researcher does not need to specify which variables areendogenous or exogeneous. All are endogenous.
4 In a VAR system is very easy to test for Granger non-causality.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Problems with VARs
1 So many parameters. If there are K equations, one for each Kvariables and p lags of each of the variables in each equation,(K + pK 2) parameters will have to be estimated.
2 VARs are a-theoretical, since they use little economic theory.Thus VARs cannot used to obtain economic policyprescriptions.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
In this lesson, the estimation of a vector autoregressive model isdiscussed.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
Consider the VAR(1)
y1t = ν1 + a11y1t−1 + a12y2t−1 + u1t
y2t = ν1 + a21y1t−1 + a22y2t−1 + u2t
wherecov(u1t , u2s) = σ12 for t = s, 0 otherwise.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
The model corresponds to 2 regressions with different dependentvariables and identical explanatory variables.
We could estimate this model using the ordinary least squares(OLS) estimator computed separately from each equations.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
It is assumed that a time series
y1 = [y11, y21]′, ..., yT = [y1T , y2T ]′
of the y variables is available.In addition, a presample value
y0 = [y10, y20]′
is assumed to be available.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
Consider the first equation
y1t = ν1 + a11y10 + a12y2t−1 + u1t ; t = 1, ...,T
y11 = ν1 + a11y10 + a12y20 + u11
y12 = ν1 + a11y11 + a12y21 + u12
...
y1T = ν1 + a11y1T−1 + a12y2T−1 + u1T
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
We definey1 = [y11, ..., y1T ]′,
X1 =
1 y1,0 y2,0
1 y1,1 y2,1...
......
1 y1,T−1 y2,T−1
,
πππ1 = [ν1, a11, a12]′,
u1 = [u11, ..., u1T ]′,
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Thusy1 = Xπππ1 + u,
The OLS estimator πππ1 is given by
πππ1 = (X′X)−1X′y1
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
Consider the second equation
y2t = ν2 + a21y10 + a22y2t−1 + u2t ; t = 1, ...,T
y21 = ν2 + a21y10 + a22y20 + u21
y22 = ν2 + a21y11 + a22y21 + u22
...
y2T = ν2 + a21y1T−1 + a22y2T−1 + u2T
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
We definey2 = [y21, ..., y2T ]′,
πππ2 = [ν2, a21, a22]′,
u2 = [u21, ..., u2T ]′,
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation
Thusy2 = Xπππ2 + u2,
The OLS estimator πππ2 is given by
πππ2 = (X′X)−1X′y2
Umberto Triacca Lesson 17: Vector AutoRegressive Models
OLS estimators
πππ1 = (X′X)−1X′y1 ⇒ πππ1 = [ν1, a11, a12]′
πππ2 = (X′X)−1X′y2 ⇒ πππ2 = [ν2, a21, a22]′
Umberto Triacca Lesson 17: Vector AutoRegressive Models
OLS Estimation
Are the OLS estimators for πππ2 and πππ2 efficient estimators?
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Estimation
We remember that
cov(u1t , u2t) = σ12 6= 0
When contemporaneous correlation exists, it may be more efficientto estimate all equations jointly, rather than to estimate each oneseparately using least squares.
The appropriate joint estimation technique is the GLS estimation
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
Let as consider the following set of two equations
y1t = β10 + β11x1t + β12z1t + u1t
y2t = β10 + β11x2t + β12z2t + u2t
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
The two equation can be written in the usual matrix algebranotation as
y1 = X1βββ1 + u1
y2 = X2βββ2 + u2
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
whereyi = [yi1, ..., yiT ]′ i = 1, 2
Xi =
1 xi ,1 zi ,11 xi ,2 zi ,2...
......
1 xi ,T zi ,T
i = 1, 2,
βββi = [βi0, βi1, βi2]′, i = 1, 2,
andui = [ui1, ..., uiT ]′, i = 1, 2,
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
Further, we assume that
1 E [uit ] = 0 i = 1, 2 t = 1, 2...T
2 var(uit) = σ2i i = 1, 2 t = 1, 2...T ;
3 E [uitujt ] = σij i , j = 1, 2;
4 E [uitujs ] = 0 for t 6= s and i , j = 1, 2;
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
Now we write the two equations as the folllowing ‘super model’[y1
y2
]=
[X1 00 X2
] [βββ1
βββ2
]+
[u1
u2
]or
y = Xβββ + u
The first point to note is that it can be shown that least squaresapplied to this system is identical to applying least squaresseparately to each of the two equations.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
Note that the covariance matrix of the joint disturbace vector u isgiven by
ΦΦΦ =
[σ2
1I σ12Iσ12I σ2
2I
]=
[σ2
1 σ12
σ12 σ22
]⊗ IT = ΣΣΣ⊗ IT
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
The disturbance covariance matrix ΦΦΦ is of dimension(2T × 2T ).An important point to note is that ΦΦΦ cannot be writtenas scalar multiplied by a 2T -dimensional identity matrix. Thus thebest linear unbiased estimator for βββ is given by the generalizedleast squares estimator
βββGLS = (X′ΦΦΦ−1X)−1X′ΦΦΦ−1y = [X′(ΣΣΣ−1 ⊗ IT )X]−1X(ΣΣΣ−1 ⊗ IT )y
It has lower variance than the least squares estimators for βββbecause it takes into account the contemporaneous correlationbetween the disturbances in different equation.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Seemingly Unrelated Regressions Equations
There are two conditions under the which least squares is identicalto generalized least squares.
1 The first is when all contemporaneus covariances are zero,σ12 = 0.
2 The second is when the explanatory variables in each equationare identical, X1 = X2
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation of A VAR model
The use of generalized least squares estimator does not lead to again in efficiency when each equation contains the sameexplanatory variables.
Thus the autoregressive cooefficients of our VAR(1) model can beestimated, without loss of estimation efficiency, by ordinary leastsquares. This in turn is equivalent to estimating each equationseparately by OLS
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation of A VAR model
The (2× 2) unknown covariance matrix Σ may be consistentestimated by Σ whose elements are
σij =u′i uj
T − 2p − 1for i , j = 1, 2
where ui = yi − Xπππi
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Estimation of A VAR model
In general, the autoregressive cooefficients of a K -dimensionalVAR(p) model
yt = ν + A1yt−1 + ...+ Apyt−p + ut
can be estimated, without loss of estimation efficiency, by ordinaryleast squares.In the first equation, we have to run the regression
y1t on y1t−1, ..., yKt−1, ..., y1t−p, ..., yKt−p,
in the second equation, we regress
y12 on y1t−1, ..., yKt−1, ..., y1t−p, ..., yKt−p,
and so on.
Umberto Triacca Lesson 17: Vector AutoRegressive Models
Conclusion
Since the VAR(p) model is just a Seemingly Unrelated Regression(SUR) model where each equation has the same explanatoryvariables, each equation may be estimated separately by ordinaryleast squares without losing efficiency relative to generalized leastsquares.
Umberto Triacca Lesson 17: Vector AutoRegressive Models