ARMA Autocorrelation Functions
Transcript of ARMA Autocorrelation Functions
-
7/28/2019 ARMA Autocorrelation Functions
1/21
ARMA Autocorrelation Functions
For a moving average process, MA(q):
xt = wt + 1wt1 + 2wt2 + + qwtq.
So (with 0 = 1)
(h) = cov
xt+h, xt
= E
qj=0
jwt+hj
qk=0
kwtk
=
2w
qhj=0
jj+h, 0 h q
0 h > q.
1
-
7/28/2019 ARMA Autocorrelation Functions
2/21
So the ACF is
(h) =
qhj=0
jj+h
qj=0
2j
, 0 h q
0 h > q.
Notes:
In these expressions, 0 = 1 for convenience.
(q) = 0 but (h) = 0 f o r h > q. This characterizes
MA(q).
2
-
7/28/2019 ARMA Autocorrelation Functions
3/21
For an autoregressive process, AR(p):
xt = 1xt1 + 2xt2 + + pxtp + wt.
So
(h) = cov
xt+h, xt
= E
pj=1
jxt+hj + wt+h
xt
=p
j=1
j(h j) + c o v
wt+h, xt
.
3
-
7/28/2019 ARMA Autocorrelation Functions
4/21
Because xt is causal, xt is wt+ a linear combination of wt1, wt2, . . . .
So
cov
wt+h, xt
=
2w h = 0
0 h > 0.
Hence
(h) =p
j=1
j(h j), h > 0
and
(0) =p
j=1
j(j) + 2w.
4
-
7/28/2019 ARMA Autocorrelation Functions
5/21
If we know the parameters 1, 2, . . . , p and
2
w, these equa-tions for h = 0 and h = 1, 2, . . . , p form p + 1 linear equations
in the p + 1 unknowns (0), (1), . . . , (p).
The other autocovariances can then be found recursively
from the equation for h > p.
Alternatively, if we know (or have estimated) (0), (1), . . . , (p),
they form p + 1 linear equations in the p + 1 parameters
1, 2, . . . , p and 2w.
These are the Yule-Walker equations.
5
-
7/28/2019 ARMA Autocorrelation Functions
6/21
For the ARMA(p, q) model with p > 0 and q > 0:
xt =1xt1 + 2xt2 + + pxtp
+ wt + 1wt1 + 2wt2 + + qwtq,
a generalized set of Yule-Walker equations must be used.
The moving average models ARMA(0, q) = MA(q) are the
only ones with a closed form expression for (h).
For AR(p) and ARMA(p, q) with p > 0, the recursive equationmeans that for h > max(p, q + 1), (h) is a sum of geomet-
rically decaying terms, possibly damped oscillations.
6
-
7/28/2019 ARMA Autocorrelation Functions
7/21
The recursive equation is
(h) =p
j=1
j(h j), h > q.
What kinds of sequences satisfy an equation like this?
Try (h) = zh for some constant z.
The equation becomes
0 = zh p
j=1
jz(hj) = zh
1
pj=1
jzj
= zh(z).
7
-
7/28/2019 ARMA Autocorrelation Functions
8/21
So if (z) = 0, then (h) = zh satisfies the equation.
Since (z) is a polynomial of degree p, there are p solutions,
say z1, z2, . . . , zp.
So a more general solution is
(h) =p
l=1
clzhl ,
for any constants c1, c2, . . . , cp.
If z1, z2, . . . , zp are distinct, this is the most general solution;
if some roots are repeated, the general form is a little more
complicated.
8
-
7/28/2019 ARMA Autocorrelation Functions
9/21
If all z1, z2, . . . , zp are real, this is a sum of geometrically
decaying terms.
If any root is complex, its complex conjugate must also be a
root, and these two terms may be combined into geometri-
cally decaying sine-cosine terms.
The constants c1, c2, . . . , cp are determined by initial condi-
tions; in the ARMA case, these are the Yule-Walker equa-
tions.
Note that the various rates of decay are the zeros of (z),
the autoregressive operator, and do not depend on (z), the
moving average operator.
9
-
7/28/2019 ARMA Autocorrelation Functions
10/21
Example: ARMA(1, 1)
xt = xt1 + wt1 + wt.
The recursion is
(h) = (h 1), h = 2, 3, . . .
So (h) = ch for h = 1, 2, . . . , but c = 1.
Graphically, the ACF decays geometrically, but with a differ-
ent value at h = 0.
10
-
7/28/2019 ARMA Autocorrelation Functions
11/21
q
qq
qq
qq
qq
qq q q q q q q q q q q q q q q
5 10 15 20 25
0.2
0.4
0.6
0.8
1.0
Index
ARMAacf(ar=0.9,ma
=
0.5,
24)
11
-
7/28/2019 ARMA Autocorrelation Functions
12/21
The Partial Autocorrelation Function
An MA(q) can be identified from its ACF: non-zero to lag q,
and zero afterwards.
We need a similar tool for AR(p).
The partial autocorrelation function (PACF) fills that role.
12
-
7/28/2019 ARMA Autocorrelation Functions
13/21
Recall: for multivariate random variables X,Y,Z, the partial
correlations ofX and Y given Z are the correlations of:
the residuals ofX from its regression on Z; and
the residuals ofY from its regression on Z.
Here regression means conditional expectation, or best lin-
ear prediction, based on population distributions, not a sam-
ple calculation.
In a time series, the partial autocorrelations are defined as
h,h = partial correlation of xt+h and xt
given xt+h1, xt+h2, . . . , xt+1.
13
-
7/28/2019 ARMA Autocorrelation Functions
14/21
For an autoregressive process, AR(p):
xt = 1xt1 + 2xt2 + + pxtp + wt,
If h > p, the regression of xt+h on xt+h1, xt+h2, . . . , xt+1 is
1xt+h1 + 2xt+h2 + + pxt+hp
So the residual is just wt+h
, which is uncorrelated with
xt+h1, xt+h2, . . . , xt+1 and xt.
14
-
7/28/2019 ARMA Autocorrelation Functions
15/21
So the partial autocorrelation is zero for h > p:
h,h = 0, h > p.
We can also show that p,p = p, which is non-zero by as-
sumption.
So p,p = 0 but h,h = 0 for h > p. This characterizes AR(p).
15
-
7/28/2019 ARMA Autocorrelation Functions
16/21
The Inverse Autocorrelation Function
SASs proc arima also shows the Inverse Autocorrelation Func-
tion (IACF).
The IACF of the ARMA(p, q) model(B)xt = (B)wt
is defined to be the ACF of the inverse (or dual) process
(B)x(inverse)t = (B)wt.
The IACF has the same property as the PACF: AR(p) is
characterized by an IACF that is nonzero at lag p but zero
for larger lags.
16
-
7/28/2019 ARMA Autocorrelation Functions
17/21
Summary: Identification of ARMA processes
AR(p) is characterized by a PACF or IACF that is:
nonzero at lag p;
zero for lags larger than p.
MA(q) is characterized by an ACF that is:
nonzero at lag q;
zero for lags larger than q.
For anything else, try ARMA(p, q) with p > 0 and q > 0.
17
-
7/28/2019 ARMA Autocorrelation Functions
18/21
For p > 0 and q > 0:
AR(p) MA(q) ARMA(p, q)
ACF Tails off Cuts off after lag q Tails off
PACF Cuts off after lag p Tails off Tails off
IACF Cuts off after lag p Tails off Tails off
Note: these characteristics are used to guide the initial choice
of a model; estimation and model-checking will often lead to
a different model.
18
-
7/28/2019 ARMA Autocorrelation Functions
19/21
Other ARMA Identification Techniques
SASs proc arima offers the MINIC option on the identify
statement, which produces a table of SBC criteria for various
values of p and q.
The identify statement has two other options: ESACF and
SCAN.
Both produce tables in which the pattern of zero and non-zero values characterize p and q.
See Section 3.4.10 in Brocklebank and Dickey.
19
-
7/28/2019 ARMA Autocorrelation Functions
20/21
options linesize = 80;
ods html file = varve3.html;
data varve;
infile ../data/varve.dat;
input varve;
lv = log(varve);
run;
proc arima data = varve;
title Use identify options to identify a good model;
identify var = lv(1) minic esacf scan;
estimate q = 1 method = ml;
estimate q = 2 method = ml;
estimate p = 1 q = 1 method = ml;
-
7/28/2019 ARMA Autocorrelation Functions
21/21
run;
proc arima output
http://www.stat.ncsu.edu/people/bloomfield/courses/st730/varve3.htmlhttp://www.stat.ncsu.edu/people/bloomfield/courses/st730/varve3.html