Como? Identification? de que?

46
2.2 Time Series Analysis 2.2.1 Preliminaries 2.2.2 Various Types of Stochastic Processes 2.2.3 Parameters of Univariate and Bivariate Time Series 2.2.4 Estimating Covariance Functions and Spectra

Transcript of Como? Identification? de que?

2.2 Time Series Analysis

2.2.1 Preliminaries

2.2.2 Various Types of Stochastic Processes

2.2.3 Parameters of Univariate and Bivariate Time Series

2.2.4 Estimating Covariance Functions and Spectra

2.2 Time Series Analysis

The central statistical model used in time series analysis is stochastic processes

A stochastic process is an ordered set of random variables,

indexed with an integer t, which usually represents time. A time series is realization of a stochastic process

},{ Z∈tX t

Definition

2.2 Time Series Analysis

Example: White Noise

A simplest stochastic process is a white noise, which is an infinite sequence of zero mean iid normal random variable

5.0

)(

)()(

)(

),(

)0|0( nonzeroany for i.e. memory, no has noise A white

0

00

0

0 0

=

×=

=

>>

∫∫

∫ ∫

∞∞

∞ ∞

+=

dxxf

dyyfdxxf

dxxf

dxdyyxf

XXP

t

ts

t

ts

tts τ

τ

2.2 Time Series Analysis

Properties of a Stochastic Process

A stochastic process is said to be stationary if all stochastic properties are independent of index t. It follows

• Xt has the same distribution function

• for all t and s, the parameters of the joint distribution function of X1 and Xs depend only |t-s|

},{ Z∈tX t

A stochastic process is weakly stationary, if the mean E(Xt) is independent of time and the second moments E(Xt,Xs) are functions of |t-s| only

• The assumption of week stationary is less restrictive than that of stationary and is often sufficient for the methods used in climate research

A stochastic process is weakly cyclo-stationary, if the mean is a function of the time within a deterministic cycle and the central second moments are functions of |t-s| and the phase of the cycle

2.2 Time Series Analysis

Example of Non-stationary

1958-1977 time series of monthly mean atmospheric CO2 concentration measured at the Mauna Loa Observatory in Hawaii. The time series can be considered as a superposition of a stationary process Xt , a linear trend αt and an oscillation with period of 12 months

2.2 Time Series Analysis

Example: Random Walk

Given a white noise Zt, is a random walk. Xt is non-stationary in variance:

∑=

=t

jjt ZX

1

22

11,

2

1

1

)()()(

0)(

σtZEZZEZEXVar

ZEXE

j

t

j

t

kjKj

t

jjt

t

jjt

===⎟⎟

⎜⎜

⎛⎟⎟⎠

⎞⎜⎜⎝

⎛=

=⎟⎟⎠

⎞⎜⎜⎝

⎛=

∑∑∑

===

=

If an ensemble of random walks is considered, the center of gravity will not move, but the scatter increases continuously

2.2 Time Series Analysis

Ergodicity

Ergodicity has to be assumed, since stationary or weak stationary alone is not enough to ensure that the moments of a process can be estimated from a single time series

Definition (loose)

A time series is ergodic, if it varies quickly enough in time that increasing amounts of information about process parameters can be obtained by extending the time series

Example of a Non-Ergodic Process

xt=a, where a is a realization of a random variable A with mean µ

µ==≠== ∑=

)(average ensemble1 average time1

AEaxn

n

tt

2.2 Time Series Analysis

Various Types of Stochastic Processes

We will concentrate on the auto-regressive (AR) processes as the most relevant type of stochastic processes for climate research, and discuss briefly their relation to a more general class of short memory processes, the auto-regressive moving average (ARMA) processes, and the relation to a class of long memory processes, the fractional auto-regressive integrated moving average (ARIMA) processes.

2.2 Time Series Analysis

Definition: AR-Processes

{ } is an AR-process of order p, if there exist real constants αk, k=0,…,p, with and a white noise process { } such that

Z∈tX t ,Z∈tZt ,0≠pα

tkt

p

kkt ZXX ++= −

=∑

10 αα

• AR-processes are important, since given any weakly stationary ergodic process {Xt}, it is possible to find an AR-process {Yt}, that approximates {Xt} arbitrarily closely

• AR-processes are popular, since they represent discretized ordinary differential equations

• The mean and variance are

where ρ (k) is the auto-correlation function

• Yt=Xt-µ is a process with zero mean

∑∑==

−=

−== p

kk

ttp

kk

ot

k

ZVarXVarXE

11)(1

)()( ,1

)(ραα

αµ

2.2 Time Series Analysis

AR(1)-Process: Xt=α1Xt-1+Zt

• AR(1)-Processes (p=1) have only one degree of freedom unable to oscillate

One finds

Thus, the variance of the process is a linear function of the variance of the input white noise Zt and a non-linear function of the parameter α1.

)1()( , 2

1

2

11 ασαρ−

== ztXVar

• A non-zero value of xt at time t tends to be damped with an average damping rate of α1 per time step• An AR(1)-process with a negative coefficient will flip around zero. Such a process is considered to be inappropriate for the description of a climate series

2.2 Time Series Analysis

AR(2)-Process: Xt=α1Xt-1+α2Xt-2+Zt

AR(2)-processes (p=2) have only two degrees of freedom and can oscillate with one preferred frequency

The AR(2)-process with α1=0.9 and α2=-0.8 exhibits quasi-periodic behavior with a period of about 6 time steps

The AR(2)-process with α1=0.3 and α2=0.3 has behavior comparable to that of an AR(1)-process with long memory

2.2 Time Series Analysis

Stationarity of AR-Processes

AR-processes can be non-stationary. An AR(1) process with α1=2 and µ=0 is stationary with respect to the mean but non-stationary with respect to the variance, since one has for Xt starting from X0

⎟⎠⎞

⎜⎝⎛ −=+==

+=

=

=

t

tt

i

ittt

tt

t

ii

ittt

ZVarXVarXVarXEXE

ZXX

411

34)(4)(4)( ),2()(

22

100

10

An AR(p)-process with AR coefficients αk, k=1,…p, is stationary if and only if all roots of the characteristic polynomial

lie outside the circle |y|=1

kp

kk yyp ∑

=

−=1

1)( α

• The characteristic polynomial has p roots, yi, i=1,…,p

• They can be real or appear in complex conjugate pairs

2.2 Time Series Analysis

Condition for a stationary AR(1) process

An AR(1)-process is stationary, if α1<1

111 /1 ,01)( αα =→=−= yyyp

Condition for a stationary AR(2) process

• An AR(2)-process is stationary, if α2+α1<1, α2-α1<1,|α2|<1

• AR(2) coefficients that satisfy these conditions lie in the triangle depicted in the figure

2

2211

2,12

21 24

,01)(α

ααααα

+±−==−−= yyyyp

Region where the characteristic polynomials have a pair of conjugate roots

Region where the characteristic polynomials have two real solutions

α12+4α2=0

2.2 Time Series Analysis

More about the roots of the characteristic polynomial: The roots of a characteristic polynomial describe the ‘typical temporal behavior’ of the corresponding process

Each root yi identifies a set of ‘typical initial conditions that lead to Xt=1 when the noise is disregarded. Since these initial conditions’ are linearly independent, any set of states (Xt-1,…,Xt-p) can be represented as a linear combination of the initial states. In the absence of noise, the future evolution of these will be

∑=

−+ =

p

jiit yX

1

ττ β

Let yi, i=1,…,p be the roots of characteristic polynomial p(y). Given a fixed i, set

substitute these values into the corresponding process

disregard the noise yields

.,1,, pkyX kiikt L==−

tkt

p

kkt ZXX += −

=∑

1=tX

Example I

An AR(1) process has only one solution y1=1/a1,

ττ αβ 11=+tX

Example II

The AR(2) process with (a1,a2)=(0.3,0.3) has solutions y1=1.39 and y2=-2.39

τττ ββ −−

+ += 2211 yyX t

Example III

The AR(2) process with (a1,a2)=(0.9,-0.9) has solutions y1=r exp( iφ), r=1.11, φ=π/3

)exp(1

1*

1112211

φτ

ββββττ

τττττ

iry

yyyyX t

−=

+=+=−−

−−−+

±

2.2 Time Series Analysis

The ‘typical temporal behavior’ of an AR process in the absence of the noise is characterized by damped modes with or without oscillations

• The damping is necessary, since the presence of the noise makes the process non-stationary (random walk)

• The rate of damping is determined by the amplitude of the roots of the corresponding characteristic polynomial

• complex roots lead to oscillations

2.2 Time Series Analysis

Definition: Moving Average (MA) Processes

A process is said to be a moving average process of order q (MA(q)), if

process noise whitea is };{ .3

0such that constants are ,, 2.process theofmean theis 1.

where

q1

1

Z∈

++= ∑=

tZ

ZZX

t

q

X

q

lltltXt

βββµ

βµ

L

},{ Z∈tX t

An MA process is stationary with mean µx and variance

⎟⎟⎠

⎞⎜⎜⎝

⎛+= ∑

=

q

lltt ZVarXVar

1

21)()( β

2.2 Time Series Analysis

Definition: Auto-regressive Moving Average (ARMA) Processes

A process is said to be an ARMA process of order (p,q), if},{ Z∈tX t

process noise whitea is };{ .3

0,0such that constants are ,, and ,, 2.process theofmean theis 1.

where

qp11

11

Z∈

≠≠

+=−− ∑∑=

−=

tZ

ZZXX

t

qp

X

q

lltlt

p

iitiXt

βαββααµ

βαµ

LL

There is a substantial overlap between the class of MA, AR, and ARMA models

•Any weakly stationary ergodic process can be approximated arbitrarily closely by any of the three types of models

• The ARMA models can approximate the behavior of a given weakly stationary ergodic process to a specific level of accuracy with fewer parameters than a pure AR or MA model does.

2.2 Time Series Analysis

Backward shift operator B

• B acts on the time index of the stochastic process. It is defined by

and satisfies

• AR, MA and ARMA can all formally be written in terms of B. Specifically define the AR operator

and the MA operator

AR, MA and ARMA processes are then formally stochastic processessatisfying

∑=

−=p

i

iiBB

11)( αφ

jq

jj BB ∑

=

+=1

1)( βθ

[ ] 1−= tt XXB 1)1( −−=− ttt XXXB

(ARMA) )()((MA) )( (AR) )(

tt

tt

tt

ZBXBZBX

ZXB

θφθ

φ

===

Why B?

• provide the tool needed to explore the connections between AR and MA models

• introduce other classes of models

2.2 Time Series Analysis

Definition: Auto-regressive-integrated Moving Average (ARIMA) Processes

•A process is said to be an ARIMA process of order (p,q,d), if the dth difference of Xt, (1-B)dXt, satisfies the ARMA operator of order (p,q), I.e.

•If –1/2<d<1/2, Xt is called a fractional ARIMA process

• An ARIMA process with a positive integer d is generally not stationary, whereas a fractional ARIMA process can be stationary

• Fractional ARIMA processes are also known as long-memory processes

},{ Z∈tX t

ttd ZBXBB )()1)(( θφ =−

2.2 Time Series Analysis

Parameters of Time Series: The Auto-covariance Function

Let Xt be a real or complex-valued stationary process with mean µ. Then

is the auto-covariance function of Xt, and the normalized function

is the auto-correlation function of Xt. The argument τ is the lag. An auto-correlation function has the properties

),()))((()( *ττ µµτγ ++ =−−= tttt XXCovXXE

)0()()(

γτγτρ =

1|)(| ),()( ≤−= τρτρτρ

The auto-covariance function and the auto-correlation function have the same shape, but differ in their units. The former is in units of Xt

2, the latter is dimensionless

Example:

Auto-covariance function of a white noise

⎩⎨⎧ =

=otherwise 0,

0 ,1)(

ττρ

2.2 Time Series Analysis

The Yule-Walker equations for an AR(p) process

If we multiply a zero mean AR(p) process Xt by Xt-τ for τ=1,…,p,

and take expectations, we obtain a system of equations

that are known as the Yule-Walker equations. The equation relates the auto-covariances

at lag τ=1,…,p to the process parameters

and the auto-covariances γ(τ) at lags τ=0,…,p-1 through the pXp matrix

∑=

−−−− +=p

itttititt XZXXXX

1τττ α

ppp γα rr=Σ

Tp p))(,),2(),1(( γγγγ Lr

=

Tpp ),,,( 21 αααα L

r=

⎟⎟⎟⎟⎟

⎜⎜⎜⎜⎜

−−

−−

)0()2()1(

)2()0()1()1()1()0(

γγγ

γγγγγγ

L

MOMM

L

L

pp

pp

p

2.2 Time Series Analysis

The Yule-Walker equations can be used to build an AR model

If covariances γ(τ),τ=0,…,p are known (e.g. have been estimated from data), the parameters of the AR(p) process can be determined by solving the Yule-Walker equations for α1,…,αp.

The Yule-Walker equations can be used to determine auto-covariance functions

If is known, the Yule-Walker equations can be solved for γ(1),…,γ(p), given the variance of the process, γ(0). The full auto-covariance function can be derived by recursively extending the Yule-Walker equations. This is done by multiplying the original AR process by Xt-τ for τ p to obtain

pαr

∑=

−=p

kk k

1)()( τγατγ

2.2 Time Series Analysis

Uniqueness of the AR(p) Approximation to an Arbitrary StationaryProcess

The following theorem is useful when fitting an AR(p) process to an observed time series

Let Xt be a stationary process with auto-correlation function ρ. For each p>0 there is a unique AR(p) process with auto-correlation function ρp such that

The parameters of the approximating process of order p are recursively related to those of the approximating process of order p-1 by

where

starting from α1,1=ρ(1).

pp ≤= ||for )()( ττρτρ

Tpppp ),,( ,1, ααα L

r=

1,,1 )(),1(,),1(, −=−= −−− pkkppppkpkp Lαααα

∑−

=−−

=−

−−

−−= 1

1)(),1(

1

1),1(

,

)(1

)()(

p

kkpp

p

kkp

pp

kp

kpp

ρα

ραρα

2.2 Time Series Analysis

Auto-covariance and auto-correlation functions of some AR(p) processes

p=1:

The Yule-Walker equation is

τατραρ

γγα

11

1

)( ,(1) )1()0(

==⇒

=

p=2:

The Yule-Walker equation is

2

222

21

2

1

2

1

1)2(

-1(1)

)2()1(

)0()1()1()0(

ααααρ

ααρ

γγ

αα

γγγγ

−+−

=

=⇒

⎟⎟⎠

⎞⎜⎜⎝

⎛=⎟⎟

⎞⎜⎜⎝

⎛⎟⎟⎠

⎞⎜⎜⎝

Auto-correlation functions of two AR(1) processes with α1=0.3 (hatched) and 0.9 (solid)

Auto-correlation functions of two AR(2) processes with (α1 α2)=(0.3,0.3) (hatched) and (0.9,-0.8) (solid)

•The auto-covariance function is a sum of auto-covariance functions of AR(1) and AR(2) processes

•The weak stationary assumption ensures that |yk|>1 for all k. Thus, each real root contributes a component to the auto-correlation function that decays exponentially and each pair of complex conjugate roots contributes an exponentially damped oscillation

2.2 Time Series Analysis

The General Form of the Auto-correlation Function of an AR(p) Process

•The auto-correlation function of a weakly stationary AR(p) process can beexpressed as

where yk,k=1,…,p, are the roots of the characteristic polynomial and are either real or appear in complex conjugate pairs. ak can be derived from the process parameters . When yk is real (complex), the corresponding ak is complex.

τ

ττ

τ

ψτφ

τρ

⎟⎟⎠

⎞⎜⎜⎝

⎛∝

++=

=

∑∑

∑=

||1)cos(

)(

i

1

||

kk

kk

kk

i

i

p

kkk

yra

ya

ya

pαr

Definition I (H. von Storch & F. Zwiers)

Let Xt be an ergodic weakly stationary stochastic process with auto-covariance function γ(τ),τ=…,-1,0,1,…. The the spectrum (or power spectrum) Γ of Xt is the Fourier transform F of the auto-covariance function γ, i.e.

for all .

∑∞

−∞=

−==Γτ

τωπτγωγω ie 2)()}({)( F

]2/1,2/1[−∈ω

2.2 Time Series Analysis

Parameter of Time Series: Spectrum

The Fourier transform F{} is a mapping from a set of discrete summable series to the set of real functions defined on the interval [-1/2,1/2]. If s is a summable discrete series, its Fourier transform F{s} is a function that takes, for all , the value

The Fourier transform mapping is invertible,

]2/1,2/1[−∈ω

∑∞

−∞=

−=j

ijjess ωπω 2)}({F

ωω ωπ dess ijj ∫

=2/1

2/1

2)}({F

2.2 Time Series Analysis

Properties of Γ(ω)

• Γ(ω) of a real-valued process is symmetric:

• Γ(ω) is continuous and differentiable for

• γ(τ) can be obtained using the inverse Fourier transform:

• Γ(ω) describes the distribution of variance across time scales

• Γ(ω) is a linear function of the auto-covariance function. That is, if γ(τ)=a1γ1(τ)+a2γ2(τ), then

]2/1 ,2/1[−∈ω

0|)(0 =

Γ=ωω

ωd

d

)()( ωω −Γ=Γ

ωωτγ πωτ de i∫−

Γ=2/1

2/1

2)()(

∫Γ==172

0

)(2)0()( ωγtXVar

)()()( 2211 ωωω Γ+Γ=Γ aa

2.2 Time Series Analysis

Parameter of Time Series: Spectrum

Definition II (Koopmans, Brockwell & Davis)

• A stochastic process is represented as the inverse Fourier transform of a random complex valued function (or measure) Zω that is defined in the frequency domain

(i.e. a stochastic process is a sum of random oscillations)

• The spectrum is defined as the expectation of the squared modulus of the random spectral measure Zω

• The auto-covariance function is shown to be the inverse Fourier transform of the spectrum

ωπωτω deZX i

t ∫= 2

2.2 Time Series Analysis

Consider a periodic weakly stationary process

where ωj=1/Tj, i=-n,…,n, Tj Z, Zj is complex random number statisfying Zj=Z-j*.

tin

njjt

jeZX πω2∑−=

=

The first and second moments of the process are

Thus, Xt is weakly stationary, if

• Zj, j=1,…,n have zero mean

• Zj is uncorrelated with Zk with j k

tin

nj

i

kjkj

n

nj

ij

tt

n

nj

tijtX

kjjj

j

eeZZEeZE

XXE

eZEXE

)(22*22

2

)()|(|

)()(

)()(

ωωπτπωτπω

τ

πω

τγ

µ

−= ≠−=

+

−=

∑∑∑

+=

=

==

Under the condition of weak stationary, one has

∞=

+==

∑∑

=∞→

=−=

τ

τ

τπω

τγ

τπωτγ

1

1

2222

)(lim

)2cos()|(|)|(|)|(|)(

j

n

jjjo

in

njj ZEZEeZE j

The spectrum is a line spectrum. It is discrete rather than continuous!

A long memory process

2.2 Time Series Analysis

Definition I is more suitable for short memory process

γ(τ) decays sufficiently fast with increasing, so that

variance attributes to a continuous range of frequency, rather than a few discrete frequencies, since otherwise we would have a process with a periodic covariance function and hence infinite memory.

∑=

∞→∞<<

τ

ττγ

1)(lim

j

Definition II is more suitable for long memory processes or processes with distinct oscillations

γ(τ) decays not sufficiently fast with increasing τ, so that γ(τ) is not summable

Variance attributes to a few discrete frequencies

∑=

∞→∞=

τ

ττγ

1)(lim

j

2.2 Time Series Analysis

The Spectrum of AR(p) and MA(q) Processes

The spectrum of an AR(p) process with process parameters {α1,…,αp} and noise variance Var(Zt)=σ2

Z is

221

2

|1|)(

ωπασω

ikp

k k

Z

e−=∑−

The spectrum of an MA(q) process with process parameters {β1,…,βq} and noise variance Var(Zt)=σ2

Z is

ωπβσω ilq

l lZ e 21

2 1|)( −=∑+=Γ

2.2 Time Series Analysis

The Spectrum of a White Noise Process:2)( Zσω =Γ

The Spectrum of an AR(1) Process:

No extremes in the interior of [0,1/2], since

When α1>0, the spectrum speak is located at frequency ω=0. Such processes are referred to as red noise processes

⎪⎪⎩

⎪⎪⎨

−>>

−<<

−=

<<+−

−+=

−=Γ −

1

212

21

21

212

21

2

21

21

21

21

2

221

2

)1()(2for )2(

)1()(2for )1(

1)(2for )2()1(

)2cos(21

|1|)(

ααπω

πωασ

ααπω

ασ

πωπωαα

σ

πωαασ

ασω ωπ

Z

Z

Z

Z

iZ

e

0)2sin()(2)( 21 ≠Γ−=Γ πωωαω

ωdd

Power spectra of AR(1) processes with α1=0.3(L) and α1=0.9(R)

2.2 Time Series Analysis

Plotting Formats

Power spectra of AR(1) processes with α1=0.3(L) and α1=0.9(R)

The same spectra plotted in log-log format, that

• Emphasizes low-frequency variations

• emphasizes certain power law

• but is not variance conserving

2.2 Time Series Analysis

Spectrum of an AR(2) Process

Power spectrum of an AR(2) process with parameters (α1,α2) is

where

The spectrum has an extreme for ω (0,1/2) when

It is a maximum when α2<0 and a minimum when α2>0

)(21)( 2

221

2

ωαασω

gZ

−++=Γ

)4cos()2cos()1()( 221 πωαπωααω +−=g

||4|)1(| 221 ααα <−

Power spectra of AR(2) processes with (α1,α2) =(0.3,0.3) (L) and (α1,α2) =(0.9,-0.8) (R). The former has a minimum at ω~0.28, while the latter has a maximum at ω~0.17

2.2 Time Series Analysis

The general form of the spectra of AR processes

Since the auto-covariance function is a sum of auto-covariance functions of AR(1) and AR(2) processes, and since the Fourier transform is linear, the spectrum of an AR(p) process is the Fourier transform of the sum of auto-covariance functions of AR(1) and AR(2) processes and hence the sum of auto-spectra of AR(1) and AR(2) processes

Interpretation of spectra of AR processes

The exponential decay of auto-covariance functions of AR(p) processes implies that the spectra of AR(p) processes are continuous. A peak in the spectra cannot reflect the presence of an oscillatory component, even though it may indicate the presence of damped eigen-oscillations with eigen-frequencies close to that of the peak

2.2 Time Series Analysis

Parameters of Time Series: The Cross-covariance Function

Let (Xt,Yt) represent a pair of stochastic processes that are jointly weakly stationary. Then the cross-covariance function γxy is given by

Where µx is the mean of Xt and µy is the mean of Yt, and the cross-correlation function ρxy is given by

Where σx and σy are the standard deviations of processes Xt and Yt, respectively

)))((()( *YtXtxy YXE µµτγ τ −−= +

YXxyxy σστγτρ /)()( =

To ensure that the cross-correlation function exists and is absolutely summable, one needs to assure

•The mean µx and µy are independent of time

|)(|)))((( |),(|)))(((

|)(|)))((( |),(|)))(((

stXYEstYXE

stYYEstXXE

yxYsYtxyXsXt

yyYsYtxxXsXt

−=−−−=−−

−=−−−=−−

γµµγµµ

γµµγµµ

yyxyxxabab ,,for |)(| =∞<∑∞=

−∞=

τ

τ

τγ

Properties

)()()(.3

)0()0(|)(|.2

)()( .1

*,

*

ταγτγαβτγ

γγτγ

τγτγ

βα xzxyzyx

yyxxyx

xyyx

+=

−=

+

2.2 Time Series Analysis

Examples:

Cross-correlation between Xt and Yt=αXt)()( ταγτγ xxxy = symmetric!

Cross-correlation between Xt and Yt=αXt+Zt with Zt being independent white noise

)()(0 if ),(0 if ,)0(

)( 2

22

ταγτγττγατσγα

τγ

xxxy

xx

zxxyy

=⎩⎨⎧

≠=+

=

Cross-correlation between Xt and Yt=Xt+1-Xt

⎟⎠⎞

⎜⎝⎛ ≈

−−=

)()(

)1()()(

τγτ

τγ

τγτγτγ

xxxy

xxxxxy

dd

2.2 Time Series Analysis

Examples:

Cross-correlation between an AR(1) process and its driving noise

0for 0)()(0for )( 2

1

<=−=≥=

ττγτγτσατγ τ

ZXXZ

zXZ

Highly non-symmetric!

Estimated cross-correlation functions between two monthly indices of SST and SLP over the North Pacific : one from data (thin line) and the other from a stochastic climate model (heavy line)

z leadsx leads

2.2 Time Series Analysis

The Effect of Feedbacks on Cross-correlation Functions

The continuous version of an AR(1) process is a first-order differential equation

Where the ‘forcing’ Zt acts on Xt without feedback.

ttt ZX

tX

+−=∂

∂ λ

A system with feedback can be written as

• λa=0: no feedback, ρxz=0 when X leads

• λa>0: negative feedback, ρxz is anti-symmetric

• λa<0: positive feedback, ρxz is positive everywhere with a maximum near lag zero

zttat

xttt

t

NXZ

NZXt

X

+−=

++−=∂

λ

λ

2.2 Time Series Analysis

The Effect of Feedbacks on Cross-correlation Functions

Predicted correlation between Z (monthly mean turbulent heat flux) and X (monthly mean sea surface temperature) for different feedbacks (Frankignoul 1985)

Estimated correlation between Z (monthly mean turbulent heat flux) and X (monthly mean sea surface temperature), averaged over different latitudinal bands in the Atlantic ocean

2.2 Time Series Analysis

Parameters of Time Series: The Cross-spectrum

Definition:

Let Xt and Yt be two weakly stationary stochastic processes with covariance functions γxx and γyy, and a cross-covariance function γxy . Then the cross-spectrum Γxy is defined as the Fourier transform of γxy:

The cross-spectrum is generally a complex-values function, since the cross-covariance function is neither strictly symmetric nor anti symmetric.

[-1/2,1/2] , )()}({)( 2 ∈==Γ −∞=

−∞=∑ ωτγωγω τωπτ

τ

ixyxyxy eF

2.2 Time Series Analysis

The cross-spectrum can be represented in different ways

1. The cross-spectrum can be decomposed into its real and imaginary parts as

2. The cross-spectrum can be written in polar coordinates as

3. The (squared) coherence spectrum as dimensionless amplitude spectrum

spectrum quadrature the:)( spectrum,-co the:)(

)()()(

ωω

ωωω

xyxy

xyxyxy i

ΨΛ

Ψ+Λ=Γ

0)( when 0)( if 2/0)( if 2/

)(

0)(hen w0)( if 0)( if 0

)(

0)(,0)( when ))(/)((tan)(

))()(()(

spectrum phase the:)( spectrum, amplitude the:)(

)()(

xy

xy

xy

xy

1

2/122

)(

=Λ⎩⎨⎧

<Ψ−>Ψ

=Ψ⎩⎨⎧

<Λ±>Λ

≠Λ≠ΨΛΨ=Φ

Ψ+Λ=

Φ

Φ

ωωπωπ

ω

ωωπω

ω

ωωωωω

ωωω

ωω

ωω ω

xyxy

xyxy

xyxyxyxyxy

xyxyxy

xyxy

ixyxy

A

A

eA xy

)()()(

)(2

ωωω

ωκyyxx

xyxy

AΓΓ

=

For jointly weakly stationary processes Xt, Yt and Zt

2.2 Time Series Analysis

Properties of the Cross-spectrum

1)(0 .3

)()( .2

)()()( .12/1

2/1

2

*,

≤≤

Γ=

Γ+Γ=Γ

∫−

+

ωκ

ωωτγ

ωαωαβω

πτω

βα

xy

ixyxy

xzxyzyx

de

2.2 Time Series Analysis

Properties of the Cross-spectrum of Real Weakly Stationary Processes

1. The co-spectrum is the Fourier transform of the symmetric part, γxys(τ), and the

quadrature spectrum is the Fourier transform of the anti-symmetric part of the cross-covariance spectrum, γxy

a(τ)

2. 2

3. The amplitude spectrum is positive and symmetric, and the phase spectrum is anti-symmetric

4. The coherence spectrum is symmetric

5. It is sufficient to consider spectra for positive ω

)()( ),()( ωωωω −Ψ−=Ψ−Λ=Λ xy

( ) ( ))()(21)( ,)()(

21)(

with

)2sin()(2)( ),2cos()(2)0()(11

τγτγτγτγτγτγ

πτωτγωπτωτγγωττ

−−=−+=

−=Ψ+=Λ ∑∑∞

=

=

xyxyaxyxyxy

sxy

axyxy

sxyxyxy

2.2 Time Series Analysis

Examples:

Cross-spectrum between Xt and Yt=αXt

1)(

0)(

)()(

0)(

)()(

)()(

)()(2

=

Γ=Α

Γ=Λ

Γ=Γ

Γ=Γ

ωκ

ω

ωαω

ω

ωαω

ωαω

ωαω

xy

xy

xxxy

xy

xxxy

xxyy

xxxy

Cross-spectrum between Xt and Yt=αXt+Zt being an independent white noise

1)(

)()(

)()(

22

2

22

<Γ+

Γ=

+Γ=Γ

ωασωαωκ

σωαω

xxz

xxxy

zxxyy

Cross-spectrum between Xt and Yt=Xt+1-Xt

0for ,1)(

0for ,21))(cot(tan

)2cos(1)2sin(tan)(

)()()())2cos(1(2)(

)()2sin()(

)())2cos(1()(

)())2cos(1(2)(

)()1()(

1

1

22

2

≠=

≥⎟⎠⎞

⎜⎝⎛ −==

⎟⎟⎠

⎞⎜⎜⎝

⎛−

ΓΓ=Γ−=Α

Γ=Ψ

Γ−=Λ

Γ−=Γ

Γ−=Γ

ωωκ

ωωππω

πωπωω

ωωωπωω

ωπωω

ωπωω

ωπωω

ωω ωπ

xy

xy

yyxxxxxy

xxxy

xxxy

xxyy

xxi

xy e

2.2 Time Series Analysis

Properties of the Cross-spectrum of Real Weakly Stationary Processes

1. The co-spectrum is the Fourier transform of the symmetric part, γxys(τ), and the

quadrature spectrum is the Fourier transform of the anti-symmetric part of the cross-covariance spectrum, γxy

a(τ)

2.

3. When the cross-covariance function is symmetric, the quadrature and phase spectra are zero for all ω. When the cross-covariance function is anti-symmetric, the co-spectrum vanishes and the phase spectrum is

4. The amplitude spectrum is positive and symmetric, and the phase spectrum is anti-symmetric

5. The coherence spectrum is symmetric

6. It is sufficient to consider spectra for positive ω

)()( ),()( ωωωω −Ψ−=Ψ−Λ=Λ xy

( ) ( ))()(21)( ,)()(

21)(

with

)2sin()(2)( ),2cos()(2)0()(11

τγτγτγτγτγτγ

πτωτγωπτωτγγωττ

−−=−+=

−=Ψ+=Λ ∑∑∞

=

=

xyxyaxyxyxy

sxy

axyxy

sxyxyxy

))(sin(2

)( ωπω xyxy Ψ−=Φ