Generalized Gaussian Bridges of Prediction-Invertible...

74
Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 10, 2012, Kyiv, Ukraine 1 http://www.uwasa.fi/tsottine/ 1 / 37

Transcript of Generalized Gaussian Bridges of Prediction-Invertible...

Page 1: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized Gaussian Bridges ofPrediction-Invertible Processes

Tommi Sottinen1 and Adil Yazigi

University of Vaasa, Finland

Modern Stochastics: Theory and Applications III

September 10, 2012, Kyiv, Ukraine

1http://www.uwasa.fi/∼tsottine/1 / 37

Page 2: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Abstract

A generalized bridge is the law of a stochastic process that isconditioned on multiple linear functionals of its path. We considercanonical representation of the bridges. In the canonicalrepresentation the linear spaces Lt(X ) = span Xs ; s ≤ t coinsidefor all t < T for both the original process and its bridgerepresentation.

A Gaussian process X = (Xt)t∈[0,T ] is prediction-invertible ifit can be recovered (in law, at least) from its prediction martingale:

Xt =

∫ t

0p−1T (t, s) dE[XT |FX

s ].

In discrete time all non-degenerate Gaussian processes areprediction-invertible. In continuous time this is most probably nottrue.

2 / 37

Page 3: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

References

This work, S. and Yazigi (2012), combines and extends the resultsof Alili (2002) and Gasbarra S. and Valkeila (2007).

Alili. Canonical decompositions of certain generalizedBrownian bridges. Electron. Comm. Probab., 7, 2002, 27–36.

Gasbarra, S. and Valkeila. Gaussian bridges. Stochasticanalysis and applications, Abel Symp. 2, 2007, 361–382.

S. and Yazigi Generalized Gaussian Bridges. PreprintarXiv:1205.3405v1, May 2012.

3 / 37

Page 4: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

4 / 37

Page 5: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

5 / 37

Page 6: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Let X = (Xt)t∈[0,T ] be a continuous Gaussian process with positivedefinite covariance function R, mean function m of boundedvariation, and X0 = m(0). We consider the conditioning, orbridging, of X on N linear functionals GT = [G i

T ]Ni=1 of its paths:

GT (X ) =

∫ T

0g(t) dXt =

[∫ T

0gi (t)dXt

]Ni=1

.

Remark (On Linear Independence of Conditionings)

We assume, without any loss of generality, that the functions giare linearly independent. Indeed, if this is not the case then thelinearly dependent, or redundant, components of g can simply beremoved from the conditioning (1) below without changing it.

6 / 37

Page 7: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Let X = (Xt)t∈[0,T ] be a continuous Gaussian process with positivedefinite covariance function R, mean function m of boundedvariation, and X0 = m(0). We consider the conditioning, orbridging, of X on N linear functionals GT = [G i

T ]Ni=1 of its paths:

GT (X ) =

∫ T

0g(t) dXt =

[∫ T

0gi (t)dXt

]Ni=1

.

Remark (On Linear Independence of Conditionings)

We assume, without any loss of generality, that the functions giare linearly independent. Indeed, if this is not the case then thelinearly dependent, or redundant, components of g can simply beremoved from the conditioning (1) below without changing it.

6 / 37

Page 8: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Informally, the generalized (Gaussian) bridge X g;y is (the law of)the (Gaussian) process X conditioned on the set∫ T

0g(t)dXt = y

=

N⋂i=1

∫ T

0gi (t) dXt = yi

. (1)

The rigorous definition is given in the next slide.

Remark (Canonical Space Framework)

For the sake of convenience, we will work on the canonicalfiltered probability space (Ω,F ,F,P), where Ω = C [0,T ],P corresponds to the Gaussian coordinate processXt(ω) = ω(t): P = P[X ∈ · ]. The filtration F = (Ft)t∈[0,T ] is theintrinsic filtration of the coordinate process X that is augmentedwith the null-sets and made right-continuous, and F = FT .

7 / 37

Page 9: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Informally, the generalized (Gaussian) bridge X g;y is (the law of)the (Gaussian) process X conditioned on the set∫ T

0g(t)dXt = y

=

N⋂i=1

∫ T

0gi (t) dXt = yi

. (1)

The rigorous definition is given in the next slide.

Remark (Canonical Space Framework)

For the sake of convenience, we will work on the canonicalfiltered probability space (Ω,F ,F,P), where Ω = C [0,T ],P corresponds to the Gaussian coordinate processXt(ω) = ω(t): P = P[X ∈ · ]. The filtration F = (Ft)t∈[0,T ] is theintrinsic filtration of the coordinate process X that is augmentedwith the null-sets and made right-continuous, and F = FT .

7 / 37

Page 10: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Definition (Generalized Gaussian Bridge)

The generalized bridge measure Pg;y is the regularconditional law

Pg;y = Pg;y [X ∈ · ] = P[

X ∈ ·∣∣∣∣ ∫ T

0g(t) dXt = y

].

A representation of the generalized Gaussian bridgeis any process X g;y satisfying

P [X g;y ∈ · ] = Pg;y [X ∈ · ] = P[

X ∈ ·∣∣∣∣ ∫ T

0g(t)dXt = y

].

8 / 37

Page 11: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Definition (Generalized Gaussian Bridge)

The generalized bridge measure Pg;y is the regularconditional law

Pg;y = Pg;y [X ∈ · ] = P[

X ∈ ·∣∣∣∣ ∫ T

0g(t) dXt = y

].

A representation of the generalized Gaussian bridgeis any process X g;y satisfying

P [X g;y ∈ · ] = Pg;y [X ∈ · ] = P[

X ∈ ·∣∣∣∣ ∫ T

0g(t)dXt = y

].

8 / 37

Page 12: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Remark (Technical Observations)

1 Note that the conditioning on the P-null-set is not a problem,since the canonical space of continuous processes is smallenough to admit regular conditional laws.

2 As a measure Pg;y the generalized Gaussian bridge is unique,but it has several different representations X g;y. Indeed,for any representation of the bridge one can combine it withany P-measure-preserving transformation to get a newrepresentation.

9 / 37

Page 13: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Generalized (linearly conditioned) Bridges

Remark (Technical Observations)

1 Note that the conditioning on the P-null-set is not a problem,since the canonical space of continuous processes is smallenough to admit regular conditional laws.

2 As a measure Pg;y the generalized Gaussian bridge is unique,but it has several different representations X g;y. Indeed,for any representation of the bridge one can combine it withany P-measure-preserving transformation to get a newrepresentation.

9 / 37

Page 14: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

10 / 37

Page 15: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Denote by 〈〈〈g〉〉〉 the matrix

〈〈〈g〉〉〉ij := 〈〈〈gi , gj〉〉〉 := Cov[∫ T

0gi (t) dXt ,

∫ T

0gj(t) dXt

].

Remark (〈〈〈·〉〉〉 is invertible and depend on T and R only)

1 Note that 〈〈〈g〉〉〉 does not depend on the mean of X nor on theconditioned values y: 〈〈〈g〉〉〉 depends only on the conditioningfunctions g = [gi ]

Ni=1 and the covariance R.

2 Since gi ’s are linearly independent and R is positive definite,the matrix 〈〈〈g〉〉〉 is invertible.

11 / 37

Page 16: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Denote by 〈〈〈g〉〉〉 the matrix

〈〈〈g〉〉〉ij := 〈〈〈gi , gj〉〉〉 := Cov[∫ T

0gi (t) dXt ,

∫ T

0gj(t) dXt

].

Remark (〈〈〈·〉〉〉 is invertible and depend on T and R only)

1 Note that 〈〈〈g〉〉〉 does not depend on the mean of X nor on theconditioned values y: 〈〈〈g〉〉〉 depends only on the conditioningfunctions g = [gi ]

Ni=1 and the covariance R.

2 Since gi ’s are linearly independent and R is positive definite,the matrix 〈〈〈g〉〉〉 is invertible.

11 / 37

Page 17: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Denote by 〈〈〈g〉〉〉 the matrix

〈〈〈g〉〉〉ij := 〈〈〈gi , gj〉〉〉 := Cov[∫ T

0gi (t) dXt ,

∫ T

0gj(t) dXt

].

Remark (〈〈〈·〉〉〉 is invertible and depend on T and R only)

1 Note that 〈〈〈g〉〉〉 does not depend on the mean of X nor on theconditioned values y: 〈〈〈g〉〉〉 depends only on the conditioningfunctions g = [gi ]

Ni=1 and the covariance R.

2 Since gi ’s are linearly independent and R is positive definite,the matrix 〈〈〈g〉〉〉 is invertible.

11 / 37

Page 18: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Theorem (Orthogonal Representation)

The generalized Gaussian bridge X g;y can be represented as

X g;yt = Xt − 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1

(∫ T

0g(u) dXu − y

). (2)

Moreover, any generalized Gaussian bridge X g;y is a Gaussianprocess with

E[X g;yt

]= m(t)−

〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1(∫ T

0g(u)dm(u)− y

),

Cov[X g;yt ,X g;y

s

]= 〈〈〈1t , 1s〉〉〉 − 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1〈〈〈1s , g〉〉〉.

12 / 37

Page 19: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Theorem (Orthogonal Representation)

The generalized Gaussian bridge X g;y can be represented as

X g;yt = Xt − 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1

(∫ T

0g(u) dXu − y

). (2)

Moreover, any generalized Gaussian bridge X g;y is a Gaussianprocess with

E[X g;yt

]= m(t)−

〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1(∫ T

0g(u)dm(u)− y

),

Cov[X g;yt ,X g;y

s

]= 〈〈〈1t , 1s〉〉〉 − 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1〈〈〈1s , g〉〉〉.

12 / 37

Page 20: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Corollary (Mean-Conditioning Invariance)

Let X be a centered Gaussian process with X0 = 0 and let m be afunction of bounded variation. Let X g := X g;0 be a bridge wherethe conditional functionals are conditioned to zero. Then

(X + m)g;yt = X g

t +

(m(t)− 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1

∫ T

0g(u) dm(u)

)+ 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1y.

Remark (Normalization)

Because of the corollary above we can, and will, assume in whatfollows that m = 0 and y = 0.

13 / 37

Page 21: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Orthogonal Representation

Corollary (Mean-Conditioning Invariance)

Let X be a centered Gaussian process with X0 = 0 and let m be afunction of bounded variation. Let X g := X g;0 be a bridge wherethe conditional functionals are conditioned to zero. Then

(X + m)g;yt = X g

t +

(m(t)− 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1

∫ T

0g(u) dm(u)

)+ 〈〈〈1t , g〉〉〉>〈〈〈g〉〉〉−1y.

Remark (Normalization)

Because of the corollary above we can, and will, assume in whatfollows that m = 0 and y = 0.

13 / 37

Page 22: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

14 / 37

Page 23: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

The problem with the orthogonal bridge representation (2) of X g;y

is that in order to construct it at any point t ∈ [0,T ) one needsthe whole path of the underlying process X up to time T . In thissection and the following sections we construct a bridgerepresentation that is canonical in the following sense:

Definition (Canonical Representation)

The bridge X g;y is of canonical representation if, for allt ∈ [0,T ), X g;y

t ∈ Lt(X ) and Xt ∈ Lt(X g;y).

Here Lt(Y ) is the closed linear subspace of L2(Ω,F ,P) generatedby the random variables Ys , s ≤ t.

15 / 37

Page 24: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

The problem with the orthogonal bridge representation (2) of X g;y

is that in order to construct it at any point t ∈ [0,T ) one needsthe whole path of the underlying process X up to time T . In thissection and the following sections we construct a bridgerepresentation that is canonical in the following sense:

Definition (Canonical Representation)

The bridge X g;y is of canonical representation if, for allt ∈ [0,T ), X g;y

t ∈ Lt(X ) and Xt ∈ Lt(X g;y).

Here Lt(Y ) is the closed linear subspace of L2(Ω,F ,P) generatedby the random variables Ys , s ≤ t.

15 / 37

Page 25: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

The problem with the orthogonal bridge representation (2) of X g;y

is that in order to construct it at any point t ∈ [0,T ) one needsthe whole path of the underlying process X up to time T . In thissection and the following sections we construct a bridgerepresentation that is canonical in the following sense:

Definition (Canonical Representation)

The bridge X g;y is of canonical representation if, for allt ∈ [0,T ), X g;y

t ∈ Lt(X ) and Xt ∈ Lt(X g;y).

Here Lt(Y ) is the closed linear subspace of L2(Ω,F ,P) generatedby the random variables Ys , s ≤ t.

15 / 37

Page 26: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Remark (Gaussian Specialities)

1 Since the conditional laws of Gaussian processes are Gaussianand Gaussian spaces are linear, the assumptions X g;y

t ∈ Lt(X )and Xt ∈ Lt(X g;y) are the same as assuming that X g;y

t isFXt -measurable and Xt is FX g;y

t -measurable (and,consequently, FX

t = FX g;y

t ). This fact is very special toGaussian processes. Indeed, in general conditioned processessuch as generalized bridges are not linear transformations ofthe underlying process.

2 Another “Gaussian fact” here is that the bridges are Gaussian.In general conditioned processes do not belong to the samefamily of distributions as the original process.

16 / 37

Page 27: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Remark (Gaussian Specialities)

1 Since the conditional laws of Gaussian processes are Gaussianand Gaussian spaces are linear, the assumptions X g;y

t ∈ Lt(X )and Xt ∈ Lt(X g;y) are the same as assuming that X g;y

t isFXt -measurable and Xt is FX g;y

t -measurable (and,consequently, FX

t = FX g;y

t ). This fact is very special toGaussian processes. Indeed, in general conditioned processessuch as generalized bridges are not linear transformations ofthe underlying process.

2 Another “Gaussian fact” here is that the bridges are Gaussian.In general conditioned processes do not belong to the samefamily of distributions as the original process.

16 / 37

Page 28: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

We shall require that the restricted measures Pg,yt := Pg;y|Ft and

Pt := P|Ft are equivalent for all t < T (they are obviously singularfor t = T ). To this end we assume that the matrix

〈〈〈g〉〉〉ij(t) := E[(

G iT (X )− G i

t (X ))(

G jT (X )− G j

t (X ))]

= E[∫ T

tgi (s) dXs

∫ T

tgj(s) dXs

]is invertible for all t < T .

Remark (On Notation)

In the previous section we considered the matrix 〈〈〈g〉〉〉, but fromnow on we consider the function 〈〈〈g〉〉〉(·). Their connection is ofcourse 〈〈〈g〉〉〉 = 〈〈〈g〉〉〉(0). We hope that is overloading of notationdoes not cause confusion to the reader.

17 / 37

Page 29: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

We shall require that the restricted measures Pg,yt := Pg;y|Ft and

Pt := P|Ft are equivalent for all t < T (they are obviously singularfor t = T ). To this end we assume that the matrix

〈〈〈g〉〉〉ij(t) := E[(

G iT (X )− G i

t (X ))(

G jT (X )− G j

t (X ))]

= E[∫ T

tgi (s) dXs

∫ T

tgj(s) dXs

]is invertible for all t < T .

Remark (On Notation)

In the previous section we considered the matrix 〈〈〈g〉〉〉, but fromnow on we consider the function 〈〈〈g〉〉〉(·). Their connection is ofcourse 〈〈〈g〉〉〉 = 〈〈〈g〉〉〉(0). We hope that is overloading of notationdoes not cause confusion to the reader.

17 / 37

Page 30: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Let now M be a Gaussian martingale with strictly increasingbracket 〈M〉 and M0 = 0.

Remark (Bracket and Non-Degeneracy)

Note that the bracket is strictly increasing if and only if thecovariance R is positive definite. Indeed, for Gaussian martingaleswe have R(t, s) = Var(Mt∧s) = 〈M〉t∧s .

Define a Volterra kernel

`g(t, s) := −g>(t) 〈〈〈g〉〉〉−1(t) g(s). (3)

Note that the kernel `g depends on the process M through itscovariance 〈〈〈·, ·〉〉〉, and in the Gaussian martingale case we have

〈〈〈g〉〉〉ij(t) =

∫ T

tgi (s)gj(s)d〈M〉s .

18 / 37

Page 31: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Let now M be a Gaussian martingale with strictly increasingbracket 〈M〉 and M0 = 0.

Remark (Bracket and Non-Degeneracy)

Note that the bracket is strictly increasing if and only if thecovariance R is positive definite. Indeed, for Gaussian martingaleswe have R(t, s) = Var(Mt∧s) = 〈M〉t∧s .

Define a Volterra kernel

`g(t, s) := −g>(t) 〈〈〈g〉〉〉−1(t) g(s). (3)

Note that the kernel `g depends on the process M through itscovariance 〈〈〈·, ·〉〉〉, and in the Gaussian martingale case we have

〈〈〈g〉〉〉ij(t) =

∫ T

tgi (s)gj(s)d〈M〉s .

18 / 37

Page 32: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Let now M be a Gaussian martingale with strictly increasingbracket 〈M〉 and M0 = 0.

Remark (Bracket and Non-Degeneracy)

Note that the bracket is strictly increasing if and only if thecovariance R is positive definite. Indeed, for Gaussian martingaleswe have R(t, s) = Var(Mt∧s) = 〈M〉t∧s .

Define a Volterra kernel

`g(t, s) := −g>(t) 〈〈〈g〉〉〉−1(t) g(s). (3)

Note that the kernel `g depends on the process M through itscovariance 〈〈〈·, ·〉〉〉, and in the Gaussian martingale case we have

〈〈〈g〉〉〉ij(t) =

∫ T

tgi (s)gj(s)d〈M〉s .

18 / 37

Page 33: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Let now M be a Gaussian martingale with strictly increasingbracket 〈M〉 and M0 = 0.

Remark (Bracket and Non-Degeneracy)

Note that the bracket is strictly increasing if and only if thecovariance R is positive definite. Indeed, for Gaussian martingaleswe have R(t, s) = Var(Mt∧s) = 〈M〉t∧s .

Define a Volterra kernel

`g(t, s) := −g>(t) 〈〈〈g〉〉〉−1(t) g(s). (3)

Note that the kernel `g depends on the process M through itscovariance 〈〈〈·, ·〉〉〉, and in the Gaussian martingale case we have

〈〈〈g〉〉〉ij(t) =

∫ T

tgi (s)gj(s) d〈M〉s .

18 / 37

Page 34: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

The following lemma is the key observation in finding the canonicalgeneralized bridge representation. Actually, it is a multivariateversion of Proposition 6 of Gasbarra, S. and Valkeila (2007).

Lemma (Radon-Nikodym for Bridges)

Let `g be given by (3) and let M be a continuous Gaussianmartingale with strictly increasing bracket 〈M〉 and M0 = 0. Then

logdPg

t

dPt

=

∫ t

0

∫ s

0`g(s, u)dMudMs −

1

2

∫ t

0

(∫ s

0`g(s, u) dMu

)2

d〈M〉s .

19 / 37

Page 35: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

The following lemma is the key observation in finding the canonicalgeneralized bridge representation. Actually, it is a multivariateversion of Proposition 6 of Gasbarra, S. and Valkeila (2007).

Lemma (Radon-Nikodym for Bridges)

Let `g be given by (3) and let M be a continuous Gaussianmartingale with strictly increasing bracket 〈M〉 and M0 = 0. Then

logdPg

t

dPt

=

∫ t

0

∫ s

0`g(s, u)dMudMs −

1

2

∫ t

0

(∫ s

0`g(s, u) dMu

)2

d〈M〉s .

19 / 37

Page 36: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

Let p(·;µ,Σ) be the Gaussian density on RN and let

αgt (dy) := P

[GT (M) ∈ dy

∣∣∣ FMt

].

By the Bayes’ formula and the martingale property

dPgt

dPt=

dαgt

dαg0

(0) =p(

0; Gt(M), 〈〈〈g〉〉〉(t))

p(

0; G0(M), 〈〈〈g〉〉〉(0)) .

Denote F (t,Mt) := −12G>t 〈〈〈g〉〉〉−1(0)Gt and the claim follows by

using the Ito formula.

20 / 37

Page 37: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

Let p(·;µ,Σ) be the Gaussian density on RN and let

αgt (dy) := P

[GT (M) ∈ dy

∣∣∣ FMt

].

By the Bayes’ formula and the martingale property

dPgt

dPt=

dαgt

dαg0

(0) =p(

0; Gt(M), 〈〈〈g〉〉〉(t))

p(

0; G0(M), 〈〈〈g〉〉〉(0)) .

Denote F (t,Mt) := −12G>t 〈〈〈g〉〉〉−1(0)Gt and the claim follows by

using the Ito formula.

20 / 37

Page 38: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

Let p(·;µ,Σ) be the Gaussian density on RN and let

αgt (dy) := P

[GT (M) ∈ dy

∣∣∣ FMt

].

By the Bayes’ formula and the martingale property

dPgt

dPt=

dαgt

dαg0

(0) =p(

0; Gt(M), 〈〈〈g〉〉〉(t))

p(

0; G0(M), 〈〈〈g〉〉〉(0)) .

Denote F (t,Mt) := −12G>t 〈〈〈g〉〉〉−1(0)Gt and the claim follows by

using the Ito formula.

20 / 37

Page 39: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Corollary (Semimartingale Decomposition)

The canonical bridge representation Mg satisfies the stochasticdifferential equation

dMt = dMgt −

∫ t

0`g(t, s) dMg

s d〈M〉t , (4)

where `g is given by (3). Moreover 〈M〉 = 〈Mg〉.

Proof.

The claim follows by using the Girsanov’s theorem.

21 / 37

Page 40: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Corollary (Semimartingale Decomposition)

The canonical bridge representation Mg satisfies the stochasticdifferential equation

dMt = dMgt −

∫ t

0`g(t, s) dMg

s d〈M〉t , (4)

where `g is given by (3). Moreover 〈M〉 = 〈Mg〉.

Proof.

The claim follows by using the Girsanov’s theorem.

21 / 37

Page 41: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Remark (Equivalence and Singularity)

Note that ∫ T−

0

∫ s

0`g(s, u)2 du ds <∞.

In view of (4) this means that M and Mg are equivalent on [0,T ).Indeed, equation (4) can be viewed as the Hitsudarepresentation between two equivalent Gaussian processes.

Also note that ∫ T

0

∫ s

0`g(s, u)2 du ds =∞

meaning, by the Hitsuda representation theorem, that M and Mg

are singular on [0,T ].

22 / 37

Page 42: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Remark (Equivalence and Singularity)

Note that ∫ T−

0

∫ s

0`g(s, u)2 du ds <∞.

In view of (4) this means that M and Mg are equivalent on [0,T ).Indeed, equation (4) can be viewed as the Hitsudarepresentation between two equivalent Gaussian processes.

Also note that ∫ T

0

∫ s

0`g(s, u)2 du ds =∞

meaning, by the Hitsuda representation theorem, that M and Mg

are singular on [0,T ].

22 / 37

Page 43: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Next we solve the stochastic differential equation (4) of Corollary6. In general, solving a Volterra–Stieltjes equation like (4) ina closed form is difficult.

Of course, the general theory of Volterra equations suggests thatthe solution will be of the form (6) of next theorem, where `∗g isthe resolvent kernel of `g determined by the resolventequation (7) given below.

Also, the general theory suggests that the resolvent kernel can becalculated implicitly by using the Neumann series. In our casethe kernel `g is a quadratic form that factorizes in its argument.This allows us to calculate the resolvent `∗g explicitly as (5) below.

23 / 37

Page 44: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Next we solve the stochastic differential equation (4) of Corollary6. In general, solving a Volterra–Stieltjes equation like (4) ina closed form is difficult.

Of course, the general theory of Volterra equations suggests thatthe solution will be of the form (6) of next theorem, where `∗g isthe resolvent kernel of `g determined by the resolventequation (7) given below.

Also, the general theory suggests that the resolvent kernel can becalculated implicitly by using the Neumann series. In our casethe kernel `g is a quadratic form that factorizes in its argument.This allows us to calculate the resolvent `∗g explicitly as (5) below.

23 / 37

Page 45: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Next we solve the stochastic differential equation (4) of Corollary6. In general, solving a Volterra–Stieltjes equation like (4) ina closed form is difficult.

Of course, the general theory of Volterra equations suggests thatthe solution will be of the form (6) of next theorem, where `∗g isthe resolvent kernel of `g determined by the resolventequation (7) given below.

Also, the general theory suggests that the resolvent kernel can becalculated implicitly by using the Neumann series. In our casethe kernel `g is a quadratic form that factorizes in its argument.This allows us to calculate the resolvent `∗g explicitly as (5) below.

23 / 37

Page 46: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Theorem (Solution to an SDE)

Let s ≤ t ∈ [0,T ]. Define the Volterra kernel

`∗g(t, s) := −`g(t, s)|〈〈〈g〉〉〉|(t)

|〈〈〈g〉〉〉|(s)(5)

= |〈〈〈g〉〉〉|(t)g>(t)〈〈〈g〉〉〉−1(t)g(s)

|〈〈〈g〉〉〉|(s).

Then the bridge Mg has the canonical representation

dMgt = dMt −

∫ t

0`∗g(t, s) dMs d〈M〉t , (6)

i.e., (6) is the solution to (4).

24 / 37

Page 47: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

Equation (6) is the solution to (4) if the kernel `∗g satisfies theresolvent equation

`g(t, s) + `∗g(t, s) =

∫ t

s`g(t, u)`∗g(u, s)d〈M〉u. (7)

Indeed, suppose (6) is the solution to (4). This means that

dMt =

(dMt −

∫ t

0`∗g(t, s)dMs d〈M〉t

)−∫ t

0`g(t, s)

(dMs −

∫ s

0`∗g(s, u)dMu d〈M〉s

)d〈M〉t ,

25 / 37

Page 48: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

Equation (6) is the solution to (4) if the kernel `∗g satisfies theresolvent equation

`g(t, s) + `∗g(t, s) =

∫ t

s`g(t, u)`∗g(u, s)d〈M〉u. (7)

Indeed, suppose (6) is the solution to (4). This means that

dMt =

(dMt −

∫ t

0`∗g(t, s)dMs d〈M〉t

)−∫ t

0`g(t, s)

(dMs −

∫ s

0`∗g(s, u)dMu d〈M〉s

)d〈M〉t ,

25 / 37

Page 49: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

In the integral form, by using the Fubini’s theorem, this means that

Mt = Mt −∫ t

0

∫ t

s`∗g(u, s) d〈M〉udMs

−∫ t

0

∫ t

s`g(u, s) d〈M〉udMs

+

∫ t

0

∫ t

s

∫ s

u`g(s, v)`∗g(v , u)d〈M〉v d〈M〉udMs .

The resolvent criterion (7) follows by identifying the integrands inthe d〈M〉udMs -integrals above.

Now it is straightforward, but tedious, to check that the resolventequation holds. We omit the details.

26 / 37

Page 50: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

In the integral form, by using the Fubini’s theorem, this means that

Mt = Mt −∫ t

0

∫ t

s`∗g(u, s) d〈M〉udMs

−∫ t

0

∫ t

s`g(u, s) d〈M〉udMs

+

∫ t

0

∫ t

s

∫ s

u`g(s, v)`∗g(v , u)d〈M〉v d〈M〉udMs .

The resolvent criterion (7) follows by identifying the integrands inthe d〈M〉udMs -integrals above.

Now it is straightforward, but tedious, to check that the resolventequation holds. We omit the details.

26 / 37

Page 51: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation for Martingales

Proof.

In the integral form, by using the Fubini’s theorem, this means that

Mt = Mt −∫ t

0

∫ t

s`∗g(u, s) d〈M〉udMs

−∫ t

0

∫ t

s`g(u, s) d〈M〉udMs

+

∫ t

0

∫ t

s

∫ s

u`g(s, v)`∗g(v , u)d〈M〉v d〈M〉udMs .

The resolvent criterion (7) follows by identifying the integrands inthe d〈M〉udMs -integrals above.

Now it is straightforward, but tedious, to check that the resolventequation holds. We omit the details.

26 / 37

Page 52: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

27 / 37

Page 53: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Let us now consider a Gaussian process X that is not a martingale.For a (Gaussian) process X its prediction martingale is theprocess X defined as

Xt = E[XT

∣∣FXt

].

Since for Gaussian processes Xt ∈ Lt(X ), we may write, at leastformally, that

Xt =

∫ t

0p(t, s)dXs ,

where the abstract kernel p depends also on T (since X dependson T ).

In the next definition we assume, among other things, that thekernel p exists as a real, and not only formal, function. We alsoassume that the kernel p is invertible.

28 / 37

Page 54: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Let us now consider a Gaussian process X that is not a martingale.For a (Gaussian) process X its prediction martingale is theprocess X defined as

Xt = E[XT

∣∣FXt

].

Since for Gaussian processes Xt ∈ Lt(X ), we may write, at leastformally, that

Xt =

∫ t

0p(t, s)dXs ,

where the abstract kernel p depends also on T (since X dependson T ).

In the next definition we assume, among other things, that thekernel p exists as a real, and not only formal, function. We alsoassume that the kernel p is invertible.

28 / 37

Page 55: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Let us now consider a Gaussian process X that is not a martingale.For a (Gaussian) process X its prediction martingale is theprocess X defined as

Xt = E[XT

∣∣FXt

].

Since for Gaussian processes Xt ∈ Lt(X ), we may write, at leastformally, that

Xt =

∫ t

0p(t, s)dXs ,

where the abstract kernel p depends also on T (since X dependson T ).

In the next definition we assume, among other things, that thekernel p exists as a real, and not only formal, function. We alsoassume that the kernel p is invertible.

28 / 37

Page 56: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Definition (Prediction-Invertibility)

A Gaussian process X is prediction-invertible if there exists akernel p such that its prediction martingale X is continuous, canbe represented as

Xt =

∫ t

0p(t, s)dXs ,

and there exists an inverse kernel p−1 such that, for all t ∈ [0,T ],p−1(t, ·) ∈ L2([0,T ],d〈X 〉) and X can be recovered from X by

Xt =

∫ t

0p−1(t, s) dXs .

29 / 37

Page 57: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Definition (Prediction-Invertibility)

A Gaussian process X is prediction-invertible if there exists akernel p such that its prediction martingale X is continuous, canbe represented as

Xt =

∫ t

0p(t, s)dXs ,

and there exists an inverse kernel p−1 such that, for all t ∈ [0,T ],p−1(t, ·) ∈ L2([0,T ], d〈X 〉) and X can be recovered from X by

Xt =

∫ t

0p−1(t, s) dXs .

29 / 37

Page 58: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Remark

In general it seems to be a difficult problem to determine whethera Gaussian process is prediction-invertible or not. In the discretetime non-degenerate case all Gaussian processes areprediction-invertible. In continuous time the situation is moredifficult, as the next example illustrates.

Nevertheless, we can immediately see that we must have

R(t, s) =

∫ t∧s

0p−1(t, u) p−1(s, u) d〈X 〉u,

where〈X 〉u = Var (E [XT |Fu]) .

However, this criterion does not seem to be very helpful in practice.

30 / 37

Page 59: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Remark

In general it seems to be a difficult problem to determine whethera Gaussian process is prediction-invertible or not. In the discretetime non-degenerate case all Gaussian processes areprediction-invertible. In continuous time the situation is moredifficult, as the next example illustrates.

Nevertheless, we can immediately see that we must have

R(t, s) =

∫ t∧s

0p−1(t, u) p−1(s, u) d〈X 〉u,

where〈X 〉u = Var (E [XT |Fu]) .

However, this criterion does not seem to be very helpful in practice.

30 / 37

Page 60: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

Example (Fine Points of Prediction-Invertibility)

Consider the Gaussian slope Xt = tξ, t ∈ [0,T ], where ξ is astandard normal random variable. Now, if we consider the “rawfiltration” GXt = σ(Xs ; s ≤ t), then X is not prediction invertible.Indeed, then X0 = 0 but Xt = XT , if t ∈ (0,T ]. So, X is notcontinuous. On the other hand, the augmented filtration is simplyFXt = σ(ξ) for all t ∈ [0,T ]. So, X = XT . Note, however, that in

both cases the slope X can be recovered from the predictionmartingale: Xt = t

T Xt .

31 / 37

Page 61: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

We want to represent integrals w.r.t. X w.r.t. X .

Definition (Linear Extensions of Kernels)

Let X be prediction-invertible. Let P and P−1 extend the relationsP[1t ] = p(t, ·) and P−1[1t ] = p−1(t, ·) linearly.

Lemma (Abstract vs. Concrete Wiener Integrals)

For f and g nice enough∫ T

0f (t) dXt =

∫ T

0P−1[f ](t) dXt ,∫ T

0g(t) dXt =

∫ T

0P[g ](t) dXt .

32 / 37

Page 62: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

We want to represent integrals w.r.t. X w.r.t. X .

Definition (Linear Extensions of Kernels)

Let X be prediction-invertible. Let P and P−1 extend the relationsP[1t ] = p(t, ·) and P−1[1t ] = p−1(t, ·) linearly.

Lemma (Abstract vs. Concrete Wiener Integrals)

For f and g nice enough∫ T

0f (t) dXt =

∫ T

0P−1[f ](t) dXt ,∫ T

0g(t) dXt =

∫ T

0P[g ](t) dXt .

32 / 37

Page 63: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Prediction-Invertible Processes

We want to represent integrals w.r.t. X w.r.t. X .

Definition (Linear Extensions of Kernels)

Let X be prediction-invertible. Let P and P−1 extend the relationsP[1t ] = p(t, ·) and P−1[1t ] = p−1(t, ·) linearly.

Lemma (Abstract vs. Concrete Wiener Integrals)

For f and g nice enough∫ T

0f (t) dXt =

∫ T

0P−1[f ](t) dXt ,∫ T

0g(t) dXt =

∫ T

0P[g ](t) dXt .

32 / 37

Page 64: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

33 / 37

Page 65: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation forPrediction-Invertible Processes

The next (our main) theorem follows basically by rewriting∫ T

0g(t)dXt =

∫ T

0P−1[g](t)dX (t) = 0.

Theorem (Canonical Representation)

Let X be prediction-invertible Gaussian process. Then

X gt = Xt −

∫ t

0

∫ t

sp−1(t, u)P

[ˆ∗g(u, ·)

](s)d〈X 〉u dXs ,

where g = P−1[g ] and

ˆ∗g(u, v) = |〈〈〈g〉〉〉X |(u)g>(u)(〈〈〈g〉〉〉X )

−1(u)

g(v)

|〈〈〈g〉〉〉X |(v).

34 / 37

Page 66: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation forPrediction-Invertible Processes

The next (our main) theorem follows basically by rewriting∫ T

0g(t)dXt =

∫ T

0P−1[g](t)dX (t) = 0.

Theorem (Canonical Representation)

Let X be prediction-invertible Gaussian process. Then

X gt = Xt −

∫ t

0

∫ t

sp−1(t, u)P

[ˆ∗g(u, ·)

](s) d〈X 〉u dXs ,

where g = P−1[g ] and

ˆ∗g(u, v) = |〈〈〈g〉〉〉X |(u)g>(u)(〈〈〈g〉〉〉X )

−1(u)

g(v)

|〈〈〈g〉〉〉X |(v).

34 / 37

Page 67: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation forPrediction-Invertible Processes

Proof.

In the prediction-martingale level we have

dX gs = dXs −

∫ s

0

ˆ∗g(s, u)dXu d〈X 〉s .

Now, by operating with p−1 and P we get

X gt = Xt −

∫ t

0p−1(t, s)

(∫ s

0

ˆ∗g(s, u) dXu

)d〈X 〉s

= Xt −∫ t

0p−1(t, s)

∫ s

0P[

ˆ∗g(s, ·)

](u)dXu d〈X 〉s .

Finally, the claim follows by using the Fubini’s theorem.

35 / 37

Page 68: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Canonical Representation forPrediction-Invertible Processes

Proof.

In the prediction-martingale level we have

dX gs = dXs −

∫ s

0

ˆ∗g(s, u)dXu d〈X 〉s .

Now, by operating with p−1 and P we get

X gt = Xt −

∫ t

0p−1(t, s)

(∫ s

0

ˆ∗g(s, u) dXu

)d〈X 〉s

= Xt −∫ t

0p−1(t, s)

∫ s

0P[

ˆ∗g(s, ·)

](u)dXu d〈X 〉s .

Finally, the claim follows by using the Fubini’s theorem.

35 / 37

Page 69: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Outline

1 Generalized (linearly conditioned) Bridges

2 Orthogonal Representation

3 Canonical Representation for Martingales

4 Prediction-Invertible Processes

5 Canonical Representation forPrediction-Invertible Processes

6 Open Questions

36 / 37

Page 70: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Open Questions

1 What kind of condition (for the covariance) would implyprediction-invertibility?

2 How to calculate the prediction-inversion kernel p−1(t, s)?

3 What would be an example of continuous Gaussian processthat is not prediction-invertible?

4 What would ensure that Ls(X ) ( Lt(X ) for s < t andLs(X ) = ∩t>sLt(X ) and Lt(X ) = Lt(X g)? Would this beenough for prediction-invertiblity?

– The End –Thank you for your attention!

37 / 37

Page 71: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Open Questions

1 What kind of condition (for the covariance) would implyprediction-invertibility?

2 How to calculate the prediction-inversion kernel p−1(t, s)?

3 What would be an example of continuous Gaussian processthat is not prediction-invertible?

4 What would ensure that Ls(X ) ( Lt(X ) for s < t andLs(X ) = ∩t>sLt(X ) and Lt(X ) = Lt(X g)? Would this beenough for prediction-invertiblity?

– The End –Thank you for your attention!

37 / 37

Page 72: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Open Questions

1 What kind of condition (for the covariance) would implyprediction-invertibility?

2 How to calculate the prediction-inversion kernel p−1(t, s)?

3 What would be an example of continuous Gaussian processthat is not prediction-invertible?

4 What would ensure that Ls(X ) ( Lt(X ) for s < t andLs(X ) = ∩t>sLt(X ) and Lt(X ) = Lt(X g)? Would this beenough for prediction-invertiblity?

– The End –Thank you for your attention!

37 / 37

Page 73: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Open Questions

1 What kind of condition (for the covariance) would implyprediction-invertibility?

2 How to calculate the prediction-inversion kernel p−1(t, s)?

3 What would be an example of continuous Gaussian processthat is not prediction-invertible?

4 What would ensure that Ls(X ) ( Lt(X ) for s < t andLs(X ) = ∩t>sLt(X ) and Lt(X ) = Lt(X g)? Would this beenough for prediction-invertiblity?

– The End –Thank you for your attention!

37 / 37

Page 74: Generalized Gaussian Bridges of Prediction-Invertible Processeslipas.uwasa.fi/~tsottine/talks/Kyiv2012.pdf · 2012-09-24 · Generalized (linearly conditioned) Bridges Let X = (X

Open Questions

1 What kind of condition (for the covariance) would implyprediction-invertibility?

2 How to calculate the prediction-inversion kernel p−1(t, s)?

3 What would be an example of continuous Gaussian processthat is not prediction-invertible?

4 What would ensure that Ls(X ) ( Lt(X ) for s < t andLs(X ) = ∩t>sLt(X ) and Lt(X ) = Lt(X g)? Would this beenough for prediction-invertiblity?

– The End –Thank you for your attention!

37 / 37