1 Introduction to Stochastic Models GSLM 54100. 2 Outline discrete-time Markov chain motivation ...

Post on 23-Dec-2015

228 views 0 download

Tags:

Transcript of 1 Introduction to Stochastic Models GSLM 54100. 2 Outline discrete-time Markov chain motivation ...

1

Introduction to Stochastic ModelsIntroduction to Stochastic ModelsGSLM 54100GSLM 54100

2

OutlineOutline

discrete-time Markov chain motivation

example

transient behavior

3

MotivationMotivation

What happens if Xn’s are dependent?

many dependent systems, e.g., inventory across periods

state of a machine

customers unserved in a distribution system

time

excellent

good

fair

bad

4

MotivationMotivation

any nice limiting results for dependent Xn’s?

no such result for general dependent Xn’s

nice results when Xn’s form a discrete-time

Markov Chain

1 ???

N

nNn

X

N

{ }11

???n

N

X sNn

N

5

Discrete-Time, Discrete-State Discrete-Time, Discrete-State Stochastic ProcessStochastic Process

a stochastic process: a sequence of indexed random variables, e.g., {Xn}, {X(t)}

a discrete-time stochastic process: {Xn}

a discrete-state stochastic process, e.g., state {excellent, good, fair, bad}

set of states {e, g, f, b} {1, 2, 3, 4} {0, 1, 2, 3}

state to describe weather {windy, rainy, cloudy, sunny}

6

Markov PropertyMarkov Property

a discrete-time, discrete-state stochastic process possesses the Markov property if P{Xn+1 = j|Xn = i, Xn−1 = in−1, . . . , X1 = i1, X0 = i0} = pij,

for all i0, i1, …, in1, in, i, j, n 0

time frame: presence n, future n+1, past {i0, i1, …, in1}

meaning of the statement: given presence, the past and the future are conditionally independent

the past and the future are certainly dependent

7

One-Step Transition Probability MatrixOne-Step Transition Probability Matrix

pij 0, i, j 0,

00 01 02

10 11 12

0 1 2

...

...

i i i

p p p

p p p

p p p

P M M M

K

M M M

01, 0,1, 2,...ij

jp i

8

Example 4-1 Example 4-1 Forecasting the WeatherForecasting the Weather

state {rain, not rain}

dynamics of the system rains today rains tomorrow w.p. does not rain today rains tomorrow w.p.

weather of the system across the days, {Xn}

1

1

P

9

Example 4-2 Example 4-2 A Communication SystemA Communication System

digital signals in 0 and 1 a signal remaining unchanged with probability

p on passing through a stage, independent of everything else

state = value of the signal {0, 1}

Xn: value of the signal before entering the nth stage 1

1

p p

p p

P

10

Example 4-3 Example 4-3 The Mood of a PersonThe Mood of a Person

mood {cheerful (C), so-so (S), or glum (G)} cheerful today C, S, or G tomorrow w.p. 0.5, 0.4, 0.1 so-so today C, S, or G tomorrow w.p. 0.3, 0.4, 0.3 glum today C, S, or G tomorrow w.p. 0.2, 0.3, 0.5

Xn: mood on the nth day, such that mood {C, S, G}

{Xn}: a 3-state Markov chain (state 0 = C, state 1 = S, state 2 = G) 0.5 0.4 0.1

0.3 0.4 0.3

0.2 0.3 0.5

P

11

Example 4.4Example 4.4Transforming a Process into a Transforming a Process into a DTMCDTMC

raining or not today depending on the weather conditions of the last two days rained for the past two days will rain tomorrow

w.p. 0.7 rained today but not yesterday will rain

tomorrow w.p. 0.5 rained yesterday but not today will rain

tomorrow w.p. 0.4 not rained in the past two days will rain

tomorrow w.p. 0.2

12

Example 4.4Example 4.4Transforming a Process into a Transforming a Process into a DTMCDTMC

state 0 if it rained both today and yesterday 1 if it rained today but not yesterday 2 if it rained yesterday but not today 3 if it did not rain either yesterday or today

0.7 0 0.3 0

0.5 0 0.5 0

0 0.4 0 0.6

0 0.2 0 0.8

P

13

Example 4.5Example 4.5A Random Walk ModelA Random Walk Model

a discrete-time Markov chain of number of states {…, -2, -1, 0, 1, 2, …}

random walk: for 0 < p < 1,

pi,i+1 = p = 1 − pi,i−1, i = 0, 1, . . .

14

Example 4.6Example 4.6A Gambling ModelA Gambling Model

each play of a game a gambler gaining $1 w.p. p, and losing $1 o.w.

end of the game: a gambler either broken or accumulating $N transition probabilities:

pi,i+1 = p = 1 − pi,i−1, i = 1, 2, . . . , N − 1; p00 = pNN = 1 example for N = 4

state: Xn, the gambler’s fortune after the n play {0, 1, 2, 3, 4}

1 0 0 0 0

1 0 0 0

0 1 0 0

0 0 1 0

0 0 0 0 1

p p

p p

p p

P

15

Example 4.7Example 4.7

insurance premium paid on a year depending on the number of claims made last year

16

Example 4.7Example 4.7

# of claims in a year ~ Poisson()

, 0!

k

ka e kk

0 1 2

0 1

0

0

1

0 1

0 0 1

0 0 1

a a a

a a

a

a

P

17

Transient Behavior Transient Behavior {Xn} for weather condition

0 if it rained both today and yesterday 1 if it rained today but not yesterday 2 if it rained yesterday but not today 3 if it did not rain either yesterday or today

suppose yesterday rained and today does not, what is the weather forecast for tomorrow? for 10 days from now?

0.7 0 0.3 0

0.5 0 0.5 0

0 0.4 0 0.6

0 0.2 0 0.8

P

18

mm-Step Transition Probability Matrix-Step Transition Probability Matrix

one-step transition probability matrix, P = [pij], where pij = P(X1 = j|X0 = i)

m-step transition probability matrix where

claim: P(m) = Pm

( )( ) ,mmijp P

( )0( | )m

mijp P X j X i

19

mm-Step Transition Probability Matrix-Step Transition Probability Matrix

Markov chain {Xn} for weather

Xn {r, c, s}, where r = rainy, c = cloudy, s = sunny

0.5 0.4 0.1

0.3 0.4 0.3

0.2 0.3 0.5

P

n = 0 n = 1 n = 2

State = r

State = c

State = s

pcr

pcc

pcs

prr

pcr

psr

(2)cr rr cc cr cs srijp p p p p p p

20

mm-Step Transition Probability Matrix-Step Transition Probability Matrix(2)

2 0

2 1 0 2 1 0

2 1 0

2 1 0 1 0

2 1 0 1 0

( | )

= ( , | ) ( , | )

+ ( , | )

= ( | , ) ( | )

+ ( | , ) ( | )

crp P X r X c

P X r X r X c P X r X c X c

P X r X s X c

P X r X r X c P X r X c

P X r X c X c P X c X c

2 1 0 1 0

2 1 1 0

2 1 1 0

2 1 1 0

1 0 1 0

+ ( | , ) ( | )

= ( | ) ( | )

+ ( | ) ( | )

+ ( | ) ( | )

= ( | ) ( |

P X r X s X c P X s X c

P X r X r P X r X c

P X r X c P X c X c

P X r X s P X s X c

P X r X r P X r X c

1 0 1 0

1 0 1 0

)+ ( | ) ( | )

+ ( | ) ( | )

=

= rr cr cr cc sr cs

cr rr cc cr cs sr

P X r X c P X c X c

P X r X s P X s X c

p p p p p p

p p p p p p

claim: (P2)cr = (PP)cr =

(P2)ij =

P2 = P(2)

Pm = P(m)

21

mm-Step Transition Probability Matrix-Step Transition Probability Matrix

pcr pcc pcs

prr

pcr

psr

r

c

s

r c s

(2)crp

(2)ijp

(2)cr rr cc cr cs srijp p p p p p p