Markovian

download Markovian

of 14

Transcript of Markovian

  • 7/29/2019 Markovian

    1/14

    1 INTRODUCTION 1

    Markov Chain Models

    1 Introduction

    Let T [0,) and let S be a countable set. We usually think of T as a set ofpossible times and S as a set of possible states of a system. Let (,A, Pr) bea probability model, and let {X(t), t T} be a set of random variables definedon this model. We think of the event {X(t) = s} as the event that the processis in state s at time t. We are interested in studying situations where it makessense to assume the following. For every finite sequence

    t0 < t2 < .. . < tn < tn+1

    of elements of T and every finite sequence

    s0 < s1 < .. . < sn < sn+1

    if Pr(X(tn) = sn, . . . , X (t0) = s0) > 0 then

    Pr(X(tn+1) = sn+1 | X(tn) = sn, . . . , X (t0) = s0)= Pr(X(tn+1) = sn+1 | X(tn) = sn). (1)

    The relation (1) is a called the Markov Property named after A. A. Markov(June 14, 1856 N.S. July 20, 1922) of Markov inequality fame.

    2 Note on notation

    If it does not result in double subscripts we will write Xt in place of X(t) Forexample, X5 instead of X(5). Since X(t) is a random variable, it maps intoS. Technically, X : T S, but we we almost always take this for granted.

    3 Interpretation

    A good way to think about the Markov Property is that it says that the condi-tional probabilty of X(tn+1 only depends on the value of X(t) for the last timet that you oberved the process. In what follows we give a number of examples.

    4 Balls in urns

    Suppose we have two urns, each containing 10 balls. Altogether there are 10white balls and 10 black balls. Once a minute, a ball is selected at random fromeach urn and placed into the opposite urn. We wish to keep track of the numberof balls of each color in each urn. This is a simplified model of gas dynamics,where the colored balls represent two different types of gas molecules, and theurns represent the left and right sides of a near-vacuum chamber.

  • 7/29/2019 Markovian

    2/14

    4 BALLS IN URNS 2

    To model this, paint one urn red, the other green, and observe that if we

    know how many white balls are in the red urn, we know everything. Let Xnbe the number of white balls in the red urn after n switches. It makes sense toassume that the Markov Property holds if we assume that each time we switchthe balls we are choosing the balls at random from their respective urns. Thismight not be the case of the balls do not get mixed together each time there isa switch. Note assuming that the selections are at random allow us to see thatfor k {0, 1, . . . , 10}

    Pr(Xn+1 = k 1 | Xn = k) = k10

    k10

    Pr(Xn+1 = k | Xn = k) = k10

    10 k10

    +10 k

    10 k

    10

    Pr(Xn+1 = k + 1|

    Xn = k) =10 k

    10 10 k

    10

    and ifjk {0,1}c then Pr(Xn+1 = j | Xn = k) = 0. Assuming the MarkovProperty allows us, in principle, to find the mass function of Xn for any positiveinteger n if we know that mass function of X0. For example,

    Pr(X2 = c)

    =10a=0

    10b=0

    Pr(X2 = c, X1 = b, X0 = a)

    =

    10a=0

    10b=0

    Pr(X2 = c | X1 = b, X0 = a)Pr(X1 = b, X0 = a)

    =10a=0

    10b=0

    Pr(X2 = c | X1 = b)Pr(X1 = b, X0 = a)

    =

    10a=0

    10b=0

    Pr(X2 = c | X1 = b)Pr(X1 = b | X0 = a)Pr(X0 = a)

    We will see below how to write this more succinctly using matrix algebra.

    4.1 Exercise

    Suppose that each urn holds 15 balls, but there are now 20 black balls and 10white balls. If Xn is the number of white balls in the red urn, find Pr(Xn+1 =k|

    Xn = j) for j, k {

    0, 1, . . . , 10}

    .

    4.2 Exercise

    Suppose each urn holds 2 balls and there are 2 white balls and 2 black balls. :etXn be the number of white balls in the red urn. Make a 3 3 matrix P where

    Pj,k = Pr(Xn+1 = k | Xn = j).

  • 7/29/2019 Markovian

    3/14

    5 GAMBLERS RUIN 3

    5 Gamblers Ruin

    Fred and Ethel engage in a simple game of chance. At each play they toss acoin. If it comes up heads, Ethel gives Fred one dollar. If it come up tails,Fred gives Ethel one dollar. They play until one of them has no money. If thecoin tosses are mutually independent and the probability that the coin comesup heads is p (0, 1):

    Who winds up with all the money? How long does the game go on?

    Suppose that Fred starts out with F dollars and Ethel starts out with E dollars.If let let En denote how much money Ethel has after n plays and Fn denotesthe amount of money Fred has, then En + Fn = E+ F. We let the state space

    be {0, 1, 2, . . . , E + F} and let T be the non-negative integers. We havePr(En+1 = 0 | En = 0) = 1,

    Pr(En+1 = E+ F | En = E+ F) = 1,

    and if x {1, . . . , E + F 1} then

    Pr(En+1 = x + 1 | En = x) = 1pPr(En+1 = x 1 | En = x) = p

    5.1 Exercise

    Suppose between them Fred and Ethel have 4 dollars. Make a 5

    5 matrix Pwhere

    Pj,k = Pr(En+1 = k | En = j).

    5.2 Exercise

    Let W be the event that eventually Ethel has all the money. Suppose thatbetween them Fred and Ethel have $10. Let f(n) = Pr(W | Ethel has n dollars).Explain why

    f(0) = 0; f(10) = 1;

    f(n) = pf(n 1) + (1 p)f(n + 1) if n {1, 2, . . . , 9}.Now show that if f p = 1/2 then f(n) = n/10 is a solution of this problem.

  • 7/29/2019 Markovian

    4/14

    6 GAMBLERS RUIN CONTINUED 4

    6 Gamblers ruin continued

    Suppose Fred and Ethel cannot agree on who should supply the coin. Fredhas a coin with probability p (0, 1) of heads and Ethel one with probabilityq (0, 1) of heads. They agree to change the game as follows. If Freds coincomes up heads and Ethels comes up tails, then Ethel gives Fred a dollar. IfEthels coin comes up heads and Freds comes up tails, Fred gives Ethel a dollar.If the coins match, no money changes hands. All other rules are the same, andEn is Ethels fortune after n plays. We still have

    Pr(En+1 = 0 | En = 0) = 1,Pr(En+1 = E+ F | En = E+ F) = 1,

    but now

    Pr(En+1 = x + 1 | En = x) = (1 p)qPr(En+1 = x | En = x) = pq+ (1 p)(1 q)

    Pr(En+1 = x 1 | En = x) = p(1 q)

    6.1 Exercise

    Suppose between them Fred and Ethel have 4 dollars. Make a 5 5 matrix Pwhere

    Pj,k = Pr(En+1 = k | En = j).

    7 A branching model

    Let N stand for the positive integers. Suppose that {Cn,k, n N, k N}be a collection of mutually independent random variables, each taking valuesin the non-negative integers. Assume that for each n, the random variablesCn,1, Cj,2, . . . all have the same mass function. For any sequence xk we makethe convention that

    0k=1

    xk := 0.

    Now, put X0 = 1 and for each n 0,

    Xn+1 =

    Xnk=1

    Cn+1,k.

    For example,

    X1 =

    X0k=1

    C1,k = C1,1

    X2 =

    X1k=1

    C2,k

  • 7/29/2019 Markovian

    5/14

    7 A BRANCHING MODEL 5

    so that if X1 = 3 then

    X2 = C2,1 + C2,2 + C2,3.The idea is that we start with X0 = 1 organism that reproduces asexually.It has C1 like organisms, giving us X1 = C1,1 organisms in generation 1. IfX1 = 3 then each of these 3 organisms has offspring, number C2,1, C2,2 andC2,3 respectively, giving X2 = C2,1 + C2,2 + C2,3 organisms in generation 2and so on. This was proposed as a model of male line of descent in families byGalton and Watson in 1874 in a paper entitled On the probability of extinctionof families.

    We can easily compute E[Xn] provided E[Cn,k] = c < . For simplicityof the argument, suppose that Pr(Cn,k m) = 1 for some m > 0. ThenPr(Xn mn) = 1. Next, for any real number x, define Ix : (,) {0, 1}by Ix(x) = 1 and Ix(u) = 0 ifu = x. Then

    mn

    N=0

    IN(Xn) = 1.

    Therefore

    E[Xn+1] = E

    Xn+1

    mn

    N=0

    IN(Xn)

    =

    mnN=0

    E [Xn+1IN(Xn)] .

    Now, observe that

    Xn+1IN(Xn) =

    Xnk=1

    Cn+1,k

    IN(Xn) =

    Nk=1

    Cn+1,k

    IN(Xn)

    which expresses the lefthand side as the product of independent random vari-ables. Therefore

    E [Xn+1IN(Xn)] = E

    Nk=1

    Cn+1,k

    IN(Xn)

    = E

    Nk=1

    Cn+1,k

    E [IN(Xn)]

    = Nk=1

    E[Cn+1,k]

    Pr(Xn = N)

    = N c Pr(Xn = N).

    This shows us that

    E[Xn+1] =

    mn

    N=0

    Nc Pr(Xn = N) = cE[Xn]

  • 7/29/2019 Markovian

    6/14

    8 A QUEUING MODEL 6

    since Pr(Xn mn) = 1. Therefore, by iteration, E[Xn+1] = cn+1. SinceE[X0] = 1 we have shown that

    E[Xn] = cn.

    Hence, from Markovs inequality,

    Pr(Xn = 0) = Pr(Xn 1) E[Xn] = mn.

    In particular, if m < 1 then Pr(Xn = 0) 0 as n . This makes sense, as ifwe are on average having fewer than one offspring per parent, we would expectthe family to die out. It remains to be seen what happens if c 1.

    7.1 Exercise

    Let Bn = {Xn = 0}. Explain why Bn+1 Bn and why the sequencePr(Xn = 0) is convergent.

    8 A queuing model

    Let T and S be the non-negative integers. Suppose that the states indicate thenumber of customers in a bank. We assume that once a minute one of threethings happens:

    A customer leaves the bank, A customer enters the bank, No one enters nor leaves the bank.

    Let Cn be the number of customers at minute n, and for each non-negativeinteger k suppose that bk + rk + dk = 1, bk and dk are positive while rk isnon-negative. Assume

    Pr(Cn+1 = k + 1 | Cn = k) = bkPr(Cn+1 = k | Cn = k) = rk

    Pr(Cn+1 = k 1 | Cn = k 1) = dkWe might assume, for example that bk = b for all k. Can we compute E[Cn] orVar[Cn].

    Such a process is also called a birth and death process.

    9 Queuing model continued

    It is unrealistic to suppose that the bank could hold an unlimited number ofpeople, so we might assume that for some N > 0, bk = 0 for all k N. Inbiological models, N would be the carrying capacity of the ecosystem.

  • 7/29/2019 Markovian

    7/14

    10 THE PURE DEATH MODEL 7

    10 The pure death model

    Modify the queuing model to have bk = 0 for all k. By choosing suitable valuesfor dk we can model radioactive decay. What we need is that dk should beproportional to k and the initial state would be the total number of atoms ofthe decaying element present at the start of our observations.

    11 A catastropic failure model

    Imagine we are walking along the non-negative integers. At each integer thereis a coin, and the coin at n comes up heads with probability pn (0, 1). Whenwe are at any integer, we toss the coin we find there. If it comes up heads, wego to the next bigger integer. If it comes up tails, we go to 0. If we are at 0 we

    just stay there. If we model this as a Markov chain, we havePr(Xn+1 = 1|Xn = 0) = p0Pr(Xn+1 = 0|Xn = 0) = 1p0

    Pr(Xn+1 = x + 1|Xn = x) = px ifx > 0Pr(Xn+1 = 0|Xn = x) = 1px ifx > 0

    In this set-up, one question is whether there is a positive probability that if westart at 0, we never return there. Notice that if Pr( X0 = 0) = 1, then

    Pr(X0 = 0, X1 = 0)

    = Pr(X1 = 0 | X0 = 0) Pr(X0 = 0) = 1 p0,Pr(X0 = 0, X1 = 0, X2 = 0)

    = Pr(X0 = 0, X1 = 1, X2 = 0)

    = Pr(X2 = 0 | X1 = 1, X0 = 0) Pr(X1 = 1 | X0 = 0) Pr(X0 = 0)Use the Markov property

    = Pr(X2 = 0 | X1 = 1) Pr(X1 = 1 | X0 = 0) Pr(X0 = 0)= (1p1)p0,

    Pr(X0 = 0, X1 = 0, X2 == 0, X3 = 0)= Pr(X0 = 0, X1 = 1, X2 = 2, X3 = 0)

    = Pr(X3 = 0 | X2 = 2, X1 = 1, X0 = 0) Pr(X2 = 2 | X1 = 1, X0 = 0)Pr(X1 = 1 | X0 = 0) Pr(X0 = 0)Use the Markov property

    = Pr(X3 = 0 | X2 = 2) Pr(X2 = 2 | X1 = 1) Pr(X1 = 1 | X0 = 0) Pr(X0 = 0)= (1p2)p1p0

    We can see in this fashion that for n 2,

    Pr(X0 = 0, X1 = 0, . . . , X n1 = 0, Xn = 0) = (1 pn1)pn2 p0.

  • 7/29/2019 Markovian

    8/14

    12 RANDOM WALKS 8

    12 Random walks

    Suppose that Y1, Y2, . . . are mutually independent random variables taking val-ues in the integers. Let T be the non-negative integers, let S be the integers.Put X0 = 0 and Xn+1 = Xn + Yn+1 for n = {0, 1, 2, . . .}. This means that

    Xn+1 =

    n+1k=1

    Yk

    Suppose that Pr(X(tn) = sn, . . . , X (t0) = s0) > 0. We have

    X(tn+1) X(tn) =tn+1

    k=tn+1

    Yk

    so X(tn+1) X(tn) is independent of X(tn). Adopting the notation from (1),ifPr(X(tn) = sn, . . . , X (t0) = s0) > 0 then

    Pr(X(tn+1) = sn+1 | X(tn) = sn, . . . , X (t0) = s0)=

    Pr(X(tn+1) = sn+1, X(tn) = sn, . . . , X (t0) = s0)

    Pr(X(tn) = sn, . . . , X (t0) = s0)

    =Pr(X(tn+1) X(tn) = sn+1 sn, X(tn) = sn, . . . , X (t0) = s0)

    Pr(X(tn) = sn, . . . , X (t0) = s0)

    =Pr(X(tn+1) X(tn) = sn+1 sn) Pr(X(tn) = sn, . . . , X (t0) = s0)

    Pr(X(tn) = sn, . . . , X (t0) = s0)

    = Pr(X(tn+1) X(tn) = sn+1 sn)=

    Pr(X(tn+1) X(tn) = sn+1 sn) Pr(X(tn) = sn)Pr(X(tn) = sn)

    =Pr(X(tn+1) X(tn) = sn+1 sn, X(tn) = sn)

    Pr(X(tn) = sn)

    =Pr(X(tn+1) = sn+1, X(tn) = sn)

    Pr(X(tn) = sn)

    = Pr(X(tn+1) = sn+1 | X(tn) = sn)

    If the Yk all have the same distribution then the random walk is said to be sta-tionary, meaning that the probability distribution of the increment X(tn+1)X(tn) depends only on tn+1

    tn. Notice that increments of a random walk are

    independent if they occur over time intervals that do not overlap. We will havea lot to say about more general processes with this behaviour.

    12.1 Exercise

    Suppose that Pr(Yn = 1) = p (0, 1) and Pr(Yn = 1) = 1 p. Find anexpression for Pr(X2n = 0) for any positive integer n.

  • 7/29/2019 Markovian

    9/14

    13 THE POISSON PROCESS 9

    12.2 Exercise

    Explain why Pr(X2n1 = 0) = 0 for every positive integer n.

    12.3 Exercise

    Let In = 1 ifXn = 0 and In = 0 ifXn = 0. Intepret I2 + I4 + + I2n in termsof the number of times our random walk visits 0.

    12.4 Exercise

    Show that

    limN

    E

    Nn=1

    I2n

    <

    if p = 1/2.

    13 The Poisson process

    Suppose that we would like to model the arrival of customers in such a way thattime is measured continuously rather than discretely. If so, we will need thatthe time between arrivals has no memory, that is, just because you know howlong it has been since the last arrival, you have no information about abouthow much longer you will have to wait. In other words, if T is the time forthe next arrival we need Pr(T > s + t|T > t) = Pr(T > s). This is called thememoryless property and it forces T to have an exponential distribution ifthe distribution function of T is to be continuous.

    So, suppose that T1, T2, . . . is a sequence of mutually independent positiverandom variables such that for any t 0,

    Pr(Tn > t) = exp(t).For any t 0, define Nt as follows. Define

    G0 = 0

    Gn = Gn1 + Tn ifn {1, 2, . . .}= T1 + + Tn.

    We think of Gn as the time of the arrival of the nth customer. It is known that

    for n

    1 and t > 0,

    Pr(Gn > t) =

    t

    1

    (n 1)! un1 exp(u) du =

    n1k=0

    tk

    k!exp(t).

    With this in mind, for t 0 we defineNt = max{n : Gn t}.

  • 7/29/2019 Markovian

    10/14

    14 A COMPOUND POISSON MODEL FOR INSURANCE CLAIMS 10

    Since Pr(Gn > 0) = 1 if n > 0 we have Pr(N0 = 0) = 1. If T1 > t then

    Nt = 0. Furthermore, since Gn is to represent the time of the arrival of the nth

    customer, we should have for any k {1, 2, . . .}

    {Nt k 1} = {Gk > t}.

    This is indeed the case. Remember that G0 < G1 < G2 . . .. If n is a positiveinteger, then

    {Nt = n} = {Gn t} {Gn+1 > t}so for each positive integer k,

    {Nt k 1} =k1

    n=0{Nt = n} =

    k1

    n=0({Gn t} {Gn+1 > t}) = {Gk > t}.

    This last equality is proven by induction on k. When k = 1, it follows from thedefinition of n and the Gk that

    {G0 t} {G1 > t} = {G1 > t}

    since G0 = 0. Now suppose the equality holds for k = N 1. Then fork = N + 1

    Nn=0

    ({Gn t} {Gn+1 > t})

    = (

    {GN

    t

    } {GN+1 > t

    })

    N1

    n=0

    (

    {Gn

    t

    } {Gn+1 > t

    })

    = ({GN t} {GN+1 > t}) {GN > t}= ({GN t} {GN+1 > t}) ({GN > t} {GN+1 > t})

    since GN < GN+1

    = {GN+1 > t}

    as claimed.It is technically complicated to show that {Nt, t 0} has the Markov prop-

    erty, and we will approach the Poisson process from a different angle.

    14 A compound Poisson model for insurance

    claims

    Suppose that Y1, Y2, . . . and C1, C2, . . . be mutually independent random vari-ables. Suppose further that the Yks are identically distributed Poisson randomvariables with expected value > 0 and the Cj s are identically distributed

  • 7/29/2019 Markovian

    11/14

    15 STATIONARY, INDEPENDENT INCREMENT PROCESSES 11

    positive integer valued random variables. Put

    Nn =n

    k=1

    Yk

    Xn =

    Nnj=1

    Cj .

    Nn represents the number of insurance claims filed on the nth business day at

    an insurance company, and let Cj be the amount due the insured for the jth

    claim. Xn then represents the amount the insurance company must pay out by

    the end of the nth business day. Both {Nn, n {0, 1, . . .}} and {(Xn, Nn), n {0, 1, . . .}} are Markov chains, but it is not clear whether {Xn, n {0, 1, . . .}}alone is a Markov chain.

    15 Stationary, Independent Increment Processes

    Markov processes can also be approached from an axiomatic approach. Thiscan be tricky as we have to show that the axioms can actually be satisfied. Oneimportant class of examples are called stationary, independent incrementsprocesses. We hope to exhibit a family of random variables, {Xt : t 0} suchthat

    Pr(X0 = 0) = 1; If 0t0 t1 t2 tn then the random variables X(t1) X(t0),

    X(t2) X(t1), , X(tn) X(tn1) are mutally independent; If 0 s t then X(t) X(s) and X(t s) have the same distribution

    function.

    Some examples are

    Poisson Process: For t > 0 and k {0, 1, 2, . . .},

    Pr(X(t) = k) =tk

    k!exp(t)

    Brownian Motion: For t > 0 and x (,),

    Pr(X(t) x) =1

    2t x

    exp

    u2

    2t

    du

    If A > 0, B > 0, C is a real number, and Xt is Brownian motion then

    S(t) = A exp(BXt + Ct)

    is a simple model of the evolution of the price of a stock and is the basisfor the Black-Scholes price for a stock option.

  • 7/29/2019 Markovian

    12/14

    16 A MARKOV CHAIN WITH THREE STATES 12

    Gamma Process: For t > 0 and x > 0,

    Pr(X(t) > x) = 1(t)

    x

    ut1 exp(u) du.

    Negative Binomial Process: For t > 0 and k {0, 1, 2, . . .}

    Pr(X(t) = k) =

    t 1 + k

    k

    (1p)tpk.

    Compound Poisson Process: Suppose that {Nt : t 0} is a Poisson processand Y1, Y2, . . . are mutually independent identically distributed randomvariables which are also independent of the Poisson process. Then

    X(t) =

    Nt

    k=1 Ykis also a stationary, independent increments process. Such processes areused to model insurance claims. The Gamma process is a special case ofthis where

    Pr(Yk = n) = pn

    n log(1p) , n {1, 2, . . .}.

    16 A Markov chain with three states

    If we completely understand the following example we will understand the basicproperties of every stationary Markov chain that has a finite state space. Sincethe states are just labels, let the state space be {1, 2, 3}.

    Given any sequence of states, s0, s1, . . . the Markov property tells us thatPr(X0 = s0, X1 = s1, . . . , X n1 = sn1, Xn = sn)

    = Pr(X0 = s0)Pr(X1 = s1 | X0 = s0) Pr(Xn = sn | Xn1 = sn1).so the behaviour of this chain is determined by the mass function of X0 andthe conditional probabilities Pr(X1 = k|X0 = j) for j and k in {1, 2, 3}. Let ussuppose for the moment that Pr(X0 = j) > 0 for any state j.

    Let n = [n(1), n(2), n(3)] where n(j) = Pr(X(n) = j) and P be the3 3 matrix with P(j, k) = Pr(X1 = k | X0 = j).

    Note then that

    1(k) = Pr(X1 = k)

    =

    3x=1

    Pr(X1 = k, X0 = x)

    =3

    x=1

    Pr(X1 = k|X0 = x)Pr(X0 = x)

    =

    3x=1

    0(x)P(x, k)

  • 7/29/2019 Markovian

    13/14

    16 A MARKOV CHAIN WITH THREE STATES 13

    which we recognize in matrix form as

    [1(1), 1(2), 1(3)] = [0(1), 0(2), 0(3)]

    P(1, 1) P(1, 2) P(1, 3)P(2, 1) P(2, 2) P(2, 3)

    P(3, 1) P(3, 2) P(1, 3)

    In fact, for any non-negative integer,

    n+1(k) = Pr(Xn+1 = k)

    =3

    x=1

    Pr(Xn+1 = k, Xn = x)

    =3

    x=1Pr(Xn+1 = k|Xn = x)Pr(Xn = x)

    =3

    x=1

    n(x)P(x, k)

    so

    [n+1(1), n+1(2), n+1(3)]

    = [n(1), n(2), n(3)]

    P(1, 1) P(1, 2) P(1, 3)P(2, 1) P(2, 2) P(2, 3)

    P(3, 1) P(3, 2) P(1, 3)

    More succinctly,n+1 = nP

    By iterating,n = 0P

    n

    This shows us that we can understand the long time behaviour of Xn if weunderstand Pn. For the purposes of rest of this example, we will take

    P =

    1/3 1/2 1/61/3 1/3 1/3

    1/4 1/2 1/4

    .

    P has some interesting properties. Let us suppose The most important ofthese are

    The elements of P are all between 0 and 1.

    The rows of P each sum to 1:P(j, 1) + P(j, 2) + P(j, 3)

    = Pr(X1 = 1 | X0 = j) + Pr(X1 = 2 | X0 = j) + Pr(X1 = 3 | X0 = j)=

    Pr(X1 = 1, X0 = j) + Pr(X1 = 2, X0 = j) + Pr(X1 = 3, X0 = j)

    Pr(X0 = j)

  • 7/29/2019 Markovian

    14/14

    16 A MARKOV CHAIN WITH THREE STATES 14

    =Pr(X0 = j)

    Pr(X0 = j)

    Since the state space is {1, 2, 3}= 1

    Now, since the rows of P sum to 1, the column vector [1, 1, 1]t is a righteigenvector for P with eigenvalue 1. This means that P has a left eigen-vector for the eigenvalue 1 as well. That means there is a row vector whose entries sum to 1. By direct calculation we can see that in fact theentries of are non-negative as well, so if we choose 0 = then n = for all n, that is, the Xn all have the same distribution!