Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes
description
Transcript of Workshop on Stochastic Differential Equations and Statistical Inference for Markov Processes
Workshop on Stochastic Differential Equations and
Statistical Inference for Markov Processes
January 19th – 22nd 2012Lahore University of Management Sciences
Schedule
• Day 1 (Saturday 21st Jan): Review of Probability and Markov Chains
• Day 2 (Saturday 28th Jan): Theory of Stochastic Differential Equations
• Day 3 (Saturday 4th Feb): Numerical Methods for Stochastic Differential Equations
• Day 4 (Saturday 11th Feb): Statistical Inference for Markovian Processes
Today
• Review of Probability
• Simulation of Random Variables
• Review of Discrete Time Markov Chains
• Review of Continuous Time Markov Chains
REVIEW OF PROBABILITY
Why Probability Models?
• Are laws of nature truly probabilistic?
• Coding uncertainty in models
• Financial Markets, Biological Processes, Turbulence, Statistical Physics, Quantum Physics
Mathematical Foundations– S is a collection of elements (outcomes of an
experiment)– Each (nice) subset of S is an event – A is a collection of (nice) subsets of S– The set function is called a
probability measure iff
Independence
• Two events are independent iff
• This means that the occurrence of one does not affect the occurrence of the other
Conditional Probability• Probability of given that has occurred
• Denoted by
• Independence can be reformulated as =
Random Variables
• A random variable X is areal valued function defined on the sample space
such that
• A is the state space of the random variable
• If A is finite of countably infinite X is discrete• If A is an interval X is continuous
Cumulative Distribution Function
• The cumulative distribution function of X is the function
• F is non decreasing and right continuous and
Probability Mass Function• If X is a discrete random variable, the function
is called the probability mass function of X
• We also have
• The cdf satisfies
Probability Density Function
• If X is a continuous random variable the probability density function is given by
• The cdf satisfies
Discrete Distributions
• Uniform :
• Bernoulli
• Binomial
• Poisson
Continuous Random Variables
• Uniform
• Exponential
• Gaussian
Expectation of a R.V.
• The expectation is defined as
for a continuous random variable
• For a discrete random variable
• What is it?
Expectation of Function of a R.V.
• “Law of the unconscious statistician”
Moments
• The nth moment is given by
• What do they ‘mean’?
Multivariate Distributions• Several random variables can
be associated with the same sample space
• Can define a joint pmf or pdf
• In case of a bivariate random vector•
Marginal pdf
• The marginal pdf of X1 is given by
• The marginal pdf of X2 is given by
Conditional Expectation
• Conditional Expectation is given by
• Note this is a function of a random variable itself!!!
Probability Generating Function
• The pgf of random variable is given by
• The pmf can be recovered by taking derivatives evaluated at 0
Central Limit Theorem
• Why are many physical processes well modeled by Gaussians?
• Let be i.i.d random variables with finite mean and variance then as
the limiting distribution of
is a normal
Law of Large Numbers
• Let be i.i.d random variables with finite mean and variance then
Numerics
• Simulate a 1-D random Walk– Calculate the mean– Calculate the Variance
• Simulate a 2D random walk– Calculate the mean– Calculate the Variance
Simulating a Binomially Distributed Random Variable
• Note sum of Bernoulli trials is a binomial
• Let X i be a Bernoulli trial with probability ‘p’ of success
• is binomial ‘n’, ‘p’
Continuous Random Variables
• Inverse Transform Method– Suppose a random variable has cdf ‘F(x)’– Then Y=F-1(U) also had the same cdf
• Generating the exponential
• Generate the exponential, compare with exact cdf
• Generate a r.v. with cdf
Rejection Method
• Simulate &
• To Simulate look @
• If accept, else reject
• To Simulate N(0,1) let
• If set
Section Challenge
• Kruskal’s Paper and Simulation of the Kruskal Count
• The n-hat problem through various approaches and simulating the n-hat problem
STOCHASTIC PROCESSES
Boring Definitions• A stochastic process is a collection of random
variables– T is the index set, S is the common sample space
• For each fixed denotes a single random variable
•For each fixed is a functions defined on T
Types of Stochastic Processes
• Discrete Time Discrete Space (DTMC)
• Discrete Time Continuous Space (Time Series)
• Continuous Time Discrete Space (CTMC)
• Continuous Time Continuous Space (SDE)
Discrete Time Discrete Space Processes
Discrete Time Markov Chains
Discrete Time Markov Chain
• The index set is discrete (finite or infinite)
• Markov Property
Transition Probability Matrix
• The one step transition probability is defined as
• If the transition probability does not depend on n the process is stationary or homogenous
• The transition matrix is
N-step Transition Probability
• The n step transition probability is
• How is this related to the one step transition probability?
• Guess: Perhaps as the nth power?
Chapman Kolmogorov Equations
• To get from i to j in n steps is equivalent to get from i to k in s steps and from k to j in n-s steps, summed over all possible intermediate k’s
• The n step transitions are just powers of the once step transition!!
Communication Classes
• Two states i and j ‘communicate’ ( ) if for some m and n
• is an equivalence relation
• The set of equivalence classes is called a ‘class’ of the DTMC
• If there is only one class in a MC it is irreducible
Class Properties
• Periodicity : The period of state i, ‘d(i)’; is the GCD of all such n for which
• First Return Time
• Transience & Recurrence– Transience
– Recurrence
Mean Return Time
• Let be the random variable defining the first return time
• The mean of is the mean return time
Transient State Recurrent State
First Passage Time
• First passage time is defined as
Stationary Distribution
• For a DTMC a stationary distribution is non-negative vector
• i.e. the eigenvector of P corresponding to eigenvalue 1
Existence Theorem for Stationary Distribution
• For a positive recurrent, aperiodic and irreducible DTMC there exists a unique stationary distribution such that
Logistic Growth
• The transition probabilities are given by
where
• Note the correspondence with the deterministic model for
DTMC SIS Epidemic Model• Compartmental Model
The Infected Class• I is a random variable that describes the infected class I={0,1,2………N}
• Two classes {0} and {1,2,….N}
• {0} is the absorbing class
• Average time in infected state– F is the sub matrix corresponding to transient states
DTMC SIR Epidemic Model
• The transition probability is given by
with
Section Challenge
• Simulate– Logistic Growth– SIS Model– SIR Model
• Compare mean of MC Simulation with
solution of corresponding deterministic Model
Continuous Time Discrete Space Processes
Continuous Time Markov Chains
Definitions
• The index set is an interval• States are discrete• Markov Property
for any sequence
Transition Probability
• The transition probability is given by
• If this only depends on the length of the time interval chain is homogenous
Chapman Kolmogorov Equations
• The transition probabilities are solutions of the Chapman-Kolmogorov Equations
Waiting Times
• The process stays at state X(0) for a random time W1 then jumps to X(W1)
• Stays in X(W1) for a random time then jumps to X(W2) & so on…..
• The random variable is the waiting time
• Inter-event time
Poisson Process• CTMC with state space {0,1,2,3…….} &– X(0)=0– For Δt sufficiently small
• Satisfies (Kolmogorov Equations)
• i.e. is the Poisson Distribution
Generator Matrix
• Transition rates qji are define in terms of transition probabilities
• The rate matrix or ‘Generator Matrix’ is
Embedded Chain
• If Yn is the DTMC defined by
is known as the embedded chain
• If T=(tji) is the transition matrix of the embedded chain
Class Properties of Embedded Chain
• Many properties carry over form the embedded DTMC to the CTMC
– States that belong to the same class in the DTMC also belong to the same class in the CTMC
– If a state is recurrent in the DTMC so it is in the CTMC– If a class of the DMC is closed so is the class in the
CTMC– If the DTMC is irreducible so is the CTMC– Note: No concept of periodicity in the CTMC!!
Kolmogorov Equations
• The forward equations are given by
• The backward equations are
Stationary Distribution
• For a positive recurrent, irreducible CTMC with generator matrix Q there exists a unique stationary distribution π
such that
Also
Generating Functions and CTMC
• From the Kolmogorov Equations a PDE governing the pgf can be derived
• The RHS consists of P(z,t) and the derivatives of P(z,t)
Interevent Time
• For a CTMC recall the inter-event time was defined as
• The inter-event time has an exponential distribution
SIS Epidemic Model in Continuous Time
• Transition probabilities are given by
SIS CTMC Model
• The Kolmogorov Equations are
where
SIS CTMC Model
• The generator matrix is given by
• The FKE can be written as
SIR Epidemic Model in continuous Time
• The joint probability distribution is given by
Kolmogorov Equations
• The forward equations are given by
where
Asymptotic Results
• For large N and small I(0)=j
Section Challenge
• Simulate– Logistic Growth– SIS Model– SIR Model
• Compare mean of CTMC with mean of DTMC
and solution of corresponding deterministic Model