Hidden Markovian Model. Some Definitions Finite automation is defined by a set of states, and a set...

Post on 20-Jan-2016

213 views 1 download

Transcript of Hidden Markovian Model. Some Definitions Finite automation is defined by a set of states, and a set...

Hidden Markovian Model

Some Definitions

• Finite automation is defined by a set of states, and a set of transitions between states that are taken based on the input observations

• A weighted finite-state automation is a simple augmentation of the finite automaton in which each arc is associated with a probability, indicating how likely that path is to be taken– Sum of all outgoing arcs from a particular state

should equal zero

Markov Chain

• A Markov chain is a special case of a weighted automaton in which the input sequence uniquely determines which states the automation will go through

• Markov chain is only useful for assigning probabilities to unambiguous sequences

Hidden Markovian Model (HMM)

• Hidden State– The states are not directly observable in the world

instead they have to be inferred through other means

HMM

• HMM– A set of N states– A set of O observations– A special start and end state– Transition Probability– Emission Probability

HMM

• Transition Probability – At each time instant the system may change its

state from the current state to another state, or remain in the same state, according to a certain probability distribution

• Emission Probability– A sequence of observation likelihoods, each

expressing the probability of a particular observation being emitted by a particular state

Example

• Imagine you are a climatologist in the year 2799 studying the history of global warming– No records of weather in Baltimore for Summer

2007– We have Jason Eisner’s Diary which has how many

ice creams he had each day– Our goal is to estimate the climate based on the

observations we have. For simplicity we are going to assume only two states, hot and cold.

Markov Assumptions

• A first order HMM instantiates two simplifying assumptions– The probability of a particular state is dependent

only on the previous state– The probability of an output observation is

dependent only on the state that produced the observation and not any other states or any other observations

HMM usage• There are 3 important ways in which HMM is

used,– Computing Likelihood

• Given an HMM l = (A,B) and an observation sequence O, determine the likelihood P(O|l)

– Decoding• Given an observation sequence O and an HMM l = (A,B),

discover the best hidden state sequence Q.

– Learning• Given an observation sequence O and the set of states in the

HMM, learn the HMM parameters A and B.

Computing Likelihood

• Lets assume 3 1 3 is our observation sequence• The real problem here is that we are not aware

of the hidden state sequence corresponding to the observation sequence

• This is to compute the total probability of the observations just by summing over all possible hidden state sequences

Forward Algorithm

• For N number of states and T sequences there can be upto N^T possible hidden sequences and when N and T are considerably large there can be an issue here…

• Forward Algorithm– It is a kind of dynamic programming algorithm, i.e,

algorithm that uses a table to store intermediate values as it builds up the probability of the observation sequence