Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

13
Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi

Transcript of Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

Page 1: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

Hidden Markov Models

First Story!

Majid Hajiloo, Aria Khademi

Page 2: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

2

Outline

This lecture is devoted to the problem of inference in probabilistic graphical models (aka Bayesian nets). The goal is for you to:

Practice marginalization and conditioning

Practice Bayes rule

Learn the HMM representation (model)

Learn how to do prediction and filtering using HMMs

Page 3: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

3

Assistive Technology

Assume you have a little robot that is trying to estimate the posterior probability that you are happy or sad

Page 4: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

4

Assistive Technology

Given that the robot has observed whether you are:

watching Game of Thrones (w)

sleeping (s)

crying (c)

face booking (f)

Let the unknown state be X=h if you’re happy and X=s if you’re sad.

Let Y denote the observation, which can be w, s, c or f.

Page 5: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

5

Assistive Technology

We want to answer queries, such as:

Page 6: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

6

Assistive Technology

Assume that an expert has compiled the following prior and likelihood models:

X

Y

Page 7: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

7

Assistive Technology

But what if instead of an absolute prior, what we have instead is a

temporal (transition prior). That is, we assume a dynamical system:

Init

Given a history of observations, say we want to compute the posterior distribution that you are happy at step 3. That is, we want to estimate:

)

Page 8: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

8

Dynamic Model

In general, we assume we have an initial distribution , a transition model and an observation model

For 2 time steps:

𝑋 0 𝑋 1 𝑋 2

𝑌 1 𝑌 2

Page 9: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

9

Optimal Filtering

Our goal is to compute, for all t, the posterior (aka filtering) distribution:

We derive a recursion to compute assuming that we have as input . The recursion has two steps:

1. Prediction

2. Bayesian update

Page 10: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

10

Prediction

First, we compute the state prediction:

Page 11: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

11

Bayes Update

Second, we use Bayes rule to obtain

Page 12: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

12

HMM Algorithm

Example. Assume:

Prediction:

Bayes Update: if we observe , the Bayes update yields:

Page 13: Hidden Markov Models First Story! Majid Hajiloo, Aria Khademi.

The END