CV 2 › ... › file › cv-2.pdf · Title: CV 2 Created Date: 11/1/2017 9:55:11 AM
11 cv mil_models_for_chains_and_trees
Transcript of 11 cv mil_models_for_chains_and_trees
![Page 1: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/1.jpg)
Computer vision: models, learning and inference
Chapter 11 Fitting probability models
Please send errata to [email protected]
![Page 2: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/2.jpg)
2
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
2Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 3: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/3.jpg)
Chain and tree models
• Given a set of measurements and world states , infer the world states from the measurements
• Problem: if N is large then the model relating the two will have a very large number of parameters.
• Solution: build sparse models where we only describe subsets of the relations between variables.
3Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 4: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/4.jpg)
Chain and tree models
4Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Chain model: only model connections between world variable and its predecessing and subsequent variables
Tree model: connections between world variables are organized as a tree (no loops). Disregard directionality of connections for directed model
![Page 5: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/5.jpg)
Assumptions
5Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
We’ll assume that
– World states are discrete
– Observed data variables for each world state
– The nth data variable is conditionally independent of all of other data variables and world states given associated world state
![Page 6: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/6.jpg)
Gesture Tracking
6Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 7: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/7.jpg)
Directed model for chains(Hidden Markov model)
7Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Compatibility of measurement and world state
Compatibility of world state and previous world state
![Page 8: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/8.jpg)
Undirected model for chains
8Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Compatibility of measurement and world state
Compatibility of world state and previous world state
![Page 9: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/9.jpg)
Equivalence of chain models
9Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Directed:
Undirected:
Equivalence:
![Page 10: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/10.jpg)
Chain model for sign language application
10Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Observations are normally distributed but depend on sign k
World state is categorically distributed, parameters depend on previous world state
![Page 11: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/11.jpg)
11
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
11Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 12: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/12.jpg)
MAP inference in chain model
12Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
MAP inference:
Substituting in :
Directed model:
![Page 13: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/13.jpg)
MAP inference in chain model
13Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Takes the general form:
Unary term:
Pairwise term:
![Page 14: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/14.jpg)
Dynamic programming
14Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Maximizes functions of the form:
Set up as cost for traversing graph – each path from left to right is one possible configuration of world states
![Page 15: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/15.jpg)
Dynamic programming
15Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Algorithm:
1. Work through graph computing minimum possible cost to reach each node2. When we get to last column find minimum 3. Trace back to see how we got there
![Page 16: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/16.jpg)
Worked example
16Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Unary cost Pairwise costs 1. Zero cost to stay at same label2. Cost of 2 to change label by 13. Infinite cost for changing by more
than one (not shown)
![Page 17: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/17.jpg)
Worked example
17Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Minimum cost to reach first node is just unary cost
![Page 18: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/18.jpg)
Worked example
18Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Minimum cost is minimum of two possible routes to get here
Route 1: 2.0+0.0+1.1 = 3.1Route 2: 0.8+2.0+1.1 = 3.9
![Page 19: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/19.jpg)
Worked example
19Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Minimum cost is minimum of two possible routes to get here
Route 1: 2.0+0.0+1.1 = 3.1 -- this is the minimum – note this downRoute 2: 0.5+2.0+1.1 = 3.6
![Page 20: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/20.jpg)
Worked example
20Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
General rule:
![Page 21: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/21.jpg)
Worked example
21Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Work through the graph, computing the minimum cost to reach each node
![Page 22: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/22.jpg)
Worked example
22Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Keep going until we reach the end of the graph
![Page 23: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/23.jpg)
Worked example
23Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Find the minimum possible cost to reach the final column
![Page 24: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/24.jpg)
Worked example
24Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Trace back the route that we arrived here by – this is the minimum configuration
![Page 25: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/25.jpg)
25
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
25Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 26: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/26.jpg)
MAP inference for trees
26Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 27: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/27.jpg)
MAP inference for trees
27Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 28: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/28.jpg)
Worked example
28Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 29: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/29.jpg)
Worked example
29Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Variables 1-4 proceed as for the chain example.
![Page 30: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/30.jpg)
Worked example
30Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
At variable n=5 must consider all pairs of paths from into the current node.
![Page 31: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/31.jpg)
Worked example
31Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Variable 6 proceeds as normal.
Then we trace back through the variables, splitting at the junction.
![Page 32: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/32.jpg)
32
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
32Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 33: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/33.jpg)
Marginal posterior inference
33Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• Start by computing the marginal distribution over the Nth variable
• Then we`ll consider how to compute the other marginal distributions
![Page 34: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/34.jpg)
Computing one marginal distribution
34Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Compute the posterior using Bayes` rule:
We compute this expression by writing the joint probability :
![Page 35: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/35.jpg)
Computing one marginal distribution
35Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Problem: Computing all NK states and marginalizing explicitly is intractable.
Solution: Re-order terms and move summations to the right
![Page 36: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/36.jpg)
Computing one marginal distribution
36Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Define function of variable w1 (two rightmost terms)
Then compute function of variables w2 in terms of previous function
Leads to the recursive relation
![Page 37: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/37.jpg)
Computing one marginal distribution
37Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
We work our way through the sequence using this recursion.
At the end we normalize the result to compute the posterior
Total number of summations is (N-1)K as opposed to KN for brute force approach.
![Page 38: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/38.jpg)
Forward-backward algorithm
38Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• We could compute the other N-1 marginal posterior distributions using a similar set of computations
• However, this is inefficient as much of the computation is duplicated
• The forward-backward algorithm computes all of the marginal posteriors at once
Solution:
Compute all first term using a recursion
Compute all second terms using a recursion
... and take products
![Page 39: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/39.jpg)
Forward recursion
39Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Using conditional independence relations
Conditional probability rule
This is the same recursion as before
![Page 40: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/40.jpg)
Backward recursion
40Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Using conditional independence
relations
Conditional probability rule
This is another recursion of the form
![Page 41: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/41.jpg)
Forward backward algorithm
41Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Compute the marginal posterior distribution as product of two terms
Forward terms:
Backward terms:
![Page 42: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/42.jpg)
Belief propagation
42Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• Forward backward algorithm is a special case of a more general technique called belief propagation
• Intermediate functions in forward and backward recursions are considered as messages conveying beliefs about the variables.
• Well examine the Sum-Product algorithm.
• The sum-product algorithm operates on factor graphs.
![Page 43: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/43.jpg)
Sum product algorithm
43Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• Forward backward algorithm is a special case of a more general technique called belief propagation
• Intermediate functions in forward and backward recursions are considered as messages conveying beliefs about the variables.
• Well examine the Sum-Product algorithm.
• The sum-product algorithm operates on factor graphs.
![Page 44: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/44.jpg)
Factor graphs
44Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• One node for each variable• One node for each function relating variables
![Page 45: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/45.jpg)
Sum product algorithm
45Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Forward pass• Distribute evidence through the graph
Backward pass• Collates the evidence
Both phases involve passing messages between nodes:• The forward phase can proceed in any order as long
as the outgoing messages are not sent until all incoming ones received
• Backward phase proceeds in reverse order to forward
![Page 46: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/46.jpg)
Sum product algorithm
46Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Three kinds of message• Messages from unobserved variables to functions• Messages from observed variables to functions• Messages from functions to variables
![Page 47: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/47.jpg)
Sum product algorithm
47Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message type 1:• Messages from unobserved variables z to function g
• Take product of incoming messages• Interpretation: combining beliefs
Message type 2:• Messages from observed variables z to function g
• Interpretation: conveys certain belief that observed values are true
![Page 48: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/48.jpg)
Sum product algorithm
48Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message type 3:• Messages from a function g to variable z
• Takes beliefs from all incoming variables except recipient and uses function g to a belief about recipient
Computing marginal distributions:• After forward and backward passes, we compute the
marginal dists as the product of all incoming messages
![Page 49: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/49.jpg)
Sum product: forward pass
49Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from x1 to g1:
By rule 2:
![Page 50: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/50.jpg)
Sum product: forward pass
50Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from g1 to w1:
By rule 3:
![Page 51: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/51.jpg)
Sum product: forward pass
51Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from w1 to g1,2:
By rule 1:
(product of all incoming messages)
![Page 52: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/52.jpg)
Sum product: forward pass
52Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from g1,2 from w2:
By rule 3:
![Page 53: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/53.jpg)
Sum product: forward pass
53Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Messages from x2 to g2 and g2 to w2:
![Page 54: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/54.jpg)
Sum product: forward pass
54Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from w2 to g2,3:
The same recursion as in the forward backward algorithm
![Page 55: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/55.jpg)
Sum product: forward pass
55Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from w2 to g2,3:
![Page 56: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/56.jpg)
Sum product: backward pass
56Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from wN to gN,N-1:
![Page 57: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/57.jpg)
Sum product: backward pass
57Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from gN,N-1 to wN-1:
![Page 58: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/58.jpg)
Sum product: backward pass
58Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Message from gn,n-1 to wn-1:
The same recursion as in the forward backward algorithm
![Page 59: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/59.jpg)
Sum product: collating evidence
59Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• Marginal distribution is products of all messages at node
• Proof:
![Page 60: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/60.jpg)
60
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
60Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 61: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/61.jpg)
Marginal posterior inference for trees
61Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Apply sum-product algorithm to the tree-structured graph.
![Page 62: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/62.jpg)
62
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
62Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 63: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/63.jpg)
Tree structured graphs
63Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
This graph contains loops But the associated factor graph has structure of a tree
Can still use Belief Propagation
![Page 64: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/64.jpg)
Learning in chains and trees
64Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Supervised learning (where we know world states wn) is relatively easy.
Unsupervised learning (where we do not know world states wn) is more challenging. Use the EM algorithm:
• E-step – compute posterior marginals over states
• M-step – update model parameters
For the chain model (hidden Markov model) this is known as the Baum-Welch algorithm.
![Page 65: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/65.jpg)
Grid-based graphs
65Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Often in vision, we have one observation associated with each pixel in the image grid.
![Page 66: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/66.jpg)
Why not dynamic programming?
66Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
When we trace back from the final node, the paths are not guaranteed to converge.
![Page 67: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/67.jpg)
Why not dynamic programming?
67Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 68: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/68.jpg)
Why not dynamic programming?
68Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
But:
![Page 69: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/69.jpg)
Approaches to inference for grid-based models
69Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
1. Prune the graph.
Remove edges until an edge remains
![Page 70: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/70.jpg)
70Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
2. Combine variables.
Merge variables to form compound variable with more states until what remains is a tree. Not practical for large grids
Approaches to inference for grid-based models
![Page 71: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/71.jpg)
71Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Approaches to inference for grid-based models
3. Loopy belief propagation.
Just apply belief propagation. It is not guaranteed to converge, but in practice it works well.
4. Sampling approaches
Draw samples from the posterior (easier for directed models)
5. Other approaches
• Tree-reweighted message passing• Graph cuts
![Page 72: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/72.jpg)
72
Structure
• Chain and tree models• MAP inference in chain models• MAP inference in tree models• Maximum marginals in chain models• Maximum marginals in tree models• Models with loops• Applications
72Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 73: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/73.jpg)
73Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
Gesture Tracking
![Page 74: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/74.jpg)
Stereo vision
74Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• Two image taken from slightly different positions• Matching point in image 2 is on same scanline as image 1• Horizontal offset is called disparity• Disparity is inversely related to depth• Goal – infer disparities wm,n at pixel m,n from images x(1) and x(2)
Use likelihood:
![Page 75: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/75.jpg)
Stereo vision
75Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 76: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/76.jpg)
Stereo vision
76Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
1. Independent pixels
![Page 77: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/77.jpg)
Stereo vision
77Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
2. Scanlines as chain model (hidden Markov model)
![Page 78: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/78.jpg)
Stereo vision
78Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
3. Pixels organized as tree (from Veksler 2005)
![Page 79: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/79.jpg)
Pictorial Structures
79Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 80: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/80.jpg)
Pictorial Structures
80Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 81: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/81.jpg)
Segmentation
81Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
![Page 82: 11 cv mil_models_for_chains_and_trees](https://reader034.fdocuments.us/reader034/viewer/2022052620/5578c7d5d8b42a85538b4ff4/html5/thumbnails/82.jpg)
Conclusion
82Computer vision: models, learning and inference. ©2011 Simon J.D. Prince
• For the special case of chains and trees we can perform MAP inference and compute marginal posteriors efficiently.
• Unfortunately, many vision problems are defined on pixel grid – this requires special methods