Agenda
• Review of Proj 1• Concepts for Proj 2– Probability– Morphology
• Image Blending
Proj 1
• Grades in EEE Dropbox• Out of 40 points; followed breakdown on
project webpage• Soln is also in dropbox– I strongly suggest you look at them– I still learn ‘tricks of the trade’ by looking at other
people’s code
Demosaicing
Look at code
whiteBalance.mclipHistogram.m
gammaCorrectOrig.mgammaCorrect.m
Someone in class had a better implementation!
Agenda
• Review of Proj 1• Concepts for Proj 2– Probability– Morphology
• Image Blending
Discrete Probability
• We have a biased 6-sixed die. How do we encode the chances of possible rolls?
Discrete Probability
• We have a biased 6-sixed die. How do we encode the chances of possible rolls?
• Table of 6 #’s that add to 1
• Write formally as P(R=1)=.1, P(R=2) = .2, …..
• Here, R is a discrete random variable• The function P(.) is sometimes called a probability mass
function (pmf)• In continuous world, its called a pdf
Discrete Probability
• Assume we have a rand() function that can generate a random # in [0,1]
• Given pmf, how we can now generate sample rolls according to P(R=1)=.1, P(R=2) = .2…?
Discrete Probability
• Given pmf, how we can now generate sample rolls according to P(R=1)=.1, P(R=2) = .2…?
• Idea: stack line segments of length .1, .2, …• Total height of stack = 1• Randomly pick a height & see what segment we hit• In matlab, sample = find(cumsum(pmf) >= rand,1)
Image histogram
• Consider a grayscale image with image intensities in {0,255}
• Construct a table of 256 values. Normalize so that the entries add to 1
• We can interpret histogram as a pmf• P(I=0)=.0006, P(I=1)=.003,…..• What would a ‘sample’ image look like?
Not so good, so let’s come up with a stronger model….
Joint probability tables
• Let’s say we have 2 “loaded” die• Define a joint pmf by a table of 6x6 #s that
add to 1• P(R1,R2)=
1 2 3 4 5 6
1 2/42 1/42 1/42 1/42 1/42 1/42
2 1/42 2/42 1/42 1/42 1/42 1/42
3 1/42 1/42 2/42 1/42 1/42 1/42
4 1/42 1/42 1/42 2/42 1/42 1/42
5 1/42 1/42 1/42 1/42 2/42 1/42
6 1/42 1/42 1/42 1/42 1/42 2/42
Joint probability tables
• P(R1,R2)=1 2 3 4 5 6
1 2/42 1/42 1/42 1/42 1/42 1/422 1/42 2/42 1/42 1/42 1/42 1/423 1/42 1/42 2/42 1/42 1/42 1/424 1/42 1/42 1/42 2/42 1/42 1/425 1/42 1/42 1/42 1/42 2/42 1/426 1/42 1/42 1/42 1/42 1/42 2/42
What’s P(R1=k) for k = 1..6?
R2
R1
What’s P(R2=k) for k = 1..6?
The marginals P(R1),P(R2) look fair, but the joint P(R1,R2) is clearly biased
Conditional Probability Tables (CPT)
• P(R1|R2) = P(R1,R2)/P(R2)
1 2 3 4 5 61 2/7 1/7 1/7 1/7 1/7 1/72 1/7 2/7 1/7 1/7 1/7 1/73 1/7 1/7 2/7 1/7 1/7 1/74 1/7 1/7 1/7 2/7 1/7 1/75 1/7 1/7 1/7 1/7 2/7 1/76 1/7 1/7 1/7 1/7 1/7 2/7
R2
R1
What’s P(R1=x|R2=y) for x = 1..6, y=1..6?
Does the CPT contain the same information as the joint?
What if we knew both the P(R1,R2) and P(R2) tables?
Shannon’s language model
• P(word|word_prev) =
run walk I salt to …run 2/7 1/7 1/
71/7 1/7 1/7
walk 1/7 2/7 1/7
1/7 1/7 1/7
I 1/7 1/7 2/7
1/7 1/7 1/7
salt 1/7 1/7 1/7
2/7 1/7 1/7
to 1/7 1/7 1/7
1/7 2/7 1/7
… 1/7 1/7 1/7
1/7 1/7 2/7
word_prev
word
What would be reasonable CPT values?
Nth order markov models
• P(word|history) = P(word|word_prev1,word_prev2)
word_prev1,word_prev2
word
run walk I salt to …
I, run 2/7 1/7 1/7 1/7 1/7 1/7
you, walk 1/7 2/7 1/7 1/7 1/7 1/7
I,I 1/7 1/7 2/7 1/7 1/7 1/7
you,you 1/7 1/7 1/7 2/7 1/7 1/7
… 1/7 1/7 1/7 1/7 2/7 1/7
… 1/7 1/7 1/7 1/7 1/7 2/7
What would be reasonable CPT values?
Nth order markov models
• P(pixel|image) = P(pixel|neighborhood)
1 2 3 4 … 2561,1,..1 2/7 1/7 1/7 1/7 1/7 1/71,1,..2 1/7 2/7 1/7 1/7 1/7 1/7… 1/7 1/7 2/7 1/7 1/7 1/7… 1/7 1/7 1/7 2/7 1/7 1/7… 1/7 1/7 1/7 1/7 2/7 1/7256..256 1/7 1/7 1/7 1/7 1/7 2/7
Neighborhood
pixel
How big is this table for a NxN window?
Statistical modeling of texture• Assume stochastic model of texture (Markov
Random Field)• Stationarity: the stochastic model is the same
regardless of position• Markov property:
p(pixel | rest of image) = p(pixel | neighborhood)
?
Implicit models
p
Synthesizing a pixel
non-parametricsampling
Input image
– Instead of explicitly defining P(pixel|neighorhood), we search the input image for all sufficiently similar neighborhoods (and pick one match at random)
Agenda
• Review of Proj 1• Concepts for Proj 2– Probability– Morphology
• Image Blending
Morphology• Operations on binary images: I(x,y) in {0,1}
Binary(x,y) = 1 if Grayscale(x,y) > threshold 0 otherwise
Why do this? 1) Less bits to encode image2) There are many operations we can perform on binary images
Example: dilation
Mask (below) is often called “structuring element”
1 1 1
1 1 1
1 1 1
Connectedness
1 1 11 1 11 1 1
0 1 01 1 10 1 0
8-connected neighbors 4-connected neighbors
Dilation: “I should turn on if any of my neighbors are turned on”How do we define neighbor?
Example: finding borders of objects
Image Dilated with 3x3 mask Dilated - Image
Grayscale dilation
Erosion
• “I should turn off if any of my neighbors are off”• Can we implement with dilation?
Morphological opening
• Open(Image) = Erode(Dilate(Image))
Binary hole filling
Agenda
• Review of Proj 1• Concepts for Proj 2– Probability– Morphology
• Image Blending
Problems with direct cloning
From Perez et al. 2003
Solution: clone gradient
Idea; we’ll manipulate gradient field of image rather than image
How can we compute df/dx & df/dy with a convolution (what are the filters)?
Seamless Poisson cloning
• Given vector field v (pasted gradient), find the value of f in unknown region that optimize:
Pasted gradient Mask
Background
unknownregion
Membrane interpolation• What if v is null?• Laplace equation (a.k.a. membrane equation )
Let’s consider this problem in 1-D
• Min (df/dx)^2 such that f(a) = x1, f(b) = x2
• Using calculus of variations (we want minimum, so differentiate above function and set = 0), we’ll find that df^2/dx^2 (second derivative) must be 0 everywhere
• Makes sense; if f(x) was curving, the total sum of slope^2 could be made smaller
x1 x2
In 2D: membrane interpolation
x1 x2
Discrete Poisson solver
p q
(all pairs that are in )
Discretized gradient
Discretized v: g(p)-g(q)
Only for boundary pixels
Boundary condition
• Minimize variational problem
• Rearrange and call Np the neighbors of p
• Big yet sparse linear system
Discrete Poisson solver
Knowns: g = desired pixels (v = gp –gq) f* = constrained border pixels Np = list of neighbors of pixel p
Unknowns: f = reconstructed pixels
Discrete Poisson solver
Knowns: g = desired pixels (v = gp –gq) f* = constrained border pixels Np = list of neighbors of pixel p
Unknowns: f = reconstructed pixels
Write equation as Af = b
A is a large space matrix mostly of 0s,+-1sb is a vector of the righthandside of the above eqn
Matlab can solve (for small images) in a single command: f = b\A
Result (eye candy)
Manipulate the gradient
• Mix gradients of g & f: take the max
Photomontage
• http://grail.cs.washington.edu/projects/photomontage/photomontage.pdf
Top Related