Post on 23-Jan-2016
CSCE 641:Computer Graphics Image Warping/Registration
Jinxiang Chai
Outline
Image warping
Image Registration
Image Warping
Image filtering: change range of image g(x) = h(f(x))
f
x
hf
x
f
x
hf
x
Image warping: change domain of imageg(x) = f(h(x))
Image Warping
Image filtering: change range of image g(x) = h(f(x))
Image warping: change domain of imageg(x) = f(h(x))
hf g
hf g
Image Warping
Why? - texture mapping
- image processing (rotation, zoom in/out, etc)
- image morphing/blending
- image editing
- image based-modeling & rendering
Parametric (global) Warping
Examples of image warps:
translation rotation aspect
affine perspective cylindrical
Transformation Function
f, g
Transform the geometry of an image to a desired geometry
Definition: Image Warping
Source Image: Image to be used as the reference. The geometry of this image is no changed
Target Image: this image is obtained by transforming the reference image.
(x,y): coordinates of points in the reference image
(u,v): coordinates of points in the target image
f,g or F,G: x and y components of a transformation function
Definition: Image Warping
Control points: Unique points in the reference and target images. The coordinates of corresponding control points in images are used to determine a transformation function.
Source Image Target Image
A Transformation Function
Used to computer the corresponding points
Source Image S(x,y) Target Image T(u,v)
u = f(x,y)
v = g(x.y)
x = F(u,v)
y = G(x.v)
Warping Types
Simple mappings: - Similarity - Affine mapping - Projective mapping
These can be applied globally over a subdivision of the plane:
- Piecewise affine over triangulation - Piecewise projective over quadrilaterization - Piecewise bilinear over a rectangular grid
Or other, arbitrary functions can be used, e.g. - Bieer-neely warp (popular for morphs) - Store u(x,y) and v(x,y) in large arrays
Similarity Transform
• A combination of 2-D scale, rotation, and translation transformations.
• Allows a square to be transformed into any rotated rectangle.
• Angle between lines is preserved
• 5 degrees of freedom (sx,sy,θ,tx,ty)
• Inverse is of same form (is also similarity). Given by inverse of 3X3 matrix above
Have the form: In matrix notation:
Affine Transform
• A combination of 2-D scale, rotation, shear, and translation transformations.
• Allows a square to be distorted into any parallelogram.• 6 degrees of freedom (a,b,c,d,e,f)• Inverse is of same form (is also affine). Given by
inverse of 3X3 matrix above• Good when controlling a warp with triangles, since 3
points in 2D determined the 6 degrees of freedom
Have the form: In matrix notation:
Projective Transform (a.k.a “perspective”)
• Linear numerator & denominator• If g=h=0, then you get affine as a special case• Allow a square to be distorted into any quadrilateral• 8 degrees of freedom (a-h). We can choose i=1,
arbitrarily• Inverse is of same form (is also projective). • Good when controlling a warp with quadrilaterals, since
4 points in 2D determine the 8 degrees of freedom
Have the form: In matrix notation:
Image Warpingx
y
u
v
Given a coordinate transform function f,g or F,G and source image S(x,y), how do we compute a transformed image T(u,v)?
S(x,y) T(u,v)
Forward Warpingx
y
u
v
S(x,y) T(u,v)
Forward warping algorithm:
for y = ymin to ymax
for x = xmin to xmax
u = f(x,y); v = g(x,y)
copy pixel at source S(x,y) to T(u,v)
Forward Warpingx
y
u
v
S(x,y) T(u,v)
Forward warping algorithm:
for y = ymin to ymax
for x = xmin to xmax
u = f(x,y); v = g(x,y)
copy pixel at source S(x,y) to T(u,v)
- Any problems for forward warping?
Forward Warpingx
y
u
v
S(x,y) T(u,v)
Q: What if the transformed pixel located between pixels?
A: Distribute color among neighboring pixels
- known as “splatting”
Forward Warping• Iterate over source, sending pixels to destination• Some source pixel map to multiple dest. pixels• Some dest. pixels may have no corresponding source• Holes in reconstruction • Must splat etc.
for y = ymin to ymax
for x = xmin to xmax
u = f(x,y); v = g(x,y)
copy pixel at source S(x,y) to T(u,v)
x
y
u
v
Forward Warping• Iterate over source, sending pixels to destination• Some source pixel map to multiple dest. pixels• Some dest. pixels may have no corresponding source• Holes in reconstruction • Must splat etc.
for y = ymin to ymax
for x = xmin to xmax
u = f(x,y); v = g(x,y)
copy pixel at source S(x,y) to T(u,v)
x
y
u
v
- How to remove the holes?
Inverse Warpingx
y
u
v
S(x,y) T(u,v)
Inverse warping algorithm:
for v = vmin to vmax
for u = umin to umax
x = F(u,v); y = G(u,v)
copy pixel at source S(x,y) to T(u,v)
Inverse Warpingx
y
u
v
S(x,y) T(u,v)
Q: What if pixel comes from “between” two pixels?
A: Interpolate color values from neighboring pixels
Inverse Warping
• Iterate over dest., finding pixels from source• Non-integer evaluation source image, resample• May oversample source• But no holes• Simpler, better than forward mapping
for v = vmin to vmax
for u = umin to umax
x = F(u,v); y = G(u,v)
copy pixel at source S(x,y) to T(u,v)
x
y
u
v
Resampling
x
y
u
v
This is a 2D signal reconstruction problem!
Resampling Filter
Review: Signal Reconstruction in Freq. Domain
T 2T…-2T -T… 0
x
fs(x)
f(x)
x
Fs(u)
u-fmax fmax
F(u)
u-fmax fmax
)()( uboxuFs
Inverse Fourier transform
Fourier transform
Review: Signal Reconstruction in Spatial Domain
)sin()( xxf s
T 2T…-2T -T… 0
x
x
fs(x) sinc(x)
Resampling
Compute weighted sum of pixel neighborhood - Weights are normalized values of kernel function
- Equivalent to convolution at samples with kernel
- Find good filters using ideas of previous lectures
x
y
u
v
Point Sampling
Nearest neighbor
x
y
u
v
- Copies the color of the pixel with the closest integer coordinate - A fast and efficient way to process textures if the size of the target is similar to the size of the reference- Otherwise, the result can be a chunky, aliased, or blurred image.
Bilinear Filter
Weighted sum of four neighboring pixels
x
y
u
v
Bilinear Filter
Sampling at S(x,y):
(i+1,j)
(i,j) (i,j+1)
(i+1,j+1)
S(x,y) = a*b*S(i,j) + a*(1-b)*S(i+1,j)
+ (1-a)*b*S(i,j+1) + (1-a)*(1-b)*S(i+1,j+1)
u
v
y
x
Bilinear Filter
Sampling at S(x,y):
(i+1,j)
(i,j) (i,j+1)
(i+1,j+1)
S(x,y) = a*b*S(i,j) + a*(1-b)*S(i+1,j)
+ (1-a)*b*S(i,j+1) + (1-a)*(1-b)*S(i+1,j+1)
Si = S(i,j) + a*(S(i,j+1)-S(i))
Sj = S(i+1,j) + a*(S(i+1,j+1)-S(i+1,j))
S(x,y) = Si+b*(Sj-Si)
To optimize the above, do the following
u
v
y
x
Bilinear Filter
(i+1,j)
(i,j) (i,j+1)
(i+1,j+1)
y
x
Anisotropic Filter
Anisotropic means "non-uniform shape" - Used in texture mapping
- A circle in screen space corresponds to an ellipse in texture space
- An Isotropic filter in screen space means an anisotropic filter
- Changing viewpoint results in different filters
- Calculated at run time
- Many calculations need to be done to draw a single pixel
Screen spaceTexture space
Comparison of Resampling Filters
Inverse Warping and Resampling
Inverse warping algorithm:
for v = vmin to vmax
for u = umin to umax
float x = F(u,v); float y = G(u,v);
T(u,v) = resample_souce(x,y,w);
x
y
u
v
(u,v)
(x,y)
Outline
Image warping
Image Registration
Image RegistrationImage warping: given h and f, compute g
g(x) = f(h(x))
hf
g?
Image registration: given f and g, compute h
h?f
g
Why Image Registration?
Lots of uses– Correct for camera jitter
(stabilization)– Align images (mosaics)– View morphing– Special effects– Image based modeling/rendering– Etc.
[Seitz 96]
Image Registration
How do we align two images automatically?
Two broad approaches:– Feature-based alignment
• Find a few matching features in both images• compute alignment
– Direct (pixel-based) alignment• Search for alignment where most pixels agree
Outline
Image registration
- feature-based approach
- pixel-based approach
Outline
Image registration
- feature-based approach
- pixel-based approach
Feature-based Alignment
1. Find a few important features (aka Interest Points)
2. Match them across two images
3. Compute image transformation function h
Feature-based Alignment
1. Find a few important features (aka Interest Points)
2. Match them across two images
3. Compute image transformation function h
How to choose features – Choose only the points (“features”) that are salient, i.e. likely to be there in
the other image– How to find these features?
Feature-based Alignment
1. Find a few important features (aka Interest Points)
2. Match them across two images
3. Compute image transformation function h
How to choose features – Choose only the points (“features”) that are salient, i.e. likely to be there in
the other image– How to find these features?
• windows where has two large eigenvalues• Harris Corner detector• Better features (SIFT features)
Feature Detection
-Two images taken at the same place with different angles
- Projective transformation H3X3
Feature Matching
?
-Two images taken at the same place with different angles
- Projective transformation H3X3
Feature Matching
?
-Two images taken at the same place with different angles
- Projective transformation H3X3
How do we match features across images? Any criterion?
Feature Matching
?
-Two images taken at the same place with different angles
- Projective transformation H3X3
How do we match features across images? Any criterion?
Feature Matching
• Intensity/Color similarity– The intensity of pixels around the
corresponding features should have similar intensity
Feature Matching
• Intensity/Color similarity– The intensity of pixels around the
corresponding features should have similar intensity
– Sum of squared differences (SSD), Normalized cross-correlation
Feature Matching
• Feature similarity (Intensity or SIFT signature)– The intensity of pixels around the
corresponding features should have similar intensity
– Cross-correlation, SSD
• Distance constraint– The displacement of features should be
smaller than a given threshold
Feature-space Outlier Rejection
bad
Good
Feature-space Outlier Rejection
Can we now compute H3X3 from the blue points?
Feature-space Outlier Rejection
Can we now compute H3X3 from the blue points?
– No! Still too many outliers…
Feature-space Outlier Rejection
Can we now compute H3X3 from the blue points?– No! Still too many outliers… – What can we do?
Feature-space Outlier Rejection
Can we now compute H3X3 from the blue points?– No! Still too many outliers… – What can we do?
Robust estimation!
Robust Estimation: A Toy Example
How to fit a line based on a set of 2D points?
Robust Estimation: A Toy Example
How to fit a line based on a set of 2D points?
RANSAC: an iterative method to estimate parameters of a mathematical model from a set of observed data which contains outliers
RANSAC: RANdom SAmple Consensus
ObjectiveRobust fit of model to data set S which contains outliers
Algorithm
(i) Randomly select a sample of s data points from S and instantiate the model from this subset.
(ii) Determine the set of data points Si which are within a distance threshold t of the model. The set Si is the consensus set of samples and defines the inliers of S.
(iii) If the subset of Si is greater than some threshold T, re-estimate the model using all the points in Si and terminate
(iv) If the size of Si is less than T, select a new subset and repeat the above.
(v) After N trials the largest consensus set Si is selected, and the model is re-estimated using all the points in the subset Si
RANSAC
Repeat M times:– Sample minimal number of matches to
estimate two view relation (affine, perspective, etc).
– Calculate number of inliers or posterior likelihood for relation.
– Choose relation to maximize number of inliers.
RANSAC Line Fitting Example
Task:
Estimate best line
RANSAC Line Fitting Example
Sample two points
RANSAC Line Fitting Example
Fit Line
RANSAC Line Fitting Example
Total number of points within a threshold of line.
RANSAC Line Fitting Example
Repeat, until get a good result
RANSAC Line Fitting Example
Repeat, until get a good result
RANSAC Line Fitting Example
Repeat, until get a good result
How Many Samples?
Choose N so that, with probability p, at least one random sample is free from outliers. e.g. p=0.99
sepN 11log/1log
peNs 111
proportion of outliers es 5% 10% 20% 25% 30% 40% 50%2 2 3 5 6 7 11 173 3 4 7 9 11 19 354 3 5 9 13 17 34 725 4 6 12 17 26 57 1466 4 7 16 24 37 97 2937 4 8 20 33 54 163 5888 5 9 26 44 78 272 1177
How Many Samples?
Choose N so that, with probability p, at least one random sample is free from outliers. e.g. p=0.99
sepN 11log/1log
peNs 111
proportion of outliers es 5% 10% 20% 25% 30% 40% 50%2 2 3 5 6 7 11 173 3 4 7 9 11 19 354 3 5 9 13 17 34 725 4 6 12 17 26 57 1466 4 7 16 24 37 97 2937 4 8 20 33 54 163 5888 5 9 26 44 78 272 1177
Affine transform
How Many Samples?
Choose N so that, with probability p, at least one random sample is free from outliers. e.g. p=0.99
sepN 11log/1log
peNs 111
proportion of outliers es 5% 10% 20% 25% 30% 40% 50%2 2 3 5 6 7 11 173 3 4 7 9 11 19 354 3 5 9 13 17 34 725 4 6 12 17 26 57 1466 4 7 16 24 37 97 2937 4 8 20 33 54 163 5888 5 9 26 44 78 272 1177
Projective transform
RANSAC for Estimating Projective Transformation
RANSAC loop:1. Select four feature pairs (at random)
2. Compute the transformation matrix H (exact)
3. Compute inliers where SSD(pi’, H pi) < ε
4. Keep largest set of inliers
5. Re-compute least-squares H estimate on all of the inliers
RANSAC
Feature-based Registration
Works for small or large motion
Model the motion within a patch or whole image using a parametric transformation model
Feature-based Registration
Works for small or large motion
Model the motion within a patch or whole image using a parametric transformation model
How to deal with motions that cannot be described by a small number of parameters?
Outline
Image registration
- feature-based approach
- pixel-based approach
Next Lecture
More on image registration
Image mosaicing