Chap. 10 computational photography
-
Upload
duckleek -
Category
Technology
-
view
500 -
download
0
description
Transcript of Chap. 10 computational photography
Computational photography
Kazuhiro Sato
Computer Vision: Algorithms and Applications
Chapter 10
What is Computational Photography?
To enhance or extend the capabilities of digital photography
enhanceextend
Contents βcomputational photography-
1. Photometric calibration2. High dynamic range imaging3. Super-resolution and blur
removal4. Image matting and
compositing5. Texture analysis and
synthesis
1. Photometric calibration
By the way,
How does an image
come out?
imaging pipeline
First, we characterizeβ’ the mapping
functionsβ’ the amounts of noise
The mapping functions consists of
β’ the radiometric response function
β’ vignettingβ’ the point spread function
1.1 Radiometric Response Function
fin
al bri
ghtn
ess
scene radiane
opticsπ
πΌsensor plane
often NOT linear
How can we determine the function?
β’ calibration chart
β’ polynomial approximation πΌ=β
π=0
π
ππππ
β’ least squares (explained later)
1.2 Noise Level Estimation
1. Segment the input image
2. Fit a linear function
3. Plot the standard deviation
4. Fit a lower envelope
estimated noise level
pixel value
π
[Liu et al. β08]
1.3 Vignetting
γ» Radial Gradient Symmetry [Zheng et al. β08]
β πΌ (π₯ , π¦)
π πΌππ
0β +ΒΏπ πΌππ
Histogram (asymmetry)
π (π , π )=πΌ (π ,π )π (π )
Find the optimal that minimizes the asymmetry using βmaximum a posteriori (MAP)β
: the image with vignetting: the vignetting-free image: the vignetting function
π=argmaxπ
π (πβ¨π )βargmaxπ
π (πβ¨π ) π (π )
1.4 Optical Blur Estimation
γ» PSF Estimation [Joshi et al. β08]
Estimating the PSF kernel using a calibration patternπΎ=argmin
πΎβπ΅βπ· ( πΌ β πΎ )β2
: sensed (blurred) image: predicted (sharp) image: downsampling operator
Solving a Bayesian framework
using a maximum a posteriori (MAP)
argmaxπΎ
π (πΎβ¨π΅ )=argmaxπΎ
π (π΅β¨πΎ ) π (πΎ )π (π΅ )
πΏ (π΅β¨πΎ )=βπ (π΅ )βπ ( πΌβ πΎ )β2
π 2
ΒΏ argminπΎπΏ (π΅β¨πΎ )+πΏ (πΎ )
πΏ (πΎ )= Ξ»πΎβπ»πΎβ2
γ» Recovering the PSF without calibration
1. Fit step edges to the elongated ones
2. Apply the least squares for every surrounded pixelπΎ=argmin
πΎβπ΅βπ· ( πΌ β πΎ )β2
2. High Dynamic Range Imaging
Dynamic Range
0.01 1 100 10000
moonlight sunlight
Luminance[cd/m2]
indoor lighting
Human eye
CCD camera
The natural world is too bright to be captured!
Creating a properly exposed photo (High dynamic range imaging)
different exposures
properly exposed photo
create
How to create such a photo?
1. Estimate the radiometric response function
2. Estimate a radiance map
3. Tone map the resulting HDR image into a 8-bit one
1. Estimate the radiometric response function
π§ππ= π (πΈπ π‘ π )log πΈ π
π§ ππ
π β1 (π§ ππ )=πΈπ π‘ π
th exposure ( )
π ( π§ππ )=log π β1 (π§ππ )=log πΈπ+ log π‘ π
[Debevec et al. β97]
minπΈ=βπβπ
π€ (π§ππ ) [π (π§ππ )βlog πΈ πβ log π‘ π ]2+ΒΏ Ξ»βπ
π€ (π )πβ² β² (π )2 ΒΏ
log πΈ π
π§ ππ
hπ€ ππππ€ (π§ )={π§βπ§πππ π§β€ (π§πππ+π§πππ₯) /2π§πππ₯β π§ π§>( π§πππ+ π§πππ₯ )/2
th exposure ( )
2. Estimate a radiance map[Mitsunaga et al. β99]
different exposures
π§π ,1 π§π ,2
π§π ,1
π§π ,2
Merging the input images into a composite radiance map.
log πΈ π=βπ
π€ ( π§ππ ) [π (π§ ππ )β log π‘ π ]βπ
π€ (π§ππ )
hπ€ ππππ€ (π§ )=π (π§ )/πβ² (π§ )
different exposures
π§π ,1 π§π ,2
radiance map (grayscale)
merge
3. Tone map the resulting HDR image into a 8-bit one
bits / pixel
8-bit image
HDR image
8 bits / pixel
input HDR image gamma applied to each channel
gamma applied to luminance only
This global approach fails to preserve details in regions with widely varying exposures.
γ» Global tone mapping using a transfer curve[Larson et al. β97]
2.1 Tone mapping
γ» Local tone mapping using bilateral filter[Durand et al. β02]
This approach doesnβt create visible halos around the edges.
result withlow-pass filtering
(visible halos)
result withbilateral filtering
(no halos)
γ» Gradient domain tone mapping [Fattal et al. β02]
πΏππ
πΏππ’π‘
πΆππ’π‘=(πΆππ
πΏππ)π
πΏππ’π‘π β [0.4 ,0.6 ]
The new luminance is combined with the original color image.
πΊ (π₯ )=π» β² (π₯ )Ξ¦ (π₯ )
Attenuation map Tone-mapped result
2.2 Flash photography [Petschnigg et al. 04]
Combine flash and non-flash images to achieve better photos
Joint bilateral filterβs kernel
domain kernel
π€ (π , π ,π , π )=exp (β (πβπ )2+( πβπ )2
2ππ2 ) βexp(ββπΉ (π , π )βπΉ (π , π )β2
2ππ2 )
range kernel
3. Super-resolution and blur removal
3.1 Color image demosaicing
Bayer RGB patternin a camera sensor
Full-color RGB
interpolate
γ» Bayesian demosaicing with a two-color prior
Original Bilinear [Bennett et al. β06]
[Bennett et al. β06]
Two-color model [Bennett et al. β06]
4. Image matting and compositing
Image βmattingβ and βcompositingβ?
Input OutputAlpha-matting
matting compositing
What is the problem in matting and compositing?
foreground
background
boundary
πΌ=1πΌ=0
0<πΌ<1
We need the opacity !
Failed example of matting
Alpha matte Composite Inset
Input
4.1 Blue screen matting [Smith et al. β96]
ππ= [π ππΊπ0 ]π π=[π ππΊ π π΅ π ] ππ=[ 00π΅π ]
{ π π=π π
πΊπ=πΊ π
(1βπΌπ )π΅π=π΅ π
solve
)
β΄πΆπ=[π ππΊπ0πΌπ ]
ΒΏ [π π πΊ π 01βπ΅ π
π΅π]
ππ= [π ππΊππ΅π ]π π 1=[π π 1
πΊ π 1π΅ π 1 ]
ππ1=[00π΅π1 ]
solve
)
β΄πΆπ=[π ππΊππ΅ππΌπ ]
ΒΏ [π π 1πΊ π 1
π΅ π 2π΅π1βπ΅ π 1
π΅π2
π΅π1βπ΅π2
1βπ΅ π 1βπ΅ π 2
π΅π1βπ΅π2
]
γ» Two-screen matting[Smith et al. β96]
π π 2=[π π 2
πΊ π 2π΅ π 2 ]
ππ2= [00π΅π2 ]
{π΅π+ (1βπΌπ )π΅π1=π΅ π 1
π΅π+(1βπΌπ )π΅π2=π΅ π 2
4.2 Natural image matting
Input Hand-drawn trimap
Alpha map
foreground composite
[Chuang et al. β01]
γ» Knockout[Berman et al. β00]
πΉ π
π΅π
πΆ=πΌπΉ+ (1βπΌ ) π΅
1. Estimate and
πΉ=βπ
π€ ππΉ π π΅ β²=βπ
π€ ππ΅π
2. Refine β πΉπ΅ β²β₯π΅π΅ β² //
3. Estimate by axis-projection
πΌπ =π π (πΆ )β π π (π΅ )π π (πΉ )β π π (π΅ )
,πΌπΊ ,πΌπ΅
πΌ πππππ= βπ=π ,πΊ ,π΅
[ π π (πΉ )β π π (π΅ ) ] βπΌ π
γ» Bayesian approach[Chuang et al. β01]
πΉ
π΅
πΆobservation
most likely estimates
arg maxπΉ ,π΅ ,πΌ
π (πΉ ,π΅ ,πΌβ¨πΆ )
πΏ (πΆ|πΉ ,π΅ ,πΌ )=ββπΆβπΌπΉβ (1βπΌ ) π΅β2
ππΆ2
πΏ (πΉ )=β (πΉβπΉ )π Ξ£πΉβ1 (πΉβπΉ )/2
: same as the above except : const.
γ» Comparison of natural image mattings
4.3 Optimization-based matting
γ» Border matting
1. Get a trimap by hard segmentation
π πΉ (πΌ=1 ) ,π π’ ,π π΅ (πΌ=0 )
[Rother et al. β04]
2. Minimizing the energy of mapping
πΈ= βπβπ π
~π·π (πΌπ)+βπ‘=1
π ~π (βπ‘ ,ππ‘ ,βπ‘+1 ,π π‘+!)
data term smoothness term
γ» Color line (closed-form) matting
1. Estimate for each
πΌ πππ π‘ππππ‘ππ=ππ β (πͺ πβπ©π)=ππ βπͺ+ππ
2. Minimize the deviations of the actual from the above πΈπΌ=β
π ( βπβπ π
(πΌπβπΌ πππ π‘ππππ‘ππ )2+πβππβ)
data term regularization term
local window local values
[Levin et al. β08]
3. Eliminate to get a final energy
4.4 Smoke, shadow, and flash matting
γ» Smoke matting[Chuang et al. β02]
input frameremoving the foreground
object
estimated alpha matte
insertion of new objects
γ» Shadow matting[Chuang et al. β03]
4.5 Video matting
[Chuang et al. β02]
5. Texture analysis and synthesis
Texture synthesis is β¦
Small patch
Similar-looking larger patch
synthesize
Texture synthesis using non-parametric sampling
Texture synthesis using non-parametric sampling
[Efros et al. β99]
[Efros et al. β01]
5.1 hole filling and inpainting
inpaint
Original Inpainted result
γ» Exemplar-based inpainting[Criminisi et al. β04]
1. Compute patch priorities
π (π )=πΆ (π )π· (π )
C (π )=β
πβΞ¨ πβ© ( π°βπ )
πΆ (π)
|Ξ¨ π|
π· (π )=|π» πΌπ
β₯βππ|πΌ
data termconfidence term
2. Find the most similar patch to with top priorityΞ¨ οΏ½ΜοΏ½=arg min
Ξ¨ πβΞ¦π (Ξ¨ οΏ½ΜοΏ½ ,Ξ¨ π )
3. Update confidence valuesπΆ (π )=πΆ (οΏ½ΜοΏ½ )β πβΞ¨ οΏ½ΜοΏ½β©Ξ©
onio
n p
eel
[Cri
min
isi et
al.
β04
]
original removed region
Remain the gradient alongthe region boundary
5.2 Non-photorealistic rendering
Non-photorealistic rendering using texture synthesis
γ» Texture transferγ» Image analogies
γ» Texture transfer
[Efros et al. β01]
input outputtexture transfer
input outputtexture transfer
1. Find the most similar block πΈπ=πΌπΈππ£πππππ+(1βπΌ ) πΈ ππ’πππππππ
πΈ π
πΈπ
π΅ π
π΅π
( means ordinary texture synthesis. )
2. Find the minimum error boundary cutπ=(π΅1
ππ£ βπ΅2ππ£ )2
π΅1ππ£ ,π΅2
ππ£
( the minimal cost path corresponds to the valley in the surface . )
γ» Image analogies
[Hertzmann et al. β01]
π΄ π΄ β² π΅ π΅ β²
?: :ΒΏ: :ΒΏ
synthesize this image
π΄ π΄ β²
π΅ π΅ β²
unfiltered example NPR-filtered example
target result
π΄ π΄ β²
π΅ π΅ β²
unfiltered example filtered example
target result
Thank you!