Computer Generated Electronic Holography of Natural Scene …cvlab.khu.ac.kr/paper42.pdf ·...
Transcript of Computer Generated Electronic Holography of Natural Scene …cvlab.khu.ac.kr/paper42.pdf ·...
Computer Generated Electronic Holography of Natural Scene from 2D Multi-view Images and Depth Map
Takanori Senoh, Kenji Yamamoto, Ryutaro Oi, Tomoyuki Mishina and Makoto Okui Universal Media Research Center, National Institute of Information and Communications
Technology {senoh, k.yamamoto, oi, mishina, m-okui}@nict.go.jp
Abstract
Electronic Holography, satisfying all 3D visual cues, is the most promising approach to ideal 3D image presentation. However, several problems have to be solved to realize high quality 3D images, such as handling huge amounts of data or insufficient display device capability. In order to solve these problems, this paper proposes an approach to high quality holographic image generation from 2D multi-view images and depth map. The approach includes phantom imaging elimination and occlusion-hole paving processes, as well as compensation of astigmatism and color aberration caused by the optical elements. 1. Introduction
Electronic holography, as illustrated in Figure 1, captures light wave information emitted or reflected from 3D objects as fringe patterns on image sensors. Fringes include light intensity as contrast, light source direction and distance as density distribution. When fringes on a display are illuminated with coherent light such as laser, the light diffracts in the same direction with same intensity as when it was recorded. As electronic holography reconstructs images of original 3D objects, satisfying all 3D visual cues, it is widely studied for use in future 3DTV systems [1][2][3][4][5].
object
laser
hologram: H = |O+R|2
reproduction:RH=R|O+R|2=|O|2R+|R|2R+R2O*+|R|2O object light O
referencelight R
halfmirror
direction
hologram H
unnecessarylight
objectlight
reconstructedobject light O
fringe pattern(intensity & phase)
diffraction
object
laser
hologram: H = |O+R|2
reproduction:RH=R|O+R|2=|O|2R+|R|2R+R2O*+|R|2O object light O
referencelight R
halfmirror
direction
hologram H
unnecessarylight
objectlight
reconstructedobject light O
fringe pattern(intensity & phase)
diffraction
Figure 1. Hologram recording and reconstruction
Obtaining fringe patterns directly by illuminating objects with laser light is not suitable for human subjects or infinitely distant scenes, so the CGH (Computer generated Hologram) method is used for live scene. CGH calculates fringe patterns from the position of the 3D object (direction and distance) and luminance, as shown in Figure 2.
object
hologram H=|O+R|2
object light:O=I×exp(-jkd)/dk=2π/λ
referencelight R
distance d
luminance I
direction
object
hologram H=|O+R|2
object light:O=I×exp(-jkd)/dk=2π/λ
referencelight R
distance d
luminance I
direction
Figure 2. Computer Generated Hologram
Object distance is captured by a depth camera or
derived from multi-view images. The depth camera measures fly-back time t of the right illuminating the object and calculates distance d = ct/2 (c=light speed) as shown in Figure 3(a) [6]. Another method is disparity D = (x1-x2) measurement between stereo camera images then, object distance d = fL/D (f=camera focal length, L=distance between cameras) is calculated as shown in (b) [7][8]. For better depth estimation, stereo cameras ideally must be parallel and image sensor gains must be identical to each other. Otherwise, image rectification and color matching are necessary [9].
objectlight
source
cameralight fly-back time t
distanced=ct/2
c: light speed
distance d
objectlight
source
cameralight fly-back time t
distanced=ct/2
c: light speed
distance d
object
distanced =
fL/(x1-x2)
luminance I
X
y1
Z
y2
camera interval L
f
X
camera1origin
camera2origin
X-L
x1, x2
fx2
image2
x1
image1
Y
object
distanced =
fL/(x1-x2)
luminance I
X
y1
Z
y2
camera interval L
f
X
camera1origin
camera2origin
X-L
x1, x2
fx2
image2
x1
image1
Y
(a) depth camera (b) disparity calculation Figure 3. Object distance measurement
2008 Second International Symposium on Universal Communication
978-0-7695-3433-6/08 $25.00 © 2008 IEEE
DOI 10.1109/ISUC.2008.46
126
Once object distance (depth) is captured, image objects are placed in 3D space according to their depths, as illustrated in Figure 4. Then, CGH is calculated [10].
color depth
3D space
hologram H=|R+O|2
distance dobject light O=Qexp(-jkd)/d
intensity Q
reference light R
direction
color depth
3D space
hologram H=|R+O|2
distance dobject light O=Qexp(-jkd)/d
intensity Q
reference light R
direction
image: MPEG MVC test sequence [11]
Figure 4. Hologram generation from depth images
Hologram generation from images and depth map
has several advantages such as: (1) high resolution 3D image provided by camera resolution, (2) compact data with additional depth map, and (3) light computational load by means of placing holograms in the middle of the scene [12]. However, it also has some drawbacks that degrade reconstructed image quality. One is the phantom imaging effect, where background objects are visible through foreground objects. Another is occlusion hole, where background pixels are missing. In addition to these, optical elements for hologram image reconstruction cause astigmatism and color aberration.
This paper proposes novel methods that are experimentally confirmed with high quality electronic holography from 2D images plus depth map. The methods include phantom imaging elimination and occlusion-hole paving, as well as astigmatism and color aberration compensation. 2. Astigmatism 2.1. Electronic holography reconstruction
Figure 5 illustrates a color optical system to reconstruct electronic holography.
blue laser
collim
ator
green laser
collim
ator
collimatorspatial filter
LCD
LCD
red laser
LCD
relay-lenshalf-mirror
half-mirror
half-
mirr
or
half-mirror
mirror
half-mirror
green hologram signal
blue hologram signal red hologram signal
3D color image lights
blue laser
collim
ator
green laser
collim
ator
collimatorspatial filter
LCD
LCD
red laser
LCD
relay-lenshalf-mirror
half-mirror
half-
mirr
or
half-mirror
mirror
half-mirror
green hologram signal
blue hologram signal red hologram signal
3D color image lights
Figure 5. Electronic Holography reconstruction
system
Holography display is three LCoSs of
1440x1050pels of 10.4μm pixels for RGB color. Illumination sources are RGB color lasers of 660, 532 and 475nm wave lengths. Half-mirrors are combining separately reconstructed RGB holographic images into a full-color image.
2.2. Astigmatism of half-mirror
As half-mirrors are diagonally placed at the cross-
point of two intersecting lights, the incident light angles between horizontal (45°) and vertical (90°) are significantly different. This difference causes out-going light position differences between horizontal direction and vertical direction, causing different converging points (astigmatism).
different focal points between horizontal lines and vertical lines
focused on horizontal lines
originalimage
observing through mirror,
light source
image of horizontally diffusing light
45°
90°
image of vertically diffusing light
focused on vertical lineshorizontal light
vertically diffusing light
horizontally diffusing light
d
e
astigmatism
half-mirror
different focal points between horizontal lines and vertical lines
focused on horizontal lines
originalimage
observing through mirror,
light source
image of horizontally diffusing light
45°
90°
image of vertically diffusing light
focused on vertical lineshorizontal light
vertically diffusing light
horizontally diffusing light
d
e
astigmatism
half-mirror
Figure 6. Astigmatism caused by half-mirror
Distance e between original light source point and
converging point of vertically diffusing light after mirror output of mirror is given in equation (1).
)sin()cos(
}tantan){cos(2
βαβαθφβα
−+−
−−=
te (1)
4/),/(sinsin 1 πααβ == − n (2) angldiffusionn == − θθφ ),/(sinsin 1 (3)
Where: α= horizontal incident angle of horizontal
light, β= refraction angle of horizontal light, θ= vertical incident angle of vertically diffusing light, φ= refraction angle of vertically diffusing light, t = mirror thickness and n = refraction index of half-mirror.
Distance d between original light source point and conversing point of horizontally diffusing light after going out of mirror is given in equation (4).
})sin()cos(
tan)cos()sin()sin()cos(
)sin({tan
2
δαδαθδαδα
βαβαβα
θ
−+−−−−
−
−+−−
=td
(4)
127
)/(sinsin, 1 nγδθαγ −=−= (5) Where: γ = horizontal incident angle of
horizontally diffusing light and δ = horizontal refraction angle of horizontally diffusing light. The difference between e and d is the amount of astigmatism. As shown in Figure 7, astigmatism is proportional to mirror thickness and less dependent on incident light diffusion angle when the angle is small.
0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 20
2
4
6
8
10
12
Diffusion angle Θ (deg)
Ast
igm
atis
m Z
0 (m
m)
t=10mm
t=20mm
t=30mm
t=40mm
Figure 7. Astigmatism dependency
2.3. Astigmatism compensation
As astigmatism is a phenomenon whereby a light source distance is differently observed between horizontally diffusing light and vertically diffusing light, light sources having inverse astigmatism can compensate for it. A linear light source having appropriate initial phase has inverse astigmatism as illustrated in Figure 8. The initial phase is given by inverting the sign of the light phase when it reaches the line source, starting with phase 0 from a point light source placed at the astigmatism distance from the linear light source. Such linear light source emits light horizontally diverging but vertically conversing at the astigmatism distance.
Z
XY
Z0
linear light source with inverse astigmatisminitial phase:ψ(0,y,0)=k(Z0+y2/2Z0)
vertically converging light
horizontally diverging light
0
original object pointI(0,0,Z0)
hologram pixelH(XH,YH,ZH)
Z
XY
Z0
linear light source with inverse astigmatisminitial phase:ψ(0,y,0)=k(Z0+y2/2Z0)
vertically converging light
horizontally diverging light
0
original object pointI(0,0,Z0)
hologram pixelH(XH,YH,ZH)
Figure 8. Astigmatism compensating light source 2.4. Astigmatism compensation results
Figure 9 shows the results of astigmatism compensation by replacing original object points with the above linear light sources. The astigmatism distance Z0 can be calculated as mentioned in 2.3, or
obtained by measuring the distance between the lattice image's horizontal and vertical focal points. When the light diffusion angle is small (several degrees), a fixed compensation value works well. When the diffusion angle becomes larger, the shaped initial phase of the linear light source will compensate the astigmatism. As astigmatism distance is proportional to the thickness of the half-mirror, thinner half-mirrors yield less astigmatism. However, unevenness of thin mirrors may cause image distortion [13].
Lattice: (a)(b) with astigmatism (c) compensated
Point: (a)(b) with astigmatism (b) compensated
Figure 9. Astigmatism compensation results
3. Color aberration 3.1. Color aberration of lens
In Figure 5, based on the single-sideband method
[14], relay-lenses are used to eliminate unnecessary lights such as incident illumination light emitted straight without diffraction and conjugate light diffracting to other directions than the original object light direction. When a common relay-lens is used to make the optical system small, color aberration is caused by the difference of lens focal length among RGB lights.
indexrefractivenradiuscurvaturer
nrflengthfocallens
:,:
)1(2/: −=
focal length differs depending on color
f=500mm/λ=588nm
δ≈9mm δ≈18mm
color object
color aberration imagehologram
indexrefractivenradiuscurvaturer
nrflengthfocallens
:,:
)1(2/: −=
focal length differs depending on color
f=500mm/λ=588nm
δ≈9mm δ≈18mm
color object
color aberration imagehologram
Figure 10. Color aberration caused by lens
Because the inter-lens distance is physically fixed,
the relay-lens system becomes imperfect with different wave-lengths. Hence, the output image sizes and positions vary depending on the color (color aberration) as shown in Figure 10.
128
3.2. Color aberration compensation
This aberration can be compensated by adjusting RGB object sizes and positions so as to get the same size and position of output images among RGB colors. Compensating image distance D' from relay-lens input and its size A' are given in equation (6) and (7).
2110
2001
1011010
64)(2)2()(4
'ffffffDDfffffff
D+−+−
−−−= (6)
2110101 /)}2()('2{' ffffffDAA −+−= (7)
Where: f0= lens focal length at normal wave length
(ex. green), f1= lens focal length at other wave-length (ex. red or blue), D = original object distance from relay lens and A= original object size measured from optical axis. As shown in Figure 11, compensating object distance difference is about 2(f1-f0) and compensating object size is almost the same as the original object size. Here, relay lens glass is ordinary BK-7 and the focal length is 500mm at wave length = 588nm.
0 200 400 6000
200
400
600
G被写体距離 Dg (mm)
補正後の被写体距離
(mm
)
R被写体距離 Dr:−−−
B被写体距離 Db:......
compensatedobject distanceD’R
compensatedobject distanceD’B
normal object distance DG (mm)com
pens
ated
obj
ect d
ista
nce
D’(
mm
)
0 200 400 6000
200
400
600
G被写体距離 Dg (mm)
補正後の被写体距離
(mm
)
R被写体距離 Dr:−−−
B被写体距離 Db:......
compensatedobject distanceD’R
compensatedobject distanceD’B
normal object distance DG (mm)com
pens
ated
obj
ect d
ista
nce
D’(
mm
)
0 20 400
20
40
G被写体サイズ Ag (mm)
補正
後被写
体サ
イズ
(mm
)
R被写体サイズ Ar: −−−
B被写体サイズ Ab:......
com
pens
ated
obj
ect s
ize
A’(m
m)
normal object size AG (mm)
compensatedobject size A’R
compensatedobject size A’B
0 20 400
20
40
G被写体サイズ Ag (mm)
補正
後被写
体サ
イズ
(mm
)
R被写体サイズ Ar: −−−
B被写体サイズ Ab:......
com
pens
ated
obj
ect s
ize
A’(m
m)
normal object size AG (mm)
compensatedobject size A’R
compensatedobject size A’B
Figure 11. Color aberration compensation
33. Color aberration compensation results
As shown in Figure 11, color aberration can be compensated by simply shifting the object position without changing the size. Besides the lens, half-mirrors also cause color aberration when RGB holographic lights go through different numbers of mirrors. Figure 12 shows the result of color aberration compensation of both lenses and mirrors by adjusting the RGB hologram positions (LCD) [15].
(a) with aberration (b) compensated
Figure 12. Color aberration compensation results
4. Phantom imaging 4.1. Holography from depth image
When reconstructing 3D object space from a depth image, image space is better than original object space as illustrated in Figure 13. Image space is a non-linear 3D space (x,y,z) = (X,Y,Z)f/(Z-f) where real objects (X,Y,Z) are projected through a camera lens of focal length f. The advantages of this space are: (1) simple: disparity is directly used as the object distance, (2) compact: objects far away have small size and less depth, and (3) it is consistent with human vision: the same images on the retina as real objects when observing images with camera focal length apart [16].
D f
z
(X,Y,D)
(-x,-y,-z)
real 3D space
image spacelens
Z,z
Y,y X,x
hologram
infinity
infinity plane
f
same retina imageas observing real object
depth
D f
z
(X,Y,D)
(-x,-y,-z)
real 3D space
image spacelens
Z,z
Y,y X,x
hologram
infinity
infinity plane
f
same retina imageas observing real object
depth
Figure 13. Image space
4.2. Phantom imaging in depth to holography
As camera images are single view image with hidden-surface removal, no hidden surface of background objects is visible when observed from the original camera position. However, as the hologram is calculated with diffusing lights from object points, some lights may penetrate foreground objects as illustrated in Figure 14.
diffusion area
phantom imaging area
hologram
color depth
backgroundobject
foregroundobject
3D space
depth map2D image focused on foreground object
holographic image
diffusion area
phantom imaging area
hologram
color depth
backgroundobject
foregroundobject
3D space
depth map2D image focused on foreground object
holographic image
Figure 14. Phantom imaging effect
These lights cause the phantom imaging effect and
degrade image quality. Especially when foreground objects are dark, they are not perceived if background lights penetrate them. The phantom imaging effect is removed by introducing a hidden-surface removal process into hologram calculation [17]. The ordinary z-buffer method for CG [18] cannot be directly applied to holography. As hologram pixel values are the sum of diffusing lights in equation (8), simple replacement
129
of pixel value with the most foreground object light destroys the sum.
)}2
(2cos{),( 22
ZYXZ
ZYXIH +
+= ∑ λπ (8)
Equation (8) is a Fresnel diffraction equation with
assumption assuming |Z|>>|X|, |Y|. I(X,Y) is the luminance of object pixel at point (X,Y), Z is the distance between object pixel and hologram plane, The origin of (X,Y,Z) coordinate system is on the current hologram pixel where summation is performed. Σ is the summation over light diffusion area to the hologram pixel. A novel phantom imaging elimination process integrated into this summation is described in the next section. 4.3. Phantom imaging elimination process
This process uses a z-buffer-like occlusion map for each hologram pixel as shown in Figure 15. The map address is the diffusing light angle to the hologram pixel. The map covers the whole diffusion area, which is determined by hologram pixel pitch p and light wavelength λ as arcsin(2p/λ). When summing up lights from object pixels with equation (8), their Z values are compared with the depth value in the occlusion map. The map address is given by the light diffusion angle to the hologram pixel. If the Z value shows the current object point is in front of the object point stored in the map, map depth value is replaced with the Z value of currently added object light. In order to prevent delayed emission of the foremost object pixel, the scanning order of object pixels is determined as follows.
(a) First, object pixels in front of the hologram plane (Z<0), are scanned from the outskirts to the center of the diffusion area, summing up object lights and checking the occlusion map values as mentioned above.
(b) Then, object pixels behind the hologram plane (Z>0), are scanned from the center to the outskirts of the diffusion area, applying the same rule as in (a).
With this algorithm, overlapping foreground objects come out earlier than background objects in all cases. Overlapping background objects are simply skipped in the light summation process. Hence hologram pixel value replacement never occurs. In front of the hologram plane, if a foreground object is closer to center than background objects in outskirts, its light is added after the background object lights. However, in this case, background object lights never penetrate the foreground object. In the same manner, behind the
hologram plane, if a background object is farther away from center than foreground objects, its light never penetrates the foreground objects. Hence, the phantom imaging effect is eliminated.
holo calc order:from outskirtsto center
backgroundobject
small window:replaced by mostforeground objectdepth
occlusion map
hologramH=|R+∑Oi|2 diffusion area
reconstructedobject light
pixel value H
holo calc order:from centerto outskirts
object light Oi
reference light R
foregroundobject
Y
XZ
holo calc order:from outskirtsto center
backgroundobject
small window:replaced by mostforeground objectdepth
occlusion map
hologramH=|R+∑Oi|2 diffusion area
reconstructedobject light
pixel value H
holo calc order:from centerto outskirts
object light Oi
reference light R
foregroundobject
Y
XZ
Case (a): center foreground object in front of hologram
and outskirts background object behind hologram
foreground object
occlusion map
hologramH=|R+∑Oi|2
scan orderreconstructedobject lightpixel H
object light Oi
reference light R
background object foreground object
occlusion map
hologramH=|R+∑Oi|2
scan orderreconstructedobject lightpixel H
object light Oi
reference light R
background object
Case (b): center foreground object and outskirts
background object, both behind hologram
background object
foreground onject
window
occlusion map
hologramH=|R+∑Oi|2
object light Oi
scan order: center to outskirts
reconstructed object lightpixel Hreference
light R
background object
foreground onject
window
occlusion map
hologramH=|R+∑Oi|2
object light Oi
scan order: center to outskirts
reconstructed object lightpixel Hreference
light R Case (c): outskirts foreground object and center
background object, both behind hologram
background object
foreground object
window
occlusion map
hologramH=|R+∑Oi|2
object light Oi
scan order:outskirts to center
reconstructed object lightpixelH
reference light R
background object
foreground object
window
occlusion map
hologramH=|R+∑Oi|2
object light Oi
scan order:outskirts to center
reconstructed object lightpixelH
reference light R Case (d): outskirts foreground object and center
background object, both in front of hologram
backgroundobject
foreground object
window
occlusion map
hologramH=|R+∑Oi|2
object light Oi
reconstructedobject light
pixel H
reference light R
scan order: outskirts to center
backgroundobject
foreground object
window
occlusion map
hologramH=|R+∑Oi|2
object light Oi
reconstructedobject light
pixel H
reference light R
scan order: outskirts to center
Case (e): center foreground object and outskirts
background object, both in front of hologram Figure 15. Phantom imaging elimination process
4.4. Phantom imaging elimination results
Figure 16 shows phantom imaging elimination results. The hologram is calculated by placing a red square board 20mm in front of a white wall. Without phantom imaging elimination, edges of the foreground
130
object are almost invisible, especially the upper edge. With the phantom elimination process, foreground object edges are preserved.
(a) 2D image (b) depth map
(c) With phantom (d) No phantom
Figure 16. Phantom elimination results 5. Occlusion hole 5.1. Occlusion hole in depth to holography
Occlusion hole is an opposite phenomenon to the phantom imaging effect. As camera images are single-view images, when holograms generated from such images are observed from a viewpoint other than that of the original camera location, some parts of the uncaptured background object that are covered by the foreground object, appear as dark holes emitting no light as shown in Figure 17.This is called an "occlusion hole". Occlusion holes in full-parallax holograms exist all around the foreground objects.
foregroundobject
full-parallax hologram
color depth
backgroundobject
3D space
depth map2D image
occlusion hole
holographic image (shot from left)
illumination light
reconstructed light
occlusionhole(no pixel)
diffusionlight
foregroundobject
full-parallax hologram
color depth
backgroundobject
3D space
depth map2D image
occlusion hole
holographic image (shot from left)
illumination light
reconstructed light
occlusionhole(no pixel)
diffusionlight
Figure 17. Occlusion hole
hologram
foregroundobject
backgroundobject
visibleocclusionhole
no visibleocclusionhole
vertical diffusiondirection
hologram
foregroundobject
backgroundobject
visibleocclusionhole
no visibleocclusionhole
vertical diffusiondirection
backgroundobject
hologram visibleocclusionhole
vertical diffusiondirection
visibleocclusionhole
foregroundobject
backgroundobject
hologram visibleocclusionhole
vertical diffusiondirection
visibleocclusionhole
foregroundobject
vertically diffusing light horizontally diffusing light
left view center view right view
Figure 18. Occlusion hole in full-parallax hologram
When the SSB method is applied to eliminate
unnecessary light, diffusing light is intercepted in one direction. Then, the occlusion hole is not seen in that direction as no object light comes out, either. In Figure 18, upward-emitting light is eliminated. Hence no hole is visible above the foreground object.
5.2. Occlusion hole paving
Occlusion holes are paved by horizontal and vertical multi-view images as shown in Figure 19. These images are graphically generated to verify the proposed algorithm, assuming a red square board is 90cm in front of camera 0 and a white wall is further than 100m away. The distance between parallel cameras is 15cm each. Camera focal lengths are 15mm.
background object
L
foregroundobject
C R
B
2D multi-view camerasf=15mm15cm
90cm
background object
L
foregroundobject
C R
B
2D multi-view camerasf=15mm15cm
90cm
(a) 2D multi-view cameras and objects
camera L view camera C view camera R view
camera B view
(b) 2D multi-view images Figure 19. 2D multi-view camera and image As upward emitting light is eliminated by the SSB
unnecessary light elimination process, left (L), center (C), right (R) and bottom (B) views are used for occlusion hole paving. In full-parallax hologram, all object pixels are checked one by one, and their depths are compared with the neighboring pixels. A relationship between foreground object pixel and background object pixel is illustrated in Figure 20. In case (a), the foreground object is in front of the hologram plane (Z=0) and the background object is behind it. Although the occlusion hole looks infinite from the hologram pixel, it suffices to pave the occlusion hole from corresponding background pixel a to maximal diffraction angle pixel b linearly. The line ab is determined so that all lights coming to the hologram pixel must go through the occlusion hole. The occluded area pixels are provided by neighboring camera views. Their location in the neighboring view
131
is estimated by adding the depth value of a visible background object around the occlusion hole to the address of the occlusion hole pixel, when depth value is given as disparity in pixel unit. In case (b), where both foreground and background objects are on one side of the hologram plane (Z=0), the occlusion hole looks finite from the hologram pixel. The occlusion hole is linearly paved with the background pixels between pixel a and pixel b which are projected from the foreground object pixel to the background object plane assuming visible background object depth around the hole can be extended over the hole.
X
Y
Zd0
dz
occlusion hole
foregroundobject pixel
backgroundobject pixel a
0hologrampixel
maximal diffraction angle pixel b
X
Y
Zd0
dz
occlusion hole
foregroundobject pixel
backgroundobject pixel a
0hologrampixel
maximal diffraction angle pixel b
(a) infinite line occlusion hole
X
Y
Zd0
dz0
backgroundobject pixel a
projected pixel b
foregroundobject pixel
occlusion hole
hologrampixel
X
Y
Zd0
dz0
backgroundobject pixel a
projected pixel b
foregroundobject pixel
occlusion hole
hologrampixel
(b) finite line occlusion hole
Figure 20. Horizontal occlusion hole paving
When occlusion holes exist vertically as illustrated in Figure 21, they are paved in the same manner as horizontal occlusion holes.
Y
Zdb
dz
0
backgroundobject pixel a
maximal diffraction angle pixel b
occlusion hole
foregroundobject pixel
Xhologrampixel
Y
Zdb
dz
0
backgroundobject pixel a
maximal diffraction angle pixel b
occlusion hole
foregroundobject pixel
Xhologrampixel
(a) infinite line occlusion hole
X
Y
Z
dz
db
0
projected pixel bbackgroundobject pixel a
foregroundobject pixel
occlusion hole
hologrampixel
X
Y
Z
dz
db
0
projected pixel bbackgroundobject pixel a
foregroundobject pixel
occlusion hole
hologrampixel
(b) finite line occlusion hole
Figure 21. Vertical occlusion hole paving
When the hologram pixel is between the foreground
object pixel and the background object pixel as shown in Figure 22, this infinite occlusion hole must be paved with all lights coming from the background object plane of maximal diffraction angle.
X
Y
Z
dz0
maximal diffraction angle pixels
backgroundobject pixel a
occlusion hole
foregroundobject pixel
hologrampixel
d0 X
Y
Z
dz0
maximal diffraction angle pixels
backgroundobject pixel a
occlusion hole
foregroundobject pixel
hologrampixel
d0
Figure 22. Infinite plane occlusion hole paving
5.3. Occlusion hole paving results
Figure 23 shows occlusion hole paving results. The
hologram is calculated by placing a red square board 10mm in front of the hologram plane and a white wall 10mm behind the hologram plane. Before the occlusion paving process, occlusion holes were observed in three directions around foreground object. After the occlusion hole paving process, no occlusion hole is observed.
(a) focusing on background object
(b) focusing on foreground object
left view center view right view Figure 23. Occlusion hole paving results
6. Natural scene electronic holography
Figure 24 shows natural scene 2D multi-view images and depth map used for electronic holography generation. These images are captured by a 2D parallel camera array with 5cm of horizontal camera interval and 20cm of vertical camera interval. Captured images are rectified and registered to ideal lattice points of cameras [11]. Each image resolution is 640x480pels. The depth map is estimated by the software provided by Nagoya University using pixel matching between neighboring views and graph-cut algorithm [19][20].
Electronic holography is generated from these multi-view images and depth map according to the
132
proposed algorithm of phantom imaging elimination and occlusion hole paving. Figure 25 shows reconstructed images with color aberration compensation done by adjusting LCD positions as shown in Figure 5. Astigmatism is reduced by using thin half-mirrors of 3mm thickness, which surface unevenness is less than a wave length. Hence, linear light source compensation was not applied.
left view center view right view
bottom view depth map
Figure 24. 2D multi-view images and depth map
Figure 25. Natural scene electronic holography
6. Conclusion
A novel approach to high quality holographic image generation from 2D multi-view images and depth map is proposed. Phantom imaging elimination and occlusion hole paving, as well as astigmatism and color aberration compensations are reported. 10. References [1] P. St. Hilaires, S. A. Benton, M. Lucente, J. Kollin, H.
Yoshikawa and J. Underkoffler, “ Electronic display system for computational holography”, Prac. Holo TV,S. A. Benton, ed., Proc. SPIE1212, 1990, pp.174-182.
[2] T. Mishina, “3D Image Reproduction by Electronic Holography”, Proc of Universal Communication International Symposium 2007, Japan, 2007, pp.172-175.
[3] Y. Flauel, T.J. Naughton, O. Matoba, E. Tajahuerce and B. Javidi, “Three-Dimensional Imaging and Processing Using Computational Holographic Imaging”, Proc of The IEEE, Vol.94, No.3, Mar. 2000, pp.636-653.
[4] T. Fujii, K. Yamaguchi and H. Yoshikawa, “Improvement of Hidden-Surface Removal for Computer-Generated Holograms from Computer Graphics and its Application to Disk Holograms”, ITE Journal, Vol.62, No.45, Japan, April 2008, pp.527-532.
[5] T. Mishina, R. Oi, J. Arai, F. Okano and M. Okui, ”Three-dimensional image reconstruction of real objects with electronic holography using 4K2K liquid crystal panels”, Proc. of IDW’07, 2007, pp.2253-2254.
[6] M. Kawakita, K. Iizuka, H. Nakamura, I. Mizuno, T. Kurita, T. Aida, Y. Yamanouchi, H. Mitsumine, T. Fukaya, H. Kikuchi, and F. Sato, “High-definition real-time depth-mapping TV camera: HDTV Axi-Vision Camera ”, Optics Express, Vol. 12, Issue 12, 2004, pp. 2781-2794.
[7] A. Klaus, M. Sormann and K. Karner, “Segment-based stereo matching using belief propagation and a self-adapting dissimilarity measure”, International Conf. on Pattern Recognition, 2006.
[8] Vladimir Kolmogorov and Ramin Zabih, “What Energy Functions Can Be Minimized via Graph Cuts?”, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 25, No. 2, 2004, pp. 147—159.
[9] Z. Zhang, “A Flexible New Technique for Camera Calibration”, IEEE Trans. PAMI, Vol.22, No.11, 2000, pp.1330-1334.
[10] R. Oi, K. Yamamoto and M. Okui, “Electronic generation of holograms by using depth maps of real scenes”, Proc of SPIE, Vol.6912, 69120M, 2008, pp.1-11.
[11] M. Tanimoto, T. Fujii, T. Senoh, T. Aoki, and Y. Sugihara, “Test Sequences with Different Camera Arrangements for Call for Proposals on Multiview Video Coding”, ISO/IEC JTC1/SC29/WG11 M12338, July 2005.
[12] K. Yamamoto, R. Oi, T. Mishina and M. Okui, “Halfzone-plate Processing for Objects at the Both Sides of Hologram Display ” , Proc of SPIE, Vol.6912, 69120Q, Jan. 2008, pp.1-10.
[13] T. Senoh, T. Mishina and M. Okui, “A Study on Astigmatism Compensation for Electronic Holography”, ITE Journal,Vol.62,No.7, Japan, 2008,pp.1127-1131.
[14] O. Bryngdahl and A. Lohmann, “ Single-sideband holography”,J Opt. Soc. Am. Vol.58, 1968, pp.620-624.
[15] T. Senoh, T. Mishina and M. Okui, “A Study on Aberration Compensation for Electronic Holography” , ITE Tech Rep,Vol.32,No.27, Japan, June 2008,pp.9-12.
[16] T. Senoh, K. Yamamoto, R. Oi, T. Mishina and M. Okui, “Study on Occlusion of Electronic-Holography Generated from 2D Image plus Depth Map”, ITE Tech Rep, Vol.32, No.36, Japan, Sep. 2008, pp.21-24.
[17] Y. Sakamoto, K. Kanazawa and Y. Aoki, “Three-dimensional Imaging from Volume Data using Computer-Generated Hologram” , Proc of 3D Image Conference 2002,Japan, 2002,pp.209-212.
[18] N. Magnenat-Thalmann, and D. Thalmann,“ Image Synthesis”, Springer-Verlag, 1987, pp.84-107.
[19] M. Tanimoto, T. Fujii and K. Suzuki, “Multi-view depth map of Rena and Akko & Kayo”, ISO/IEC JTC1/SC29/WG11, M14888, October 2007.
[20] M. Tanimoto, T. Fujii and K. Suzuki, “Improvement of Depth Map Estimation and View Synthesis”, ISO/IEC JTC1/SC29/WG11, M15090, Jan. 2008.
133