Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and...

8
Calibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract In this paper, we propose a stereo system that consists of a single camera with multiple omnidirectional mirrors. The system offers omnidirectional observation, portability, and real-time detection of near objects. To detect near objects, it is necessary to create a model of the shape and loca- tion of the mirrors. However, accurate modeling is diffi- cult because of alignment errors in setting up the mirrors; thus making calibration of the system difficult as well. We propose a method of calibrating a multiple omnidirectional mirror sensor through observing a point light source at an infinite range. Detectable distances of the system are then calculated at each pixel. As a result, it is proved that the detectable distance of the mirror is enhanced by using a combination of various mirrors. 1 Introduction Recently, many surveillance cameras have been placed in street environments for crime prevention. However, crimes are committed more in places having a deserted feeling. In such places, a wearable security system is more effective than a fixed one. If a fixed security system is used, many cameras are necessary even though few people pass by; making this not a very efficient solution. On the other hand, since a wearable security system is individual equipment for each person, it can work anywhere. A wearable secu- rity system needs omnidirectional observation, portability, and real-time detection of near objects. A stereo system, consisting of a single camera with multiple omnidirectional mirrors, is proposed to fulfill these demands. Several omnidirectional stereo systems have already been proposed. Two panoramic images acquired by rotating a camera are used for an omnidirectional stereo[1, 2]. Since it is necessary to acquire images at a single time, moving a camera is not suitable for detecting objects in real-time. Catadioptric stereo methods with two cameras [3, 5] use a parabolic and a hyperbolic mirror for each camera to gen- erate an omnidirectional image. Because a sensor fastened to a person needs to be small and light, two cameras are too large for this purpose. We have proposed a wearable system that satisfies the above requirements [6, 7]. Our system aims to detect ob- jects closer than a threshold range, and determines if the ob- ject is far away or not instead of searching for correspond- ing points along epipolar lines. Our sensor, therefore, can detect objects in real-time. For detecting near objects, the proposed method needs a lookup table of corresponding points for an infinite range. When creating the table, we need information about the shape of the mirrors and the location of the camera and mir- rors. However, as accurate information is difficult to obtain, it is difficult to create the table without calibrating the sys- tem. This paper describes a method of calibration through observing a point light source at an infinite range. Further, this method can use mirrors of any shapes. Another issue is to find the best mirror shape for the sys- tem. We discussed the arrangement of paraboloidal mirrors in [7], and in this paper we analyze the distances detectable by the system with the mirrors of arbitrary shape. However, the multiple mirrors constituting the system make it diffi- cult to theoretically calculate detectable distances. Thus, we compute distances that have sufficient disparity as de- tectable distances at every pixel. As a result, we obtain how the detectable distance changes when the size or the arrangement of each mirror is changed. Since the applica- ble sizes and arrangements of the mirrors are evaluated, it is useful for improving performance. 2 Omnidirectional Sensor with Com- pound Spherical Mirrors Our omnidirectional sensor consists of compound spherical mirrors. Fig.1 shows an experimental setup of the camera. The sensor has a large mirror and 6 small ones. Rays from an object hit these mirrors and reflected rays are projected on the image plane. Fig.2 is an example of an image of the sensor. Because the centers of the side mirrors are not on the camera axis, the images on the side mirror are distorted. The rays that hit different mirrors intersect each other at the object position; thus, we can compute the range of the object by triangulation. However, since the length of the baseline is the distance of the points where the rays hit the mirrors, the baseline is quite narrow. As it is diffi- cult to compute the range of objects at an arbitrary range, 1

Transcript of Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and...

Page 1: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Calibration and Performance Evaluation of Omnidirectional Sensor withCompound Spherical Mirrors

Abstract

In this paper, we propose a stereo system that consists of asingle camera with multiple omnidirectional mirrors. Thesystem offers omnidirectional observation, portability, andreal-time detection of near objects. To detect near objects,it is necessary to create a model of the shape and loca-tion of the mirrors. However, accurate modeling is diffi-cult because of alignment errors in setting up the mirrors;thus making calibration of the system difficult as well. Wepropose a method of calibrating a multiple omnidirectionalmirror sensor through observing a point light source at aninfinite range. Detectable distances of the system are thencalculated at each pixel. As a result, it is proved that thedetectable distance of the mirror is enhanced by using acombination of various mirrors.

1 Introduction

Recently, many surveillance cameras have been placed instreet environments for crime prevention. However, crimesare committed more in places having a deserted feeling. Insuch places, a wearable security system is more effectivethan a fixed one. If a fixed security system is used, manycameras are necessary even though few people pass by;making this not a very efficient solution. On the other hand,since a wearable security system is individual equipmentfor each person, it can work anywhere. A wearable secu-rity system needs omnidirectional observation, portability,and real-time detection of near objects. A stereo system,consisting of a single camera with multiple omnidirectionalmirrors, is proposed to fulfill these demands.

Several omnidirectional stereo systems have alreadybeen proposed. Two panoramic images acquired by rotatinga camera are used for an omnidirectional stereo[1, 2]. Sinceit is necessary to acquire images at a single time, movinga camera is not suitable for detecting objects in real-time.Catadioptric stereo methods with two cameras [3, 5] use aparabolic and a hyperbolic mirror for each camera to gen-erate an omnidirectional image. Because a sensor fastenedto a person needs to be small and light, two cameras are toolarge for this purpose.

We have proposed a wearable system that satisfies the

above requirements [6, 7]. Our system aims to detect ob-jects closer than a threshold range, and determines if the ob-ject is far away or not instead of searching for correspond-ing points along epipolar lines. Our sensor, therefore, candetect objects in real-time.

For detecting near objects, the proposed method needs alookup table of corresponding points for an infinite range.When creating the table, we need information about theshape of the mirrors and the location of the camera and mir-rors. However, as accurate information is difficult to obtain,it is difficult to create the table without calibrating the sys-tem. This paper describes a method of calibration throughobserving a point light source at an infinite range. Further,this method can use mirrors of any shapes.

Another issue is to find the best mirror shape for the sys-tem. We discussed the arrangement of paraboloidal mirrorsin [7], and in this paper we analyze the distances detectableby the system with the mirrors of arbitrary shape. However,the multiple mirrors constituting the system make it diffi-cult to theoretically calculate detectable distances. Thus,we compute distances that have sufficient disparity as de-tectable distances at every pixel. As a result, we obtainhow the detectable distance changes when the size or thearrangement of each mirror is changed. Since the applica-ble sizes and arrangements of the mirrors are evaluated, it isuseful for improving performance.

2 Omnidirectional Sensor with Com-pound Spherical Mirrors

Our omnidirectional sensor consists of compound sphericalmirrors. Fig.1 shows an experimental setup of the camera.The sensor has a large mirror and 6 small ones.

Rays from an object hit these mirrors and reflected raysare projected on the image plane. Fig.2 is an example of animage of the sensor. Because the centers of the side mirrorsare not on the camera axis, the images on the side mirror aredistorted. The rays that hit different mirrors intersect eachother at the object position; thus, we can compute the rangeof the object by triangulation. However, since the lengthof the baseline is the distance of the points where the rayshit the mirrors, the baseline is quite narrow. As it is diffi-cult to compute the range of objects at an arbitrary range,

1

Page 2: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Figure 1: Omnidirectionalcamera with compoundspherical mirrors

Vertical View Horizontal View

Figure 2: Configuration ofcompound spherical mirrors

Figure 3: An example of an image of the sensor.

our method only determines if an object is at infinite rangeor not. This approach is applicable for a narrow baselinesystem and the computational cost is low.

3 Detection of Near Objects by Cata-dioptric Stereo

3.1 Computing Corresponding Points for anInfinite Range

In this section, we compute the corresponding points whenan object at an infinite range is found in several of the sensormirrors. Fig.4 shows the situation for the central mirror.O is the origin of the camera coordinate system.d is thedistance from the camera origin to the center of the mirrorC, the radius of which isR. When an object is projectedonto the pointx in the central mirror, a ray from the objecthits the mirror at the pointp, 6 pCO = θ, and 6 pOC = φ.And x is defined by usingγ,

(cx + l cos γ, cy + l sin γ) (1)

Image Plane

Figure 4: The ray direction reflected on the center mirror.

in the image coordinate system, where(cx, cy) is the imagecenter. Then,

tanφ =l

f(2)

wheref (in pixels) is the focal length of the camera. Byconsidering the triangle4pCO,

tanφ =R sin θ

d − R cos θ. (3)

We computeθ by solving (3).The incident angleα is θ +φ, because the normal vector

at p is ~Cp. Since the incident and reflected angles are thesame,

β = α + θ = 2θ + φ. (4)

Thus, the ray vectoru from the mirror to the object becomes

(cos γ sinβ, sin γ sin β,− cos β). (5)

If the object is at an infinite range, the ray directions fromthe object to the mirrors are parallel. Fig.5 shows a situationin which the same object is found in one of the side mirrors.The incident ray vector is parallel tou. Since the unit vectorv′

0 from the originO to the center of the side mirrorC ′ isknown,

cos β′ = −u · v′0. (6)

In a manner similar to the case of the center mirror, we com-puteθ′ by solving

tanφ′ =r sin θ′

d′ − r cos θ′, (7)

whereφ′ = β′ − 2θ′ andd′ =√

d2 + R2.

2

Page 3: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Image Plane

Figure 5: The ray direction reflected on one of the side mir-rors.

If the unit vectorw is perpendicular tov′0 and u, w

is given by normalizingv′0 × u. Then, the vectorv′ =

(v′x, v′y, v′z) from the origin to the pointp′, where the ray

hits the side mirror, is computed by rotatingv′0 aroundw

by angleφ′. Finally, the pointx′ on which the object isprojected onto in the side mirror is computed as

(cx + l′ cos γ′, cy + l′ sin γ′), (8)

where

l′ = f

√v′2

x + v′2y

v′z

, tan γ′ =v′yv′

x

(9)

Since the corresponding points are computed as prepro-cessing, our method refers to a lookup table to find the cor-responding point when it computes the difference betweenthe images of the mirrors.

3.2 Detecting Near Objects

If an object is close enough to the sensor, the projected pointof the object is different from the corresponding point for aninfinite range. Thus, our method takes into account the dif-ference of the intensity between the images of the mirrors.The intensity of the central mirror isI(x), and the intensityof the corresponding point of a side mirrori is Ii(x′). ThecriterionE(x) to detect near objects becomes

E(x) =N∑i

|I(x) − Ii(x′)|, (10)

Input Image

Transformed Image

Low Pass Filter

DifferenceTransform

6 Difference-Images Binary Image

Threshold Denoise

Figure 6: Flowchart of detecting near objects

whereN is the number of side mirrors on which the objectis projected onto.

The flowchart of actual processing is like Fig.6. Thisscene is generated by the ray tracing method. In this ex-ample, a person stands near the sensor. The image of aside mirror is transformed by projecting every point of theside mirror to the center image according to the correspon-dences. The near object is detected by taking into accountthe difference between the original center image and thetransformed image. Since we obtain 6 difference images,we can use the sum of them. After removing the noise ofthe sum of the differences of images by erosion and dilation,the person standing near the sensor is detected.

4 Detecting Corresponding points forCalibration

For detecting near objects, the proposed method needs alookup table of corresponding points for an infinite range.When creating the table, we need information about the

3

Page 4: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Infinite Range

Point Light Source

Figure 7: Sensors attached on turntables for detecting cor-responding points

Light Source

Parabolic Mirror

Parallel Rays

Figure 8: System using parabolic mirrors for making paral-lel rays

shape of the mirrors and the locations of the camera andmirrors. However, as accurate information is difficult to ob-tain, it is also difficult to create the table and accurately cal-ibrate the system.

We propose a method for detecting corresponding pointsthrough observing a point light source at an infinite range.This method can use the compound spherical mirrors, but itcan also use mirrors of any shape. Since the light source isat an infinite range, the light is considered as a parallel one.By attaching the sensor on two turntables, the light source isobserved while rotating the sensor. The light is observed atseveral points in an image because the light is reflected byeach mirror. We regard them as the corresponding points.The lookup table is obtained by finding these correspond-ing points while rotating the sensor. Fig.9 is the flowchartfor detecting corresponding points from an image. For sim-plicity, we used a red light source and a blue background.After taking out the red component for removing the back-ground image, the corresponding points become the centersof gravity of the light source in the image.

Input Image

Decomposition

Red Component

CalculatingMoment

Background

Reflected Light Source

Figure 9: Flowchart for detecting corresponding points

8 Side Mirrors Parabolic Mirrors8 Side Mirrors Parabolic Mirrors

Figure 10: Example of mirror shape for the sensor

5 Estimation of Detectable Distance

Another issue is to find the best mirror shape for the system.In this section, we analyze the distances detectable by thesystem with mirrors of arbitrary shape.

5.1 Simulation of Omnidirectional Sensor

In the previous section, we used compound spherical mir-rors for the sensor. If a mirror of another shape is used, thedetectable distance will change. For example, Fig.10 showstwo examples, a large mirror and 8 small mirrors, and alarge parabolic mirror and 6 small parabolic ones.

To evaluate the performance of a sensor with these mir-rors it is necessary to theoretically or experimentally com-pute the detectable distances of the sensor .

5.2 Calculating of Detectable Distance

Because detectable distances are different according to theposition of the image, we calculate them for each pixel.

The corresponding points can be calculated by the equa-tions described in Section 3.1. Fig.11 shows the ray direc-tion reflected on the compound spherical mirrors. The rayfrom an infinite range is projected on the pointx reflectedby the center mirror, and then projected on the pointx′ re-flected by the side mirror. Then,x andx′ are correspond-ing points. If the sensor can detect objects, which are suf-ficiently near, the ray from the objects reflected on the side

4

Page 5: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Image Plane

Object

1[pixel]

Figure 11: Ray direction when the sensor detects objects

Image Plane

Figure 12: Location of the projected points on the image

mirror is projected on the pointx′′. x′′ is a point away fromx′ by 1 pixel. Therefore, the object exists at the intersect-ing point of the ray projected ontox and the one projectedontox′′. The detectable distance onx becomes the distancefrom C to the object.

However, even if all parameters are known, includingthose of the camera, mirrors and corresponding points, com-puting the intersecting point is still difficult if we use mir-rors of intricate shapes. Therefore, we find an approxima-tion of x′′ experimentally instead of computing the exactintersecting point. Now, we assume thatx, x′ and x′′

take discrete positions on the image. On the assumption,as shown in Fig.12,x′′ is a point 1 pixel fromx′.

If an object exists at the pointX, which is the intersect-ing point of the ray reflectedp on a mirror and the ray re-flectedp′ on another mirror. Sincex andx′ are calculateddiscretely, the rays may not intersect. Thus, it is assumed

TurnTable1

TurnTable2

Figure 13: The system for detecting corresponding points

that the object exists at a midpoint where these rays ap-proach closest. Using independent parameterst1 and t2,the equations of these rays become as follows:{

X1 = ut1 + pX2 = u′t2 + p′′ (11)

SinceX is average ofX1 andX2, X becomes

X =X1 + X2

2(12)

wheref(t1, t2) = |X1 − X2| (13)

is minimized. We compute the distance ofC to X for allneighboring pixelsx′′ and choose one ofx′′ that has theminimum distance betweenX1 andX2. The detectable dis-tance is the distance ofC to X, which is computed by usingthe chosenx′′.

6 Experiment

6.1 Experiment Detecting CorrespondingPoints

We experimented by detecting corresponding points of thesensor. The calibration system is shown in Fig.13. It con-sists of two turntables, with a sensor attached to each ofthem. The sensor is independently rotated in the directionsof two axes. Since a ray from a point light source at an in-finite range is a parallel ray, we use a parallel ray insteadof a point light source at an infinite range. Each turntableis rotated 360 degree, and at every one degree images aretaken. We take out the red component to remove the back-ground image, because it is measured by using a red ray inthe situation when the background is blue. The correspond-ing points are the center of gravity of the light source in the

5

Page 6: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Figure 14: Measuring result of corresponding points

Table 1: Errors of corresponding points with/without theperipheral area of the mirrors

With WithoutRMS[pixels] 0.814 0.696SD[pixels] 0.300 0.190

image. Fig.14 shows the result of the detection of corre-sponding points. In this figure, corresponding points havethe same color.

Next, we evaluate the accuracy of computing corre-sponding points by simulation. Since the image of a lightsource is not a pixel, but has some area, an error in esti-mating the center of gravity occurs. We compare the corre-sponding points detected by this method with the theoreti-cally computed ones, which are calculated by the equationsdescribed in Section 3.1.

We compared the results using the sensor that has a largemirror and 6 small mirrors. Their diameters are 50[mm] and20[mm], the image size is640 × 480[pixels] and the objectis illuminated by a parallel light. We adjust the parame-ters of the simulation as a light source is projected With a10×10[pixels2] area in an image. We calculated the RMSand SD of the error as shown in Table 1. If we use the pe-ripheral area of the mirrors, the error becomes large. Thus,we compare the results with/without the peripheral area ofthe mirrors. Since the errors are sufficiently small, our cal-ibration method is validated as working without having toassume the shape of the sensor.

Vertical View Horizontal View

Figure 15: Configuration of compound spherical mirrors

Figure 16: Input image usedfor simulation of detectingobjects

Figure 17: Scene used forsimulation of detecting ob-jects

6.2 Detecting Objects by Simulation

In this section, we simulated a sensor for detecting an objectaround the sensor. The sensor has a large mirror and 6 smallmirrors, and is shown in Fig.15. The diameters are 50[mm]and 20[mm], image size is640 × 480[pixels].

The observed scene is Fig.17 and the input image isFig.16. This scene consists of several poles with radii of20[cm], which are set around the sensor at every 50[cm]distance from the sensor. We detect the poles using themethod described in Section3. Fig.18 shows the detectionresult. The result shows that nearly a thirdlyof the poles canbe detected. Then, the sensor can detect that an object existswithin 150[cm]; however, it does not detect an object thatexists over 200[cm].

6.3 Estimation of Detectable Distance

In this section, we calculate the detectable distance of thesensor shown in Fig.19. Fig.20 is the result of the de-tectable distance calculated by the method described in Sec-tion 5.2. The detectable distance is computed by the paral-lax between the center mirror and one of the side mirrors.Since the detectable distances are computed by 6 pairs ofmirrors, they are obtained by taking the maximum distancefrom those distances. As a result, the sensor can detect anobject that exists within a distance of about 2.0[m]. This

6

Page 7: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

Figure 18: Result of detecting

Focal Point

Image Plane

Focal Point

Image Plane

Figure 19: Parameter of the sensor that calculates detectabledistance

result matches the result of the above simulation.Next, we consider that the sensor may be able to de-

tect objects farther away, when using the parallax of theside mirrors. Thus, we calculate the corresponding pointsbetween the side mirrors, and calculate the detectable dis-tances by the method described in Section5.2. Fig.21 is theresult of the detectable distance. Fig.22 and Fig.23 shows acomparison of Fig.20 and Fig.21 aty = 150, y = 90. As aresult, the result of the detectable distance is improved andthe sensor can detect farther away objects.

7 Conclusion

This paper describes a method for calibrating a multipleomnidirectional mirror sensor and the distances detectableby the system are estimated for each pixel. Since accuratemodeling is difficult because of alignment errors in settingup mirrors, we proposed a method for detecting correspond-ing points through observing a point light source at an infi-nite range. This method can use not only compound spher-

Figure 20: Detectable distance[mm]

Figure 21: Detectable distance using parallax of side mir-rors[mm]

ical mirrors, but also mirrors of any shape. Moreover, herewe analyzed the distances detectable by the system withmirrors of arbitrary shape. We computed the detectable dis-tances at which an object has sufficient disparity in the im-age, for every pixel. Since mirrors of intricate shape makeit difficult to theoretically calculate the detectable distance,we experimentally find an approximation of the intersectingpoint instead of computing the exact one. As a result, it wasproved that the detectable distance of a mirror is enhancedby using a combination of various mirrors.

References

[1] H.Ishiguro, M.Yamamoto and S.Tsuji: “Omni-directionalstereo”IEEE Transactions on Pattern Analysis and Machine

7

Page 8: Calibration and Performance Evaluation of …cgeyer/OMNIVIS05/papers/3364.pdfCalibration and Performance Evaluation of Omnidirectional Sensor with Compound Spherical Mirrors Abstract

0

1000

2000

3000

4000

5000

6000

0 50 100 150 200 250 300 350

dist

ance

[mm

]

x[pixel]

Figure 22: Comparison of detectable distance[mm](y =150)

Intelligence, 14,2,pp.257–262, 1992.

[2] S.Peleg and M.Ben-Ezra: “Stereo panorama with a singlecamera”Proc.IEEE Conference on Computer Vision and Pat-tern Recognition, Ft.Collins, Colorado, pp.395–401, 1999.

[3] J.Gluckman and S.Nayar: “Real-time omnidirectional andpanoramic stereo”Proc.of Image Understanding Workshop,1998.

[4] Y.Negishi, J.Miura, Y.Shirai: “Map Generation of a MobileRobot by Integrating Omnidirectional Stereo and Laser RangeFinder” the Journal of the Robotics Society of Japan, 21, 6,pp.690–696, 2003, in Japanese.

[5] A.Chaen, K.Yamazawa, N.Yokoya, and H.Takemura, “Om-nidirectional stereo vision using hyperomni vision,”IEICE,Tech. Rep., pp.96–122, 1997, in Japanese.

[6] anonymous for review

[7] anonymous for review

0

1000

2000

3000

4000

5000

6000

0 50 100 150 200 250 300 350

dist

ance

[mm

]

x[pixel]

Figure 23: Comparison of detectable distance[mm](y =90)

8