Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V....

19
Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January 16, 2002

Transcript of Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V....

Page 1: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

Image-Based Target Detection and Tracking

Aggelos K. Katsaggelos

Thrasyvoulos N. Pappas

Peshala V. Pahalawatta C. Andrew Segall

SensIT, Santa FeJanuary 16, 2002

Page 2: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

2

Introduction

Objective: Impact of visual sensors on benchmark and operational scenarios

Project started June 15, 2001 Video data acquisition Initial results with imaging/video sensors

– For Convoy Intelligence Scenario– Detection, tracking, classification– Image/video communication

Page 3: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

3

Battlefield Scenario*

Gathering Intelligence on a Convoy

• Multiple civilian and military vehicles

• Vehicles travel on the road

• Vehicles may travel in either direction

• Vehicles may accelerate or decelerate

Objectives

• Track, image, and classify enemy targets

• Distinguish civilian and military vehicles and civilians

• Conserve powerNon-Imaging

Sensor

Imager

* Jim Reich, Xerox PARC

Page 4: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

4

Experimental Setup

Imager Type

• 2 USB cameras attached to laptops (uncalibrated)

• Obtained grayscale video at 15 fps

Imager Placement

• 13 ft from center of road, 60 ft apart

• Cameras placed at an angle relative to the road to capture large field of view

Test Cases:

• One target at constant velocity of 20mph

• One target starts at 10mph, increases to 20mph

• One target starts at 10mph, stops and idles for 1min, and then accelerates

• Two targets from opposite directions at 20mph

Non-Imaging Sensor

Imager

Page 5: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

5

Tracking System

BackgroundRemoval

PositionEstimation

Tracking

CameraCalibration

(offline) Video

Sequence

ObjectLocation

Page 6: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

6

Background Model *

Basic Requirements• Intensity distribution of background pixels can vary (sky, leaf, branch)• Model must adapt quickly to changes

Basic ModelLet Pr(xt) = Prob xt is in background

2

2

2

)(

1 2

11)Pr(

it yxN

it e

Nx

• yi = xs, some s < t, i = 1,2, …, N

• xt is considered background if Pr(xt) > Threshold

• Equivalent to a Gaussian mixture model.

• based on MAD of consecutive background pixels

* Ahmed Elgammal, David Harwood, Larry Davis “Non-parametric Model for Background Subtraction,” 6th European Conference on Computer Vision, Dublin, Ireland, June/July 2000.

y1

yN

y2

y3

xs

Page 7: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

7

Estimation of Variance ()

Sources of Variation

• Large changes in intensity due to movement of background (should not be included in )

• Intensity changes due to camera noise

Estimation Procedure

• Assume yi ~ N(, 2)

• Then, (yi-yi-1) ~ N(0, 22)

• Find Median Absolute Deviation (MAD) of consecutive yi ’s

• Use m to find from:

)( 1 ii yymedianm

268.0

m

Page 8: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

8

Segmentation Results

Foreground extraction of first target at 20mph

Foreground extraction of second target at 20mph

Page 9: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

9

Camera Calibration

h

L

X1

X2

f

d1

d2

1. X1 = h / tan( - )

2. X2 = h / tan( - )

3. L = X1 - X2 = h [1 / tan( - ) - 1 / tan( - )]

) (tan

) (tan1

) (tan

) (tan1

Lh

d2

d2

d1

d1

f

f f

f

Variables to estimate: f and Assumptions

• Ideal pinhole camera model

• Image plane is perpendicular to road surface

Page 10: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

10

Calibration Results

12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28400

600

800

1000

1200

1400

1600Focal Length vs Theta

Theta (Degrees)

Fo

cal L

en

gth

(P

ixe

ls)

Page 11: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

11

Tracking

Median Filtering

• Used to smooth spurious position data

• Doesn’t change non-spurious data

Kalman Filtering

• Constant acceleration model

• Initial conditions set by our assumptions

• Used to track position and velocity

Page 12: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

12

Results: Target #1 20 mph

0 5 10 15 20 25 30 35 40 45 5020

40

60

80

100

120

140

time

posi

tion

(fee

t)

Kalman Filter Position Estimate

Kalman EstimateObservations Median Output

0 5 10 15 20 25 30 35 40 45 50-50

-45

-40

-35

-30

-25

-20

-15

-10

-5

0

time

velo

city

(m

iles

per

hour

)

Kalman Filter Velocity Estimate

Page 13: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

13

Results: Target #2 20 mph

0 10 20 30 40 50 6020

40

60

80

100

120

140

time

posi

tion

(fee

t)

Kalman Filter Position Estimate

Kalman EstimateObservations Median Output

0 10 20 30 40 50 60-40

-35

-30

-25

-20

-15

-10

-5

0

time

velo

city

(m

iles

per

hour

)

Kalman Filter Velocity Estimate

Page 14: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

14

Results: Target #1 10-20 mph

0 10 20 30 40 50 60 7020

30

40

50

60

70

80

90

100

110

120

time

posi

tion

(fee

t)

Kalman Filter Position Estimate

Kalman EstimateObservations Median Output

0 10 20 30 40 50 60 70-30

-25

-20

-15

-10

-5

0

time

velo

city

(m

iles

per

hour

)

Kalman Filter Velocity Estimate

Page 15: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

15

Results: Target #2 Stop-Start

0 50 100 150 200 250 30020

40

60

80

100

120

140

time

posi

tion

(fee

t)

Kalman Filter Position Estimate

Kalman EstimateObservations Median Output

0 50 100 150 200 250 300-30

-25

-20

-15

-10

-5

0

5

time

velo

city

(m

iles

per

hour

)

Kalman Filter Velocity Estimate

Page 16: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

16

Work in Progress

Improving and automating camera calibration process Improving foreground segmentation results using

– background subtraction– image feature extraction (color, shape, texture)– spatial constraints in the form of MRFs– information from multiple cameras

Estimating accuracy of segmentation– use result to improve Kalman filter model

Multiple object detection Object recognition Integration with other sensors

Page 17: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

17

Other Issues

Communication between sensors– When/what to communicate– Power/delay/loss tradeoffs

Communication of image/video– Error resilience/concealment– Low-power techniques

Communication of data from multiple sensors– Multi-modal error resilience

Page 18: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

18

Low-Energy Video Communication*

Method for efficiently utilizing transmission energy in wireless video communication

Jointly consider source coding and transmission power management

Incorporate knowledge of the decoder concealment strategy and the channel state

Approach can help prolong battery life and reduce interference between users in a wireless network

* C. Luna, Y. Eisenberg, T. N. Pappas, R. Berry, and A. K. Katsaggelos, "Transmission energy minimization in wireless video streaming applications," Proc. of Asilomar Conference on Signals, Systems, and Computers, Pacific Grove, California, Nov. 4-7, 2001.

Page 19: Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.

Image-Based Target Detection and Tracking

Aggelos K. Katsaggelos

Thrasyvoulos N. Pappas

Peshala V. Pahalawatta C. Andrew Segall

SensIT, Santa FeJanuary 16, 2002