1 2012 IEEE International Conference on Multimedia and Expo.

Post on 17-Jan-2018

222 views 0 download

description

Limitations a) object tracking is limited to the image plane, not in the physical world [4]. b) it assumes static background [1]. c) it incurs high computational complexity.[6] [1] R. Rosales and S. Sclaroff, “3D Traject Recovery for Tracking Multiple Objects and Trajectory Guided Recognition of Actions,” [4] S. Weng, C. Kuo and S. Tu, “Video object tracking using adaptive Kalman filter,” [6] M. Roh, T. Kim, J. Park and S. Lee, “Accurate object contour tracking based on boundary edge selection” 3

Transcript of 1 2012 IEEE International Conference on Multimedia and Expo.

1

Video Based Real-world Remote Target Tracking On Smartphones

Qia WangAlex Lobzhanidze

Hyun I. Jang Wenjun Zeng

Yi ShangDept. of Computer Science, University of Missouri,

Columbia

2012 IEEE International Conference on Multimedia and Expo

2

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

3

Introduction Moving object tracking is an important problem in many

video applications, such as surveillance and traffic management.

Limitations

a) object tracking is limited to the image plane, not in the physical world [4].

b) it assumes static background [1].

c) it incurs high computational complexity.[6]

[1] R. Rosales and S. Sclaroff, “3D Traject Recovery for Tracking Multiple Objects and Trajectory Guided Recognition of Actions,”[4] S. Weng, C. Kuo and S. Tu, “Video object tracking using adaptive Kalman filter,”[6] M. Roh, T. Kim, J. Park and S. Lee, “Accurate object contour tracking based on boundary edge selection”

4

Introduction A new tracking algorithm and system for use on mobile

smartphones

Light-weight computing

due to the limited computation power of smartphones, complex algorithms requiring high time complexity and space complexity will not fit.

Reasonable accuracy

to estimate a remote target positionbased on video, an accurate tracking with the object boundaryappropriately identified is very important.

Interactive user interfaceInput from user’s interaction is a unique feature on smartphones and is very useful for video tracking.

5

Introduction With our system, a user holds the smartphone and takes

a short video clip of the remote moving object, then simply draws a rough bounding box enclosing the remote target on the first frame.

Our algorithm then derives a tight bounding box around the target object in all video frames.

Assuming the physical width and height of the remote target is known, we then borrow the idea from the POSTIT algorithm [13]: the remote target position could be calculated based on the four 3-D points of the target and their four correspondences on the image.

[13] D. F. Dementhon and L. S. Davis , “Model-based object pose in 25 lines of code,”

6

Introduction We then use the Extended Kalman Filtering (EKF) model

[12] to estimate a smooth trajectory and velocity of the moving target.

While the video is being taken, the smartphone’s GPS coordinates and digital compass readings are recorded so that the moving object’s trajectory could be drawn on the map.

7

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

8

System Overview

9

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

10

Target Object Tracking Our goal is to generate a tight bounding box around the

target for all frames in the video to facilitate subsequent remote target position estimation.

Optical Flow Tracking [18]

Patch Classification

[18] B. D. Lucas and T. Kanade, “An interactive image registration technique with an application to stereo vision”.

11

Target Object TrackingOptical Flow Tracking Object motion and background motion are easier to

estimate with the help of the optical flow features.

shows the moving object’s optical flow features.

Variance (noise)

12

Target Object TrackingPatch Classification To obtain a tight and accurate bounding box, we need to

find the object’s canny edges that belong to the object boundary. Canny edges are detected first and added as a mask on top of the cropped image as shown in Figure 3.

13

Target Object Tracking Then intensity-based segmentation is performed to get

different intensity clusters.background patches object patches

14

Target Object Tracking The Bayesian rule is then applied to determine which

category a patch belongs to.

For patch , its prior probability P(B) of belonging to the background and probability P(O) of belonging to the object should be equal if no other information is available.

15

Target Object Tracking

position of a pixel from patch L’s gray scale intensity value at frame t

motion compensated prediction residual errors

16

Target Object Tracking

threshold value

17

Target Object Tracking Patches located along the sides of the bounding box

have P(B) P(O)

Patches located around the geo-center of the bounding box have

P(O) P(B)

18

Target Object Tracking To obtain a tight bounding box, all we need is to detect

the canny edges belonging to the object. K-Nearest Neighbor algorithm For each canny edge pixel , find its neighborhood pixels

whose distance to is less than R. (R = 3) will be treated as a canny edge pixel belonging to the

object.

threshold value

19

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

20

Remote Target Position Estimation

21

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

22

Extended Kalman Filtering To estimate the moving object’s true moving trajectory.

�⃗�=(𝑎𝑥

𝑎𝑦)

�⃗�𝑡+1=(𝑝𝑜𝑠𝑥+𝑣𝑒𝑙𝑥∆𝑇+ 1

2𝑎𝑥∆𝑇 2

𝑝𝑜𝑠 𝑦+𝑣𝑒𝑙𝑦∆𝑇+ 12𝑎𝑦 ∆𝑇

2

𝑣𝑒𝑙𝑥+𝑎𝑥∆𝑇𝑣𝑒𝑙𝑦+𝑎𝑦∆𝑇

)

23

Kalman gain

Correct

State predictionCovariance prediction

Prediction

Correction

Extended Kalman Filtering

noise

measurement error𝐶=(1 0 0 0

0 1 0 0)

Observation model

Observation matrix

24

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

25

Implementation

26

Implementation

27

Implementation

a: Optical Flow, b:Canny Edge, c: Segmentation,

d:Patch Motion Estimation, e:Connected Component Analysis

28

Implementation

29

Implementation

30

Outline Introduction System Overview Target Object Tracking Remote Target Position Estimation Extended Kalman Filtering Implementation Conclusion

31

Conclusion A novel, easy-to-use, video-based remote target

tracking system on commodity smartphones is presented.

A new light-weight video object tracking algorithm is developed specifically for the smartphone platform, by taking into account smartphone’s computation limitation as well as taking advantage of its user-friendly interface.