A study on data fusion techniques used in multiple radar tracking

6

Click here to load reader

description

This project aimed to compare the use of and resultant errors when Measurement Fusion (Plot Fusion) and Track Fusion were used to combine data from various sensors in a simulated environment analogous to the Singaporean environment. The environment and analysis was done wholly using a program executed by MATLAB 6.1, and results showed that Measurement Fusion was more accurate when tracking objects following a path with many turns. However, the major source of error was not the fusion algorithm, but the inclusion algorithm.

Transcript of A study on data fusion techniques used in multiple radar tracking

Page 1: A study on data fusion techniques used in multiple radar tracking

A Study on Data Fusion Techniques Used in Multiple Radar Tracking

Tan Xuan You1, Lee Kar Heng2

Abstract

This project aimed to compare the use of and resultant errors when Measurement Fusion (Plot Fusion) and Track Fusion were used to combine data from various sensors in a simulated environment analogous to the Singaporean environment. The environment and analysis was done wholly using a program executed by MATLAB 6.1, and results showed that Measurement Fusion was more accurate when tracking objects following a path with many turns. However, the major source of error was not the fusion algorithm, but the inclusion algorithm.

Introduction

Rationale In the tumultuous post-911 world, it is important that Singapore take steps to enhance her security. Fusion Algorithms allow data from various sensors to be combined, painting a more accurate picture of the happenings in her waterways and allowing her to take more immediate action if a threat is determined.

Explanation of Measurement Fusion and Track Fusion “Fusion” is a term describing the integration of data from multiple sensors using a combination of mathematical algorithms and data transfer architecture. Two basic architectures are used in theory, Measurement Fusion and Track Fusion. In practice, however, a combination, or hybrid system is often used.

(a) Measurement Fusion

(b) Track Fusion

Figure 1 Data Fusion Architectures Measurement Fusion (Figure 1a) can be best described as a central-level tracking system. In pure Measurement Fusion, raw data is directly obtained from the sensors and is parsed through a single process that generates a track of the object. Track Fusion (Figure 1b) can be described as sensor-level tracking. In pure Track Fusion, each sensor’s raw data is passed through a process that generates the track of each object as detected by that sensor.

_________________________________ 1 SRP Student, Hwa Chong Junior College 2 Sensor Systems Division, Defence Science Technology Agency

Page 2: A study on data fusion techniques used in multiple radar tracking

These tracks are then passed to a central process that combines these tracks to obtain the location and track history of the object. In practice, many systems use a combination of both architectures, or use feedback algorithms in calibration of sensor-level tracking. Some systems may also use the best individual track as the output track, to prevent misleading data from other sensors from compromising the overall accuracy of the sensor track.

Hypothesis The hypothesis is that pure Track Fusion is more accurate in pinpointing the object track, while pure Measurement Fusion is faster in pinpointing the objects location. This is because an average of covariance estimates in the first case would present a rather accurate track averaged from many tracks, as compared to the covariance estimates based on raw data only. However, for an objects location at a particular point in time, Measurement Fusion can find the centroid of the various raw data points collected, which would be a feasible and accurate location estimate. However, since military applications commonly deal with fast-moving objects, Track Fusion is on the whole a superior fusion architecture to use in military applications.

Materials and Methods

MATLAB 6.1 was used to simulate various situations using 5 simulated radars placed at locations along Singapore’s coast1. The simulated radars had percentage detection rates at 80%, 90%, or 95%, and used update rates of 1sec, 2sec or 4sec. Targets were given different tracks, and their speed varied from 10 knots to 60 knots. The raw data was then either fused using measurement (perfect association algorithm) or track (simple convex algorithm) fusion, and the errors in object position, azimuth and range were then plotted on a graph against time. The following scenarios and radar locations were used, to a rough scale.

Figure 2 Simulation Scenarios These routes include the paths taken by large ships through the Straits of Singapore, as well as other routes confusing and challenging to the fusion algorithms. A, B, …, E indicate the simulated radar positions. 1 Locations taken from Nautical Chart

Page 3: A study on data fusion techniques used in multiple radar tracking

Results and Analysis Note: Errors in Elevation and Azimuth were not taken into consideration as objects were deemed to be on the sea, which is at an assumed constant elevation.

Measurement Fusion versus Track Fusion In the situation “Turn after C”, when Track Fusion is used, the following error plot is obtained.

Figure 3 ‘Turn After C’ Tracking Using Track Fusion

This shows a large amount of position error (n-shaped graph) when the object is turning near the radar, and minimal position error when the object is moving in a straight line. This is probably due to the tracking algorithm. When the object turns, its low velocity makes it almost stationary with respect to the radar, and this makes it almost impossible for the tracking algorithm to predict its future location. Thus the position error is at its highest while the object is turning. Also, the continuously changing range results in a large range error (graph near horizontal axis) as the algorithm fails to predict the future range correctly.

Page 4: A study on data fusion techniques used in multiple radar tracking

Other Deductions The overview of the error plot of Measurement Fusion in ”Trading Route” was particularly interesting.

Figure 4 ‘Trading Route’ Tracking Using Measurement Fusion

This shows a large positional error when the object is far away from the radars (in the beginning and in the end). Obvious as this observation may be, it points out a flaw in the algorithm. While tracking the object, too many radars are used out of the five radars available, producing a larger error than if the closest radars were used. This was verified in the error plot for Measurement Fusion in the situation “Convoluted”.

0 500 1000 1500 2000 2500 3000 3500 4000 4500 5000-1

0

1

2

3

4

5

6

7

8x 10

4

Position ErrorRange ErrorAzimuth ErrorElevation error

Figure 5 ‘Convoluted’ Tracking Using Measurement Fusion

Page 5: A study on data fusion techniques used in multiple radar tracking

The large error towards the end is not a result of fusion error but contamination of accurate data from the nearest sensors by inaccurate data from the further sensors. The range error near the point with maximum range error is seen below:

1700 1750 1800 1850 1900

-4000

-3000

-2000

-1000

0

1000

2000

3000

4000Position ErrorRange ErrorAzimuth ErrorElevation error

Figure 6 Error Performance

This shows a relatively constant range error, magnitude of which does not exceed 1000. This is probably a result of the range not varying much through the entire trading route.

Other problems encountered 1. The program used many different files and segments to accomplish a given task. Program execution,

although flexible, was tedious and messy. 2. The program used was a simulation, and only reflected real-life situations to a limited extent. 3. The graphical user interface made it difficult to create a realistic, accurate situation with Singapore’s

radars. Lack of time and training in MATLAB and instruction in the program’s intricacies made it impossible to decipher the code and manually add in radar locations.

4. The simulation could not cope with objects passing close to the radars and paths with rapid, sharp turns. Some simulations had to be redone to compensate for this.

5. Many error reports were given, but due to lack of understanding of the program, some errors were unable to be solved.

Improvements

The algorithm could be improved to first find the centroid of the object locations detected, x0, then remove the two with largest error, and then find the new centroid, x1, which would be a more accurate value. Another possibility would be to set the software “maximum range” for the radar. The algorithm would ignore any data from that radar when the distance of the object exceeded a certain value. This would keep errors due to large distances between the object and the radar to a minimum.

Page 6: A study on data fusion techniques used in multiple radar tracking

Conclusion

In Measurement Fusion versus Track Fusion, Track Fusion fares worse than Measurement Fusion when objects turn rapidly, for example, dealing with speedboats and light ships. Analysis has shown that fusion methods for tracking large, slow moving ships do not improve accuracy much, and the systems can be redirected to more fast-moving, important targets. However errors due to measurement and Track Fusion are insignificant compared to errors from other sources. This is because the objects are travelling at a relatively low velocity, and any errors in sensor data are largely made up for by the large amount of data available to draw from. Improvements could be made both to the simulation algorithm and the algorithm used for fusion in practical applications. This would keep errors to a minimum. The simulation program could also be further developed and simplified. Further studies in this area can then be done with greater ease. In all, besides using a hybrid of Measurement Fusion and Track Fusion, changes could be made to the basic algorithm, and methods like best individual track could be used to provide the most accurate track for use.

Acknowledgements

My parents, for supporting me through my life, including this project. Mr Lee Kar Heng, my mentor, for being there and helping out despite various difficulties. Hwa Chong Junior College, The National University of Singapore and the Defence Science Technology Agency, for giving me the chance to do this project.

References

Lawrence A Klein. Sensor and Data Fusion Concepts and Applications. Ed. Donald C.O’Shea. Bellingham, WA: SPIE Optical Engineering Press, 1993. S J Symons, J A H Miles, J R Moon. “Comparison of Plot and Track Fusion for Naval Sensor Integration.” N.p.: 2002: 1361-1366. “Singapore Straits 1:200000 Nautical Chart” Maritime Port Authority Singapore: 1999: p202