Deliverable D 3.3 Report on real-time algorithm ...

18
G A 730836 P a g e 1 | 18 Deliverable D 3.3 Report on real-time algorithm implementation and performance evaluation Reviewed: (no) Project acronym: SMART Starting date: 01/10/2016 Duration (in months): 36 Call (part) identifier: H2020-S2RJU-OC-2015-01-2 Grant agreement no: 730836 Due date of deliverable: Month 30 Actual submission date: 30/09/2019 Responsible/Author: Muhammad Abdul Haseeb , UB Dissemination level: PU Status: Draft Ref. Ares(2019)6063278 - 30/09/2019

Transcript of Deliverable D 3.3 Report on real-time algorithm ...

Page 1: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 1 | 18

Deliverable D 3.3 Report on real-time algorithm implementation and

performance evaluation

Reviewed: (no)

Project acronym: SMART

Starting date: 01/10/2016

Duration (in months): 36

Call (part) identifier: H2020-S2RJU-OC-2015-01-2

Grant agreement no: 730836

Due date of deliverable: Month 30

Actual submission date: 30/09/2019

Responsible/Author: Muhammad Abdul Haseeb , UB

Dissemination level: PU

Status: Draft

Ref. Ares(2019)6063278 - 30/09/2019

Page 2: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 2 | 18

Document history

Revision Date Description

01 06.08.2019 First draft

02 18.09.2019 Second draft

03 30.09.2019 Submitted version

Report contributors

Name Beneficiary Short Name

Details of contribution

Danijela Ristić Durrant UB Initial draft, document structure and Chapters 1,2,3,4

Muhammad Abdul Haseeb

UB Chapter 5-Sections 5.1, 5.2, 5.3

Milan Banić UNI Sub-section 5.2.2

Danijela Ristić Durrant UB Sub-section 5.3.3

Muhammad Abdul Haseeb

UB Final review

Danijela Ristić Durrant UB Final document and submission

Page 3: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 3 | 18

Table of Contents 1. Executive Summary ................................................................................................................ 4

2. Abbreviations and acronyms .................................................................................................. 5

3. Background ............................................................................................................................. 6

4. Objective/Aim ......................................................................................................................... 6

5. SMART Obstacle Detection System (ODS) .............................................................................. 7

5.1 Requirement Analysis ............................................................................................................. 7

5.2 Hardware ................................................................................................................................ 8

5.2.1 Sensors ................................................................................................................................ 9

5.2.2 Network ............................................................................................................................... 9

5.2.3 Processing unit .................................................................................................................. 12

5.3 Software ................................................................................................................................ 13

5.3.1 Machine learning-based algorithms timing considerations ............................................. 14

5.3.3 Human-machine Interface ...................................................................................................... 16

6 References................................................................................................................................. 18

Page 4: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 4 | 18

1. Executive Summary

The main goal of SMART project is to increase the effectiveness and capacity of rail freight

through the contribution to automation of railway cargo haul at European railways. Two SMART

working streams are:

Development of a prototype of an autonomous obstacle detection system (ODS),

Development of a real-time marshalling yard management system.

The SMART solution for obstacle detection (OD) provides prototype hardware and software

algorithms for obstacle detection on the rail tracks ahead of the locomotive. The system

combines different vision technologies: thermal camera, night vision sensor (camera augmented

with image intensifier), multi RGB cameras, and laser scanner in order to create a multi-sensor

system for mid (up to 200 m) and long range (up to 1000 m) obstacle detection during day and

night operation, as well as during operation in poor visibility condition.

This deliverable document, (D3.3), reports the activities, effort and work undertaken in Work

Package 3 (WP3- Development of software algorithms for obstacle detection on railway tracks)

of the SMART project, focused on real-time ODS hardware and software implementation

The following documents provide additional perspectives for the present work:

1. D1.1 Obstacle Detection System Requirements and Specification 2. D2.1 Report on selected sensors for multi-sensory system for obstacle detection 3. D2.3 Report on sub systems conformance testing 4. D2.4 Report on functional testing of fully integrated multi-sensor obstacle detection

system 5. D3.1 Report on algorithms for 2D image processing. 6. D3.2 Report on SMART data fusion and distance calculations 7. D7.1 Report on evaluation of developed SMART technologies

Page 5: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 5 | 18

2. Abbreviations and acronyms

Abbreviation / Acronyms Description GUI Graphical user interface

GPS Global Positioning System

GPU Graphics processing unit

HMI Human-machine interface

ODS Obstacle Detection System

OD Obstacle Detection

ML Machine Learning

ROI Region of Interest (in an image)

ROS Robot operating system

RGB RGB (Red Green Blue) camera image

RTS Real-time systems

S2R JU Shift2Rail Joint Undertaking

SMART Smart Automation of Rail Transport

WP Work Package

Page 6: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 6 | 18

3. Background The present document constitutes the Deliverable D3.3 “Report on real-time algorithm implementation and performance evaluation” in the framework of the TD 5.6 Autonomous train operation, task 2, 2016-2019) of IP5 (MAAP version November 2015).

4. Objective/Aim This document has been prepared to provide report on real-time implementation of algorithms for autonomous obstacle detection developed within the Obstacle Detection work stream of the project SMART. SMART prototype of a novel reliable on-board system for obstacle detection on railway mainlines has been developed with a long-term goal of integration into planned Autonomous Train Operation (ATO) module over standardized interface. In this way, SMART will make an important contribution to the vision of a fully automated rail freight system (IP5 – TD5.6: “Autonomous Train Operation”).

Page 7: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 7 | 18

5. SMART Obstacle Detection System (ODS)

The basic need to detect and estimate the distance of various obstacles on the rail tracks in real-time is a big challenge. In following, the main development and performance characteristics of SMART real-time Obstacle Detection System (ODS) are presented. The focus is on vision-based obstacle detection system as in the final realization of SMART ODS, SMART vision sensors (cameras) were used for mid- and long-range obstacle detection in operational environment (details given in Deliverables D2.4 and D7.1). The SMART laser sensor, because of its limitations, was used only for mid-range dataset generation, that is for providing the ground truth data for off-line training machine learning models for obstacle detection (Deliverable 3.2).

Real-time systems (RTS) are defined as hardware and software-based systems that assure the

system response within the specified time constraints. The required “event-to-response time”, or in other words, “system response time” depends on many factors. For example, in a high-speed machine vision application where the system is physically moving fast or the capturing scene is moving fast, timing and hardware selection are among number of other factors to be taken into account when designing the system.

In addition, a challenging machine vision application is highly bandwidth demanding

application which brings a huge amount of data streams from vision sensor to the processing unit. For example, a high-resolution CT scan image, where dense or high-resolution data is needed, is a high-bandwidth application. Similar to high-speed applications, the high-bandwidth applications is characterized with many factors that need to be considered while designing the real-time system.

SMART ODS requires a high-bandwidth and low latency network to connect all cameras to

central processing unit. Based on the nature of the application, during the development of real-time SMART Obstacle Detection System (ODS) for rail transport, both challenges were presumed, high-speed and high-bandwidth. In the next section, the requirements of the SMART ODS system are discussed in detail.

5.1 Requirement Analysis

The requirements to make SMART ODS real-time are grouped in two main categories: high-speed and high-bandwidth requirements as listed below.

High-speed:

1. ODS mounted on a moving freight train running at 80 km/h, which is a speed of a conventional freight train.

2. Train-stopping distance of approximately 700 meters for above mentioned speed according to the regulations.

3. Moving or continuously changing scene/environment

High-bandwidth: 1. Mid-range (up to 200 m) and Long-range (up to 1000 m) obstacle detection requires high-

resolution images to enable distant objects visibility in image.

Page 8: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 8 | 18

2. Simultaneous data acquisition from multiple vision sensors. Based on the above-mentioned factors the minimal required response time and bandwidth

analysis was performed as described in following sections.

Minimum requirement approximation: In order to estimate the minimum requirement of SMART ODS, let’s assume a freight train

moving at 80 km/h (22.22 m/s) on a straight rail track. According to railway regulations, the stopping distance of the freight train for speed of 80 km/h is about 700 meters.

The required event-to-response time 𝑡𝑟𝑒𝑠 (time required by perception module) should be less

than data acquisition time 𝑡𝑎𝑐𝑞 , which means the system should be able to process the captured data (camera frame) and response before the acquisition of the next subsequent frame:

𝑡𝑟𝑒𝑠 < 𝑡𝑎𝑐𝑞 (1)

𝑡𝑝𝑟𝑜𝑐𝑒𝑠𝑠 = 𝑡𝑎𝑐𝑞 + 𝑡𝑟𝑒𝑠 (2)

The stopping distance 𝑑𝑠𝑡𝑜𝑝 and stopping time 𝑡𝑠𝑡𝑜𝑝 of the train equipped with SMART ODS while running with speed 𝑣 can be calculated as:

𝑡𝑠𝑡𝑜𝑝 = 𝑡𝑝𝑟𝑜𝑐𝑒𝑠𝑠 + 𝑡𝑏𝑟𝑎𝑘𝑖𝑛𝑔 + 𝑡𝑑𝑒𝑙𝑎𝑦 (3)

𝑑𝑠𝑡𝑜𝑝 = 𝑣 ∙ 𝑡𝑠𝑡𝑜𝑝 (4)

Where 𝑡𝑏𝑟𝑎𝑘𝑖𝑛𝑔 refers as time required to fully stop the train after the brake engagement, 𝑡𝑑𝑒𝑙𝑎𝑦 refers as driver reaction time after the ODS system warning and 𝑡𝒑𝒓𝒐𝒄𝒆𝒔𝒔 can be defined as total processing time, i.e. time required by the hardware and software components to deliver output from the moment at which the event occurred. However, the braking time 𝑡𝑏𝑟𝑎𝑘𝑖𝑛𝑔 depends on many physical factors of the train itself [1]:

• the speed of the train when the brakes are applied;

• the deceleration rate available with a full-service brake application, which varies according to the coefficient of friction between wheel and rail;

• the delay from when the brakes are commanded by the train driver to when they are actually become effective (brake delay time);

• the state of the wear of the brake pads and the air pressure available in the brake cylinders;

• the geography of the track, in particular, the track gradient the train travels over from when the brakes are commanded to where the front of the train stops;

• the mass distribution of the train.

5.2 Hardware

A real-time obstacle detection system for operational trains can be difficult to implement as a PC computer-based system because of the needed potential for high computational cost and algorithm complexity.

Page 9: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 9 | 18

Emerging embedded vision system-based solutions for robotics and security applications allow swift processing information and taking appropriate actions. Such systems show high performance, high accuracy while requiring low energy consumptions.

Despite high demanding needs, in the work for prototyping the SMART ODS, a PC

computer-based system was considered for real-time implementation because it was sufficient to meet the prototype system requirement and because of project budget constraints. In order to cope with object detection and distance estimation as a computational expensive task, novel hardware-specific optimizations of the proposed image processing algorithms were performed, that allow the algorithms to run in real-time.

The SMART hardware can be considered through three parts: Sensors, Network and

Processing Unit.

5.2.1 Sensors

SMART ODS consists of 5 vision sensors including three RGB zooming cameras from The Imaging Source [2], thermal camera from FLIR [3], night vision sensor consisting of custom-made night vision lens mounted on a monochrome camera from The Imaging Source. Some specifications of sensors are given below which were also considered during the selection and designing of the SMART network and processing unit.

Table 1. SMART vision sensors

Vision Sensor Resolution (Pixels) Frequency (Hz) Size (Megapixels)

RGB 2592x1944 15 5MP

Thermal (TH) 640x512 9 0.328MP

Night Vision (NV) 2592x1944 15 5MP

5.2.2 Network The minimum requirements estimation and selection of the network hardware is crucial

for the applications such as SMART ODS which is characterised with the large size data and high speed. In SMART ODS due to the large size of images, high bandwidth network was needed. As described in Deliverable D2.1, the original network setup was based on a gigabit network. However, during sub-system conformance testing (Deliverable 2.4) it was established that a gigabit network is not sufficient to enable real-time sensor connection, and consequently it was decided to increase the network bandwidth to 10 GB network. This had as a consequence a change of the network switch from originally planned ADVANTECH EKI-9512P to NETGEAR XS708T with specifications given in Table 2. The selected industrial 10GB network switch was chosen to overcome the problem of data overflowing and to enable getting the maximum resolution images at high speed.

Table 2. ODS demonstrator switch specifications [10]

Page 10: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 10 | 18

Technology

Standards

IEEE 802.3 Ethernet IEEE 802.3u 100BASE-T (XS708T/XS716T only) IEEE 802.3ab 1000BASE-T IEEE 802.3an 10GBASE-T 10Gbps Ethernet Over Copper Twisted Pair Cable IEEE 802.3ae 10-Gigabit Ethernet Over Fiber (10GBASE-LRM) - XS708T/ XS716T only IEEE 802.3ae 10-Gigabit Ethernet Over Fiber (10GBASE-SR, 10GBASE-LR, 10GBASE-ER, 10GBASE-LX4) IEEE 802.3z Gigabit Ethernet 1000BASE-SX/LX IEEE 802.3x Full-Duplex Flow Control IEEE 802.1Q VLAN Tagging IEEE 802.3ad Trunking (LACP) IEEE 802.1AB LLDP with ANSI/TIA-1057 (LLDP-MED) IEEE 802.1p Class of Service IEEE 802.1D Spanning Tree (STP) IEEE 802.1s Multiple Spanning Tree (MSTP) IEEE 802.1w Rapid Spanning Tree (RSTP) IEEE 802.1x RADIUS Network Access Control IEEE 802.3az Energy Efficient Ethernet (EEE)

Interface

Ports 8 10GBASE-T copper 2 SFP+ 1000/10GBASE-X fiber ports (shared)

Power Requirements

Input Voltage Internal 100-240VAC 50-60Hz

Power Consumption ~ 49.5 W

Physical Characteristics

Housing Aluminum Shell

IP Rating -

Dimensions 440 x 204 x 43 mm

Weight 2.61 kg

Installation Rack mounting

Environmental Limits

Operating Temperature -5° to 55 °C

Storage Temperature –10° to 70 °C

Ambient Relative Humidity max 95% (non-condensing)

Standards and Certifications

Electromagnetic emissions and immunity

CE mark, commercial FCC Part 15 Class A, VCCI Class A Class A EN 55022 (CISPR 22) Class A Class A C-Tick EN 55024

Page 11: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 11 | 18

CCC 47 CFR FCC Part 15, Subpart B, Class A ICES-003: 2016 Issue 6, Class A ANSI C63.4:2014 IEC 60950-1:2005 (ed.2)+A1:2009+A2:2013 AN/NZS CISPR 22:2009+A1:2010 CLASS A

Safety

CB mark, commercial CSA certified (CSA 22.2 #950) UL listed (UL 1950)/cUL IEC 950/EN 60950 EN 60950-1: 2006 + A11:2009 + A1:2010 + A12:2011 + A2:2013 IEC 60950-1:2005 (ed.2)+A1:2009+A2:2013 AN/NZS 60950.1:2015 CCC (China Compulsory Certificate)

MTBF (mean time between failures)

Time 276197 h

It is important to note that at the moment of selection of above specified network switch,

there were no commercially available 10 GB switches certified for railway applications. However, the finally selected switch is certified for short circuit protection and for electromagnetic immunity and emissions and these certificates were used to obtain permits for mounting the ODS onto a vehicle in real traffic conditions. The permit was requested from vehicle owner Serbia Cargo. As the prototype of SMART ODS is not an interoperability constituent and does not influence the locomotive control and functioning of locomotive’s devices/subsystems, the noted certificates were sufficient to obtain permits.

The change of the network setup had a consequence to finally selected power supply system.

The power supply originally planned and described in D2.1 had to be re-designed because the finally chosen switch (as well as any other 10 GB switch available on the market at the moment of the final SMART ODS sensors’ housing design) doesn’t have Power Over Ethernet (POE) capabilities. The decision was made that the SMART ODS is powered via UPS (APC SMT2200IC), which is certified for electromagnetic immunity and emissions. Besides this, the UPS also adds a layer of protection as it has control logic that protects from power surges and short circuit. The UPS is located in the locomotive driver cab and it is connected to the locomotive 220 V power outlet. As SMART OD sensors require different voltage supply, three AC-DC power supplies are installed in the ODS sensor’s housing. To power the switch and POE injector for thermal camera a DIN rail 220 V power outlets are also envisioned to be installed into the sensors’ housing. To prevent possible short circuit and to increase safety, additional fuses and current differential protection is positioned before the power supply elements located in the sensors’ housing (Figure 1).

Page 12: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 12 | 18

Figure 1. The scheme of the power and data flow for SMART OD integrated system mounted onto the Serbia Cargo series 444 locomotive

The SMART networks enable to receive data about 7 Hz at maximum resolution simultaneously from all sensors. However, further the software based data synchronization lower down the data acquisition to 2 Hz. The hardware or software based synchronisation is needed to synchronised data captured independently from SMART vision sensors at 7 Hz, in order to process the same scene capture by all vision sensors. This down sampling of acquired data does not affect the real-time performance of SMART ODS and provide enough information. Considering the train speed of 80 km/h, the data acquisition frame rate is 7 FPS (frame per 143 milliseconds) which means that at about every 3 meters travelled distance the new data has been captured. Whereas the actual processing is done on two frames out of seven captured frames meaning that SMART ODS processes an image taken at every 500 milliseconds or in other words at each 11.11 meters travelled distance.

5.2.3 Processing unit SMART machine learning-based algorithms for obstacle detection and distance

estimation require high performance processing unit to enable real-time processing. Because of the different sizes in which objects may appear in image, the large image search space is involved in SMART object detection algorithms, which increases the complexity and slows down the overall performance of SMART ODS real-time implementation.

Page 13: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 13 | 18

SMART PC-computer based processing unit was used for the real-time obstacle detection

system prototyping. The processing unit was equipped with INTEL core i9 CPU and two Nvidia GTX 1080 16 GB GPUs (graphics processing units). Parallel GPUs computing enable real-time object detection and distance estimation.

5.3 Software SMART ODS software architecture was developed on Robot Operating System (ROS)

distribution Kinetic Kame [4] operated on Linux operating system (Ubuntu 16.04 64-bit, http://releases.ubuntu.com/16.04/). During developing hardware and software for SMART ODS the goal was to achieve high reliability, modularity, redundancy, performance, speed and redundancy characteristics for SMART ODS.

The ROS works here as middleware that provides the virtual interface between different

software modules such as data acquisition, object detection, distance estimation, visualization and offline processing modules (Figure 2).

Figure 2. Block-diagram of ROS-based software architecture of SMART ODS

ROS messaging middleware [4] provides an Inter-Process Communication (IPC) or shared

memory. The IPC mechanism of subscriber/publisher enables inter communication between modules and sub-modules.

Sensor interface module: The sensor interface module is responsible for providing data

Page 14: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 14 | 18

from sensors to other modules. Each sensor has its own dedicated data acquisition interface. Besides providing data from the sensors, the sensor interface modules are also responsible for sensor parameters configuration during pre-run test and during run. This means it is not needed to completely stop the SMART ODS in case changing any sensors parameters, the parameters can be changed while SMART ODS is running and train is moving. The user can change the sensors parameters using Dynamic Parameters Interface as shown in Figure 2. The user interface interacts with dynamic parameters configurator, which further passes user’s configuration changes to sensor interface modules. Some of the parameters possible to change are: camera FPS, exposure time, field of view, resolution.

Perception module: the received sensors data after data synchronisation are further

processed on machine learning and computer vision-based perception module. The perception module is the core building block of the whole SMART ODS software framework and it is responsible for processing the raw data from sensors and for providing the meaningful information, i.e. rail track detection (ROI), objects detection, recognition, tracking and distance estimation.

Global services module: this module includes data logging module that is capable of

recording the raw data from sensors, parameters configuration, sensors parameters and processed data. The recorded data can be retrieved to analysis the performance of SMART ODS for further improvement and development. Further, the global service module includes map, which together with GPS sensor input (https://www.hardkernel.com/shop/usb-gps-module/), is used to visualize real-time position of the train on the map. The map can be setup in offline mode that means the pre-saved map can be used if internet service is not available. However, the online mapping can also be done when access to internet is possible.

User interface module: the user interface module includes visualization of all processed

information from perception module and real-time positioning of the train on the map. The user interface assists and facilitates the driver by providing information regarding the obstacles on the rail tracks. The information on the user interface is designed to only alert the driver, who is still responsible to make the front-end decision. Further, the user interface provides the control interface for the responsible ODS technician to troubleshoot the problems, configure parameters and monitor sub-system modules.

5.3.1 Machine learning-based algorithms timing considerations

Machine learning has a fundamental role in SMRT ODS software development. The machine learning algorithms were developed under the Keras API (https://keras.io/) running on top of Tensorflow development kit (https://www.tensorflow.org/). The Tensorflow is one of the widely used frameworks due to its high performance and processing speed in comparison to other available frameworks. Tensorflow based SMART machine learning algorithms were optimized in a way to have high processing rate to achieve real-time processing. The achieved processing rate of Tensorflow based SMART ODS modules run on SMART PC is 8 FPS that means SMART machine learning based perception module is able to process in 125 ms. Referring to equation (1) in Section 5.1

Page 15: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 15 | 18

𝑡𝑟𝑒𝑠 < 𝑡𝑎𝑐𝑞 (1)

the processing rate should be larger than the data acquisition rate. In case of SMART ODS, the data acquisition rate is 7 Hz, however after data synchronization the data at 2 Hz are forwarded to processing unit where the SMART perception module is able to process at 8 Hz on SMART PC. The processing rate is four times the actual data acquisition rate. This shows that the SMART algorithms are highly optimized to perform reliably in real-time environment and can perform on higher data acquisition rate as well.

As described in Deliverable D3.2, state-of-the-art object detection algorithm YOLO-You Only Look Once [5] was chosen as object detector for SMART DisNet-based object detection and distance estimation (Figure 3). YOLO uses a single deep-learning network for both object classification and object localization in an image, and it is considered as real-time fast multi-object detection algorithm with higher accuracy rate.

Figure 3. SMART machine learning DisNet -based system used for object distance estimation from a monocular camera image (Deliverable D3.2)

The DisNet-based algorithm was developed for object detection and distance estimation from a monocular camera and can be applied to all three SMART camera types, RGB, Thermal and Night Vision camera. As described in Deliverable 3.2, the results of obstacle detection from single camera are satisfactory, but the results could be improved when performing the sensor fusion, so that there is interest to process more than one camera image at the time. For the purpose of timing consideration, the calculation of total processing time when only one sensor (RGB camera) is used is performed and compared to the total processing time when more than one sensor is used, 2 (thermal+ RGB camera) and 5 (all 5 SMART vision sensors, 3 RGB, Thermal and Night Vision). The sensors’ data acquisition times are given in Table 1 as well as in Table 3. The time performance table (Table 3) shows that SMART ODS satisfies the minimal requirement approximation. However, the time can be reduced more with better performing processing unit.

In Table 3, 𝑓 refers to frequency and 𝑡𝒑𝒓𝒐𝒄𝒆𝒔𝒔 is total time required by acquisition module

Page 16: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 16 | 18

and perception module to provide an output in case of single camera, whereas the data synchronization time is considered instead of data acquisition time in case of multiple cameras. Table 3. Time performance

Module Sensor Interface Module Perception Module

Total Processing Time

𝑓𝒑𝒓𝒐𝒄𝒆𝒔𝒔, 𝑡𝑝𝑟𝑜𝑐𝑒𝑠𝑠

Sub Module

Data Acquisition

𝑓𝑎𝑐𝑞 , 𝑡𝑎𝑐𝑞

Data Synchronization

𝑓𝑎𝑐𝑞́ , 𝑡𝑎𝑐𝑞́

Object detection, tracking and

distance estimation 𝑓𝑟𝑒𝑠, 𝑡𝑟𝑒𝑠

RGB Camera 15 Hz, 66.6 ms - 8 Hz, 125 ms 5.21 Hz, 191.6 ms

RGB + Thermal Camera

9 Hz, 111 ms 5 Hz, 200 ms 6.5 Hz, 153.8 ms 2.82 Hz, 353.8 ms

All sensors 7 Hz, 142 ms 2 Hz, 500 ms 2.5 Hz, 400 ms 1.11 Hz, 900 ms

As obvious from the time performance table, the total processing time is increasing

gradually for the cases where more than one camera are used in comparison to single camera. The results of performance evaluation of SMART ODS (Deliverable 7.1) show that using of one camera at a time gives obstacle detection and distance estimation result that meets the requirements of mid- and long-range object detection (with up to ±10% error). Fusing with thermal camera processing results can lead to better performance (Deliverable 3.2) while still enabling real-time processing (Table 3). However, for using more than two vision sensors at time, it is needed to reduce processing time, which could be achieved with a processing unit of higher performances. Nevertheless, the SMART ODS meets the requirements of real-time by delivering the object detection and distance estimation results in fraction of second.

5.3.3 Human-machine Interface

As described above, in the introductionary part of section 5.3, the user interface (human-machine interface) module was developed for the real-time visualization of sensor data processing results and for real-time positioning of the train on the map, which is displayed to the driver (Figure 4).

Figure 4. (a) User Interface (UI) implemented in the locomotive driver cabin during the dynamic tests performed in May 2019. (b) Live camera image displayed to driver on the UI together with

the map and the warning on detected obstacle on the rail track ahead of the locomotive

Page 17: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 17 | 18

The visualization in the UI is a Web Service built using Leaflet JavaScript library for

rendering real-time maps [7]. The Web Service uses roslibjs [8] for connecting to ROS to read sensor data. The complete structure of the Web Application is relatively simple as shown in Figure 5.

Figure 5. (a) The block diagram of User Interface architecture

The Main Page of the User Interface has three sections:

Leaflet Maps - Leaflet is an open source JavaScript library for interactive maps. The browser communicates with ROS to extract the real-time GPS coordinates of the train, which are then rendered on the map.

Camera Screen - The camera screen is a video stream of one of the SMART cameras mounted on the train. The camera images are obtained from ROS in real-time. The camera stream is displayed using mjpeg server. The mjpeg server is a streaming server that subscribes to an image topic of specific camera in ROS-based SMART ODS software architecture and publishes that topic as a video stream via HTTP.

Obstacle Table - The obstacle table appears at the bottom of the screen with a warning sign if there are obstacle distances available from SMART obstacle detection and distance estimation module. A leaflet map marker also appears at the obstacle distance indicating the obstacle on the map. If there are no obstacles detected, the obstacle distance array from ROS is empty and the Web App does not show any warning sign or obstacle table. If the obstacle array has one or more distance values from ROS, a real-time obstacle table will be generated and displayed at the bottom with a blinking warning sign. These distance values will update themselves in the obstacle table as the obstacles get closer or are not anymore in the camera view. The application converts the estimated obstacle distance into GPS coordinates in the direction of the train motion and renders markers for the obstacles on the rail track in the leaflet map.

The Web App uses Bootstrap to make the design responsive. The display will adjust itself dynamically depending on the screen size and the device being used to view the application.

Page 18: Deliverable D 3.3 Report on real-time algorithm ...

G A 730836 P a g e 18 | 18

6 References [1] https://pdfs.semanticscholar.org/bdd1/42932455dce2c08b8027bd9672aa0ed548f6.pdf [2] T. I. S. E. GmbH, "The Imaging Source Europe GmbH," [Online]. Available: https://www.theimagingsource.com/products/zoom-cameras/gige-color/dfkz12gp031. [3] FLIR TAU2 camera[Online]. Available: http://www.flir.com/uploadedFiles/OEM/Products/LWIR-Cameras/Tau/FLIR-TAU-2-Datasheet.pdf [4] Cousins S (2010) Welcome to ROS Topics, IEEE Robotics & Automation Magazine, Vol. 17, Issue 1. [5] Redmon, J., Farhadi, A. – YOLOv3: An Incremental Improvement, arXiv, 2018 [6] SMART project: http://www.smartrail-automation-project.net [7] https://leafletjs.com/ [8] http://wiki.ros.org/roslibjs [9] http://wiki.ros.org/mjpeg_server [10] https://www.netgear.com/