OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion...

54
Interface Specification Open Fusion Platform www.ofp-projekt.de February 2019 (Version 2.0)

Transcript of OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion...

Page 1: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification Open Fusion Platform

www.ofp-projekt.de

February 2019

(Version 2.0)

Page 2: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 1 / 53

Inhalt 1. Overview .......................................................................................................................................... 3

1.1. Target of document ................................................................................................................. 3

1.2. Functional description ............................................................................................................. 4

1.3. Related Works ......................................................................................................................... 7

2. Standards ......................................................................................................................................... 7

2.1. Time synchronization (PTP) ..................................................................................................... 7

2.2. Coordinate systems ................................................................................................................. 8

2.3. V2X communication (ITS-G5)................................................................................................... 9

2.4. Safety ....................................................................................................................................... 9

2.4.1. Safety manager .............................................................................................................. 10

2.4.2. AUTOSAR und AUTOSAR Adaptive ................................................................................ 11

2.5. SI ............................................................................................................................................ 12

3. Communication Concept ............................................................................................................... 13

4. Data types – Generic ..................................................................................................................... 13

4.1. Status ..................................................................................................................................... 14

4.2. Primitives ............................................................................................................................... 14

4.3. Objects ................................................................................................................................... 15

4.4. Ego Motion ............................................................................................................................ 15

4.5. Pose ....................................................................................................................................... 16

4.6. Image ..................................................................................................................................... 16

5. Module Manifest ........................................................................................................................... 17

6. Perception Layer ............................................................................................................................ 20

6.1. Sensor description ................................................................................................................. 20

6.1.1. Sensor manifest ................................................................................................................. 20

6.1.1.1. Sensor manifest – Camera ......................................................................................... 20

6.1.1.2. Sensor manifest – RADAR .......................................................................................... 21

6.1.1.3. Sensor manifest – LiDAR ............................................................................................ 22

6.1.1.4. Sensor manifest – Ultrasonic ..................................................................................... 23

6.1.1.5. Sensor manifest – Vehicle Bus ................................................................................... 25

6.1.1.6. Sensor manifest – Vehicle-to-X (V2X) ........................................................................ 26

6.1.2. Map provider manifest ...................................................................................................... 28

6.2. Data Types ............................................................................................................................. 30

Page 3: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 2 / 53

6.2.1. RADAR ................................................................................................................................ 30

6.2.2. Camera .............................................................................................................................. 31

6.2.3. LiDAR ................................................................................................................................. 32

6.2.4. Vehicle bus ........................................................................................................................ 32

6.2.5. Vehicle Abstraction ........................................................................................................... 33

6.2.5.1. Detailed Specification ................................................................................................ 35

6.2.6. Ultrasonic .......................................................................................................................... 38

6.2.7. Map .................................................................................................................................... 38

6.2.7.1. Simple Road Graph .................................................................................................... 38

6.2.7.2. Semantic High-Accuracy Topographic Map ............................................................... 39

6.2.8. V2X ..................................................................................................................................... 40

6.2.8.1. V2X messages ............................................................................................................ 40

6.2.7.2. V2X data input/output .............................................................................................. 43

7. Fusion layer ............................................................................................................................... 44

8. Environment model ................................................................................................................... 44

8.1. Timing concept ...................................................................................................................... 44

8.2. Map data ............................................................................................................................... 44

8.2.1. Road graph ........................................................................................................................ 45

8.2.2. Semantic high-accurate topographic map ........................................................................ 46

8.3. Static environment model ..................................................................................................... 47

8.3.1. Freespace........................................................................................................................... 47

8.3.2. Obstacles ........................................................................................................................... 49

8.3.3. Static objects ..................................................................................................................... 50

8.3.4. Occupancy grid map .......................................................................................................... 50

8.4. Dynamic environment model ................................................................................................ 51

8.4.1. Dynamic objects ................................................................................................................ 51

8.5. Vehicle status ........................................................................................................................ 52

8.5.1. Ego motion ........................................................................................................................ 52

8.5.2. Pose ................................................................................................................................... 52

Appendix ................................................................................................................................................ 52

Authors .............................................................................................................................................. 52

License ............................................................................................................................................... 53

References ......................................................................................................................................... 53

Page 4: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 3 / 53

Part 1: Basic Specifications

1. Overview This document is created as a deliverable of the project Open

Fusion Platform [OFP], which is publicly funded by the Federal

Ministry of Education and Research (BMBF). Within the OFP

Project the Partners are developing a near series fusion platform

with open interfaces as an enabler for high- and fully automated

vehicles. By the disclosure of the interface description we want to

enable other companies, institutes or universities to easily

integrate their own products or prototypes in the OFP and

therefore accelerate the development of the new automated driving technologies. The OFP will be a

generic platform for automated driving functions, but will show the capabilities at the end of the

project by implementing the following main use case:

“An e-car autonomously parks and positions itself directly on top off a parking space with a wireless

charging plate (valet parking). After the car is fully charged, it drives itself to a normal parking space

without a charging plate.”

The following 10 main partners are working on the OFP project:

together with the 2 associated partners:

1.1. Target of document

This document describes the IN- and OUTPUT Interfaces of the Open Fusion Platform, including many

details that are needed to incorporate new sensors or to use the fusion model for further

functionalities. The document starts with the standards used, the internal communication concept of

the OFP and then details Input and Output Interfaces of the OFP. Furthermore, Data Types, Timing

concept and Vehicle Status are addressed within this document.

The document is not complete nor will it be, as this topic is very complex. This document shall give

insights into the OFP to other interested parties, and is a starting point to use the OFP for their own

Page 5: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 4 / 53

products or prototypes. Through an initial Workshop and more discussions with interested parties,

this document will grow and hopefully become a useful tool for introducing the open fusion platform

to the world. Finally, this document shall become a starting point for standardizing the interfaces of

such fusion platforms for automated driving cars.

The OFP project welcomes all types of collaboration, may it be input to this document or hands on

work implementing new prototypes with the open fusion platform. If you like to collaborate, please

get in contact with the coordinator of the OFP project, e.g. via eMail: [email protected]

1.2. Functional description

The OFP system architecture is formed by a number of functional layers. The reason behind this

architectural decision is the easy replacement of layers, their refinement or reworks without

influencing higher and not directly connected components. E.g., if a specified hardware component

camera will be replaced by a model with another specification (change of pixel resolution etc.), the

immediately connected layer could be influenced either due to interface changes or by necessary

adaptations of components. Higher layers are not influenced and remain stable in terms of interface

and provided functionality.

Figure 1 Overview Functional Layering

Figure 1 shows the functional layering used: Hardware and software components are clustered inside

the identified layers. The Sensor Layer includes typical sensors (e.g. hardware Cameras), surrounding

peripheral units (e.g. hardware V2X module), and virtual sensors (e.g. software Map Data) as well.

These components are typically not part of the core system but deliver necessary input data for the

core system´s functionality.

The provided signals or pre-processed information (signal -> data -> information) are requested by

software components clustered in the Perception Layer. The perceptual software components are

Page 6: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 5 / 53

specialized in the recognition (detection) of features on the provided data / information of the (real /

virtual) sensors. The low level validations (confirmations) and trackings are further main

functionalities associated to the Perception Layer. The layer components are highly sensor

dependent and are specialized for working on a suitable sensor data.

The fusion layer aggregates useful information provided by the perception layer and delivers

combined and interpreted information. Gathered information from various sensors form the new

high-level information, that provide validated and enriched information that can be estimated by the

attached sensors.

The Environment Model provides a consolidated view of the collected, processed, interpreted and

validated information. The information from all sensors are aggregated and stored in one of the high-

level models (static/dynamic environment, vehicle state).

The Application Layer components consume the high-level information from the Environment Model.

The analysis of the situation and the planning of further action are carried out by these applications

class AbstractionLayer_Tegra_Sw

OS

(from Software)

Perception

(from Software)

Fusion

(from Software)

GENIVI

(from Fusion Framework)

Hardw are_Hw Pkg

(from HardwareArchitecture)

Fusion Framework

(from Software)

pkg BSW_Tegra

MCAL

Serv iceLayer

+ E2E

+ TimE

ECU_AbstLayer

ComplexDriv ers

RTE

(from AUTOSAR)

Env ironment model

(from Software)

Application

(from Software)

Figure 2 Layering Model Performance Part

Page 7: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 6 / 53

(high-level components). The output from this layer is either used as feedback by the lower layers

(especially perception and fusion layer components) or provided to the system’s environment as

illustrated on the Actors Layer.

The interfaces between layers shall remain stable in case of methods (names and behaviours) and

exchanged data (signature and semantic of data). With fulfilling the defined contract (interface

specification) a change of components without side effects is possible.

To address the performance and the safety aspects of the subsystems two layering models are

applied. Figure 2 illustrates the pure layering for the performance architecture; Figure 3 illustrates

the layering for the safety architecture. Both layer concepts are used within the system under

development. The models have the same basic concept.

In both charts, only the main layers are presented in order to simplify the illustration.

Figure 3 Layering Model Safety Part

All illustrated layers and functionalities in Figure 2 and Figure 3 and are necessary for the main

functionality of the OFP. These components are deployed on the performance processors /

infrastructure of the system (i.e. Tegra X2).

To realize the necessary safety functionality – to ensure the hazardless behaviour – of the system,

the illustrated layers in Figure 3 Layering Model Safety Part are necessary. The deployment of the

safety functionality (low-level and high-level as well) are on a dedicated Safety processor (i.e. Aurix).

class AbstractionLayer_Aurix_Sw

Hardware_HwPkg

(from HardwareArchitecture)

pkg BSW_Aurix

MCAL

Serv iceLayer

+ E2E

+ TimE

ECU_AbstLayer

ComplexDriversSafetyOS

SWC

(from AUTOSAR)

RTE

(from AUTOSAR)

Page 8: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 7 / 53

1.3. Related Works

For hardware interfaces (e.g. CAN, LVDS, …) and basic Software (e.g. AUTOSAR Adaptive) standards

are available for many years now and widely applied in automotive series products. This is not the

case for automated driving specific problems. A few initiatives, like OpenDrive [OD], already started a

few years ago, but for many issues regarding sensor fusion and environmental modelling, no

standards are available yet.

Within the first year of the OFP Project (2016) a few new initiatives went public, like the Open

Robinos white paper [OR] from Elektrobit, the Open Platform initiative [OP] from BMW, Mobileye

and Intel and the adaptive AUTOSAR [AA] enhancement.

Open Robinos is an open specification for a functional software architecture with well-defined

interfaces, software modules and control mechanism (now merging into the SOFAM initiative:

Standardized, open framework for autonomous mobility). Its aim is to invite partners, customers and

tier 1 suppliers to create a reference platform for automated driving.

Where possible and meaningful, we try to incorporate other standards and will cite the standards

within this document. The OFP interface specification will hopefully evolve over time and will only

describe those parts, which are not defined within other standardization efforts. If you think,

something is missing in this document please contact the main author.

Version 1.0 of this OFP Interface Specification was an integral input to the new ISO Specification

proposal: ISO 23150 “Data communication between sensors and data fusion unit for automated

driving functions”. [ISO23150] The OFP project takes part in the new ISO process and will make sure,

that the Open Fusion Platform will support the newly defined standards of ISO 23150.

2. Standards

2.1. Time synchronization (PTP)

To ensure the synchronization of the fusion model data, the system needs to have a synchronized

time basis. This is achieved with a Time Synchronization between the involved ECUs based on the

IEEE 802.1AS (gPTP).

The system can be the timing master in the complete setup which requires a direct connection to a

GPS sensor with a PPS-signal for timing synchronization. A second solution setup can use an external

timing master which synchronizes all slaves via gPTP.

The timing master is the Aurix Safety-ECU on the DrivePX2 board which receives information from a

GPS receiver and the PPS (pulse-per-second) to ensure the correct time basis.

Standard Description

IEEE 802.1AS Timing and Synchronization for Time-Sensitive

Applications (gPTP) Table 1 - TimeSync Standard

Page 9: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 8 / 53

2.2. Coordinate systems

The GPS device that will be used has to deliver coordinates in WGS84 format. Starting from this

interpreted signal information the coordinates will be calculated to UTM coordinates.

The applied world reference system is Universal Transverse Mercator (UTM) coordinate system

[UTM]. This 2-dimensional Cartesian coordinate system is a horizontal position representation, the

location is independently from the vertical position. This representation realized with several maps

can be simplified for our project because a very small part of the real-world is used and the targeted

inaccuracy is obsolete.

The UTM coordinates can be transformed into common other coordinate systems.

The car reference system (Fahrzeugreferenzsystem, see Figure 4) is a 3D Cartesian coordinate system

that defines the vehicle’s dimension (inner coordinate system, body frame). The origin of the right-

hand oriented coordinate system is a defined and calibrated point inside the car’s dimension

(typically the middle of the front axis). The positive x-dimension is pointing forward seen from

driver’s perspective, the clockwise rotation is specified as positive roll angle. The positive y-

dimension is pointing to the left side from driver’s perspective, the clockwise rotation is specified as

positive pitch angle. The positive z-dimension is pointing upwards from driver’s perspective, the

clockwise rotation is specified as positive yaw angle

The used car reference system is specified in the norm ISO 8855.

Figure 4 Car Reference System

Between the various coordinates system, a transformation is necessary (Figure 5). For details please

see the description in [ISO8855].

Figure 5 Coordinate Transformation

Page 10: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 9 / 53

2.3. V2X communication (ITS-G5)

The wireless communication between cooperative vehicles or vehicles and the infrastructure is

achieved by employing the European ITS-G5 standard [ET11]. The standard is based on the IEEE

802.11p standard, which supports robust wireless ad-hoc communications of fast moving

stations in the 5,9 GHz frequency band.

Standard Description

ETSI ITS-G5 ETSI ITS-G5 standard is based on the IEEE

802.11p standard1 supporting V2X

communications in a wireless ad-hoc network to

be used at the 5,9 GHz frequency band allocated

in Europe Table 2: V2X communication standard

2.4. Safety

The EB tresos Functional Safety products running on the NVIDIA Drive PX2 are based on the

AUTOSAR standard and meet the ISO 26262 requirements up to automotive safety integrity level D

(ASIL D). Additionally these products are conform to the IEC 61508 standard for non-automotive use.

The products used in the OFP System are:

• EB tresos Safety OS

Data protection: To provide a safe execution environment for safety-critical functions, the EB

tresos Safety OS incorporates proven concepts such as Microkernel and System Calls from

the aerospace and industrial markets. The result is a robust and protected Safety Operating

System (OS) compatible with the latest AUTOSAR standard. The OS is independently certified

for use in ASIL D applications such as electrical power steering, as well as SIL 3 use in

nonautomotive projects.

• EB tresos Safety RTE

Data protection: The EB tresos Safety RTE takes care of the safe handling of RTE services

between software in different partitions.

• EB tresos Safety TimE Protection

Execution protection: EB tresos Safety TimE Protection is a software module that enables the

timing and execution supervision of safety-related applications. Thus, it provides freedom

from the interference of safety-related software modules with regard to time and execution.

EB tresos Safety TimE Protection is independently certified for use in ASIL D applications such

as electrical power steering, as well as SIL 3 use in non-automotive projects.

1 The IEEE 802.11p amendment is part of the active IEEE 802.11-2016 standard meanwhile, but still used for

better distinction to other standards.

Page 11: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 10 / 53

• EB tresos Safety E2E Protection

Communication protection: EB tresos Safety E2E Protection is a set of modules that supports

the transmission of safety-related data between ECUs. It consists of an end-to-end

communication protection library and an end-to-end protection wrapper for integration into

an AUTOSAR basic software stack.

Standard Description

ISO 26262 ISO 26262 is an international standard for functional safety of electrical

and/or electronic systems in production automobiles defined by the

International Organization for Standardization (ISO) in 2011

IEC 61508 IEC 61508 is an international standard intended to be a basic functional

safety standard applicable to all kinds of industry Table 3 – Reference Safety Standards

2.4.1. Safety manager

To fulfill the safety requirements of autonomous driving systems the safety manager covers the

following tasks:

• Program-flow Monitoring

• Plausibility checks

• Hardware Monitoring

• Deriving safety and error strategy

Program-Flow-Monitoring

The Program-Flow-Monitoring supervises the execution-time and execution-sequence of the

architecture. Therefore checkpoints in the software will be supervised.

Name Data Type Unit Description

Checkpoint ID int Checkpoint ID to

distinguish between

different checkpoints.

Report bool Reports the passing of

the Checkpoint

Timestamp int, int ms, us Timestamp of

checkpoint passing Table 4 – Program Flow Monitoring Interfaces

Plausibility check

A standard plausibility check is the evaluation of data thresholds and timing conditions. Next to such

easy implementations, there are further interpretation steps implemented in order to evaluate greater

and more complex parts of the architecture. As an example the result of a path planning based on a

grid based sensor data fusion can be checked against a second approach of sensor data fusion in order

to ensure that the first computation chain is working correctly.

Page 12: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 11 / 53

The result of such safety and error finding driven evaluation is a list of system states:

Name Data Type Unit Description

Checkpoint ID int Checkpoint ID to

distinguish between

different checkpoints

Check result bool[] List of results for every

checked state.

Timestamp int, int ms, us Timestamp of

checkpoint Table 5 – Plausibility check interfaces

Hardware monitoring

To monitor the status of the Hardware, there are two possible mechanisms:

• BIST - Built-in self-test

A Built-in self-test (BIST) is a built-in mechanism that allows the ECU to perform several hardware

tests.

• Question and Answer

The Safety ECU in the heterogenous System used in OFP is asking predefined questions to the

performance ECU as a challenge and response system. With this method several hardware based

computing mechanisms at the performance ECU can be tested.

The result of these tests is a list of system states.

Name Data Type Unit Description

Checkpoint ID int Checkpoint ID to

distinguish between

different checkpoints

Check result bool[] List of results for every

checked state.

Timestamp int, int ms, ys Timestamp of

checkpoint Table 6 – Hardware monitoring interfaces

Deriving safety and error strategies

The list of check results is used to decide whether a specific behaviour can be activated due to its

activation prerequisites or – if the behaviour is already active – what fail operational mechanism shall

be activated and what information shall be stored for later error handlings.

2.4.2. AUTOSAR und AUTOSAR Adaptive

The Infineon Aurix TC297 “Safety ECU” on the NVIDIA DrivePX 2 which is used in the OFP System is

running a EB tresos AutoCore. The EB tresos AutoCore is the implementation of AUTOSAR-compliant

basic software for automotive electronic control units (ECUs). The EB AutoCore is based on AUTOSAR

4.x and includes support for 3.x releases. Therefore, the Software running on the Safety ECU will be

conformant to the AUTOSAR Standard.

Page 13: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 12 / 53

On the Nvidia Tegra Processor “Performance ECU” on the NVIDIA DrivePX2 runs EB corbos on Linux.

EB corbos is an implementation of the AUTOSAR Adaptive specification. In comparison to the

AUTOSAR Classic Platform the AUTOSAR Runtime Environment for the Adaptive Platform dynamically

links services and clients during runtime.

Standard Description

AUTOSAR 3.x,

AUTOSAR 4.x

AUTOSAR Adaptive

AUTOSAR (AUTomotive Open System ARchitecture) is a worldwide

development partnership of automotive interested parties founded

in 2003 Table 7 – AUTOSAR Standard

2.5. SI

The base units of the International System used within the projects are listed in the Table 8.

Base Quantity SI-Base Unit

Name Symbol Name Symbol

length l metre m

mass m kilogram kg

time, duration t second s

electric currency I, i Ampere A

thermodynamic temperature T Kelvin K Table 8 – SI Base Units

Additionally to the base units, coherent derived units are used. Some examples are given in Table 9.

Derived quantity SI coherent derived unit

Name Symbol Name Symbol

speed, velocity v metre per second m/s

acceleration a metre per second squared m/s2 Table 9 – Examples of coherent derived units in the SI expressed in terms of base units

Certain coherent derived units having also own special names and symbols as shown in Table 10.

Derived quantity Name Symbol Expressed in

terms of

other SI units

Expressed in

terms of SI base

units

plane angle radian rad 1 m/m

frequency hertz Hz s−1

force newton N m kg s−2

power, radiant flux watt W J/s m2 kg s−3

electric potential difference,

electromotive force

volt V W/A m2 kg s−3 A−1

Celsius temperature degree Celsius oC

K

Table 10 – Coherent derived units in the SI with special names and symbols

For further reading please see [SI06].

Page 14: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 13 / 53

3. Communication Concept The communication concept is split-up in two strategies (see Figure 6): data-driven communication

and timing-driven communication. Starting from the lowest sensor layer, data / information is

provided to the consumer immediately after gathering. This data driven communication will be

performed through all layers and their components till the Environment Model. The environment

model fulfils the concept of the blackboard pattern.

Following from the Environment Model to Application Layer and subsequently to the Actors Layer a

timing-driven communication is realized. Within a scheduled task management data is provided to

the consumers. Depending on the consumers’ needs (e.g. bus communication, brake activation)

various time slices (e.g. 50, 100, 250 ms) can be realized.

The dataflow inside the application and actuator layer can also follow the data-driven

communication concept, but to support the deployment on these two layers on dedicated ECUs, a

timing driven communication is required.

Figure 6 Communication concept between perception/fusion and application

4. Data types – Generic The OFP uses generic data types to enable standardized communication between all layers and

components inside these layers. All modules in the OFP have to support the generic data types if they

want to use and provide to 3rd party components.

The following classes are modelled in the system architecture, realized as a software architecture

model.

Page 15: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 14 / 53

4.1. Status

Figure 7 – Aggregate Class Status

The coordinate system defines how to interpret the status information.

A sensor coordinate system is defined by the direct measurements of the sensor, the Car coordinate

system is defined as a Cartesian vehicle coordinate system.

The local referenced system is a Cartesian coordinate system which is not related to the GPS

positions. The global referenced system is defined in UTM coordinates.

The aggregated class Status realizes the following interface functions:

Name of Function Description

getStatus Get the status for a specific axis.

getCovariance Get the covariance matrix. Table 11 – Description Functions Class Status

4.2. Primitives

The OFP defines some primitives known from computer graphics to describe abstract data structures.

A polygon is defines as a closed set of lines where the start of the first line segment is equal to the

end point of the last segment.

Figure 8 Aggregate Class Polygon and Line

Line Description

Start The line’s start point

End The end of the line Table 12 – Description of Line

pkg Types

dim : int = 3

Status

- m_coordinateSystem :CoordinateSystem

- m_covariance :Matrix<dim, dim>

- m_status :double[dim]

+ getCovariance() :const Matrix<dim, dim>&

+ getStatus(int) :double

«enumeration»

CoordinateSystem

Sensor = 0

Car = 1

LocalReferenced = 2

GlobalReferenced = 3

rows : int = 3

cols : int = 3

Matrix

- m_matrix :double[rows*cols]

+ operator()(int, int) :double

pkg Types

Polygon

- m_segments :std::vector<Line>

+ getSegments() :const std::vector<Line>&

Line

- m_start :Status<3>

- m_end :Status<3>

+ getStart() :const Status<3>&

+ getEnd() :const Status<3>&

*1

Page 16: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 15 / 53

Polygon Description

Segments All line segments of the polygon Table 13 – Description of Polygon

4.3. Objects

The aggregated class Object realizes the following interface functions:

Name of Function Description

getCenter Get the center data from Status.

getClass Get the object class type.

getSize Get the size from Status.

getType Get the object type.

transform Transforms the object into another coordinate

system based on the given calibration

information. Table 14 – Description Functions Class Object

The classes ObjectType and ObjectClass realizing the enumeration types that are used by the class

Object.

4.4. Ego Motion

Figure 9 – Class Egomotion

The class Egomotion realizes the following interface functions:

Name of Function Description

getRotationRate Get the rotation rate for a given axis.

getVelocity Get the velocity for a given coordinate axis.

transform Transforms the egomotion into another

Coordinate system based on a given calibration. Table 15 – Description Functions Class Egomotion

pkg Types

Egomotion

+ getRotationRate(CoordinateAxis) :double

+ getVelocity(CoordinateAxis) :double

+ transform(Calibration&) :bool

Page 17: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 16 / 53

4.5. Pose

Figure 10 – Class Pose

The class Pose realizes the following interface functions:

Name of Function Description

getOrientation Get the orientation from Status.

getPosition Get the position from Status.

transform Transforms the pose into another coordinate

system based on a given calibration. Table 16 – Description Functions Class Pose

4.6. Image

Figure 11 –Class Image

The class Image realizes the following interface functions:

Name of Function Description

getImage Get the whole image.

getImageArea Get the selected area of the image.

transform Rectifies the image based on a defined

calibration. Table 17 – Description Functions Class Image

pkg Types

Pose

+ getOrientation() :const Status&

+ getPosition() :const Status&

+ transform(Calibration&) :bool

pkg Types

Image

+ getImage() :const Pixel&

+ getImageArea(uint8, uint8, uint8, uint8) :Pixel&

+ transform(Calibration&) :bool

Pixel

m1

Page 18: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 17 / 53

5. Module Manifest The module manifest describes the software components in a running OFP environment.

A module can either be a virtual sensor in the perception layer which abstracts an existing sensor and

provides the measured information in the standardized format. Or the module is a fusion component

in the fusion layer or the module is a function component in the application layer which is reliable for

the automated driving use cases. The dependencies between these components are known during

the startup phase and the safety-ECU watches for the correct execution of active components.

The module manifest is defined as a description of identifiers, inputs, outputs and which layer

corresponds to the module. A module can be used by various use cases and a use case is related to

safety relevant constraints like maximum speed.

The module manifest can use the meta-data blocks to describe special features, constraints, use-

cases or proprietary information. The safety ECU and the framework on the performance ECU can

identify which modules are relevant for which use cases and can enable and disable modules.

Figure 12 – Module manifest

pkg Module

Module

- m_layer :ModuleLayer

- m_identifier :unsigned long

- m_metaData :std::vector<MetaData*>

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getMetaData() :const std::vector<MetaData*>&

+ getVersion() :const Version&

+ getUseCases() :const std::vector<UseCase>&

«enumeration»

ModuleLayer

Perception = 0

Fusion = 1

Application = 2

MetaData

- m_key :std::string

- m_type :MetaDataType

- m_children :std::vector<MetaData*>

+ getKey() :const std::string&

+ getType() :MetaDataType

+ findChild(std::string&) :MetaData*

+ getParent() :MetaData*

«enumerati...

MetaDataType

Undefined = 0

Node = 1

Entry = 2

Page 19: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 18 / 53

Figure 13 – Meta data blocks

The meta-data block is a tree-based structure which can contain information about the module. It is

possible to store global information with use of standard data formats or proprietary information

which can only be used by specific modules.

Every software layer contains special modules which are identified by the ModuleType. Modules in

the sensor layer convert proprietary data into the standard OFP data types and send them to the

fusion or application layer, the fusion layer modules consume data from the perception layer or

other fusion modules. The application layer modules consume data from the underlying layers and

do not provide data for other OFP components.

pkg Module

MetaData

- m_key :std::string

- m_type :MetaDataType

- m_children :std::vector<MetaData*>

+ getKey() :const std::string&

+ getType() :MetaDataType

+ findChild(std::string&) :MetaData*

+ getParent() :MetaData*

«enumeration»

MetaDataEntryType

Other = 0

Boolean = 1

Int = 2

Float = 3

String = 4

MetaDataEntryBase

+ getValueType() :MetaDataEntryType

«enumerati...

MetaDataType

Undefined = 0

Node = 1

Entry = 2

T : typename

MetaDataEntry

- m_value :T

+ getValue() :const T&

MetaDataEntryBoolean MetaDataEntryInt MetaDataEntryFloat MetaDataEntryString

< T->int >< T->bool > < T->float > < T->std::string >

Page 20: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 19 / 53

Figure 14 – Layer-related module manifests

The modules have input and output ports to read and provide data in form of standardized OFP

types. Input ports read data and output ports provide data. The meta-data blocks describe provided

data or the requirements.

Figure 15 Port definition with meta-data-blocks

pkg Module

Module

- m_layer :ModuleLayer

- m_identifier :unsigned long

- m_metaData :std::vector<MetaData*>

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getMetaData() :const std::vector<MetaData*>&

+ getVersion() :const Version&

+ getUseCases() :const std::vector<UseCase>&

«enumeration»

ModuleLayer

Perception = 0

Fusion = 1

Application = 2

Port

- m_type :PortType

- m_component :Component

- m_dataType :DataType = Unknown

- m_metaData :std::vector<MetaData>

+ getPortType() :PortType

+ getComponent() :Module

+ getDataType() :DataType

+ getMetaData() :const std::vector<MetaData>&

«enumerati...

PortType

Input = 0

Output = 1

PerceptionModule

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

FusionModule

- m_inputs :std::vector<Port>

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

+ getInputs() :const std::vector<Port>&

ApplicationModule

- m_inputs :std::vector<Port>

+ getInputs() :const std::vector<Port>&

pkg Module

Port

- m_type :PortType

- m_component :Component

- m_dataType :DataType

- m_metaData :std::vector<MetaData>

+ getPortType() :PortType

+ getComponent() :Module

+ getDataType() :DataType

+ getMetaData() :const std::vector<MetaData>&

«enumerati...

PortType

Input = 0

Output = 1

MetaData

- m_key :std::string

- m_type :MetaDataType

- m_children :std::vector<MetaData*>

+ getKey() :const std::string&

+ getType() :MetaDataType

+ findChild(std::string&) :MetaData*

+ getParent() :MetaData*

«enumerati ...

MetaDataType

Undefined = 0

Node = 1

Entry = 2

Page 21: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 20 / 53

Part 2: Perception Layer

6. Perception Layer The perception layer contains all single-sensor-algorithms and provides converter modules to convert

from the proprietary sensor interfaces to the standardized generic and sensor specific data

structures. A fusion of a single sensor with an ego-motion is defined as a perception layer.

6.1. Sensor description

The OFP supports two kinds of sensor systems. The first system is a single sensor which is described

by its type, data and built-in position. The second system type can be a multi-sensor system which

describes multiple sensors as one single sensor, but the integrated single sensors are defined as

normal sensors which can be requested by other modules.

A multi-sensor system (i.e. all ultrasonic sensors) collects the measurements from all connected

single sensors and provides all measured data in one measurement list (data-types have to be

equivalent).

6.1.1. Sensor manifest

6.1.1.1. Sensor manifest – Camera

The Camera sensor is responsible for the Image interfaces. The sensor consists of two parts – the

mechanical optic part and the mechanical / electrical imager part. Both parts are combined by

construction.

The optical part is specified by vertical and horizontal opening angles. Further specifying details like

distortion coefficients are not under further investigation. The to be used output will be processed

from the imager’s captured photons, to charged electrical capacities, presented as electrical signals

and transformed into information (i.e. interpreted data). The imager is defined by the width

(horizontal dimension), the height (vertical dimension), the pattern of the imager (i.e. RGB, RCCC),

and the bit resolution of charged capacities manifested by the format (i.e. bits per picture element,

e.g. 12 bit).

The provided information is stored in the meta-data block of the output port.

Page 22: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 21 / 53

Figure 16 – Sensor manifest Camera

Kind of Information Description Sensor Description Specification of module and definition of

module type (entry).

Width Description of image width.

Height Description of image height.

Bits per Pixel Description of used bits per pixel (e.g. 12 bit).

Characteristic Description of sensor’s characteristic.

Horizontal Opening

Angle

Description of sensor’s horizontal opening

angle.

Vertical Opening Angle Description of sensor’s vertical opening angle.

Pattern Description of pattern (e.g. RGB).

Blindness level The sensor’s blindness level Table 18 – Camera sensor description

6.1.1.2. Sensor manifest – RADAR

The Radar sends electromagnetic waves and receives echoes from reflecting objects in the

environment. Basic information about the radar properties e.g. base frequency and

horizontal/vertical opening angle are comprised within the sensor manifest radar (see Table 19).

Furthermore, the radar delivers in regular cycles metadata about the currently used modulation

scheme and optional information about the sensor state (see Table 19).

Finally, the radar delivers the attributes range, radial speed, angle of arrival, and optional

Information for the quality of every recognized reflection within the target list (also see Section

6.2.1 RADAR).

class CameraSensor

SensorDescriptor

m_key = "SensorDescriptor"

m_type = Node

«enumeration»

ModuleLayer

Application = 2

Fusion = 1

Perception = 0

(from Module)

Module::Module

- m_identifier :unsigned long

- m_layer :ModuleLayer

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getVersion() :const Version&

OpeningAngleHorizontal

m_key = "OpeningAngleHorizontal"

OpeningAngleVertical

m_key = "OpeningAngleVertical"

Pattern

m_key = "Pattern"

ImagePort

Ports::ImageOutport

m_type = Output

Ports::ImageDescriptor

m_key = "ImageDescriptor"

m_type = Node

Ports::

ImageDescriptorFormat

m_key = "Format"

Ports::

ImageDescriptorHeight

m_key = "Height"

Ports::

ImageDescriptorWidth

m_key = "Width"

Module::PerceptionModule

m_layer = Perception

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

ObjectListPort

Ports::

ObjectListOutput

m_type = Output

CameraSensor

BlindnessLevel

m_key = "BlindnessLevel"

1

1

1

1

1

1

1

1

1

1

1

1

1

1

1..*

1

0..1 1

1

1

Page 23: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 22 / 53

Figure 17 – Sensor manifest RADAR

Kind of Information Description

Sensor Description Specification of module and definition of

module type (entry).

Frequency Description of used base frequency.

mid frequency mid frequency of radar scan

3dB-beamwidth

azimuth

azimuth width of radar beam

3dB-beamwidth

elevation

elevation width of radar beam

data acquisition

duration

duaration of radar scan

max range max radial detection range of sensor

Range gate length Length of the range gate Table 19 – RADAR sensor description

6.1.1.3. Sensor manifest – LiDAR

The LiDAR sensor typically scans an area and does time of flight measurements of the reflection of a

laser impulse for a grid within the field of view. The sensor is specified by vertical and horizontal

opening angles, the maximum distance of the LiDAR beams and the resolution of a single scanned

point. The backscattering of the laser impulse is measured via highly sensitive diodes, preprocessed

and then presented as output from an ADC.

In the LiDAR sensor manifest, the field of views, the resolution and the maximal distance are defined.

class RadarSensor

SensorDescriptor

m_key = "SensorDescriptor"

m_type = Node

«enumeration»

ModuleLayer

Application = 2

Fusion = 1

Perception = 0

(from Module)

Module::Module

- m_identifier :unsigned long

- m_layer :ModuleLayer

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getVersion() :const Version&

OpeningAngleHorizontal

m_key = "OpeningAngleHorizontal"

OpeningAngleVertical

m_key = "OpeningAngleVertical"

RADARRawTargetListPort

Ports::

RADARRaw TargetListOutport

m_type = Output

Frequency

m_key = "Frequency"

BlindnessLev el

m_key = "BlindnessLevel"

InterferenceLev el

m_key = "InterferenceLevel"

RangeResolution

m_key = "RangeResolution"

SpeedResolution

m_key = "SpeedResolution"

RadarSensor

Port

Ports::ObjectListPort

m_dataType = ObjectList

Module::PerceptionModule

m_layer = Perception

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

1

1

1

1

1

1

1

1

1..*

1

1

1

1

1

1 1

1

1

0..*

1

Page 24: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 23 / 53

Figure 18 – Sensor manifest LiDAR

Kind of Information Description Sensor Description Specification of module and definition of

module type (entry). Max Distance Description of the maximum distance that could

theoretically achieved with the sensor. Horizontal Opening

Angle Description of the horizontal opening angle of

the LiDAR sensor. Vertical Opening Angle Description of the vertical opening angle of the

LiDAR sensor.

Resolution Resolution of a single measurement point of the

measurement matrix.

Blindness level The blindness level of the sensor Table 20 – LiDAR sensor description

6.1.1.4. Sensor manifest – Ultrasonic

The ultrasonic sensor can be a multi-sensor system of more than one ultrasonic sensor or as one

single ultrasonic sensor.

class LiDARSensor

Module::Module

- m_identifier :unsigned long

- m_layer :ModuleLayer

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getVersion() :const Version&

Module::PerceptionModule

m_layer = Perception

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

«enumeration»

ModuleLayer

Application = 2

Fusion = 1

Perception = 0

(from Module)

LiDARSensorLiDARRawTargetListPort

Ports::

LiDARRawTargetListOutport

m_type = Output

Range

m_key = "Range"

OpeningAngleHorizontal

m_key = "OpeningAngleHorizontal"

SensorDescriptor

m_key = "SensorDescriptor"

m_type = Node

ObjectListPort

Ports::

ObjectListOutput

m_type = Output

OpeningAngleVertical

m_key = "OpeningAngleVertical"

BlindnessLev el

m_key = "BlindnessLevel"

0..1

1

n1

1

1

1

1

1

1

11

Page 25: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 24 / 53

Figure 19 – Sensor manifest Ultrasonic

Kind of Information Description

Sensor Description Specification of module and definition of

module type (entry).

Horizontal Opening Angle The horizontal opening angle

Vertical Opening Angle The vertical opening angle

Range The measurement range

Blindness level The blindness level of the sensor Table 21 – Ultrasonic sensor description

The ultrasonic output port does not contain additional information per default.

Figure 20 – Ultrasonic output port

class UltrasonicSensor

Module::Module

- m_identifier :unsigned long

- m_layer :ModuleLayer

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getVersion() :const Version&

Module::PerceptionModule

m_layer = Perception

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

«enumeration»

ModuleLayer

Application = 2

Fusion = 1

Perception = 0

(from Module)

Range

m_key = "Range"

OpeningAngleHorizontal

m_key = "OpeningAngleHorizontal"

SensorDescriptor

m_key = "SensorDescriptor"

m_type = Node

USRawTargetListPort

Ports::

USRawTargetListOutport

m_type = Output

UltrasonicSensor

BlindnessLev el

m_key = "BlindnessLevel"

1

1

11

1

1

1

1

class USRawTargetListPort

Module::Port

- m_type :PortType

- m_component :Component

- m_dataType :DataType = Unknown

+ getPortType() :PortType

+ getComponent() :Module

+ getDataType() :DataType

USRawTargetListPort

m_dataType = USRawTargetList

USRawTargetListOutport

m_type = Output

Page 26: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 25 / 53

6.1.1.5. Sensor manifest – Vehicle Bus

The vehicle bus sensor is responsible for the CAN and FlexRay interfaces. The sensor provides a look-

up table to translate the messages’ and signal’s names into unique IDs which can be parsed by the

consuming modules. The look-up table is stored in the meta-data block of the output port.

Figure 21 –Vehicle bus sensor manifest

Kind of Information Description

Sensor Description Specification of module and definition of

module type (entry). Table 22 – Vehicle bus sensor description

The vehicle bus output port provides the meta information about the translation from message- and

signal-names to unique IDs, like DBC-files. One message can contain multiple signals and a

VehicleMessageMap describes one message. The message’s name is stored in the key and the first

entry in the list of TranslationEntries provides the mapping from the message’s name to the ID.

class VehicleBusSensor

Module::Module

- m_layer :ModuleLayer

- m_identifier :unsigned long

- m_version :Version

+ getType() :ModuleType

+ getUID() :unsigned long

+ getVersion() :const Version&

Module::PerceptionModule

m_layer = Perception

- m_outputs :std::vector<Port>

+ getOutputs() :const std::vector<Port>&

«enumeration»

ModuleLayer

Perception = 0

Fusion = 1

Application = 2

(from Module)

VehicleBusSensorVehicleMessagePort

Ports::

VehicleMessageOutport

m_type = Output

11

Page 27: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 26 / 53

Figure 22 – The vehicle bus output description

6.1.1.6. Sensor manifest – Vehicle-to-X (V2X)

Figure 23: Sensor manifest V2X

class VehicleMessagePort

Module::Port

- m_type :PortType

- m_component :Component

- m_dataType :DataType = Unknown

+ getPortType() :PortType

+ getComponent() :Module

+ getDataType() :DataType

Module::MetaData

- m_key :std::string

- m_type :MetaDataType

+ getKey() :const std::string&

+ getType() :MetaDataType

+ findChild(std::string&) :MetaData*

+ getParent() :MetaData*

«enumerati...

MetaDataType

Undefined = 0

Node = 1

Entry = 2

(from Module)

Module::MetaDataEntryBase

m_type = Entry

- m_unsigned :bool

+ getValueType() :MetaDataEntryType

T : typename

Module::MetaDataEntry

- m_value :T

+ getValue() :const T&

«enumeration»

MetaDataEntryType

Other = 0

Boolean = 1

Int = 2

Long = 3

Float = 4

String = 5

(from Module)VehicleMessagePort

m_dataType = VehicleMessage

VehicleMessageOutport

m_type = Output

Module::

MetaDataEntryULong

VehicleMessageMap

m_type = Entry

TranslationEntry

The TranslationEntry translates the message's name to a message ID (only one per VehicleMessageMap) and translates the signal's names to signal IDs

< T->unsigned long >

n1

1

1

n1

n

1

Page 28: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 27 / 53

Kind of Information Description

Sensor Description Specification of module and definition of

module type (entry).

Station ID Unique ID of the V2X station with this sensor Table 23: V2X sensor description

The information obtained by means of V2X communication with other cooperative vehicles or the

infrastructure is preprocessed in the V2X applications and the relevant data is exchanged through the

corresponding interface. There are four main categories of data, which are essential for the interface

to the fusion platform.

It is important to note that the V2X sensor not only delivers input data to the fusion platform, but also

requires input data from the fusion platform to create and disseminate V2X messages to other

cooperative vehicles or the infrastructure (see sec. 6.2.8.1).

Object information is one fundamental element:

Name Data type Description

objectID Integer Unique ID of object

class Enumeration Object class (e.g. vehicle, infrastructure, pedestrian,

unknown)

static Boolean Classification, if static or dynamic object

positionVector Record Position information in WGS84 (see Table 25)

motionVector Record Motion information (see Table 26)

dimensionVector Record Dimension information (see Table 27)

detectionTime Integer Detection time of object as UTC-timespamp in ms Table 24: object information

Name Data type Description

latitude Float Latitude in degree

longitude Float Longitude in degree

altitude Integer Altitude in cm Table 25: positionVector

Name Data type Description

heading Integer Heading in degree

speed Float Speed in m/s

acceleration Float Acceleration in m/s²

direction Enumeration Movement direction (forward, backward, unknown) Table 26: motionVector

Name Data type Description

length Integer length in cm

width Integer width in cm Table 27: dimensionVector

Another element provided by the V2X sensor is parking lot information. The data fields described in

Table 28 are related to one single lot. Hence, information about an entire parking area results in a list

of this lot information for each lot in the parking area:

Page 29: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 28 / 53

Name Data type Description

lotID Integer Unique lot ID

positionVector Record Lot position in WGS84 (see Table 25)

dimensionVector Record Lot dimensions (see Table 27)

state Float Lot state (vacant, occupied) as probability value

type Enumeration Lot type (parking lot, charging lot)

occupancyTime Integer Remaining occupancy time of lot in min Table 28: lot information

Next element provided by V2X sensor is charging information:

Name Data type Description

vehicleID Integer Unique vehicle ID

state Enumeration Charging state (charging, not charging)

chargingPercentage Float Current charging state in %

chargingTime Integer Remaining charging time in min

batteryCondition Enumeration Information about battery condition (e.g. good or critical

life time)

failureState Enumeration Failure state information (e.g. charging failure, charging

not possible, failure charging plate) Table 29: Charging information

State information is also provided to monitor the state of single cooperative stations or the whole

system:

Name Data type Beschreibung

stationID Integer unique ID of V2X station (vehicle or infrastructure)

state Enumeration State information (e.g. system failure) Table 30: state information

6.1.2. Map provider manifest

The Digital Map is a virtual depiction of the features of an area on earth in a defined coordinate

system. In the context of OFP, the map is considered as a single sensor providing a representation of

the environment. The map is therefore part of the perception layer and described by its type and

data.

The digital map will include two types of representations characterized by their different levels of

complexity which will be accessible from within the framework/fusion platform:

• A road graph which is a relatively sparse abstract representation of road networks with basic

geometric and limited semantic information.

• A semantic high-accuracy topographic map which contains information about the geometry,

coloring and semantic attributes of map features.

The map data is accessible through the map sensor interface that comprises sensor data access

functions that are enriched by the specific functions to access map elements.

Page 30: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 29 / 53

Each representation includes several layers which are accessible from within the framework/fusion

platform. Each layer contains

• Read-only data.

• Temporary data with varying attributes or new additional features detected by the other

sensors with regulated read and write access.

Figure 24: Map provider manifest

Kind of Information Description

Sensor Description Specification of module and definition of

module type (entry). RoadGraph Road graph representation

HdMap Semantic high-accuracy topographic map

representation Table 31: Map Sensor Description

Page 31: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 30 / 53

6.2. Data Types

All data types derive from one generic data structure which contains information that are relevant

for all algorithms.

Figure 25 Generic data structure with data type definition

The results of components are provided via shared pointer containers, that are realized as copy on

update containers. This guarantees non-blocking simultaneous read-access and if multiple modules

access the same data structure a small latency for write access.

Figure 26 Data container

6.2.1. RADAR

The radar fusion input provides a target list (see Table 32) and optional an object list. Targets are

provided in spherical sensor coordinates (range, azimuth angle, elevation angle). Furthermore each

target is characterized by a radial speed and SNR (signal to noise ratio) which is an indicator for the

receive signal strength. The radar may further provide additional quality information regarding the

estimated parameters. Optional the radar provides an Object list according to Section 4.2.

Figure 27 - Radar raw target description

pkg Types

«enumeration»

DataType

Unknown = 0

Image = 1

RADARRawTarget = 2

RADARRawTargetList = 3

LiDARRawTarget = 4

LiDARRawTargetList = 5

Egomotion = 6

Pose = 7

GeoPosition = 8

GridMap = 9

Object = 10

ObjectList = 11

USRawTarget = 12

USRawTargetList = 13

Proprietary = 14

Data

- m_creationTimestamp :double

- m_datatype :DataType

- m_uniqueId :unsigned long long

+ transform(Calibration&) :bool

+ getUniqueId() :unsigned long long

+ getDataType() :DataType

pkg Types

T : class

Container

- m_data :T* = nul lptr

+ getReadableData() :const T&

+ getWritableData() :T*

ContainerBase

- m_refCounter :unsigned long* = 1

- m_datatype :DataType = Unknown

+ ref() :void

+ deref() :bool

+ getDatatype() :DataType

pkg Types

Data

- m_creationTimestamp :double

- m_datatype :DataType

- m_uniqueId :unsigned long long

+ getDataType() :DataType

+ getUniqueId() :unsigned long long

+ transform(Calibration&) :bool

RADARRawTarget

- m_angles :Status<2>

- m_distance :Status<1>

- m_snr :double

- m_velocity :Status<1>

+ transform(Calibration&) :bool

«enumeration»

RADARRawTargetType

Unknown = 0

Static = 1

Dynamic = 2

Page 32: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 31 / 53

Raw Target List Unit Description Distance m Measured distance of the detection.

Radial Velocity m/s Absolute radial (in direction of the segment) velocity of

the detection.

Angle Azimuth rad Azimuth angle in radians of the detection.

Angle Elevation rad Elevation angle in radians of the detection.

Existence probability 0..1 Existence probability of the detection, using e.g. SNR

and RCS, not based on history. Table 32 – detailed description of Radar raw targets

6.2.2. Camera

The camera provides images and optional an object list. The object list is described in section 4.2. The

image is a row-dominated array and with information per pixel and pixel format.

Figure 28 Image description

The object list is defined by 2D objects in image coordinates. All meta-information is equal to the

generic object interface in section 4.1.

Figure 29 Image object description

pkg Types

Image

- m_width :unsigned int = 0

- m_height :unsigned int = 0

- m_type :ImageType

- m_data :void* = nullptr

- m_dataSize :unsigned long = 0

- m_rowSize :unsigned int = 0

+ getWidth() :unsigned int

+ getHeight() :unsigned int

+ getImageType() :ImageType

+ getPixel(unsigned int, unsigned int) :T*

+ getPixelSize() :unsigned long

+ getDataSize() :unsigned long

+ getRowSize() :unsigned long

«enumeration»

ImageType

RGB888 = 0

RGBA8888 = 1

UInt8 = 2

UInt16 = 3

Float = 4

pkg Types

ImageObject

- m_center :Status<2>

- m_class :ObjectClass = Unknown

- m_dimension :Status<2>

- m_id :unsigned int

- m_relativeVelocity :Status<2>

+ getCenter() :const Status<2>&

+ getClassification() :ObjectClass

+ getDimension() :const Status<2>&

+ getId() :unsigned int

+ getRelativeVelocity() :const Status<2>&

Page 33: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 32 / 53

Image object Unit Description Center Px The pixel coordinate of the object’s center

Dimension Px With and height of the object

ID

The unique identifier

Classification

Classification of the object

Relative velocity Px/s The relative velocity of the object Table 31 – detailed description of an image object

6.2.3. LiDAR

The LiDAR provides a raw target list with all measured pings of one complete measurement cycle. A

LiDAR target contains the Cartesian sensor coordinates, the polar coordinates, the intensity and the

target’s type.

Figure 30 LiDAR raw target

Figure 31 List of LiDAR raw targets

6.2.4. Vehicle bus

The vehicle bus data is separated into messages and signals. Both types are identified via an unique

ID. The message can contain multiple signals and the signals contain the data. The vehicle bus sensor

(section 6.1.1.5) provides a look-up table to translate names of signals and messages to the unique

IDs.

pkg Types

LiDARRawTarget

- m_position :Status<3>

- m_angles :Status<2>

- m_distance :double

- m_intensity :double

- m_type :LiDARRawTargetType

+ transform(Calibration&) :bool

«enumeration»

LiDARRawTargetType

Other = 0

Ground = 1

Rain = 2

Dirt = 3

pkg Types

Data

- m_creationTimestamp :double

- m_datatype :DataType

- m_uniqueId :unsigned long long

+ transform(Calibration&) :bool

+ getUniqueId() :unsigned long long

+ getDataType() :DataType

LiDARRawTargetList

+ transform(Calibration&) :boolAll lists derive from std::vector<> with the corresponding entries as template argument

Page 34: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 33 / 53

Figure 32 The vehicle bus data structure

6.2.5. Vehicle Abstraction

Since the content of the vehicle bus is highly specific, it is reasonable to introduce a vehicle

abstraction interface between the perception & fusion layer and the actual vehicle can bus. This

decoupling of the high-level fusion system from the low-level vehicle control allows for efficient

abstract function development and makes fast transfer into various types of vehicles including

simulated variants possible.

Figure 33 Vehicle abstraction

pkg Types

«enumeration»

DataType

Unknown = 0

Image = 1

RADARRawTarget = 2

RADARRawTargetList = 3

LiDARRawTarget = 4

LiDARRawTargetList = 5

Egomotion = 6

Pose = 7

GeoPosition = 8

GridMap = 9

Object = 10

ObjectList = 11

USRawTarget = 12

USRawTargetList = 13

VehicleMessage = 14

Proprietary = 15

Data

- m_creationTimestamp :double

- m_datatype :DataType

- m_uniqueId :unsigned long long

+ transform(Calibration&) :bool

+ getUniqueId() :unsigned long long

+ getDataType() :DataType

VehicleMessage

- m_messageID :unsigned long

VehicleSignalBase

- m_signalId :unsigned long

- m_type :SignalType

T : typename

VehicleSignal

- m_value :T

+ getValue() :T

«enumerati...

SignalType

Unknown = 0

Int8 = 1

UInt8 = 2

Int16 = 3

UInt16 = 4

Int32 = 5

UInt32 = 6

Float = 7

VehicleSignalInt8 VehicleSignalUInt8 VehicleSignalInt16 VehicleSignalUInt16 VehicleSignalInt32 VehicleSignalUInt32 VehicleSignalFloat

n

1

< T->char >

< T->unsigned char >

< T->short >

< T->unsigned short >

< T->int >

< T->unsigned int >

< T->float >

Page 35: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 34 / 53

A control unit incl. accordant vehicle-specific

software is therefore necessary. It on the one

hand extracts data from the can bus to send it via

the abstract vehicle output interface to the fusion

system, and on the other hand receives with

constant frequency the abstract vehicle input

interface to write the information onto the

vehicle-specific can bus.

The vehicle input and output interfaces contain all

information necessary to supervise and control the

full functionality of the vehicle, incl. vehicle

motion, automation modes, lighting, or the

information of further vehicle sensors.

We also define a vehicle hmi interface, which

represents the driver input in simulated vehicles. It

is not used in real vehicles.

To be able to efficiently adopt the fusion system or

the automated driving functions to various

vehicles (incl. simulations), the software shall be as

generic as possible, only parameterized by real

quantities, which define physical properties (e.g.,

size), technical details (e.g., engine specification)

or the capabilities of the behavior (automation

modes etc.) of the host vehicle. These are

represented by the vehicle specification interface

(see Figure 34 Vehicle specification).

Figure 34 Vehicle specification

Page 36: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 35 / 53

6.2.5.1. Detailed Specification

VehicleInput

Name Description Type Comments

Motion control

steeringAngle Set steering angle in rad double (Single-track model assumption). Math.

positive (Rotation to left -> positive)

acceleration Set acceleration in m/ss double Positive along gear-induced motion

direction, Longitudinal acceleration

(scalar)

velocity Set velocity in m/s double Positive along gear-induced motion

direction, Longitudinal velocity (scalar),

alternative to set acceleration

controlMode Set controller mode uint8 [0: acceleration, 1: velocity, 2: stop

distance, 3: HMI-controlled]

stopDistance Set stop distance in m double Positive along gear-induced motion

direction,

gearMode Set gear mode uint8 [0: forward, 1: backward, 2: Park, 3:

Idle]

Further control

commands

indicatorState Set indicator state uint8,

bitvector

[0: off, 1: on], [FL,FR,RL,RR,CL,CR,

FullLeft, FullRight]

indicatorMode Set indicator mode uint8 [0: normal, 1: warning, 2: direct]

headLightState Set head light state uint8 [0: off, 1: normal, 2: high beam]

ignitionState Set ignition state uint8 [0: off; 1: ignition on; 2: engine on]

parkingBrakeState Set parking brake state uint8 [0: off, 1: on]

wiperMode Set wiper mode uint8,

bitvector

[0: off, >0: different speeds],

[frontwiper, rearwiper,…]

System commands

automationMode Set automation mode (high

level)

uint8 [0: manual, 1: longitudinal, 2: lateral, 3:

full]

automationPermission Set automation permission

(low level)

uint8 [0: off, 1: on] (general permisstion for

automated driving)

emergencyBrake Set emergency brake state uint8 [0: off, 1: on]

comfortHold Set comfort hold state uint8 [0: off, 1: on], "soft emergency brake"

chargingMode Set charging mode uint8 [0: no charging; 1: charging]

engineMode Set engine / drive mode uint8 [0: electrical; 1: fuel; 2: auto, …] tbd:

hybrid: (E, hybrid, etc.), fully electric:

(eco, sport, …)

Automation software

meta data

heartBeat Heartbeat (counter) uint32

stopDistanceHeartBeat Heartbeat value at start of

stopping maneuver

uint32

timestamp Current system time in

us/ns?

uint64

Page 37: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 36 / 53

VehicleOutput

Name Description Type Comments

Actual motion

steeringAngle Actual steering angle in rad double (Single-track model

assumption). Math. positive

(Rotation to left -> positive)

acceleration Actual acceleration in m/ss double Positive along gear-induced

motion direction, Longitudinal

acceleration (scalar)

velocity Actual velocity in m/s double Positive along gear-induced

motion direction, Longitudinal

velocity (scalar)

controlMode Actual controller mode uint8 [0: acceleration, 1: velocity, 2:

stop distance, 3: HMI-

controlled]

stopDistance Actual stop distance in m double Positive along gear-induced

motion direction,

gearMode Actual gear mode uint8 [0: forward, 1: backward, 2:

Park, 3: Idle]

gear Actual gear value (R,1,2,3,4,5,...) unit8

Vehicle state

indicatorState Actual indicator state uint8,

bitvector

[0: off, 1: on],

[FL,FR,RL,RR,CL,CR, FullLeft,

FullRight]

indicatorMode Actual indicator mode uint8 [0: normal, 1: warning, 2:

direct]

headLightState Actual head light state uint8 [0: off, 1: normal, 2: high

beam]

ignitionState Actual ignition state uint8 [0: off; 1: ignition on; 2: engine

on]

parkingBrakeState Actual parking brake state uint8 [0: off, 1: on]

wiperMode Actual wiper mode uint8,

bitvector

[0: off, >0: different speeds],

[frontwiper, rearwiper,…]

System state

automationMode Actual automation mode (high

level)

uint8 [0: manual, 1: longitudinal, 2:

lateral, 3: full]

automationPermission Set automation permission (low

level)

uint8 [0: off, 1: on] (general

permisstion for automated

driving)

emergencyBrake Actual emergency brake state uint8 [0: off, 1: on]

comfortHold Actual comfort hold state uint8 [0: off, 1: on], "soft emergency

brake"

chargingMode Actual charging mode uint8 [0: no charging; 1: charging]

engineMode Actual engine / drive mode uint8 [0: electrical; 1: fuel; 2: auto,

…] tbd: hybrid: (E, hybrid, etc.),

fully electric: (eco, sport, …)

Vehicle meta data

heartBeat Vehicle heartbeat (counter) uint32

stopDistanceHeartBeat Heartbeat value at start of

stopping maneuver

uint32

Page 38: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 37 / 53

timestamp Current vehicle system time in

us/ns?

uint64

Vehicle sensors

acceleration6D Acceleration (linear, angular) in

m/(ss); rad/(ss)

Array:6

doubles

6 DOF (X,Y,Z,Roll,Pitch,Yaw)

w.r.t. vehicle coordinate

system (DIN 70000)

steeringVelocity Steering velocity in rad/s double Math. positive (Rotation to left

-> positive)

steeringWheelAngle Steering wheel angle in rad double Math. positive (Rotation to left

-> positive)

velocity6D Velocity (linear, angular) in m/s;

rad/s

Array: 6

doubles

6 DOF (X,Y,Z,Roll,Pitch,Yaw)

w.r.t. vehicle coordinate

system (DIN 70000)

position GNSS position (long, lat in grad; alt

in m)

Array: 3

doubles

Absolute position

(Long,Lat,Alt) WGS84 (current

state)

positionVariances GNSS variances around current

Position

Array: 3

doubles

Variances; tbd units

headingNorth Angle to north in rad double W.r.t north

pitchAngle Pitch angle in rad double W.r.t. to local tangential

surface

rollAngle Roll angle in rad double W.r.t. to local tangential

surface

wheelSpeed Single wheel velocities in m/s Array: 4

doubles

Sign indicates motion

direction, positive=forward;

(FL, FR, RL, RR)

wheelSlipRatio Wheel slip ratio 4 wheels [0,1] 4xdouble

wheelSlipAngle Wheel slip angles in rad Array: 4

doubles

(FL, FR, RL, RR)

wheelForce 4 Wheel forces in N Array: 4

doubles

(FL, FR, RL, RR)

brakePedalPosition Brake pedal position [0,1] double 0=neutral; 1=down

brakePedalPressure Brake pedal pressure in bar double

acceleratorPedalPosition Accelerator pedal position [0,1] double 0=neutral; 1=down

acceleratorPedalPressure Accelerator pedal pressure in bar double

Additional vehicle states

batteryState Charging state of battery [0,1] double

doorStates Door states (closed/open) uint8,

bitvector

[0: off, 1: on], [FL, FR, RL, RR,

Trunk, Hood]

safetyBeltStates Safety-belt state uint8,

bitvector

[0: off, 1: on], [FL, FR, RL, RR,

RC]

ADASState ADAS state uint16,

bitvector

ADAS states [0: off, 1: on], tbd

errorCode Error codes uint32,

bitvector

Details tbd

Red background = still in discussion

Page 39: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 38 / 53

6.2.6. Ultrasonic

The ultrasonic sensor provides distance measurements without additional information. If the

ultrasonic sensor system contains more than one sensor, a list of measurements for one

measurement cycle is provided.

Figure 35 Ultrasonic measurement

Figure 36 Ultrasonic measurement list

6.2.7. Map

6.2.7.1. Simple Road Graph

Following the OpenDrive 1.4 Specification [OD1.4], the road graph is made of nodes and edges. The

nodes for the global road graph represent single road segments (i.e., geometric paths and lane

definitions) with specific attributes describing their geometric parameters and some additional

attributes (e.g.: driving restrictions, drivable directions, speed restrictions, etc.). Edges represent

connections between nodes, where a change in attributes can occur (e.g.: an intersection, a change

of the speed limit, a stop sign or a parking space entrance). The road geometry is described

analytically by means of various elements, such as straight lines, circular sections, polynomials or

clothoids. Nonetheless, queries through the API employ simple Cartesian coordinates.

Each road segment is identified by a unique integer id and is assigned a number of attributes or

meta-data.

Attributes Data Type Description

semantic type String E.g.: Lane, biking, tram, parking space, …

roadSegmentId Int unique for all RoadSegments

roadId Int Unique for all roads

laneId Int Unique for all lanes of a single road

laneSectionId Int Unique for all sections along a single lane Table 33: Example of attributes for nodes and edges

pkg Types

Data

- m_creationTimestamp :double

- m_datatype :DataType

- m_uniqueId :unsigned long long

+ transform(Calibration&) :bool

+ getUniqueId() :unsigned long long

+ getDataType() :DataType

USRawTarget

- m_distance :Status<1>

+ transform(Calibration&) :bool

pkg Types

Data

- m_creationTimestamp :double

- m_datatype :DataType

- m_uniqueId :unsigned long long

+ transform(Calibration&) :bool

+ getUniqueId() :unsigned long long

+ getDataType() :DataType

USRawTargetList

+ transform(Calibration&) :bool

All lists derive from std::vector<> with the corresponding entries as template argument

Page 40: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 39 / 53

The road segments can be queried by a proximity search from a Cartesian a point:

Method Description

getClosestRoadSegmentFromPosition() Given a point, return the RoadSegment with the

minimal lateral distance to the segment.

getDistanceFromSegmentStart() Get the longitudinal position along the

RoadSegment for a queried point.

getLaneWidth() Get width of road at a specific distance from the

road segment start point.

getPosition() Generate a single Cartesian 3d point on the

RoadSegment in a specified distance from the

Segment start position.

getLength() Return the RoadSegment length in meters.

sampleTrackLine() Generate Cartesian 3d points along the defined

list of RoadSegments for a specified list of

distances from the track list origin. Table 34 – Simple Road Graph query API

6.2.7.2. Semantic High-Accuracy Topographic Map

The semantic high-accuracy topographic map contains information about the geometry, color and

semantics of features like the drivable area, road markings, parking space types and vertical

structures.

Attributes Data Type Description

Id Int RoadMarking id

parkingClass Enum Normal/Handicapped/Charging/etc. parking

space type

cornerListCcw Vector<Point> Polygon of the Parking slot Table 35: ParkingSlot data

Attributes Data Type Description

Id Int RoadMarking id

cornerList Vector<Point> Feature

markingClass Enum Lane vs. park marking

markColor Float[3] Averaged RGB color of the marking

envColor Float[3] Averaged RGB color of the marking environment Table 36: RoadMarking data

Attributes Data Type Description

Id Int Boundary id

bCurve Vector<Point> A list of points defining the boundary and its

height Table 37: Boundary curve data

Attributes Data Type Description

outerBorderCcw Vector<Point> Counter-clockwise outer polygon definition for

the complete known space

innerBorderCw Vector<Vector<Point>> List of clockwise polygons that are to be

subtracted from the outer polygon Table 38: FreespacePolygon data

Page 41: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 40 / 53

Method Description

getAllParkingSlots() Retrieve all known ParkingSlots

getParkingSlotFromId() Retrieve ParkingSlot by id

getRoadMarkings() Retrieve all known Road markings including

labels up to specified distance around a point

getBoundaries() Retrieve all vertical structures in the map

geDrivableArea() Retrieve the Freespace Polygon Table 39 – Semantic map query API

Figure 37: Map Layer

6.2.8. V2X

The V2X communication part needs an interface to the fusion platform and should provide an

interface directly to the function module, too. The interface to the fusion platform serves the

interchange of information regarding environmental situation awareness, as well as parking and

charging management. The interface between the function module and V2X is used to interchange

functional operating instructions and user requests by means of the HMI.

To determine the specific interfaces for the V2X communication, it is useful to have a look on the

intended V2X messages at first. Depending on the content of received messages, varying information

is obtained. On the other hand, the required information to populate V2X messages that shall be sent

to other recipients differs according to the desired message type.

Subsequently the accessor methods for the provided data are described according to the interfaces.

6.2.8.1. V2X messages

It is intended to use four different V2X message types in this project:

• Cooperative Awareness Message (CAM, ETSI EN 302 637-2)

• Decentralized Environmental Notification Message (DENM, ETSI EN 302 637-3)

• Parking Management Message (PMM, will be specified in this project)

• Electromobility Management Message (EMM, will be specified in this project)

The set of messages is not fixed. In the future, additional message types may be included for extensions

of the current use cases. For example, Signal Phase and Timing message (SPaT) or road topology

Page 42: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 41 / 53

message (TOPO or MAP) would be interesting in this context (both still under development,

ISO TS 19091/ETSI TS 103 301).

The creation of V2X messages is based on two different modes. While some, in particular safety-

relevant, messages are disseminated periodically, other message types are triggered event-based.

Hence, certain information is available in regular time intervals, whereas different information may

occur irregularly.

In the following, the previously mentioned messages will be described along with the related inputs

and outputs, as well as the triggering condition. Inputs are prerequisite to populate V2X messages that

shall be sent to other recipients, whereas outputs are relevant for information that has to be passed

from incoming V2X messages to the fusion platform or the function module.

Cooperative Awareness Message (CAM)

Description Distributing information about the presence, position and basic status of

neighbouring ITS-stations (ITS-S). At that, in particular the communication of direct

neighbours is essential, i.e. within the area of one single hop. CAMs guarantee the

essential co-operative perception in a V2X ad-hoc network. Hence, every

communicating ITS-S must be able to receive and send them.

Triggering

condition

Periodically

Input Object information of ego vehicle (e.g position, timestamp, heading, velocity,

direction, vehicle length/widt)

Output Object information (i.e. from another announcing cooperative vehicle, V2V)

Decentralized Environmental Notification Message (DENM)

Description Providing information about a certain driving environment or traffic event to other

ITS-S. At that, applications of co-operative Road Hazard Warning (RHW) are

predominantly focused, to make other road users aware of dangers. Based on the

information, receiving ITS-stations can decide whether the event is relevant for them

and trigger suitable actions on application level. If required, DENMs can be passed on

to other ITS-S by the receiver (multi-hop).

Triggering

condition

Event-based

Input Object information (e.g. obstacle), State information (e.g. fatal system error)

Output Object information (e.g. obstacle), State information (e.g. fatal system error)

Page 43: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 42 / 53

Parking Management Message (PMM)

Description Basically providing information about vacant parking or charging lots. Furthermore it

allows updating the occupancy view and can announce management information. At

that, a list of vacant parking/charging lots is given as whitelist. The list can be updated

by marking certain lots as vacant or occupied. Management information like

instructions to find a vacant parking/charging lot, take a defined lot or specific user

requests can be included.

Triggering

condition

Periodically

Input Parking lot information from ego vehicle (i.e. updates for list of vacant/occupied

parking or charging lots) and functional responses or requests (e.g. parking state,

current parking lot) to cooperative infrastructure

Output Parking lot information (i.e. list of vacant/occupied parking or charging lots) and

functional requests or instructions (e.g. parking mode, assigned parking lot) from

cooperative infrastructure

Electromobility Management Message (EMM)

Description Providing information relevant for electric vehicles and their management. The EMM

can be a management request like an inquiry for the current charging state of an

electric vehicle or an instruction as to start or stop charging at a certain charging

station. On the other hand it can contain information like the current charging state

of an electric vehicle or the announcement of the start of the charging process.

Triggering

condition

Event-based

Input Charging information of ego vehicle (e.g. charging state, remaining charging time) and

functional responses or requests (e.g. charging state, target charging time) to

cooperative infrastructure

Output Charging information (e.g. charging state, remaining charging time) from other

cooperative vehicle and functional requests or instructions (e.g. to announce current

state, start/stop charging) from cooperative infrastructure

Parking Command Message (PCM)

Description Providing information relevant for parking, collection and charging commands to the

vehicle. The PCM can be a management request from the valet parking driver,

triggered via the valet parking mobile app. The PCM can also contain a suggested

route from the start point to the destination point for the respective parking or

collection scenario.

Triggering

condition

Event-based

Input User (e.g. driver) command information send from the valet parking app to the ego

vehicle (e.g. command type like parking, collection, charging and planned route and

designated destination parking spot)

Output User (e.g. driver) command information send from the valet parking app to the ego

vehicle (e.g. command type like parking, collection, charging and planned route and

designated destination parking spot)

Page 44: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 43 / 53

There is no standardization or definition of V2X messages including the information intended in PCM,

PMM and EMM so far. As this specific information is essential for the use cases considered in this

project, this two new message types will be specified, defining the appropriate data structures. In the

future, it could be an outlook to feed the definitions into an upcoming standardization, in particular in

regard to the electromobility management messages.

6.2.7.2. V2X data input/output

Member Description

getObjectInfoList() Get object information from V2X sensor

setObjectInfo() Set object information to V2X sensor Table 40: Access object information

Member Description

getLotInfoList() Get list of lot information from V2X sensor

updateLotInfoList() Update list of lot information through V2X sensor Table 41: Access lot information

Member Description

getChargingInfoList() Get list of charging information from V2X sensor

setChargingInfo () Set charging information to V2X sensor Table 42: Access charging information

Member Description

getStateInfoList() Get list of state information from V2X sensor

setStateInfo() Set state information to V2X sensor Table 43: Access state information

Page 45: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 44 / 53

Part 3: Fusion and environment Model

7. Fusion layer

The fusion layer contains components that read information from multiple sensors or components of

the perception layer and fuses the information to one consistent environment model which then can

be read by the following layers.

Please see the module manifest for components of the fusion layer (see chapter 5).

The components of the fusion layer consume data from the sensors and the perception layer and

provide information for other fusion layer components or the environment model.

8. Environment model

The environment model contains all consistent and validated information from the perception and

fusion layer. The collected updates are sent periodically (see chapter 3) to the application layer.

The environment model is a set of data structures to describe the ego-vehicle and the environment

which follows the concept of the Blackboard pattern. The model provides updated information after

defined timing triggers.

All provided information in this environment model are described in a fixed coordinate system. The

coordinate system is defined by the external map and the localization results are defined in the

coordinate system of these external maps.

8.1. Timing concept

The environment model differentiates the scene in three different classes. Every class updates its

information to the functional layer in own timing cycles.

The slowest update-rate is used for the static environment information in form of the grid-map or

updates in the road graph or the semantic high-accurate topographic map.

The second class updates the object lists of the dynamic model or states of traffic lights, etc.

The fastest class describes the ego-vehicle state which contains the driver state, the ego-motion and

other safety-relevant information about the vehicle.

For further information about the data format, please see chapter 4.

8.2. Map data

The map data (also see section 6.2.7 for the map data description) contains information about the

global environment, i.e. road graphs. The map data in the environment model is enriched by sensor

and fusion information, i.e. lane information, state of traffic lights and information about parking

slots.

Page 46: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 45 / 53

8.2.1. Road graph

The global road graph contains drivable streets, intersections and parking places in an abstract graph.

Nodes in this graph describe intersections and the path from a road to a parking place is described as

a special case of an intersection. For the valet parking use-cases handover sections are required

which are also represented as nodes.

The drivable direction of a road is stored in the meta-data-block of each edge of the graph. The

length, maximal and average speed, maximum curvature and if the road requires a road charge.

Parking spaces contain information about the dimension of the rectangular area and the heading.

The node describes the center of the parking space. Additional information is given in a meta-data-

area like restrictions like disabled parking permissions, owners of private parking spaces, a charging

plate, etc.

In the global road graph, also information about static (or nearly static) properties of the node and

edges are included which are acquired dynamically (e.g. by means C2I communication). This includes

for example the occupancy of parking spaces, traffic jams, construction sites, road blockings, current

average travel times or occupancy of charging plates. For all this measured information are

corresponding uncertainties given.

Every node and every edge are identified by unique IDs and have relations to the information of the

appropriate elements in the semantic high-accurate topographic map.

Figure 38 OpenStreetMap-based road graph of Berlin

The map provides information about the complete map or it can return a region of interest around a

defined position (e.g. ego vehicle coordinate). The result is described in an abstract road graph with

information about following roads outside of the ROI.

Page 47: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 46 / 53

8.2.2. Semantic high-accurate topographic map

Figure 39 OpenDrive based semantic high-accurate topographic map of urban intersections in Braunschweig

The semantic high-accurate topographic map contains precise information about the geometry and

semantic of roads, lanes, intersections, parking spaces as well as various road infrastructure. Figure

39 shows an example of an urban intersection.

To model the geometry of the road, its centerline is modeled piecewise by arc segments, clothoids or

polynomials. To model the lanes which are belonging to a road, the offsets and width of each lane

are modeled piecewise by polynomials. Furthermore, various lane properties are given such as a bus

lane or merging lane. To describe the road network, road linkage is characterized for each road with

its successor/predecessor linkage and junctions. Intersections and junctions are modelled by one-

way-roads that connect inbound lanes from a road to outbound lanes.

Page 48: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 47 / 53

Figure 40 OpenDrive Example of parking area at DLR in Braunschweig (Tronis® and SUMO)

The position of road specific objects, such as road markings like stopping lines and directional arrows,

but also road infrastructure like traffic signs, traffic lights or barriers is parametrized by a lateral

offset along the centerline and a transversal offset to it. Lane markings are given per lane (at the

outline of the lane). Parking spaces are also provided per road and use the same positioning along

road as other road infrastructure.

If any lanes, lane markings, packing spaces, traffic signs, occupancy of parking spaces, state of

barriers, state of traffic lights etc. are detected online (e.g. by the onboard sensors or via V2X

communication) which are not already included in the map, they are added and linked to the

appropriate elements which are already included in the map. When the online detected elements are

already included in the stored information it is fused with the online detected information.

Uncertainty information (existence probability, confidence, ...) is added to every object in this map,

where it makes sense (especially for the objects which were detected online and added to or fused

with the offline generated map).

Additional information are landmarks which can be used for localization and a global reference. A

landmark contains its position and orientation and some descriptors in the meta-data block which

are helpful for localization algorithms to identify the landmark.

8.3. Static environment model

The static model contains all measured and validated information about the static scene that are not

included into the enriched map data.

8.3.1. Freespace

The freespace defines the cross-checked space between the car and the measurement range of the

sensors. It defines a polygon which describe the freespace-probability based on a given freespace-

certainty. If the sensors can detect road curbs and can link them to the external map it is possible to

describe the freespace on the sidewalk to provide the application a generic way to check if it is safe

to drive on sidewalk in urgent cases. The height of the road curbs can be extracted out of the map or

the verified obstacle list (see chapter 8.3.2).

The freespace is differentiated into two different interpretations. The first interpretation describes a

dynamic freespace which is fused in a single-shot way by all sensors to provide a validated highly-

Page 49: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 48 / 53

dynamic freespace to give the higher applications the possibility to react as fast as possible on

changes in the driveway (i.e. a box is detected directly in front of the vehicle and was not detected in

earlier measurement cycles). The polygon describes the freespace based on the current field of view

of the sensors and can describe hidden areas as non-free.

The second interpretation describes a time-based fusion. This freespace contains information that

are measured several cycles and can contain freespace that can be currently not seen by the sensors

but was measured during the drive-by of this area.

Figure 41 Freespace description

The freespace contains several polygons. The outer polygon defines the maximum drivable area and

the inner borders describe the borders inside the maximum free space. These inner borders can

result for example due to traffic islands. See figure 40 for an example on a parking area. The dark

blue area is the drivable area.

Figure 42 Example of a free space

pkg Types

Line

- m_start :Status<3>

- m_end :Status<3>

+ getStart() :const Status<3>&

+ getEnd() :const Status<3>&

FreeSpaceSegment

- m_borderType :FreeSpaceSegmentType

+ getBorderType() :FreeSpaceSegmentType

«enumeration»

FreeSpaceSegmentType

SensorMaxRange

SensorShadow = 2

Obstacle = 2

FreeSpaceBorder

- m_freespaceSegments :std::vector<FreeSpaceSegment>

+ getFreeSpaceBorder() :const std::vector<FreeSpaceSegment>

FreeSpace

- m_outerBorder :FreeSpaceBorder

- m_innerBorders :std::vector<FreeSpaceBorder>

+ getOuterBorder() :const FreeSpaceBorder&

+ getInnerBorders() :const std::vector<FreeSpaceBorder>&

1

*

1 1..*

Page 50: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 49 / 53

The tables below describe the content of the free space polygon classes.

FreeSpaceSegment Unit Description Border type

A description why this border exists.

Table 44 – detailed description of a free space segment

FreeSpaceBorder Unit Description Free space segments

A closed-loop polygon of free space segments

Table 45 – detailed description of a free space border

Raw Target List Unit Description Outer border

The maximum drivable area. Points are sorted counter-

clockwise

Inner borders

A list of free space boarders that fit completely in the

outer border. The points are sorted clockwise Table 46 – detailed description of a free space

8.3.2. Obstacles

Obstacles are all kinds of obstacles that are detected by the onboard sensors and cannot be detected

as potentially dynamic objects. If the sensors detect unclassifiable obstacles (i.e. boxes on the road)

are these obstacles described by a generic obstacle description. The obstacle is defined as a set of

lines that can be bounding boxes or only a set of lines to describe the shape of an unknown obstacle

as good as possible. The quality of the obstacle is described with an existence quality that is a fusion

between measurements of the different sensors and meta-information out of the HD-map. A second

indicator is the detection quality that provides information about the visibility of the object due to

the sensors.

Figure 43 Description of an obstacle

Raw Target List Unit Description Line segments m A list of line segments. The height is encoded in the Z-

coordinate

Existence probability % Existence probability of the obstacle

Detection quality % Detection quality of the obstacle Table 47 – detailed description of an obstacle

pkg Types

Obstacle

- m_lineSegments :std::vector<Line>

- m_existenceProbabil ity :double

- m_detectionQuality :double

+ getExistenceProbabil i ty() :double

+ getDetectionQuality() :double

+ getLineSegments() :const std::vector<Status<2>>&

Page 51: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 50 / 53

8.3.3. Static objects

Static objects are obstacles that are detected by the onboard sensors and can be classified as

potentially dynamic objects. If it is possible, the objects are linked to the semantic highly-accurate

topographic map. Potentially dynamic objects are for example parked cars, standing bicyclists or

pedestrians that do not move.

Objects are defined in section 4.2.

8.3.4. Occupancy grid map

The occupancy grid map describes the static world around the ego vehicle. The measured data of the

static environment is discretized into quadratic chunks and each cell is classified into three classes:

free, unknown and occupied. The map is defined with a fixed size and the resolution of one cell. This

results in a rectangular map and a resize-request results in a reset of the map. The result of an

occupancy grid map can be provided as an image as shown in figure 3 or as an one-dimensional row-

dominated array with the probability values for each cell.

The map moves along the vehicle but is not rotated to the ego vehicle’s heading.

It is additionally possible to extract a gradient map of measured heights of obstacles to estimate the

drivable area and to detect obstacles which can be overrun.

Figure 44 LiDAR-based occupancy grid map

Page 52: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 51 / 53

Figure 45 Definition of the Grid-map

The following table defines the member variables of the occupancy grid map.

Member Description

width The map’s width

height The map‘s height

resolution The map’s resolution

position The current position of the car in the local

coordinate system

heading The car’s heading in the local coordinate system

cells The cells of the occupancy grid map Table 48 Occupancy grid map member description

A cell of an occupancy grid map is described as follows:

Member Description

occupancy The cell contains a set of occupancy entries per

sensor

heightGradient The gradients in x- and y-direction of measured

obstacle heights Table 49 Cell description of an occupancy grid map

8.4. Dynamic environment model

The dynamic model contains two different classes of information. First it contains the dynamically

detected, classified and predicted objects. Secondly, it contains the ego vehicle description.

8.4.1. Dynamic objects

The OFP sorts the detected objects into three categories. All objects that never moved are defined as

stationary objects (see section 8.3.1). The actual moving objects are classified as dynamic objects and

all objects that are currently stationary but were moving in the past are classified as potentially

dynamic objects.

pkg Types

GridMap

+ transform(Calibration&) :bool

+ getWidth() :int

+ getHeight() :int

+ getResolution() :float

+ getCell(float, float) :const GridMapCell&

+ updatePose(Egomotion&) :void

GridMapCell

+ updateOccupancy(bool, float) :void

+ updateHeightGradient(float) :void

+ getHeight(float&, float&, float&) :bool

Page 53: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 52 / 53

Additional to the categorization is a classification provided with classes for cars, trucks, pedestrians,

bicycles and unknown objects.

The objects are matched to the high-accurate topographic map to increase the maneuver and

trajectory prediction of obstacles. The link to clothoids can be used by planning algorithms for path

prediction etc. For further information about the data format of objects, please see section 4.2.

8.5. Vehicle status

The vehicle status contains information about the ego-vehicle and is updated as fast as possible.

It contains the ego-motion and the pose in relation to the world or in relation to the local stationary

coordinate system.

8.5.1. Ego motion

The ego motion describes the movement between two measurement cycles in form of a 3D-velocity

and 3D-rotation-rates.

Some functions require direct information from the vehicles sensors about the ego motion (e.g.

steering angle) and can register for a CAN-stream forwarding and receive the data directly after the

interpretation. For further information about the data format, please see section 4.4

8.5.2. Pose

The pose is defined as a translation and rotation of Cartesian coordinates in relation to the root

coordinate system. The root coordinate system can be a local static Cartesian coordinate system or a

world system, i.e. UTM. For further information about the data format, please see section 4.5

The ego vehicle is matched to the tracks in the high-accurate topographic map to estimate the next

drivable lanes and tracks.

Appendix

Authors

Dr. Michael Schilling (HELLA), contacting Author for the OFP: [email protected]

Sven-Garrit Czarnian (HELLA Aglaia)

Kay Kossowan (HELLA Aglaia)

Thomas Dammeier (HELLA Aglaia)

Thomas Müller (HELLA Aglaia)

Alexander Burmeister (DLR)

Christian Löper (DLR)

Paulin Pekezou-Fouopi (DLR)

Thorsten Volz (Elektrobit)

Sebastian Gehrig (InnoSenT)

Judith Ngoumou (TWT Science & Innovation)

Tobias Breddermann (HELLA)

Page 54: OFP Interface Specification - Hella · 2019. 3. 5. · Interface Specification – Open Fusion Platform (OFP) Published under Creative Commons BY-ND 4.0 License 4 / 53 products or

Interface Specification – Open Fusion Platform (OFP)

Published under Creative Commons BY-ND 4.0 License 53 / 53

License

This document is published under the Creative Commons License CC BY-ND 4.0 (Attribution-

NoDerivatives 4.0 International), for full license information please visit:

https://creativecommons.org/licenses/by-nd/4.0/

References

[AA] Adaptive AUTOSAR, https://www.autosar.org/standards/adaptive-platform/

[ET11] European Telecommunications Standards Institute: “ETSI EN 302 663. Intelligent

Transport Systems (ITS); Access layer specification for Intelligent Transport Systems

operating in the 5 GHz frequency band”, 2013.

[ISO8855] „Fahrdynamik”, https://de.wikipedia.org/wiki/Fahrdynamik, 5-Dec-2016.

[ISO23150] „Data communication between sensors and data fusion unit for automated driving

functions”, https://www.iso.org/standard/74741.html

[IEC61508] Standard published by the International Electro-technical Commission of rules

applied in industry. It is titled Functional Safety of

Electrical/Electronic/Programmable Electronic Safety-related Systems (E/E/PE, or

E/E/PES), http://www.iec.ch/functionalsafety/standards/

[ISO26262] Titled "Road vehicles – Functional safety", ISO 26262 is an ISO standard for functional

safety of electrical and/or electronic systems in production automobiles from 2011,

https://www.iso.org/obp/ui/#iso:std:iso:26262:-1:ed-1:v1:en

[OD] Open Drive, http://www.opendrive.org/

[OD1.4] Open Drive Specification, 1.4

http://www.opendrive.org/docs/OpenDRIVEFormatSpecRev1.4H.pdf

[OFP] Open Fusion Platform, www.ofp-projekt.de

[OP] Open Platform, http://ir.mobileye.com/investor-relations/press-releases/press-

release-details/2016/BMW-Group-Intel-and-Mobileye-Team-Up-to-Bring-Fully-

Autonomous-Driving-to-Streets-by-2021/default.aspx

[OR] Open robinos, https://www.elektrobit.com/products/eb-robinos/eb-robinos-

specification/

[SI06] Bureau International des Poids et Mesures: “Le Système international d’unités; The

International System of Units (SI)”, Organisation Intergouvernementale de la

Convention du Mètre, 8th edition, 2006,

http://www.bipm.org/utils/common/pdf/si_brochure_8.pdf, 5-Dec-2016.

[UTM] „Universal Transverse Mercator coordinate system”,

https://en.wikipedia.org/wiki/Universal_Transverse_Mercator_coordinate_system