Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More...

50
Master of Science in Electrical Engineering Department of Electrical Engineering, Linköping University, 2016 Stabilization, Sensor Fusion and Path Following for Autonomous Reversing of a Full-scale Truck and Trailer System Patrik Nyberg

Transcript of Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More...

Page 1: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Master of Science in Electrical EngineeringDepartment of Electrical Engineering, Linköping University, 2016

Stabilization, Sensor Fusionand Path Following forAutonomous Reversing of aFull-scale Truck and TrailerSystem

Patrik Nyberg

Page 2: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Master of Science in Electrical Engineering

Stabilization, Sensor Fusion and Path Following for Autonomous Reversing ofa Full-scale Truck and Trailer System

Patrik Nyberg

LiTH-ISY-EX–16/4982–SE

Supervisor: Niclas Evestedtisy, Linköpings universitet

Oskar Ljungqvistisy, Linköpings universitet

Henrik PetterssonScania

Examiner: Daniel Axehillisy, Linköpings universitet

Division of Automatic ControlDepartment of Electrical Engineering

Linköping UniversitySE-581 83 Linköping, Sweden

Copyright © 2016 Patrik Nyberg

Page 3: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Abstract

This thesis investigates and implements the sensor fusion necessary to autonom-ously reverse a full size truck and trailer system. This is done using a LiDARmounted on the rear of the truck along with a RTK-GPS. It is shown that therelative angles between truck-dolly and dolly-trailer can be estimated, along withglobal position and global heading of the trailer. This is then implemented in oneof Scania’s test vehicles, giving it the ability to continuously estimate these states.

A controller is then implemented, showing that the full scale system can be sta-bilised in reverse motion. The controller is tested both on a static reference pathand a reference path received from a motion planner. In these tests, the controlleris able to stabilise the system well, allowing the truck to do complex manoeuvresbackwards. A small lateral tracking error is present, which needs to be furtherinvestigated.

iii

Page 4: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 5: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Acknowledgments

Firstly I would like to thank Scania for the opportunity to do this master thesis.Everyone in the group has been of great help, especially my supervisor HenrikPettersson.

Secondly I would like to thank my supervisors Niclas Evestedt and Oskar Ljungqvistalong with my examiner Daniel Axehill. All of them has been of great help andhave been very involved in this thesis.

I would also like to express my gratitude to all my family and friends. Withouttheir love and support none of this would have been possible, you truly meaneverything to me.

Södertälje, June 2016Patrik Nyberg

v

Page 6: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 7: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Contents

Notation ix

1 Introduction 11.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 System overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.3 Problem formulation . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4 Previous work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.5 Thesis outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Platform 52.1 Truck and trailer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Steering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

2.3.1 LIDAR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.3.2 RTK-GPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

3 Modeling 93.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93.2 Dynamic equations . . . . . . . . . . . . . . . . . . . . . . . . . . . 103.3 Measurement equations . . . . . . . . . . . . . . . . . . . . . . . . . 12

4 Data Classification 154.1 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154.2 First filtration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174.3 RANSAC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

5 State estimation 235.1 Extended Kalman filter . . . . . . . . . . . . . . . . . . . . . . . . . 235.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

6 Results 276.1 Data classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

6.1.1 Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

vii

Page 8: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

viii Contents

6.1.2 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . 286.2 State estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

6.2.1 Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306.2.2 Ground truth . . . . . . . . . . . . . . . . . . . . . . . . . . 316.2.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

6.3 Robustness evaluation . . . . . . . . . . . . . . . . . . . . . . . . . 326.4 Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

7 Conclusion 377.1 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377.2 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

Bibliography 39

Page 9: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Notation

Symbols

Symbol Explanation

L1 Wheelbase of the truckL2 Wheelbase of the dollyL3 Wheelbase of the trailerM1 Length of the trucks off-hitchlk Length from kingpin to front of trailerθ1 Angle of truck in global coordinate systemθ3 Angle of the trailer in global coordinate systemβ2 Angle between truck and dollyβ3 Angle between dolly and trailer

(x1, y1) Position of truck in global coordinate system(x3, y3) Position of trailer in global coordinate systemα Steering angle of the truckv Velocity in centre of the trucks rear axle

ix

Page 10: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 11: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

1Introduction

The work in this master thesis concerns the sensor fusion and controller neededfor a truck to reverse along a desired path with a trailer. In this chapter thebackground and previous work is described. The problem formulation is statedalong with an outline of the thesis.

The solution in this thesis will be a part of a fully autonomous system. Thereforethe background is given followed by a system overview for a fully autonomoussystem. The problem formulation then describes which part of this problem thisthesis will solve.

1.1 Background

Reversing with a trailer is a task many drivers find difficult. Most car drivershave reversed a car with one trailer hitched and know that it is not easy. Themain difference between a car with a trailer and a truck with a trailer is that thehitching is done through a dolly. The dolly is much like a normal car trailer, butinstead of a box or loading area on top it has a turning plate which the trailer isattached to. This creates two main points which make reversing with a truck andtrailer more difficult than with a car. The first is that there are now two joints,creating two angles. One will be from the truck to the dolly and one from thedolly to the trailer. The driver will then have to consider both these angles whilereversing. The second point is that the dolly is very hard for the driver to see.When looking through the mirrors in the cabin, the only visible part is the trailer.The driver then has to look for where the corners are and try and calculated howthe dolly is directed compared to the truck.

For cars there are now driver aids available for reversing a trailer. For example

1

Page 12: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

2 1 Introduction

VehicleControl

MotionPlanning

VehicleState

Estimation

Sensing en-vironment

Environment(includingvehicle)

High LevelPlanning

Figure 1.1: System overview.

VW now has a system were the driver can set a desired angle between the car andtrailer. This allows the driver to much more easily reverse a trailer in a desiredmanoeuvre. For trucks however, no such system exists on the market today.

For this there exist several plausible market exploitations. The most basic is thatthe truck measures the angles, giving the driver information of how the dollyand truck are angled. More complex methods include letting the driver set thedesired angle, much like VW system already in place for cars, or for the driverto set a point where the truck should be parked which is then executed automati-cally.

For all levels of driver aids, it is important that the work can be reused as morecomplex methods are implemented. Therefore the solution should be ready for afully autonomous solution.

1.2 System overview

An overview of a system for autonomous vehicles can be seen in Figure 1.1. Thefirst part is high level planning. This can for instance be a system which decidesthat a certain material shall be transported from a source point to a certain desti-nation. The system will then assign a truck for the task, and it is then up to thetruck to solve the assignment.

The truck will then need some kind of motion planning which decides where todrive. This will create a reference for the controller which then has the task offollowing the desired path.

The controller will need to handle different cases, for instance driving forwardat both low and high speed or reversing with and without a trailer. For thisdifferent sensors will be needed to give measurements of for instance position,heading and speed.

For autonomous trucks, one of the tasks left to solve is reversing with a trailer.An algorithm to solve this will be needed in the future, otherwise trucks willnot be able to reverse. For trucks being sold today, this algorithm could also beused as some sort of driver aid, similar to the algorithms on the market today for

Page 13: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

1.3 Problem formulation 3

reversing a trailer with a car.

1.3 Problem formulation

The task for this thesis is to create an algorithm to allow a truck to reverse alonga path with a dolly and trailer hitched. For this, a controller is needed to steerand throttle, but the relative angles between truck-dolly and dolly-trailer mustbe known to be able to control the system. The task at hand can then be split intotwo main parts - state estimation and controller.

To estimate the angles between truck, dolly and trailer the most direct approachis to put a sensor which directly measures the angle on the joints. However, sen-sors on the dolly or trailer itself is not favoured as this increases the cost of sep-arate trailers. Therefore another solution is required. This thesis will investigatehow this problem can be solved using a laser scanner, also known as LiDAR.

For the controller, the controller proposed in [9] will be implemented and tested.

The main tasks considered in the thesis are

• Create a data classification method and filter to estimate the necessary statesfor control of the vehicle.

• Collect data to evaluate state estimations.

• Implement and test a controller.

1.4 Previous work

Previous research has been conducted in this area. In [13] a system with onetrailer is modelled. There has also been several papers deriving a dynamic modelof a truck and trailer system with an arbitrary number of trailers, [1] [2] [4].

There has also been research where the system is modelled and the last traileris seen as a virtual tractor which is steered. This has been implemented on amedium scale tractor showing promising results in path tracking, [11].

Most papers use controllers tracking a specific point, most commonly the rearaxle of the rear-most trailer. In [4] a different approached is used. In this paperthe control error instead consists of the distance from all bodies to the reference.

Several papers also show that the system can be stabilised using linear control,[3] [11] [9].

In [5] a 2-trailer system is linearised. It is then shown to be stable in backwardsmotion using LQ-control on a platform with off-axel hitching consisting of asmall radio controlled truck with trailer. In this work path following is not in-vestigated.

Page 14: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

4 1 Introduction

No research has been found applying these models or controllers on a full scaletruck and trailer system.

No previous work in the specific task of using LiDAR for angle estimation oftruck and trailer system have been found. In all previously mentioned researchthe angles are considered to be known and the main contribution of this workis therefore to give a robust and versatile method to estimate the trailer anglesneeded for control of the reversing system.

1.5 Thesis outline

The thesis will be outlined as

• Chapter 2 describe the experimental platform used in the thesis.

• Chapter 3 presents the mathematical description of the system behaviour.The measurement equations to estimate the relative angles between truck-dolly and dolly-trailer are derived.

• Chapter 4 presents the data at hand and how this data is treated to createmeasurements.

• Chapter 5 presents the filter used and how this estimates the states.

• Chapter 6 contains a short desctiption of the controller which will be tested.

• Chapter 7 presents the results of this thesis for the three different mainparts - data classification, state estimation and controller.

• Chapter 8 contain the conclusions of this thesis along with suggestions forfuture work.

Page 15: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

2Platform

For data collection and evaluation of the algorithms, a full sized truck and trailersystem is used. The truck at hand is a Scania G340. As trailer, a semi-trailer with2 axles at the front and 3 axles at the rear is used. The system is loaded withconcrete to avoid slip and has a total weight of approximately 50 metric tons.

2.1 Truck and trailer

The truck used is called "Socius" and is a part of the test fleet used by Scania todevelop autonomous trucks for mining applications. All mechanical parts areoriginal, but in terms of sensors and computational power the truck has beenmodified.

The relevant modifications for this thesis is the addition of one LiDAR facingbackwards along with two industrial computers installed in the glow box. Boththe radars and the LiDAR publish messages to the DDS communication protocol.

One of the computers installed is used to run the algorithms developed in thisthesis. This computer has been pre-programmed for running models of this kindand updates all modules with a frequency of 100 Hz.

To control the steering and throttle input, a system has been developed and putin place by Scania. The throttle input is sent as a speed or acceleration requestdirectly to the engine via CAN, whilst the steering input is sent to an electricmotor in the steering wheel. These can be controlled by publishing a message onDDS including the desired speed of the truck along with the desired curvature.

At all times the driver sitting in the driving seat can take control by grabbing thesteering wheel or by touching the pedals. There is also an emergency stop button

5

Page 16: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

6 2 Platform

in the cab.

L1 4.66 [m]M1 0.81 [m]L2 3.75 [m]L3 7.59 [m]

Table 2.1: Lengths for truck and trailer system

The effective lengths for the system can be seen in Table 2.1.

2.2 Steering

The steering in place is the original steering, which has an electric motor in thesteering wheel which normally helps the driver so that the wheel turns easily. Thesystem in place can receive a message containing the desired curvature, which istranslated to a rotation of the steering wheel. This position is sent to the electricmotor which in this case turns the steering wheel to the desired position all byitself.

The dynamics of this system has not been modelled in detail. It is assumed thatthe total dynamics of the system is a transport delay combined with a first ordersystem. Experiments conducted by Scania show that these sum up to about 0.5second from signal to yaw-rate.

In this thesis the steering will be considered perfect when modelling the system.

2.3 Sensors

For this thesis, the sensor limitations are what is present on Scanias test vehicleSocius. The relevant sensors are one LiDAR facing backwards with adjustableheight angle, one RTK-GPS, speed from a tacho-meter and built-in measurementof the steering wheel, which is used to calculate steering angle.

2.3.1 LIDAR

The LiDAR sensor used has 581 beams in each sweep with a field of view of 2.6radians. The LiDAR does 3 sweeps per sample and each time the laser movesfrom side to side it does so with a slight upward arc. This creates 3 virtual beams,which can be approximated to be at different vertical angles.

The distance and angle of each point measured is not exact, the distance is exactdown to a few centimetres and the angle error has not been modelled.

Each measurement has been pre-processed where the pulsewidth of the returnbeam has been used to determine whether or not the hit is valid.

Page 17: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

2.3 Sensors 7

The LiDAR is mounted in the rear right corner, 94 centimetres above the ground.The LiDAR was then tilted upwards to make sure that the beams would hit thetrailer.

2.3.2 RTK-GPS

The test vehicle is fitted with a RTK-GPS to give position and heading in a globalcoordinate system. The system tracks the phase of the carrier wave, enablingmuch more accurate positioning compared to normal GPS. The system used inthe test vehicle Socius has an accuracy of about 5 cm. For the purpose of thisthesis, these errors will not be considered as they will have little to no effect onthe results.

Page 18: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 19: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

3Modeling

For both the filter and the controller, a mathematical description of the system isneeded. This should describe how the system behaves and how the states changewhen different input is fed to the system.

This chapter will describe the dynamic equations for the states of this system.The measurement equations used to estimate these states will then be derived.

3.1 Overview

The truck and trailer system can be viewed as three bodies, as in Figure 3.1.The truck is oriented in global coordinate system with position (x1, y1) and di-rection θ1, similarly the trailer has position (x3, y3) and direction θ3.

The system has a set of lengths, marked in Figure 3.1. These are L1 which is thewheelbase of the truck, M1 which is the length of the off-hitch, L2 which is thelength of the dolly and L3 which is the wheelbase of the trailer.

A local coordinate system is fixed in the centre of the rear axle of the truck, withpositive x-direction straight forward, y-direction to the left and z-direction upcreating a right-handed system. The angle relative the truck and dolly, β2, is thendefined positive counter-clockwise.

Similarly, the relative angle between the dolly and trailer, β3, is defined posi-tive counter-clockwise in a coordinate system fixed in the dolly.

To model this system, some approximations are made. In the model, only theeffective wheelbase is relevant. As the truck, dolly and trailer all have multiplerear axles, these are approximated as one axle in the centre. This will hold when

9

Page 20: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

10 3 Modeling

L3

L2

M1

L1

α

β3

β2

(x3, y3)

Figure 3.1: Schemtic figure of the truck and trailer system. All angles aredefined positive counter-clockwise.

there is little to no slip. Since the speed of all manoeuvres in this thesis will below, this assumption will hold. Therefore all lengths (L1, L2, L3) described in thisthesis are the effective lengths.

This system, with described approximations, then consists of four axles, two rigidfree joints, a kingping hitching and an actuated front steering. This is then con-sidered a general 2-trailer system. The lengths for this system are displayed intable 2.1.

3.2 Dynamic equations

For this system, the inputs are the speed of the truck (denoted v) and the steeringangle (denoted α). The states are chosen to be the position of the trailer (x3, y3),the angle of the trailer θ3, the angle between truck and dolly β2 and the anglebetween dolly and trailer β3. A dynamic model for a general truck and trailersystem with these states have been derived in [1] [11] [2]. In these papers, thedynamic model is derived based on the holonomic and nonholonomic pfaffianconstraints present. These arise from the constrained motion of the system dueto the assumption of no slip between wheel and ground.

The resulting nonlinear dynamic model was derived in [1] and is presented inEquation 3.1.

Page 21: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

3.2 Dynamic equations 11

x3 = v cos β3 cos β2(1 +M1

L1tan β2 tanα) cos θ3 (3.1a)

y3 = v cos β3 cos β2(1 +M1

L1tan β2 tanα) sin θ3 (3.1b)

θ3 =v sin β3 cos β2

L3(1 +

M1

L1tan β2 tanα) (3.1c)

β3 = v cos β2(1L2

(tan β2 −M1

L1tanα) −

sin β3

L3(1 +

M1

L1tan β2 tanα)) (3.1d)

β2 = v(tanαL1−

sin β2

L2+M1

L1L2cos β2 tanα) (3.1e)

When implementing the controller later on, a linear model will be needed. Thislinearisation is carried out in [10]. The linearisation is done around the the equi-librium point denoted pe = [β3,e, β2,e] and αe. In this equilibrium point, theangles β3,e, β2,e and αe are the required angles for the system to follow the trajec-tory, see [10] for details.

The linearised model is presented in Equation 3.2.

˙p = v(A(p − pe) + B(α − αe)) (3.2)

where

A = v3

0 1 0 00 0 1

L30

0 0 − 1L3

1L2

0 0 0 − 1L2

B = v3

00− M1L1L2

L2+M1L1L2

(3.3)

The characteristic polynomial for the matrix A is then

det(λI − A) = λ2(λ +v3

L3)(λ +

v3

L2) (3.4)

which have poles placed in

λ1,2 = 0 (3.5a)

λ3 = − v3

L3(3.5b)

λ4 = − v3

L2(3.5c)

Page 22: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

12 3 Modeling

L2

lk

lcl1

φ

β2

β3

1

Figure 3.2: Schematic figure over the truck and trailer system. The truck isfacing downwards, the red line is the front of the trailer, which is measuredby the LiDAR/radar

The system will thus be unstable in backwards motion due to the two poles inthe right half-plane. The stability will also be affected by the speed, as the poleswill move to the right as the reversing speed increases.

3.3 Measurement equations

The measurements given by the LiDAR will be a cloud of points on the trailersshort side. The idea is to use this data to create estimates of the angle relativetruck and trailer (denoted φ) and the distance from the centre of the front of thetrailer to the hitching point (denoted lc), see Figure 3.2. How these estimates arecreated is presented in Chapter 4.

Given that φ and lc are being measured, these virtual measurements can be usedas input to construct measurement equations, that later will be used in the filter-ing step. Since the angle φ is defined as the angle difference between the truckand trailer, it follows that

φ = β2 + β3 (3.6)

Since the length of the dolly (L2) is known, a triangle giving β2 can be constructedso that

Page 23: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

3.3 Measurement equations 13

β2 = arcsin(lc + l1L2

) (3.7)

were l1 is the length added from the fact that the hitching point for the trailer inthe dolly (called kingpin) and the centre of the trailer is separated by the length lk .Since the coordinate system is fixed in the body of the truck, lk will have angle φrelative the coordinate system, giving the relation

l1 = lk sin(φ) (3.8)

Combining equations 3.6, 3.7 and 3.8 will give the measurement equation

lc = L2 sin(β2) − lk sin(β2 + β3) (3.9)

The other measurements given by the system are position and heading of thetruck relative a global coordinate system. These measurements are given by theRTK-GPS and are denoted [x1, y1, θ1, UTM]. For this thesis, the utm zone hasbeen neglected. This will only give a limitation and need to be considered if thetruck during a manoeuvre passes a border into a new utm zone, which will notbe the case for this thesis.

These measurements are then used in combination with the known lengths ofthe system and the relative angles to calculate the position of the rear axle of thetrailer. Since the angle difference between the truck and trailer is defined as:

θ1 − θ3 = β2 + β3 (3.10)

the measurement equation for θ1 is:

θ1 = θ3 + β2 + β3 (3.11)

The resulting measurement equations h(x) are summarised in Equation 3.12.

x1 = x3 + M1 cos θ1 + L2 cos(θ1 + β2) + L3 cos θ3 (3.12a)

y1 = y3 + M1 sin θ1 + L2 sin(θ1 + β2) + L3 sin θ3 (3.12b)

θ1 = θ3 + β2 + β3 (3.12c)

lc = L2 sin β2 − lk sin(β2 + β3) (3.12d)

φ = β2 + β3 (3.12e)

Page 24: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 25: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

4Data Classification

To be able to create the virtual measurements, discussed in section 3.3, somekind of data classification is needed. This classification method has to create thevirtual measurements and be robust against outliers and other disturbances.

In this chapter the data is first presented along with which disturbances andanomalies that can be expected. The algorithm for creating the virutal measure-ments is then present along with motivation to the tuning of design parameters.

4.1 Data

The data from the LiDAR is sent from the sensor to a computer with softwarefrom Volkswagen. This computer treats the data and sends some of the informa-tion out on the network. This treated and filtered data is then accessible for allapplications running. The data fields relevant for this thesis are called ’distance’and ’valid’. The distance is the measured ’distance’ to the object and ’valid’ showwhether or not the beam actually hit an object or not.

The first step is to convert all valid data points from spherical coordinates tocartesian coordinates. Since only part of the information from the LiDAR is ac-cessible, some approximation has to be made. Therefore the transformation iscarried out by checking which position, beam number and distance the measure-ment has. Each beam has a pre-programmed vertical angle, starting with beam1 at zero degrees and adding 0.05 rad for every beam. This is an approxima-tion made, were the actual angle is a more complex formula using information ofwhere on the mirror in the sensor the return beam hits. For the horizontal angleit is pre-defined that the LiDAR measures between -1.26 and 1.26 rad. It is thenassumed that all 581 measurements are evenly spaced in this interval, giving the

15

Page 26: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

16 4 Data Classification

Position [m]-6 -5.5 -5 -4.5 -4 -3.5 -3 -2.5

Position[m

]

-1.5

-1

-0.5

0

0.5

1

1.5

Data sample LiDAR

Figure 4.1: One data sample from the LiDAR after first filtration, removingother objects and contaning hits on the trailer.

horizontal angle to be calculated as

θ = 1.26 − 2.52581

i (4.1)

were i ∈ [0..580] is the position in the array.

One typical sample from the LiDAR can be seen in Figure 4.1. From this it is clearthat most hits from the LiDAR, all in this case, is on the trailer. However certainsituations can give hits on other objects as can be seen in Figure 4.2. In this case,another test vehicle was on the test track and the LiDAR detected it and its trailer.Thus some kind of first filtration is needed along with an algorithm for creatingthe virtual measurements.

As can be seen in Figure 4.1, there is a spread in the point cloud along the front ofthe trailer. The distance measurement will have a small amount of noise, as willthe angle measurement. However the biggest spread is due to the approximationthat every beam has a constant vertical angle. In reality, the beams move witha slight arc upwards as they move from side to side. This creates small arcs inthe measurements. As these are stacked on top of one and other, they are hard todetect when all measurements are plotted.

As the shape of the trailer is known, it is known that the front is a straight line.However, if Figure 4.1 is studied more closely, it is clear that two shapes that donot fit the straight line are present. These are highlighted in Figure 4.3. These arethe hoses and cables that connect the trailer to the truck, containing compressedair and power to the trailers brakes. These will thus always be present in the data,and might move around as the trailer swings. Therefore the algorithm will needto handle these anomalies.

Page 27: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

4.2 First filtration 17

Position [m]-16 -14 -12 -10 -8 -6 -4 -2 0 2

Position[m

]

-14

-12

-10

-8

-6

-4

-2

0

Data sample LiDAR

Figure 4.2: Sample from LiDAR during tests. Blue are hits from the LiDAR,red is the position of the sensor. The truck is facing to the right. Objectdetected in the bottom is another test vehicle on the test track.

4.2 First filtration

As the area where the front of the trailer can be is very limited due to fix lengthof the dolly, which is considered known, a box in which the front of the trailerhas to be can be created. After this a safety margin has to be added, this to coverthe cases were the LiDAR also hits the long side of the trailer. A first filtration ofthe data is then to exclude all points that lie outside of this box. For this thesisthe box was decided upon to be (-1, -5) in x-position and (-4, 4) in y-position. Theresulting images after this filtration can be seen in Figure 4.1.

The biggest limitation with this filtration is that some other object than the trailercan still be present after the filtration. This is however unlikely and is assumedto not happen in this thesis. This is due to that the LiDAR has an upward angle,so for another object to be detected it has to be both very tall and very close.Even though unlikely, it is still possible that another object will be present, forinstance if the truck is reversing close to a building. The user has to take this in toconsideration and bare in mind that the algorithm in these cases will be untested.

4.3 RANSAC

The next step is then to estimate the angle of the front of the trailer. As previouslymentioned, it is known that the measurement can be approximated as a straightline. The task is then to use the points in the point cloud and estimate a straightline fitting these points.

One approach for this is to randomly select two points and draw a straight linethrough them. All other points that then lie within a certain threshold from thisline are considered inliers. This is then repeated a set number of iterations, and

Page 28: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

18 4 Data Classification

Position [m]-6 -5.5 -5 -4.5 -4 -3.5 -3 -2.5

Position[m

]

-1.5

-1

-0.5

0

0.5

1

1.5

Typical data sample LiDAR

Figure 4.3: Typical data sample from LiDAR sensor. Highlighted are persis-tent objects that differ from the straight line of the trailer front.

the line with the most inliers is considered the best estimate. This algorithm wasfirst presented in [6] and is called RANSAC.

A short summary of the RANSAC algorithm are found in Algorithm 1. [6]

Algorithm 1 RANSAC algorithm

1: procedure RANSAC

2: for i = 1 to numIterations do3: p1 = rand()4: p2 = rand()5: l = straight line between points (p1, p2)6: for j = 1 to numberOfPoints do7: distance(j) = |l - pj|8: end for9: numberOfInliers = sum(distance < threshHold)

10: if numberOfInliers > bestNumberOfInliers then11: lineEstimate = l12: bestNumberOfInliers = numberOfInliers13: end if14: end for15: end procedure

The design parameters when running RANSAC are the threshold and the numberof iterations.

The number of iterations was tested against the data collected during the first testruns. Figure 4.4 shows the variations of RANSAC on one of the data collectionswhere the truck is driven straight forward at low speed. The idea is that as thenumber of iterations grow, the variance should decrease towards the true vari-

Page 29: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

4.3 RANSAC 19

Iterations in RANSAC0 100 200 300 400 500 600 700 800 900 1000

Varian

ce[rad

]

×10-3

0.4

0.6

0.8

1

1.2

1.4

1.6

Variance for φ

Iterations in RANSAC0 100 200 300 400 500 600 700 800 900 1000

Variance

[m]

×10-3

6

7

8

9

10

11

Variance for lc

Figure 4.4: Variance of φ and lc for different number of iterations. Resultsfrom data sample when truck was driven straight forward at low speed.

ance. The result is that above 100 iterations, the gain of running more iterationsis small. Since RANSAC has complexity O(n), the time will grow linearly withincreasing amount of iterations.

The final parameter for the number of iterations was set to 150, even though theperformance does not seem to improve above 100 iterations. This safety marginwas decided upon as RANSAC is a non-deterministic algorithm, giving some pro-tection from "unlucky" randomly drawn points.

For the threshold, several samples from the LiDAR was investigated. In Figure4.5 a comparison of different thresholds are presented. For this application, thedata shows that what the sensor is seeing is very close to a straight line. Thisallows for a small threshold to be set. But, if the threshold was set below 0.05meters, some points that are hits on the trailer and should be inliers are detectedas outliers. Therefore the threshold was set to 0.05 meters.

The middle figure in Figure 4.5 shows RANSAC on one sample with threshold0.05 meters. From this it is visible that the hits that are most likely on the trailerare also considered inliers. Worth noting is also that the persistent objects (seefigure 4.3) discussed in the beginning of this chapter are considered outliers.

The result of running RANSAC on a few typical data samples can be seen inFigure 4.6. This figure shows that when the trailers angle relative to the truck, φ,is large enough, a larger number of points are on the long side of trailer and thusbeing the side RANSAC will estimate. This also has the effect that the centre ofthe short side has to be estimated by knowing the width of the trailer and findingthe corner. The possibility here is then to estimate both the long and short sideof the trailer and then look for the intersection of these lines.

The algorithm is then extended to estimate more than one line. The idea is torun RANSAC, remove the points that were within the threshold and then run

Page 30: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

20 4 Data Classification

Position [m]-4.5 -4 -3.5 -3 -2.5

Position[m

]

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

2.5

RANSAC - threshold 0.1

Within threshold

Estimated line

Outliers

Position [m]-4.5 -4 -3.5 -3 -2.5

Position[m

]

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

2.5

RANSAC - threshold 0.05

Within threshold

Estimated line

Outliers

Position [m]-4.5 -4 -3.5 -3 -2.5

Position[m

]

-2.5

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

2.5

RANSAC - threshold 0.025

Within threshold

Estimated line

Outliers

Figure 4.5: Ransac with threshold set to 0.1, 0.05 and 0.025 meters. Inliersare marked blue and outliers marked red.

Position [m]-5.5 -5 -4.5 -4 -3.5 -3 -2.5 -2 -1.5

Position[m

]

-4

-3

-2

-1

0

1

2

3

RANSAC - one line

Within threshold

Estimated line

Outliers

Position [m]-5.5 -5 -4.5 -4 -3.5 -3 -2.5 -2 -1.5

Position[m

]

-4

-3

-2

-1

0

1

2

RANSAC - one line

Within threshold

Estimated line

Outliers

Position [m]-5.5 -5 -4.5 -4 -3.5 -3 -2.5 -2 -1.5

Position[m

]

-4

-3

-2

-1

0

1

2

3RANSAC - one line

Within threshold

Estimated line

Outliers

Figure 4.6: RANSAC on three typical samples of data from LiDAR

Page 31: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

4.3 RANSAC 21

Position [m]-4.5 -4 -3.5 -3

Position[m

]

-4

-3

-2

-1

0

1

Inliers

Line estimation

Outliers

Position [m]-4.5 -4 -3.5 -3

Position[m

]

-4

-3

-2

-1

0

1

Position [m]-4.5 -4 -3.5 -3

Position[m

]

-4

-3

-2

-1

0

1

Figure 4.7: Illustration of RANSAC algorithm. First RANSAC estimates oneline, the points considered inliers are then removed. Remaining points (mid-dle figure) is then used to estimate the second line (rightmost figure).

RANSAC again on the remaining points. The behaviour is illustrated in Figure4.7. The first image illustrates all points with the RANSAC estimate. The secondfigure is all the outliers and the third figure is the second RANSAC estimate.After this the algorithm can be repeated.

Experiments on the data set collected shows that running RANSAC with 3 lineswill give good corner detection. This is due to that on some samples, the pointsfrom the hoses and wires on the front will give a larger number of hits than thelong side of the trailer, even though that side can be estimated. This is solved byrunning RANSAC three times. To then run the algorithm a fourth time will notgive an enhanced result, as the number of points left after the third iteration inthe general case are very low.

When estimating the long side of the trailer in cases where the number of hits arelow, generally under 20, the estimate can be sensitive to noise. This will give aestimate where the lines intersect, but with a relative angle far from 90 degrees.Since the trailers corner should be 90 degrees, these samples can be excluded.After experiments, it was decided upon that if the angle difference is greater than0.05 rad (2.9 degrees) from 90 degrees, the first estimate is used and the estimateof the long side is considered false and not used.

Page 32: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 33: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

5State estimation

The basic outline for estimating the states x = [x3, y3, θ3, β3, β2]T is to use themeasurements z = [x1, y1, θ1, lc, φ]T along with the dynamic equations whichhas α and v as input. The data classification will be used to create lc and φ, whichwill then be considered measurements by the filter.

5.1 Extended Kalman filter

The filter used is an extended Kalman filter (EKF), the dynamic model f (x, u)and measurement equations h(x) have been presented in Chapter 3.

The Kalman filter can be used to estimate the states in a linear state space model.If however the system can not be described by a linear model, an extended Kalmanfilter can be used. In this section, the Kalman filter for the general case is pre-sented along with a short derivation of the extended Kalman filter.

The states in a linear state space model

xk+1 = Fkxk + Gu,kuk + Gv,kvk (5.1a)

yk = Hkxk + Dkuk + ek (5.1b)

Cov(vk) = Qk (5.1c)

Cov(ek) = Rk (5.1d)

are estimated using a Kalman filter (KF) [8] [7]. This uses a combination of adynamic model, which describes how the states changes with input uk , and mea-surements from sensors yk . The matrices Rk and Qk are design parameters that

23

Page 34: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

24 5 State estimation

decide how the observer should weigh the dynamic model versus the measure-ments. The time update for the Kalman filter is

xk+1|k = Fk xk|k + Gu,kuk (5.2a)

Pk+1|k = FkPk|kFTk + Gv,kQkG

Tv,k (5.2b)

and the measurement update

xk|k = xk|k−1 + Kkεk (5.3a)

Pk|k = Pk|k−1 − KkSkKTk (5.3b)

where

Sk = HkPk|k−1HTk + Rk (5.4a)

Kk = Pk|k−1HTk (HkPk|k−1H

T + Rk)−1 (5.4b)

εk = yk − Hk xk|k−1 − Dkuk (5.4c)

One can show that Kalman filter will give the best linear unbiased estimate. Ifthe distributions vk , ek and x0 are Gaussian it is also the minimum variance andmaximum likelihood estimate, [8] [7].

In the nonlinear case, the state space model is

xk+1 = f (xk , uk , vk) (5.5a)

yk = h(xk , uk , ek) (5.5b)

which means, as mentioned above, that the Kalman filter can not be used. How-ever, if the Taylor expansion around x

g(x) = g(x) + g ′(x)(x − x) (5.6)

is used in every point, the Kalman filter can be used. Applying this to the Kalmanfilter the extended Kalman filter (EKF) is obtained [12]. The time update is then

xk+1|k = f (xk|k) (5.7a)

Pk+1|k = Qk + f ′(xk|k)Pk|k(f′(xk|k))

T (5.7b)

and the measurement update

Page 35: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

5.2 Implementation 25

xk|k = xk|k−1 + Kkεk (5.8a)

Pk|k = Pk|k−1 − Pk|k−1(h′(xk|k−1))T S−1k h′(xk|k−1)Pk|k−1 (5.8b)

where

Sk = Rk + h′(xk|k−1)Pk|k−1(h′(xk|k−1))T (5.9a)

Kk = Pk|k−1(h′(xk|k−1))T S−1k (5.9b)

εk = yk − h(xkk−1) (5.9c)

5.2 Implementation

The filter requires the derivatives of the dynamic equations and the measure-ment equations, which are numerically computed at every sample according tosymmetric derivation. This is done for one time step in every direction accordingto

f ′(x) =f (x + h) − f (x − h)

2h(5.10)

were h is set to 0.01.

The update frequency of the filter was set to 100 Hz. This is since the environ-ment in place by Scania updates all models with a maximum of 100 Hz, which isalso the frequency of the in-car communication protocol DDS.

Filter performance was tested for different values in noise matrices Qk and Rk .The final tuning for these parameters were

Qk = I (5.11a)

Rk = 102I (5.11b)

The filter was implemented along with the data classification in MATLAB/Simulink.The code generation environment built by Scania was then used to generate Ccode for the algorithm.

Page 36: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking
Page 37: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

6Results

The data classification proposed in Chapter 4, the filter proposed in Chapter 5and the controller proposed in [10] was tested both in simulation and on the realplatform. This chapter will present and discuss the results.

6.1 Data classification

The data classification was first tested in simulation and then on recorded data.After this it was implemented and tested running live in the truck.

6.1.1 Simulation

To test performance, a vanilla case was set up. The dynamic model was usedto simulate the behaviour at different steering input and velocity. After this aperfect LiDAR model was constructed, giving 581 perfect measurements evenlyspaced between the angles -1.2 and 1.2 rad. This was to simulate the physicalsensor, but with prefect measurements.

To achieve this, all vectors from the LiDAR was constructed in MATLAB alongwith the walls of the trailer for every time sample. The intersection was then cal-culated for every beam and every vector describing the trailer. The intersectionswhich were valid, meaning positive distance from the LiDAR, was saved as hits.These were then saved as measurements in the same way as from the real LiDARand sent to the data classification algorithm.

The result is displayed in Figure 6.1. From this is clear that both of the virtualmeasurements lc and φ follow ground truth. Since the LiDAR does not have aninfinite number of beams, there will be a small bias in lc. This is because the dataclassification estimates the trailers front to be between the two outer most hits

27

Page 38: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

28 6 Results

Time [s]0 2 4 6 8 10 12 14 16 18 20

Distance

[m]

-2

-1.5

-1

-0.5

0lc

Time [s]0 2 4 6 8 10 12 14 16 18 20

Angle[rad

]

-0.6

-0.5

-0.4

-0.3

-0.2

-0.1

Ground truth

Estimation

Figure 6.1: Calculation of lc and φ with perfect LiDAR measurements. Sim-ulation done with constant speed 1 m/s and constant steering angle 0.25rad.

which will never be exactly on the two corners. However, as can be seen in Figure6.1, this bias is very small.

It is also visible from Figure 6.1 that even with prefect measurements φ will havea small amount of noise as it approaches 0.35 rad. This is due to that the LiDAR ishitting one side of the trailer as well as the front. At specific angles it will then bepossible to find a straight line which includes all hits on the front as well as oneor more on the side. This will then be the optimal solution from RANSAC. Sincethis only happens at very specific angles, the algorithm will alternate betweenperfect estimation and slightly tilted estimation, creating what looks like noisein the measurement.

6.1.2 Experiments

The classification method proposed in Chapter 4 was tested on data collectedfrom the platform described in Chapter 2. Several test runs were made on sepa-rate occasions collecting data.

During the tests, the truck was reversed in simpler manoeuvres, mostly as straightas possible. The truck was also driven forwards in more complex manoeuvres, in-cluding slalom and sharp turns. The idea behind the tests was to find all cornercases. These were determined to be when the angle relative truck and dolly β2and the angle between dolly and trailer β3 reached their limit and when thesechanged at different rate. The other states do not affect the estimation, since themeasurement in this case is independent of the current state.

The data classification is, as described in Chapter 4 used to create the virtual

Page 39: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

6.1 Data classification 29

Position [m]-3 -2.5 -2 -1.5 -1 -0.5 0

Position[m

]

-1

-0.5

0

0.5

1

LiDAR sample - simulation

Figure 6.2: Simulated LiDAR sample after addition of noise. Black linesmark the trailer, blue the measurements from the LiDAR and red the posi-tion of the LiDAR sensor

measurements lc and φ. In Figure 6.3, lc and φ are plotted for one of the tests.Here the truck is reversed as straight as possible manually.

As there is no easy way of measuring these variables another way, no groundtruth is present for this data. However, one can using this determine the amountof noise in the measurements.

Page 40: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

30 6 Results

Time [s]0 10 20 30 40 50 60 70 80 90 100 110

Distance

[m]

-1

-0.5

0

0.5

1

lc

Time [s]0 10 20 30 40 50 60 70 80 90 100 110

Angle[rad

]

-0.2

0

0.2

0.4

0.6

0.8

φ

Figure 6.3: Virtual measurements created from one data collection. Duringthe test a test driver reversed as straight as possible manually.

6.2 State estimation

The filter proposed in Chapter 5 was tested both on simulated and collected data.In all tests, the data classification was used to create virtual measurements. Afterthis the filter was implemented online in the truck giving live estimation of theangles.

6.2.1 Simulation

For simulation of the state estimation, the same simulation as for the data classifi-cation was used. The filter was simulated both with measurements from the dataclassification and measurements of the truck position and heading, both withand without noise added.

The results from the state estimation can be seen in Figure 6.4. This shows thatthe filter estimates the angles correctly, which is the critical part of the estima-tion. This is because the rest of the states, (x3, y3) and θ3, are calculated from themeasurements (x1, y1) and θ1 and the estimated angles β2 and β3.

To make the scenario more realistic, noise was then added to the angle and dis-tance for each of the LiDAR measurements. The noise level was decided upon tocreate a realistic scenario. The noise power set to -30 dB for the angle and dis-tance and -50 dB for steering angle and velocity. In Figure 6.2, one sample fromthe LiDAR is plotted. If this is compared to Figure 4.1 the level of noise can becompared and are about equal.

Page 41: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

6.2 State estimation 31

Time [s]0 2 4 6 8 10 12 14 16 18 20

Angle

[rad]

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7β2

Time [s]0 2 4 6 8 10 12 14 16 18 20

Angle[rad

]

-2

-1.5

-1

-0.5

0

0.5β3

Ground truth

Estimation

Figure 6.4: Estimation of β2 and β3 with perfect LiDAR measurements. Sim-ulation done with constant speed 1 m/s and constant steering angle 0.25rad.

6.2.2 Ground truth

To validate the filter, ground truth was also measured during data collection. Thiswas done using wire sensors, which measure distance. These were put to measurethe hypotheses of a triangle, giving the possibility to calculate the angle.

However, out on the test track one of the wire sensors broke when it was mounted.This resulted in that ground truth only could be measured for one of the angles.Since simulations show that the angle between the truck and trailer is the mosteffected by bias in constants, it was measured during the test.

6.2.3 Experiments

For the state estimation, the filter presented in Chapter 5 was simulated replayingthe data. The results from one of the tests can be seen in Figure 6.5.

Figure 6.5 shows that the angle estimation works well and the errors are verysmall. It also shows that the dynamics of the system are followed well.

Page 42: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

32 6 Results

Time [s]0 50 100 150 200 250

Angle[rad

]

-0.5

-0.4

-0.3

-0.2

-0.1

0

0.1

0.2

0.3

0.4

β2

Estimation

Ground Truth

Time [s]0 50 100 150 200 250

Angle[rad

]

-0.4

-0.3

-0.2

-0.1

0

0.1

0.2

0.3

0.4

0.5

0.6

β3

Estimation

Figure 6.5: Estimated angles for one of the tests along with ground truth forthe dolly angle (β2)

6.3 Robustness evaluation

The trailer and dolly both have multiple rear axles and if one pair of wheelsstart to slip, the effective length will change. Since these lengths are consideredconstant by the filter, it is interesting to test how errors in these lengths effect theangle estimation. In Figure 6.6 and 6.7 the system has been simulated with errorsin L2 and L3.

Figure 6.6 shows the angle estimation without noise when an error is present inthe dolly length L2. Here it is visible that the error will create a bias in the esti-mation of β2 and β3. This is due to that Equations 3.12 and 3.12e are dependenton L2.

Figure 6.7 shows the angle estimation when error in the trailer length L3 is in-troduced. As this figure shows, this will not create a bias in the angle estimation.This is due to that the measurement Equations 3.12d and 3.12e are not dependenton L3.

This shows that for the angle estimation, the length L2 is the most critical. Butit is also clear that a 15 % change in L2 only causes about 0.04 rad (2.2 degrees)bias. Since 15 % of the dolly length in this case is about 56 cm, it is reasonable toassume that the bias will be small.

Page 43: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

6.3 Robustness evaluation 33

Time [s]0 5 10 15 20

Distance

[m]

-2

-1.5

-1

-0.5

0

0.5lc

Time [s]0 5 10 15 20

Angle[rad]

-0.6

-0.5

-0.4

-0.3

-0.2

-0.1

Ground truth

15% error in L2

50% error in L2

Time [s]0 5 10 15 20

Angle[rad

]

0

0.2

0.4

0.6

0.8β2

Time [s]0 5 10 15 20

Angle[rad

]

-2

-1.5

-1

-0.5

0

0.5β3

Figure 6.6: Simulation with constant velocity 1 m/s and constant steeringangle 0.25 rad. Estimation with errors in L2, no noise added.

Time [s]0 5 10 15 20

Distance

[m]

-2

-1.5

-1

-0.5

0lc

Time [s]0 5 10 15 20

Angle[rad

]

-0.6

-0.5

-0.4

-0.3

-0.2

-0.1

Ground truth

15% error in L3

50% error in L3

Time [s]0 5 10 15 20

Angle[rad

]

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7β2

Time [s]0 5 10 15 20

Angle[rad

]

-2

-1.5

-1

-0.5

0

0.5β3

Figure 6.7: Simulation with constant velocity 1 m/s and constant steeringangle 0.25 rad. Estimation with errors in L3, no noise added.

Page 44: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

34 6 Results

6.4 Controller

For the controller, both the data classification and the filter was running live inthe truck. This gave a continuous estimation of the angles which were sent to thecontroller.

The controller used has been derived in [10]. The only modification is that thepenalty matrix for the LQ-controller is

Q =

0.375 0 0 0

0 10 0 00 0 8 00 0 0 2

(6.1)

This changed was made to compensate for the size of the truck since the trackingerror allowed should be higher when a bigger truck is used.

As a first experiment, a static reference in form of a straight line was tested. Thisline was placed on the test track and the truck was manually positioned close tothe line. After this the controller was switched on controlling the steering wheel.Throttle input was given by the test driver to start the system rolling, after thatthe engine was running on tick over, which was approximately 0.7 m/s.

The result after reversing along a straight path can be seen Figure 6.9. Here thetruck was placed a short distance from the reference, the controller then had toturn the trailer inward and then straighten up as the reference came closer. It isfrom this visible that the reference was not followed exactly and that a bias ofaround 0.5 meter existed.

In Figure 6.8 the estimated angles during this test are illustrated. After the firstinitialisation of the filter, the angles converge close to zero. This figure clearlyillustrates that the controller is able to stabilise the system.

The controller and state estimation were then combined with a motion planner.This created a scenario where the truck fully autonomously can reverse to a de-fined position. The motion planner had a graphical interface in the truck, wherea point behind the truck was placed. After this objects acting as obstacles wasintroduced. The motion planner then planned a route around these obstacles.

The resulting reference path and path taken for one of these test can be seen inFigure 6.10. Here it is once again visible that a bias exists and that the controllerconverges with a offset of around 0.5 meters.

Page 45: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

6.4 Controller 35

Time [s]10 20 30 40 50 60 70 80

Angle[rad

]

-0.5

0

0.5β2 LQ

Reference

Time [s]10 20 30 40 50 60 70 80

Angle[rad

]

-0.5

0

0.5β3 LQ

Reference

Time [s]10 20 30 40 50 60 70 80

Angle[rad

]

-0.5

0

0.5α Steering angle

Reference

Figure 6.8: Angles β2 and β3 during test when reversing straight backwards.

Position [m]80 90 100 110 120 130 140 150 160

Position[m

]

-10

0

10

20

30

40

Tracking of trailer rear axel

Reference

Position

Start

Figure 6.9: Tracking of the rear axel centre of the trailer. The truck startedat the bottom right reversing along the red reference.

Page 46: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

36 6 Results

Position [m]0 5 10 15 20 25 30 35

Position[m

]

0

5

10

15

20

25

30

Tracking of trailer rear axel

Position

Reference

Start

Figure 6.10: Reference path and path taken one on of the tests with motionplanner, start point is in lower right corner. The motion planer sent thereference to the controller, with start in the trucks current position.

Page 47: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

7Conclusion

This chapter contains the conclusions of this thesis, along with a discussion ofpossible future work.

7.1 Conclusion

The data classification used in this thesis worked well. The virtual measurementshad a small bias and variance and the classification was robust. This was clearas data collection and test runs were made on 4 separate occasions with smallvariations. For instance was the angle of the LiDAR sensor not constant, as it wasused for other tasks from time to time between tests.

There were some problems with the initiation of the data classification. Sincethere is no way of knowing which side of the trailer is estimated when the algo-rithm starts it had to be assumed to be the front. This caused problems if thealgorithm had to be restarted when the dolly and trailer were at a large enoughangle. After completing this thesis, one alternative approach would have been tocreate an algorithm that track the position of the two corners of the trailer. Inthis case it would be possible to realise which corner is estimated.

This approach would also give the possibility to use previous samples in a bet-ter way. As the classification is implemented in this thesis all the samples areindependent. This is of course not the case, as the trailer can only move a shortdistance between samples.

In terms of state estimation, the proposed algorithm worked well. There exists asmall problem in the initiation which is yet to be localised.

The controller was robust and managed to stabilise the system well. Remarkable

37

Page 48: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

38 7 Conclusion

is that no tuning was done, the parameters were only scaled to compensate forthe size of the truck. It was clear that the controller worked best at as low speedas possible, running on tick over worked well and the ride was smooth. Duringone test the speed was increased when reversing along a straight line and it wasclear that the controller had to give much greater steering input. As this is thedynamics of the system, it is not clear whether or not this would improve with abetter tuned controller, but is something that could be investigated.

7.2 Future work

A more robust data classification could be implemented with some kind of track-ing of the trailer corners.

Tuning of the controller could improve the results. In this thesis it was short ontime, but with tuning the performance could be even better.

More investigation as to why there is a static error in position needs to be done. Apossibility is that tuning of the controller could solve some of it, but more likely isthat the controller need to be extended with some kind of integrator. This wouldcompensate for errors in the trailer length, which is likely to vary as wheels startto slip.

Page 49: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

Bibliography

[1] Claudio Altafini. The general n-trailer problem: Conversion into chainedform. 1998. Cited on pages 3 and 10.

[2] Claudio Altafini. Some properties of the general n-trailer. INT. J. CONTROL,74(4):409–424, 2001. Cited on pages 3 and 10.

[3] Claudio Altafini. Following a path of varying curvature as an output regula-tion problem. 2002. Cited on page 3.

[4] Claudio Altafini. Following a path of varying curvature as an output regula-tion problem. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 47(9):1551–1556, 2002. Cited on page 3.

[5] Claudio Altafini, Alberto Speranzon, and Karl Henrik Johansson. Hybridcontrol of a truck and trailer vehicle. In Hybrid Systems: Computation andControl, pages 21–34. Springer, 2002. Cited on page 3.

[6] Martin A Fischler and Robert C Bolles. Random sample consensus: aparadigm for model fitting with applications to image analysis and auto-mated cartography. Communications of the ACM, 24(6):381–395, 1981.Cited on page 18.

[7] Rudolph E Kalman and Richard S Bucy. New results in linear filtering andprediction theory. Journal of basic engineering, 83(1):95–108, 1961. Citedon pages 23 and 24.

[8] Rudolph Emil Kalman. A new approach to linear filtering and predictionproblems. Journal of basic Engineering, 82(1):35–45, 1960. Cited on pages23 and 24.

[9] Oskar Ljungqvist. Motion planning and stabilization for a reversing truckand trailer system. 2015. Cited on page 3.

[10] Oskar Ljungqvist, Daniel Axehill, and Anders Helmersson. Path fol-lowing control for a reversing general 2-trailer system. arXiv preprintarXiv:1605.04393, 2016. Cited on pages 11, 27, and 34.

39

Page 50: Stabilization, Sensor Fusion and Path Following for ...952744/FULLTEXT01.pdf2.3.2 RTK-GPS ... More complex methods include letting the driver ... Most papers use controllers tracking

40 Bibliography

[11] Jesús Morales, Jorge L. Martínez, Anthony Mandow, and Alfonso J. García-Cerezo. Steering the last trailer as a virtual tractor for reversing vehicleswith passive on- and off-axle hitches. IEEE TRANSACTIONS ON INDUS-TRIAL ELECTRONICS, 60(12):5729–5736, 2013. Cited on pages 3 and 10.

[12] Gerald L Smith, Stanley F Schmidt, and Leonard A McGee. Application ofstatistical filter theory to the optimal estimation of position and velocity onboard a circumlunar vehicle. National Aeronautics and Space Administra-tion, 1962. Cited on page 24.

[13] Moritz Werling, Philipp Reinisch, Michael Heidingsfeld, and Klaus Gresser.Reversing the general one-trailer system: Asymptotic curvature stabiliza-tion and path tracking. TUGBoat, 14(3):342–351, 2014. Cited on page 3.