technical report autonomous trucks - linköping university · central. the real vehicle...

41
I Technical Report Autonomous Trucks Version 1.0 Author: Olle Sj¨oblom Date: December 14, 2014 Status Reviewed 141214 AR Approved Course name: Control Project E-mail: [email protected] Project group: Autotrucks Document responsible: Olle Sj¨ oblom Course code: TSRT10 Author’s E-mail: [email protected] Project: Autonomous Trucks Document name: TechicalReport.pdf

Upload: others

Post on 24-Sep-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

I

Technical ReportAutonomous Trucks

Version 1.0

Author: Olle SjoblomDate: December 14, 2014

Status

Reviewed 141214 ARApproved

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 2: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Project Identity

Group E-mail: [email protected]: http://www.notyetexisting.seOrderer: Erik Frisk, Linkoping University

Phone: +46 (0)13 - 28 5714, E-mail: [email protected]: Per Sahlholm, Scania

Phone: +46 (0)8 553 891 29 , E-mail: [email protected] Responsible: Daniel Axehill, Linkoping University

Phone: +46 13 284042, E-mail: [email protected] Manager: ScaniaAdvisors: Chih Feng Lee , Linkoping University

Phone: +46 (0)13 - 28 4620 , E-mail: [email protected]

Group Members

Bus Group

Amy Rankka (AR) Projekt Leader 0708305289 amyra281Adam Lundmark (AL) Bus Responsible 0706189894 adalu838

Mapping Group

Olle Sjoblom (OS) Document Responsible 0722000847 ollsj047Mikael Bengtsson (MBe) - 0736293663 mikbe683

Vehicle Group

Mathias Brischetto (MBr) Test Responsible 0768021446 matbr612Peter Karlsson (PK) - 0706595260 petka454

Planning Group

Johan Hogdahl (JH) Design Responsible 0738498167 johho091Rasmus Mehler (RM) - 0738518292 rasme879Srdjan Jovanovic (SJ) - 0737320773 srdjo099

Page 3: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

III

Document History

Version Date Changes made Sign Reviewer

0.1 141208 First draft. All All1.0 141214 First version All AR

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 4: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Contents

1 Introduction 1

1.1 Aims and Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.2 Use . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

1.3 Overview of This Document . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

2 System Overview 2

3 Communication 4

3.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.2 Concept and Realisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.3.1 Package Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

3.3.2 Matlab Handler . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.3.3 Matlab Sender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

3.3.4 C++ Handler . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

3.3.5 C++ Sender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

4 PreScan 9

5 Lanelets 10

5.1 Basic Concept . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

5.2 The Role of Lanelets in This Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

5.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

6 Mission Planner 11

6.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

6.2 Communication with Other Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

6.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

6.3.1 Compatibility with PreScan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

6.3.2 Implementation for Computing the Route . . . . . . . . . . . . . . . . . . . . . . . 12

7 Situational Awareness 14

7.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

7.2 Communication with Other Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

7.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

8 Local Planner 19

8.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

8.2 Communication with Other Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

8.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

8.3.1 Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

8.3.2 Missions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

8.3.3 Obstacles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

8.3.4 Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

8.3.5 Collission Handeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

8.3.6 Path-Finding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

8.3.7 Replanning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Page 5: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

V

8.3.8 Calculation of Reference Speed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

8.3.9 Calculation of Reference Heading . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

9 Vehicle Data Estimator 23

9.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

9.2 Communication with Other Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

9.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

9.3.1 Center of Gravity Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

9.3.2 Mass Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

9.3.3 Extended Kalman Filter Implementation . . . . . . . . . . . . . . . . . . . . . . . 27

9.3.4 Limitation Calculations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

10 Low-Level Controller 29

10.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

10.2 Communication with Other Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

10.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

11 Real Vehicle Representation 32

11.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

11.2 Communication with Other Subsystems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

11.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32

A Matlab Coded Estimators 33

References 36

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 6: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks1

Technical ReportVersion 1.0

1 Introduction

This document is a technical report for an automatic control project on developing asimulation envorinment for autonmous trucks at, for example, an open pit mine. It consistsof a system overview explaining the entire system and how its sub parts communicate witheach other, followed by a deeper explanation of each sub system.

1.1 Aims and Goals

This project is a part of a bigger project called iQMatic, where the goal is to be ableto have trucks operating autonomously at an open pit mine. In this specific part of theproject, a software has been developed which interacts with a simulation software calledPreScan, where the working space is being simulated. The system maps up the world sothat algorithms can be used to drive from one point to another and complete differenttypes of missions, while avoiding obstacles. The aim with this document is to show whatthe system looks like and how it is implemented.

1.2 Use

The purpose with the simulation environment developed in this project is that it shouldbe used to simulate different scenarios at the gravel pit that a fleet of autonomous truckscould encounter. The simulations can then provide information about the possibilties andlimitations of the missions given to the trucks.

1.3 Overview of This Document

This document will first provide a descriptive overview of the system, then describe thecommunication in the system, the PreScan environment [7] and the concept of lanelets[3],and finally throw light upon the implementation of each subsystem. For each subsystem,its purpose, implementaion, what is required of the subsystem, and how data is transferredto and from the subsystem is described.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 7: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks2

Technical ReportVersion 1.0

2 System Overview

The system can be divided into six subsystems: Mission Planner, Situational Awareness,Local Planner, Vehicle Data Estimator, Low-Level Controller and Real Vehicle Represen-tation. The subsystems and their connections are displayed in Figure 1 along with a blockrepresenting the, compared to a standard vehicle model, additional sensors used for navi-gation. The mission planner shall be situated outside the vehicle, at a mission commandcentral. The real vehicle representation is the part of the simulation environment thatshall be replaced with the physical autonomous vehicle, i.e. it contains all the sensorsan autonomous vehicle has access to. The rest of the subsystems are part of the vehicleguidance system, e.g all logic situated on board the vehicle needed to drive the vehicleautonomously and the sensors used for navigation.

Mission

Planner

Situational

Awareness

Local

Planner

Vehicle Data

EstimatorLow-level

Controller

Real Vehicle

Representation

(including sensors)

Navigational

Sensors

LCM bus

Simulink connection

Simulink connection

Figure 1: Overview of the system layout.

The functionality and the implementation of the six subsystems will be presented in thesections 6 to 11. A brief description of their purposes is presented here:

• Mission Planner: An operator controls the vehicles by giving them either missionsor stop commands (in case of emergency, for example). When a mission is given,the mission planner computes a route that the vehicle should be able to follow. Themission planner is not aware of obstacles, such as other vehicles, humans or stonesfor instance.

• Situational Awareness: A subsystem that contains all the information that thevehicle has about its surroundings. It takes in and interprets information fromsensors and constructs a local grid map of drivable and non-drivable areas.

• Local Planner: The local planner computes a trajectory that the vehicle shallfollow and sends reference signals to the low-level controller. It also reports back tothe mission planner about the vehicles status.

• Vehicle Data Estimator: Models of the vehicle to provide information to the restof the system.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 8: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks3

Technical ReportVersion 1.0

• Low-Level Controller: Low level controllers for basic vehicle functionality.

• Real Vehicle Representation: A realistic model of the vehicle in PreScan usedin simulation.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 9: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks4

Technical ReportVersion 1.0

3 Communication

This section will describe how the intermodule communication was implemented.

3.1 Purpose

As functionality often is apportioned for complex systems, message passing and informa-tion flow is vital for the functionality. Information packets of varying sizes with commandsand data will be passed from one subsystem to another. In short, intermodule informationexchange is vital in this project.

3.2 Concept and Realisation

The application in question is a tightly connected network on a truck. Commands andupdates may be sent to an external interface but the system as a whole has a tight connec-tion. The method chosen to realise this is Lightweight Communication and Marshalling(LCM) [6]. The LCM-protocol simplifies message passing on a local-area network as thename signifies.

3.3 Implementation

The LCM communication is based on channel wise bus communication. An informa-tion supplier will publish information on a channel bus and an arbitrary number of pro-cesses subscribe to the data. Both publisher and subscriber communicate with pre-definedpackage types. In the following sections, package types and implementation of differenthandlers and senders are described.

3.3.1 Package Types

package miss ionPackage ;

struct missionData{

i n t 1 6 t wayPointSize ;i n t 1 6 t isWayPointOnFreespaceSize ;double waypoints [ wayPointSize ] ;boolean isWayPointOnFreespace [ isWayPointOnFreespaceSize ] ;boolean i s S t a r t F r e e s p a c e ;boolean isEndFreespace

}

struct abort{

boolean panic ;}

Listing 1: Mission package type declaration.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 10: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks5

Technical ReportVersion 1.0

package s igna lPackage ;

struct r e f e r e n c e S i g n a l{

f loat speed ;f loat heading ;boolean i s v e h i c l e o n r o a d ;

}

Listing 2: Signal package type declaration.

package sensorPackage ;

struct sensorData{

i n t 1 6 t s i z e 1 ;i n t 1 6 t s i z e 2 ;f loat preScanData [ s i z e 1 ] ;f loat airData [ s i z e 2 ] ;

}

Listing 3: Sensor package type declaration.

package worldPackage ;

struct worldMap{

i n t 1 6 t g r i dReso lu t i on ;i n t 1 6 t c e l l S i z e ;boolean world [ g r i dReso lu t i on ] [ g r i dReso lu t i on ] ;f loat v e h i c l e P o s i t i o n [ 2 ] ;f loat veh ic l eHead ing ;boolean i s v e h i c l e o n r o a d ;

}

Listing 4: World map package type declaration.

package imuPackage ;

struct imuData{

f loat a c c e l e r a t i o n y ;f loat r o l l a n g l e ;f loat r o l l r a t e ;f loat p i t c h a n g l e ;

}

Listing 5: IMU package type declaration.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 11: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks6

Technical ReportVersion 1.0

package veh ic l ePackage ;

struct vehic leDimens ion{

double l ength ;double width ;double theta max ;double v max ;

}

Listing 6: Vehicle package type declaration.

3.3.2 Matlab Handler

1 % Initialize the lcm object and packet handler (Only done once)

2 lc = lcm.lcm.LCM(’udpm ://239.255.76.67:7667? ttl=1’);

3 aggregator = lcm.lcm.MessageAggregator ();

4 lc.subscribe(’SENSOR ’, aggregator);

56 % Attempt to catch packet , wait for 1/200 s

78 while true

9 millis_to_wait = 1/200;

10 msg = aggregator.getNextMessage(millis_to_wait);

11 if ~isempty(msg)

12 break;

13 end

14 disp(’Waiting ...’)

15 end

1617 m = sensorPackage.sensorData(msg.data);

Listing 7: Standard matlab handler.

3.3.3 Matlab Sender

1 % Initialize the lcm object (Only done once)

2 lc = lcm.lcm.LCM(’udpm ://239.255.76.67:7667? ttl=1’);

34 % Map is prepared , then publish

5 if mapIsprepared

6 lc.publish(’MAP’,map);

7 end

Listing 8: Standard matlab reciever.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 12: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks7

Technical ReportVersion 1.0

3.3.4 C++ Handler

1 #inc lude <lcm/lcm−cpp . hpp>2 #inc lude ”miss ionPackage /missionData . hpp”3

4 c l a s s Handler5 {6 pub l i c :7

8 i n t s i z e ;9 std : : vector<double> waypoints

10

11 ˜Handler ( ) {}12

13 void handleMiss ion ( const lcm : : Rece iveBuf f e r ∗ rbuf ,14 const std : : s t r i n g& chan ,15 const miss ionPackage : : missionData ∗ msg)16 {17 s i z e = msg−>s i z e ;18 waypoints . r e s i z e ( s i z e ) ;19 waypoints = msg−>waypoints ;20 }21 } ;22

23 i n t main ( )24 {25 lcm : :LCM lcm ( ”udpm://239 . 2 55 . 7 6 . 6 7 : 7 6 67 ? t t l=1” ) ;26 i f ( ! lcm . good ( ) ) re turn 1 ;27

28 Handler h ;29 lcm . sub s c r i b e ( ”MISSION” , &Handler : : handleReference , &h) ;30

31 // This i s an example o f an asunchronous r e c i e v e r32 bool readMessages = true ;33 whi le ( readMessages )34 {35 // Read messages as long as the re are messages to read36 i n t lcm fd = lcm−>ge tF i l eno ( ) ;37 f d s e t fd s ;38 FD ZERO(& fds ) ;39 FD SET( lcm fd , &fd s ) ;40

41 s t r u c t t imeva l se l ectTimeout =42 {43 0 , // seconds44 0 // microseconds45 } ;46

47 i n t s t a tu s = s e l e c t ( l cm fd + 1 , &fds , 0 , 0 , &se lectTimeout ) ;48 i f ( s t a tu s != 0 && FD ISSET( lcm fd , &fd s ) )49 {50 lcm−>handle ( ) ;51 }52 e l s e53 {54 readMessages = f a l s e ;55 }56 }57

58 // Sends waypoints to parent ob j e c t ( Local Planner i . e . )59 parent−>i nd i catePath (h . waypoints ) ;60

61 }

Listing 9: Cut out of package handler in local planner.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 13: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks8

Technical ReportVersion 1.0

3.3.5 C++ Sender

1 #inc lude <lcm/lcm−cpp . hpp>2

3 #inc lude ” s igna lPackage / r e f e r e n c e S i g n a l . hpp”4

5 i n t6 main ( )7 {8 lcm : :LCM lcm ( ”udpm://239 . 2 55 . 7 6 . 6 7 : 7 6 67 ? t t l=1” ) ;9 i f ( ! lcm . good ( ) ) re turn −1;

10

11 s igna lPackage : : r e f e r e n c e S i g n a l my data ;12

13 // The l o c a l p lanner c a l c u l a t e s speed , heading and i f the v eh i c l e i s onroad .

14 my data . speed = 83 . 3 3 ;15 my data . heading = −0.4;16 my data . i s v e h i c l e o n r o a d = f a l s e ;17

18 lcm . pub l i sh ( ”REF” , &my data ) ;19

20 r e turn 0 ;21 }

Listing 10: Cut out of reference signal sender.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 14: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks9

Technical ReportVersion 1.0

4 PreScan

PreScan[7] is a simulation environment that is used to simulate the vehicle’s motion duringa mission. A representation of the world, including roads and different obstacles, can beset up in PreScan and the vehicle’s motion in this world can be visualised. Several vehicle-and sensor models are included in PreScan and in this project the vehicle model MercedesActros 1860 (the only one with dynamics for 4 wheels) is used. PreScan has an interfaceto Simulink that is be used for communication with subsystems via direct connections.Otherwise, communication between the subsystems and PreScan is being done via LCM.

Figure 2: An example of the PreScan working enviroment.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 15: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks10

Technical ReportVersion 1.0

5 Lanelets

In this section, lanelets [3] will be described. Lanelets will be used to represent roads onthe map used by the mission planner and a lanelet-algorithm will be used by the localplanner to compute trajectories when the vehicle is driving on roads.

5.1 Basic Concept

Lanelets is a technique for representing roads and road networks. One single laneletis represented by a left bound and a right bound. Beside from keeping track of theboundaries of the lanelet, the left and right bound also gives information about what thedriving direction, in the lane represented by the lanelet, is. Each boundary consists of anumber of nodes.

A network of lanelets can be put together. When two lanelets are merged, by havingthe starting node position in one of them as ending node position in the other one, thetwo lanelets are called adjacent. By performing this with numerous lanelets, a network oflanelets representing a road network is formed.

An illustration of the basic concept of lanelets is showed in Figure 3.

Figure 3: The basic concept of lanelets.

5.2 The Role of Lanelets in This Project

Lanelets are used by the mission planner (Section 6).

5.3 Implementation

The C++ library libLanelet [2] is used in the implementation of the lanelet functionalityin mission planner. All roads in the map that the mission planner has access to arerepresented by lanelets.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 16: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks11

Technical ReportVersion 1.0

6 Mission Planner

An operator uses the mission planner to give missions to the vehicles and to stop a vehiclein case of emergency. The mission planner knows which routes that are usually possibleto use. It is not aware of vehicles or unexpected obstacles, such as stones or churches, thatmay block the road or the usual route. When a mission has been entered, the missionplanner computes a route that the vehicle should be able to follow as long as there are nounexpected obstacles present.

6.1 Purpose

The mission planner is located outside of the vehicle. Thus, the overall purpose of themission planner is that an operator can give a mission to the vehicle by typing the thestart point and the end point for the mission from a central computer. It is the missionplanner’s duty to give a route for this mission based on the given default map of thewhole area. In cases of emergency, the operator should be able to stop the vehicle, whichis another important purpose of the mission planner.

6.2 Communication with Other Subsystems

The mission planner sends data to the local planner, see section 8, via LCM.

The data sent to the local planner is a list consisting of waypoints that represents thecomputed route from the start point to the destination point. For each waypoint, a boolrepresenting whether the waypoint is on freespace or not is added.

6.3 Implementation

The mission planner is an object-oriented program implemented in C++. The librarieslibLanelet, pugixml [1] and boost [8] are used for some functionality.

The road network is represented by a map of lanelets created in JOSM [9], a Java editorenvironment for OpenStreetMap. When a road network has been created in JOSM, it issaved as a .osm file. OSM is an xml-based file format and can be parsed by a function inthe libLanlet library.

Each lanelet consists of a number of nodes along each boundary, see section 5.1. Eachboundary is represented by a way element, the nodes by node elements and the laneletby a relation element in the .osm file. The distance between the nodes within a lanelet isarbitrary. The same applies regarding the length of a lanelet. The lanelets and the nodesare assigned specific ID numbers when created. The last node of a lanelet is the first nodeon the next lanelet so it is possible to keep track of which lanelets that are connected.

The boundaries of the free-space areas are created along with the road network in JOSM.The boundaries are way elements which are assigned an area attribute to distinguish themfrom the lanelet boundaries. For the program to find those way elements, the .osm filehas to be parsed with a different parser than the one included in libLanelet. A parser fromthe pugixml library is used for this purpose.

6.3.1 Compatibility with PreScan

An .osm file can be imported in PreScan and all way elements with a specific attribute willthen be represented by road blocks. PreScan has however a tendency to bend the roads

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 17: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks12

Technical ReportVersion 1.0

so that they drift away from the original coordinates. When visualising the simulatedmotion of the vehicle in PreScan, it will therefore look like as if the vehicle is drivingoff the roads a considerable amount of the time. To avoid this problem, another methodis used in this project to place the roads in PreScan and the mission planner map onthe same coordinates. An OSM map of the world is set as a background layer in JOSMand the lanelets are created on top of the roads in a small area of the background map.A screenshot of the same area of the background map is then taken and imported asan underlay in PreScan. The global coordinates in the PreScan environement are set tomatch the coorinates in JOSM. A road network is then created on top of the same roadsas i JOSM. The distance between the right and left boundaries of the lanelets is set equalto the width of a road segment in PreScan.

6.3.2 Implementation for Computing the Route

The main key for the mission planner is to compute a route. The route is defined as anumber of waypoints with latitudial and longitudinal coordinates. Those waypoints aresent to the local planner.

The operator has to enter the ID numbers of the start- and destination lanelet when givinga mission that includes road driving. The libLanlet library is used to compute the shortestroute between those two lanelets on the road network. The function below is available inlibLanlet and it is a member function of the LaneletMap class. The method computes theroute based on the input arguments which are the start lanelet and destination lanelet.These arguments belong to a class called Lanelet, with information about its ID number(among other information).

1 sho r t e s t pa th ( const l a n e l e t& s ta r t , const l a n e l e t& dest ) const ;

The function returns a vector with all lanelets along the shortest path. The waypoints tobe sent to local planner are calculated as the middle points between all node pairs (nodesfrom right- and left boundary of the lanelets) of all those lanelets.

If driving on free-space is included in the mission, the coordinates for the starting and/ordestination point have to be entered by the operator. When the coordinates have beenentered, a check is performed to make sure that they are within the free-space boundaryand that the free-space area is connected to the start- or the destination lanelet. Todetermine whether the coordinates are within the allowed area, a function called within()from the boost library is used. The function takes two geometries, in this case a point ofthe coordinates and a polygon made of the nodes from the boundary, as arguments anddetermines if the first geometry is within the second. The free-space boundaries will havecommon nodes with the connecting lanelets and thus it can be determined whether thecoordinates are within an area that connects to the right lanelet.

A flow chart of the mission planner can be seen in Figure 4.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 18: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks13

Technical ReportVersion 1.0

Figure 4: Working principle of the mission planner.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 19: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks14

Technical ReportVersion 1.0

7 Situational Awareness

Situational Awareness creates a world representation of the closest surroundings to thevehicle. The world representation is a grid where each cell in the grid contains infor-mation about driveability. The driveability concerning static objects is determined by abinary Bayes’ filter. Dynamic objects such as other vehicles are represented in the gridby predicting a trajectory assuming constant speed and yaw rate in each time step. Thestatic and dynamic worlds are then merged together resulting in a full representation ofthe world.

7.1 Purpose

Situational Awareness receives sensor data from the vehicle in PreScan. Using this infor-mation, the module constructs a local map of the environment in real-time and sends itto the Local Planner at each time stamp. It detects driveable and non-drivable areas aswell as other vehicles and obstacles. An overview of the sub system is shown in Figure 5.

Figure 5: Overview of situational awareness.

7.2 Communication with Other Subsystems

Sensor data from PreScan is sent via LCM to the Situational Awareness module, whichis written in Matlab. The module then creates a grid map, that via LCM will be sentto Local Planner. The data types sent to and from Situational Awareness can be seen insection 3.3.1.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 20: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks15

Technical ReportVersion 1.0

7.3 Implementation

Sensors

The following sensors are used:

• A laser sensor and a GPS are used to provide data for the grid map construction.

• An idealized AIR sensor is used to get information on dynamic objects.

Occupancy Grid Map

The map is represented as an occupancy grid map meaning that the world is discretizedand divided into a number of cells. All cells are of equal size and each cell c has aprobability p(c) of being occupied. Unexplored cells have a prior occupancy probabilityp(c) = 0.5, unoccupied cells have p(c) ≈ 0, and occupied cells have p(c) ≈ 1, whereof course 0 ≤ p(c) ≤ 1 ∀c. Since it is not possible to completely surely determine theoccupancy of a cell from just one measurement, a binary Bayes’ filter (see Thrun et al.[10]) is used to update the occupancy probability of each cell over time based on earliermeasurements, according to Equation (1),

p(c|z1:t) =p(zt|c)p(c|z1:t−1)

p(zt|z1:t−1)=p(c|zt)p(zt)p(c|z1:t−1)

p(c)p(zt|z1:t−1)(1)

where Bayes rule has been used in the second equality. Similar calculations yield

p(¬c|z1:t) =p(zt|¬c)p(¬c|z1:t−1)

p(zt|z1:t−1)=p(¬c|zt)p(zt)p(¬c|z1:t−1)

p(¬c)p(zt|z1:t−1)(2)

Taking the quotient of Equations (1) and (2) results in cancellation of some probabilitiesand finally yields

p(c|z1:t)p(¬c|z1:t)

=p(c|zt)

1− p(c|zt)p(c|z1:t−1)

1− p(c|z1:t−1)

1− p(c)p(c)

(3)

Defining

lt(c) = logp(c|z1:t)p(¬c|z1:t)

(4)

one ends up with

lt(c) = logp(c|zt)

1− p(c|zt)+ log

p(c|z1:t−1)

1− p(c|z1:t−1)+ log

1− p(c)p(c)

(5)

Identifying the second term log p(c|z1:t−1)1−p(c|z1:t−1)

as lt−1(c) results in

lt(c) = lt−1(c)︸ ︷︷ ︸Recursive term

+ logp(c|zt)

1− p(c|zt)︸ ︷︷ ︸Uses current measurement

− logp(c)

1− p(c)︸ ︷︷ ︸Prior

(6)

where p(c|zt) will be defined from Figure 6.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 21: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks16

Technical ReportVersion 1.0

0 20 40 60 80 1000

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Distance [m]

Pro

babili

ty

Figure 6: Occupancy probability for a measurement with detection at 70 meters.

Finally, one can retrieve p(c|z1:t) from lt(c) using the inverse of Equation (4), yielding

p(c|z1:t) = 1− 1

1 + elt(c)(7)

Hence, a recursive algorithm for retrieving the occupancy probability of a cell has turnedout and the resulting map is constructed by doing these computations for all cells.

Figure 7 demonstrates how the vehicle with the help of a laser sensor, in an initial state,represents the world in an occupancy grid map. Green cells represent no occupancy, redcells represent occupancy and white cells represent unexplored areas. As seen in Figure7, cell c is marked as unoccupied, which is not the actual case. This is the reason whyseveral measurements of a single cell are needed to determine its true state of occupancy,and hence why a binary Bayes’ filter is used.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 22: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks17

Technical ReportVersion 1.0

X

Y

x

yc

obstacle

Figure 7: A principle sketch how the vehicle uses a laser sensor to build an occupancygrid map of its surroundings in the global coordinate system.

The construction of the grid map can be summarised in the flow chart in Figure 8.

Figure 8: High-level work flow for constructing a grid map.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 23: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks18

Technical ReportVersion 1.0

Dynamic Objects

Since a binary Bayes’ filter only applies to static objects, a seperate way of handlingdynamic objects is implemented. By assuming constant movement in each time step apredicted trajectory of a dynamic object can be calculated. If the object holds a yaw rateω and velocity v, the trajectory is represented as a segment along a circle with radius

r =v

ω(8)

with the circle center positioned perpendicular to the object’s heading. If the object doesnot hold a yaw rate, a straight trajectory in the current heading direction is assumed.The trajectory length is set to a fix value in both cases.

A trajectory is represented in the grid by stepping along it and assigning occupancy tothe corresponding cells in the grid. To decrease the computational time, the step size isset to a reasonable value. By dilating the calculated trajectory waypoints with a circle ofthe same size as a normal vehicle one obtains a representation of the full vehicle along thetrajectory. The concept of dilating is shown in Figure 9.

Trajectory Waypoints Dilated Trajectory Waypoints

Figure 9: Concept of dilating trajectory waypoints.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 24: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks19

Technical ReportVersion 1.0

8 Local Planner

8.1 Purpose

The purpose of the local planner is to compute the trajectory that the vehicle shall followin order to get to the next way point and send reference signals to the low-level controllers.

8.2 Communication with Other Subsystems

The local planner receives the following data:

• Vehicle position from situational awareness.

• A binary map from situational awareness that contains information about where thevehicle is allowed to drive and not allowed to drive.

• A route from the mission planner.

8.3 Implementation

The main loop of the local planner is implemented by two threads. One that listens to theLCM-bus and one that does all the other work. The main loop initializes all the objects itneeds for running, and then it basically just waits for commands from other subsystems.

Every command sent to local planner needs handeling when it arrives. That is done by ahandeler of the class myHandler which contains functionallity for every message type.

8.3.1 Communication

The communication over the LCM-bus is handled by two objects, LCM and myHandler.These two objects are used both by the main thread and the listener thread that is reciev-ing all the messages. Because of this, the two resources are protected with semaphores toprevent real time errors. This needs to be kept in mind when extending the program.

8.3.2 Missions

When local planner receives a mission, it needs to rebuild it to fit its representation.This is done by waypoints. Each waypoint has a constraint that needs to be fullfilledbefore local planner marks a waypoint as completed. The constraints could be either justposition constraints or heading and position constraints at the same time. For most ofthe waypoints, ordinary position constraints are used, but on freespace and on the firstwaypoint on a road section of the route, a heading constraint is also applied. The reasonfor this is that the vehicle needs to enter the road with a correct angle to be able to stayon the road. This is a bit more tough for the RRT-planner to achieve than just a positionconstraint. That is why it is only used when necessary.

In the visualizer, the waypoints are coded by colours to make it easier for the user to sewhat is going on. In Table 1 this is summarized.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 25: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks20

Technical ReportVersion 1.0

Colour Waypoint

Green Not reached position contrained waypoint.Purple Not reached heading constrained waypoint.Yellow Current goalpoint.

Red Reached waypoint.

Table 1: Waypoint colour coding.

8.3.3 Obstacles

A class named ObstacleMap is used in the local planner to keep record of where the vehicleis allowed to be and where it is not. There are several reasons for the vehicle not to beallowed in a certain position from a perspective outside of the local planner. That is, forinstance, not running into moving or static obstacles, or driving outside of the road. Insidethe local planner these cases are however treated equally when it comes to representingthem in the ObstacleMap class. Every obstacle is represented as rotated rectangle.

8.3.4 Map

The basic road map is created at startup by inputing a osm-file. The boundaries of theroads are represented as way elements, see section 6.3. It is therefore easy to construct verythin rectangles between the straight lines that bind together the nodes of the boundaries.This is also done around the freespace area. This means that the vehicle is encapsulatedinside a closed area that it should not drive outside. This also gives a neat line-likevisualization of the road delimiters and it is very obvious in the graphical user interfacewhere the vehicle should be able to drive and where it should not.

The other obstacles in the map are created when Situational Awereness tells the localplanner that obstacles, moving or static ones, are present. Local planner does not need toknow if the obstacles are static or not. That is all handeled in Situational Awerness. Theobstacles are built entierly from the message from Situational Awereness. The messageconsists of a binary map representing where the obstacles are. The vehicle is assumedto be located in the middle of the binary map. The size of each obstacle that the localplanner creates is decided by the part of the message that sets the grid size.

8.3.5 Collission Handeling

The collision handeling is done by a brute force method. For each obstacle within acertain range from the vehicle, a check is made for every linesegment in that obstacle if itintersects with any of the linesegments of the vehicle model. The linesegment intersectioncheck is done by functionality of the boost library[8].

8.3.6 Path-Finding

The path-finding is divided in two cases; freespace-driving and road-driving.

While driving on freespace, the local planner computes the trajectories using an RRT-algoritm developed by Niclas Evestedt and described in [4]. The most important filesregarding the RRT-planner are in RRTPlanner.h, simulator.h, model.h, samplingStrate-gies.h

When driving on on road, the RRT-algorithm is still running, but the reference signal sentto the low-level controllers is overridden so that the vehicle is following the waypoints using

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 26: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks21

Technical ReportVersion 1.0

an pure-pursuit controller. This is done in the last step before sending the reference signalsto the low-level controllers.

8.3.7 Replanning

When the RRT-planner makes a plan, this is done for the obstacle map present at the timethe planning starts. It is almost always the case that the surroundings are changing whenthe vehicle advances. A new plan must be calculated when something new happends. Thisis implemented by a timer since the car has a fixed speed on freespace. Because of thefixed speed it can be calculated approximatly how often the route needs to be replannednot run into any unforseen obstacles. In the current implementation this timer is set to 5seconds.

8.3.8 Calculation of Reference Speed

Both the road-following algorithm and the RRT-algorithm outputs a sequence of (x, y)T -coordinates as the trajectory for the vehicle. The resulting output that the local plannerpublishes on the LCM consists of speed, vi, and heading, θi, where θi = 0 is defined asnorth. The reference signal is stepwise constant so that the vi and θi is constant duringthe time interval ti ≤ t < ti+1.

The speed is set to a constant value when driving on the free-space area, while the speedon road is a function of how the road curvature. Below follows a more detailed descriptionon how the speed is computed while driving on road.

By using the fact that the vehicle is following waypoints, one can calculate the averagecurve angle ahead of the vehicle given the vehicles position and the position of the in-volved waypoints ahead of the vehicle. The number of waypoints ahead, N , that areincluded when computing the average curve angle depends on the horizon that is set, i.e.N = N(horizon). Usually there are many waypoints close to each other in sharpe curves,while they are further away from each other when the road is basically a straight line.N will then increase when the vehicle is approaching a curve and decrease when drivingstraight.

When passing a new waypoint, a new N is computed and based on this an average curveangle, φaverage is computed. When computing φaverage the law of cosine is used in order

to get φaverage = 1N

∑Ni=1 φi. Here each φi is computed by applying the law of cosine.

By taking the position of the first three waypoints (N ≥ 3) and then calculate the anglebetween vect12 and vect23, where vectij is the vector from waypoint i to j, the result isφ1. After looping throug all N waypoints inside of the look ahead distance, the averagecurve angle is computed.

The constant speed v between two waypoints that is sent out as a reference signal, isthen based on φaverage, according to the equation below.

v = vmin +vmax − vminπ · φaverage

(9)

There are some special cases. For instance, limφi→∞ is taken care of by simply settingthe speed to vmin.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 27: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks22

Technical ReportVersion 1.0

8.3.9 Calculation of Reference Heading

The reference heading calculations are done in a pure-pursuit way. While driving onfreespace, the intersection between the reference trajectory, computed by the RRT-planner,and an circle centered around the vehicle are used to set the heading to get to the inter-section. When driving on road the same principle applies, but in this case the intersectionis calulated between the straight line connecting two waypoints and an circle centeredaround the vehicle.

The heading reference is then computed as

θ = − atan2xintersection − xvehicleyintersection − yvehicle

+ π/2 (10)

where atan2 computes the angle on the intervall [−π, π], the intersection-subcript denotesthe intersection point and the vehicle-subscript denotes the vehicle position recieved fromsituational awareness. The negative sign and addition of π/2 is due to that the localplanner works in a left-hand system that is rotated 90 degrees.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 28: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks23

Technical ReportVersion 1.0

9 Vehicle Data Estimator

The vehicle data estimator is the part of the system used to provide information aboutthe vehicle itself that is needed to avoid unwanted and possbily dangerous situations. Itis based on mathematical models and can handle crucial aspects of a real vehicle.

9.1 Purpose

To plan a safe trajectory for the physical vehicle, a few limiting parameters need to beknown in order not to exceed the capacity of the vehicle and thereby possibly causingdangerous situations. This system estimates mass and centre of gravity.

The vehicle data estimator provides limitations for the trajectory planning. These areformulated as a maximum speed reference and a maximum reference angle for the trajec-tory, which are based on a maximum allowed curvature and what wheel angle is neededto achieve it. The vehicle also has a top speed limit which is the Swedish legal speedlimit for heavy duty vehicles (i.e. 90km/h), and likewise a maximum turning angle sinceit cannot not turn at any angle due to limitations in the steering shafts. Apart from this,the vehicle has limitations regarding how much to steer at a given speed or the otherway around, how high the speed can get at a given steering angle. To implement this,calculations based mainly on the roll over-condition have been made.

9.2 Communication with Other Subsystems

The vehicle data estimator receives sensor data from the real vehicle representation viadirect connections in Simulink. To send results, the block sends packages via LCM tolocal planner.

The data that is sent is the width and length of the vehicle and limitations of speed andheading (vmax and θmax respectively), where heading is defined as a compass angles, northis 0 degrees and east is 90 degrees. These are sent to local planner via LCM.

9.3 Implementation

The vehicle data estimator has been implemented in Matlab/Simulink. A flow chartdescribing the module work flow can be seen in Figure 10.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 29: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks24

Technical ReportVersion 1.0

Figure 10: A rough description of the work flow in the vehicle data estimator.

The vehicle size, i.e. length and width, has been hard coded into vehicle data estimatorto make the trajectory planning avoid collisions when passing nearby objects.

The mass estimation and centre of gravity estimation has been implemented as extendedKalman filters. See following sections or [5] for more detailed information.

9.3.1 Center of Gravity Estimation

The model used to estimate the centre of gravity height is the physical model seen below,describing the roll behavior of the vehicle. The mass is an input from the mass estimationfilter. The parameters used are listed in Table 2.

Ixx,rollφ = mhcgay cos(φ) +mhcgg sin(φ)− Cφφ−Dφφ (11)

Ixx,roll = Jx +mh2cg (12)

Cφ =3

2w2Cfront + Crear

2(13)

Dφ =6

2w2Dfront +Drear

2(14)

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 30: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks25

Technical ReportVersion 1.0

Table 2: Parameters used in the vehicle roll behaviour model.

Notation Definitionφ Roll angleIxx Roll inertiaJx Roll inertia constantw Track widthC Suspension spring constantD Suspension damping constant

The model was validated by compairing the right and left hand side with measurementdata. By this, and examining the vehicle model in Simulink, it was found that somemodifications had to be done to the original estimation model.

• Some tuning had to be done to both suspension parameters (Cθ and Dθ) for bettermodel correspondence.

• The rolling moment of inertia was calculated with the assumption that the lateralacceleration in Simulink is measured at the roll center instead of at a height of anactual IMU-sensor placed on the truck.

aIMUy = ay − hIMU φ cos(φ), hIMU = 0

ay = aIMUy

9.3.2 Mass Estimation

The model used to estimate the vehicle mass is a longitudinal model of the vehicle whereNewton’s second law sums the acting forces. The equations follow below.

ma = Ft − Fr − Fg − Fd (15)

where Ft is the traction force, Fr is rolling resistance force, Fg is gravity force and Fd isaerodynamic drag force.

Ft = (Teng − Jengωeng)igifηgηf

rω+ Tret

ifηfrω− Jωωω

rω(16)

Fr = Crrmg cos(α) (17)

Fg = mg sin(α) (18)

Fd =1

2ρCDAv

2 (19)

The parameters used above in the model are defined below in Tables 3, 4 and 5.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 31: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks26

Technical ReportVersion 1.0

Table 3: Parameters used in the mass estimation model.

Notation Definitionif Final drive gear ratiorω Wheel radiusηf Final drive efficiencyA Frontal areaCd Drag coefficientCrr Rolling coefficientJω Wheel rotational inertiaJeng Engine rotational inertiaρ Air density

Table 4: Measurement signals used in the mass estimation model.

Notation DefinitionTeng Engine torqueωeng Engine accelerationα Road gradev Vehicle speed

Table 5: Input values used in the mass estimation model.

Notation Definitionig Gear ratioηg Gearbox efficiencyTret Retarder torque

The model was validated by comparing the right and left hand side with measurementdata. By this and examining the vehicle model in Simulink it was found that somemodifications had to be done to the original estimation model.

• Since the vehicle model does not have a retarder this term has been neglected.

• The engine inertia has also been neglected.

• Instead of converting the engine torque, the transission output torque was used dueto better model correspondence.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 32: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks27

Technical ReportVersion 1.0

9.3.3 Extended Kalman Filter Implementation

The above equations and the kalman filters has been implemented following [5]. Thespecific algorithm used can be seen in Figure 11.

Figure 11: The extended Kalman filter algorithm used in the estimators.

The Matlab implementations can be found in Appendix A. The one for mass estimationcan be seen in Listing 11, and for center of gravity estimation in Listing 12.

One has to be aware of that these filters were implemented without process or measure-ment noise, and the tuning of the noise covariance matrices has been adjusted thereafter.However, it has been observed that the estimations are highly sensitive to what type ofdriving scenario has been run. Especially since the scenarios have not been possible torun for an extended period of time which might be necessary for optimal tuning of thefilters.

9.3.4 Limitation Calculations

The above estimations determine the limit of the lateral acceleration according to

ay = ngT

2hcg(20)

where T is the track width and n is the safety factor to ensure that no roll-over canoccur between 0 and 1. From this it is possible to extract a maximum heading and speed

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 33: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks28

Technical ReportVersion 1.0

reference based on a maximum curvature for a given speed and vice versa. This is doneby Ackerman’s formula according to

R =√a22 + l2 cot2(δ) (21)

where l is the wheelbase, δ is the average of inner and outer steering angle and a2 is thelength from the mass point to the rear wheel pair.

Equations 20 and 21 together with the formula

ay =V 2

R(22)

gives a limitation of both heading and velocity. The local planner does not use thelimitations at the moment, but it could easily be implemented. Listing 13 in Appendix Ashows the Matlab implementation of the limitation calculations.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 34: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks29

Technical ReportVersion 1.0

10 Low-Level Controller

10.1 Purpose

This subsystem calculate and output control signals to the real vehicle. It also takesmeasurement signals from the real vehicle. The low-level controller gets the referencesignals from local planner. This subsystem only gets references momentarily, not a futuretrajectory.

10.2 Communication with Other Subsystems

The low-level controller gets the reference signals speed and heading from local planner,see Section 8.2, via LCM. Therefore a wrapper from LCM to Simulink is used. A signalfrom local planner tells the controllers which set of parameters they will use, see Section8.2. The measurement signals are received from the real vehicle representation via directconnections in Simulink.

The information sent are the control signals throttle position [%], brake position [%] andsteering wheel angle [deg]. All these control signals are sent via direct connections inSimulink to the real vehicle representation.

10.3 Implementation

A flow chart of the work of the low-level controller is presented in Figure 12 below.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 35: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks30

Technical ReportVersion 1.0

Figure 12: A rough description of the work flow in the low-level controller.

The controllers are PID controllers. The controllers get their reference signals from localplanner. There are logic implemented for behaviour mode, i.e. different PID parametersdepending on whether the vehicle is on road or not, however it is not in use. Measurementsfrom real vehicle representation together with the reference signals form the error at eachtime sample. This error is the input to the controllers. The controlling is split intotwo parts: longitudinal controlling and lateral controlling, where the first one controlsthe speed and the latter heading. The longitudinal controller will control the speed bycontrolling throttle and break. If the error between desired velocity and actual velocity isnegative, the brake will apply, which is a brake pressure. If the error is positive, the inputgoes to throttle. The lateral controller controls the heading by steering wheel angle. TheSimulink implementation of the controllers can be seen in Figure 13 and Figure 14 below.

Figure 13: Simulink block for the longitudinal controller. The switches make it possibleto have different parameters depending on the behaviour mode (not in use right now).

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 36: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks31

Technical ReportVersion 1.0

Figure 14: Simulink block of the heading controller. This controller needs a switch to beused only when it gets reference signals to ensure it works as intended.

To tune these controllers, lambda tuning has been used as a start for the longitudinalcontroller and Ziegler–Nichols method for the lateral controller. Thereafter, some testshas been carried out to tune them even better.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 37: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks32

Technical ReportVersion 1.0

11 Real Vehicle Representation

11.1 Purpose

This subsystem is meant to represent the actual truck that is to be controlled. Thereforeit is suitable to use a representation as realistic as possible. To do this the most advancedmodel from PreScan is used.

11.2 Communication with Other Subsystems

The real vehicle representation receives control signals from the low-level controller. Theseare throttle position [%], brake position [%] and steering wheel angle [deg]. All thesecontrol signals are sent via direct connections in Simulink.

From the real vehicle representation the measurements signals speed [m/s] and heading[degrees], expressed as a compass angle where north corresponds to 0 degrees and east90 degrees, are sent to the low-level controller. These are sent via direct connections inSimulink.

Several measurement signals from real vehicle representation are sent to vehicle dataestimator. This will be done by direct connections in Simulink. The measurements andparameters will mainly be used in the mass estimation to calculate force output throughengine torque and transmission parameters. See Section 9.3.2 for examples of signals thatare sent.

11.3 Implementation

PreScan is able to give both 2D and 3D models. Both types are based on a bicycle modeland are able to model dynamics of roll behaviour depending on mass and centre of gravityof a truck. The 3D model can be extended to a four wheel model. Here, the 3D modelwith roll dynamics is chosen. Internally the model is equipped with a motor block, atransmission, a gear, a chassi and four wheels.

The model has an almost fully open source implementation in Simulink which can becontrolled by multiple inputs to motor block, brakes, gear box etc. While simulating andgiving these inputs through Simulink, the graphical representation in PreScan simulatesthe scenario as controlled. Simultaneously it is possible to extract several measurementsignals, either from realistic sensors or internal measurements, even ones that are usuallynot accessible in heavy duty trucks.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 38: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks33

Technical ReportVersion 1.0

A Matlab Coded Estimators

12 function ut = fcn(in)

3 %%%%%%% EKF for Mass Estimation %%%%%%%%

45 % Initializing the persistent variables

6 persistent xhat Ts Pt Kt

7 if isempty(Pt)

8 xhat =[0;0;0; 7000;1]; %Initial Guess of States

9 Ts =1/200; %Sample Time

10 Pt=10* diag ([1 1 1 10 1]); %Initial Covariance Matrix

11 Kt=1*eye (5); %Initial Weighting Matrix

12 end

1314 % Parameters

15 nf =0.98; %Final Drive Efficency (From GUI)

16 iF =2.73; %Final Gear Drive Ratio (From GUI)

17 A=7.95; %Frontal Area (From GUI)

18 Cd=0.7; %Drag Coefficient (From GUI)

19 Crr =0.01; %Rolling Coefficient (From GUI)

20 Jw=166; %Wheel Rotational Inertia (From iQDrive Report)

21 rw =0.498; %Wheel Radius (From GUI)

2223 g=9.82; % Gravitational Constant

24 rho =1.225; %Air density kg/m3

2526 % Measurement Parameters

27 %Input Data: (torque_transm , v_x , roadgrade , v_x , roadgrade , gearnumber ,

torque_eng )

28 yt = in (1:5);

2930 %Input Values (Logic/Unused)

31 u11 =1; %Gearbox efficiency

3233 % Measurement Jacobian

34 Ht=[1 0 0 0 0;0 xhat (5) 0 0 xhat (2);0 0 1 0 0;0 1 0 0 0;0 0 1 0 0];

3536 %Noise Covariance Matrices

37 Rt=0.3e-5* diag ([1 1 1e-5 1e9 1]);

38 Qt=1e-4* diag ([1 1 1 4e6 1]);

3940 % Measurement Update

41 hx=[xhat (1);xhat (2)*xhat (5) ;0; xhat (2) ;0];

42 xhat=xhat+Kt*(yt -hx);

43 Kt=Pt*Ht ’*inv((Ht*Pt*Ht ’+Rt));

44 Pt=Pt-Kt*Ht*Pt;

4546 %Time Update

47 f=[xhat (1);

48 xhat (2)+Ts*((( xhat (1))*u11)*iF*nf/(rw*(xhat (4)+Jw/rw^2))-rho*Cd*A*xhat (2)

^2/(2*( xhat (4)+Jw/rw^2))-xhat (4)*Crr*g*cos(xhat (3))/(xhat (4)+Jw/rw^2)

-xhat (4)*g*sin(xhat (3))/(xhat (4)+Jw/rw^2));...

49 0;

50 xhat (4);

51 xhat (5)];

5253 % Derivatives

54 df1dx1=Ts*iF*nf/(rw*(xhat (4)+Jw/rw^2));

55 df2dx2=1-Ts*rho*Cd*A*xhat (2)/(xhat (4)+Jw/rw^2);

56 df2dx3=Ts*(xhat (4)*Crr*g*sin(xhat (3))/(xhat (4)+Jw/rw^2)-xhat (4)*g*cos(xhat

(3))/(xhat (4)+Jw/rw^2));

57 df2dx4=Ts*(-(( xhat (1))*u11)*iF*nf/(rw*(xhat (4)+Jw/rw^2)^2)+rho*Cd*A*xhat (2)

^2/(2*( xhat (4)+Jw/rw^2) ^2)-Jw*rw^2*Crr*g*cos(xhat (3))/((rw^2* xhat (4)+Jw)

^2)-Jw*rw^2*g*sin(xhat (3))/((rw^2* xhat (4)+Jw)^2));

58

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 39: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks34

Technical ReportVersion 1.0

59 %State Jacobian

60 Ft=[1 0 0 0 0;

61 df1dx1 df2dx2 df2dx3 df2dx4 0;

62 0 0 1 0 0;

63 0 0 0 1 0;

64 0 0 0 0 1];

65 Gt=eye(5);

66 Pt=Ft*Pt*Ft ’+Gt*Qt*Gt ’;

67 xhat=f;

6869 ut = xhat;

Listing 11: Implementation of mass estimation EKF in Matlab

12 function y = fcn(u,mass)

3 %%%%%%% EKF for Estimation of Center of Gravity %%%%%%%%

45 % Initializing the persistent variables

6 persistent xhat Ts Pt Kt

7 if isempty(Pt)

8 xhat =[0;0;1;0]; %Initial Guess of States

9 Ts =1/200; %Sample Time

10 Pt =0.05* eye(4); %Initial Covariance Matrix

11 Kt=1* zeros (4,3); %Initial Weighting Matrix

12 end

1314 %Input IMUdata: (phi , phi_dot , phi_dot_dot , a_y , pitch)

15 y1=[u(1);u(2);u(4)];

1617 K_f = 5.98e5; %Front Suspension Spring constant (From GUI)

18 K_r = 1.196e5; %Rear Suspension Spring constant (From GUI)

19 D_f = 4.436e4; %Front Suspension Damping constant (From GUI)

20 D_r = 2.18e4; %Rear Suspension Damping constant (From GUI)

21 w = 2.034; %Wheel Base (From GUI)

22 Ktheta = (3/2)*w^2*( K_f + K_r)/2; % Suspension Spring constant

23 Dtheta = (6/2)*w^2*( D_f + D_r)/2; % Suspension Damping constant

2425 Jx =5106; %Roll inertia (From GUI)

26 M=mass; %Input mass from MassEstimator Output

27 himu =0; %Height of theoretical IMU

2829 % Measurement Jacobian

30 Ht=[1 0 0 0;0 0 1 0;0 0 0 1];

3132 %Noise Covariance Matrices

33 Rt=1e-2* diag ([1e-1 100 1.5e-3]);

34 Qt=1e-6* diag ([1 1 1200 1]);

3536 % Measurement Update

37 hx=[xhat (1);xhat (2);xhat (4)]; %h(x_(t -1)) f r n s t a steg

38 xhat = xhat + Kt*(y1 -hx); %inkl i n i t i a l v r d e

39 Kt=Pt*Ht ’*inv((Ht*Pt*Ht ’+Rt));

40 Pt=Pt - Kt*Ht*Pt;

4142 %Time Update

43 f=[xhat (1)+Ts*xhat (2);

44 xhat (2)+Ts/(Jx+M*xhat (3)^2-M*xhat (3)*himu)*(-Dtheta*xhat (2)-Ktheta*xhat

(1)+M*xhat (3)*xhat (4));

45 xhat (3);

46 xhat (4)];

4748 %State Jacobian

49 Ft=[1 Ts 0 0;

50 -Ktheta*Ts/(Jx+M*xhat (3)^2-himu*M*xhat (3)) 1-Dtheta*Ts/(Jx+M*xhat (3)^2-

himu*M*xhat (3)) Ts*M*xhat (4)/(Jx+M*xhat (3)^2-himu*M*xhat (3))-Ts*(2*M

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 40: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks35

Technical ReportVersion 1.0

*xhat (3)-M*himu)*(-Ktheta*xhat (1)-Dtheta*xhat (2)+M*xhat (3)*xhat (4))

/((Jx+M*xhat (3)^2-himu*M*xhat (3))^2) Ts*M*xhat (3)/(Jx+M*xhat (3)^2-

himu*M*xhat (3));

51 0 0 1 0;

52 0 0 0 1];

5354 xhat=f;

55 Gt=eye(4);

56 Pt=Ft*Pt*Ft ’+Gt*Qt*Gt ’;

5758 y = xhat;

Listing 12: Implementation of center of gravity EKF in Matlab

12 function y = fcn(u)

3 % %%%%%%%%%%%%% Vehicle Limitations %%%%%%%%%%%%%%%%%

45 % Hardcoded parameters (some from GUI)

6 g = 9.82; % Gravitational Const

7 b = 2.830; %[m] length from rear to CoG

8 l = 3.6; %[m] wheelbase (between front and rear wheels)

9 w = 2.034; %[m] track width

10 s = 0.4; % Safety Factor ( conventional )

11 v_lim = 90/3.6; % Upper speed limit (swe legal truck speed limit)

1213 %Input values

14 steer = u(1);

15 v = u(2);

16 hcg = u(3);

1718 %Wheel angle -> Turning radius

19 R = sqrt(b^2 +(l*cot(steer))^2);

2021 %Maximum allowed lateral acceleration based on current CoG

22 a_max = s*g*w/(2* hcg);

23 %Maximum allowed speed based on a_max , given current turning radius

24 v_max = sqrt(a_max*R);

25 %Maximum allowed turning radius on a_max , given current speed

26 R_max = v^2/ a_max;

27 %Turning radius -> Wheel angle

28 delta_max = acot(sqrt(max(R_max^2-b^2, 0))/l);

2930 %Sets Top Speed limit

31 v_max = min(v_lim , v_max);

3233 heading_max=delta_max*v_max; %Yaw Rate

34 y=[ heading_max *180/pi; v_max];

Listing 13: Matlab code for calcutating speed and heading limitations

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf

Page 41: Technical Report Autonomous Trucks - Linköping University · central. The real vehicle representation is the part of the simulation environment that shall be replaced with the physical

Autonomous Trucks36

Technical ReportVersion 1.0

References

[1] pugixml. https://code.google.com/p/pugixml/, -.

[2] Bender and J. Ziegler. liblanelet. https://github.com/phbender/liblanelet,2014.

[3] Bender, J. Ziegler, and C. Stiller. Lanelets: Efficient map representation for au-tonomous driving. IEEE Intelligent Vehicles Symposium (IV), 2014.

[4] Niclas Evestedt, Daniel Axehill, and Fredrik Gustafsson. Rapidly expanding randomtrees: A solution for the iqmatic project?

[5] Frisk and Krysander. iqdrive - final report mass and center of gravity estimation,2014.

[6] Albert S. Huang, Edwin Olson, and David C. Moore. Lcm: Lightweight commu-nications and marshalling. https://code.google.com/p/lcm/wiki/Documentatio,2013.

[7] Tass International. https://www.tassinternational.com/prescan.

[8] Boris Schaling. The boost c++ libraries. http://en.highscore.de/cpp/boost/,2011.

[9] Edgewall Software. Josm. https://josm.openstreetmap.de/, 2014.

[10] Sebastian Thrun, Wolfram Burgard, and Dieter Fox. Probabilistic robotics, 2000.

Course name: Control Project E-mail: [email protected] group: Autotrucks Document responsible: Olle SjoblomCourse code: TSRT10 Author’s E-mail: [email protected]: Autonomous Trucks Document name: TechicalReport.pdf