driving autonomy - intelligent vehicles

44
Driving Autonomy Rob Weston Machine Learning for Intelligent Transportation Systems OXFORD ROBOTICS INSTITUTE APPLIED ARTIFICIAL INTELLIGENCE LAB A 2 I …is mobile autonomy

Upload: others

Post on 18-Dec-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Driving Autonomy - Intelligent Vehicles

Driving Autonomy

Rob Weston

Machine Learning for Intelligent Transportation Systems

OXFORD ROBOTICS INSTITUTE

APPLIED ARTIFICIAL INTELLIGENCE LABA2I

…is mobile autonomy

Page 2: Driving Autonomy - Intelligent Vehicles
Page 3: Driving Autonomy - Intelligent Vehicles

Autonomous driving is fundamentally a systems challenge …

Page 4: Driving Autonomy - Intelligent Vehicles

Why systems?

Challenges • An uncertain world • Imperfect sensors and maps • Ever changing environments (unseen locations,

weather, season)

Solutions must be… • auditable • safe • efficient (computation, energy)

Localisation Detection Tracking Planning Control

Camera, Lidar, Radar Etc.Driving commands

Page 5: Driving Autonomy - Intelligent Vehicles

Systems allow us to move from this…

Page 6: Driving Autonomy - Intelligent Vehicles

…is mobile autonomy

To this!

Page 7: Driving Autonomy - Intelligent Vehicles

To this!

Page 8: Driving Autonomy - Intelligent Vehicles

The Deep Learning Revolution

We can now accurately learn some pretty complex mappings…

… and execute these mappings very quickly indeed.

Page 9: Driving Autonomy - Intelligent Vehicles

So, where does machine learning fit in?

Improving the tooling… • automatic data generation • versatile simulation, e.g.

• real -> sim • third-party behaviour

• …

Improving the platform… • sensor performance (and fusion) • situational awareness

• perception • prediction (behaviour) • planning (behaviour)

• …

Page 10: Driving Autonomy - Intelligent Vehicles

The Deep Learning Revolution

We can now accurately learn some pretty complex mappings…

… and execute these mappings very quickly indeed.

Provided we have vast labelled datasets!

Page 11: Driving Autonomy - Intelligent Vehicles

Integrating Machine Learning into the system creates some unique challenges

Learning from limited training labels

Adapting to new situations quickly

Managing uncertainty

Model validation

Model Size …

Page 12: Driving Autonomy - Intelligent Vehicles

So where does machine learning fit in?

Page 13: Driving Autonomy - Intelligent Vehicles

Leveraging the system for self-supervision…

Page 14: Driving Autonomy - Intelligent Vehicles

End-to-End Tracking and Semantic Segmentation Using RNNs

Peter Ondruska, Julie Dequaire, Dominic Wang and Ingmar Posner

[Ondruska, Dequaire, Wang and Posner, “End-to-End Tracking and Semantic Segmentation Using RNNs”, RSS w/s DL in Robotics 2016]

Page 15: Driving Autonomy - Intelligent Vehicles

Input Output

Deep Tracking

[Ondruska, Dequaire, Wang and Posner, “End-to-End Tracking and Semantic Segmentation Using RNNs”, RSS w/s DL in Robotics 2016]

Self-supervised learning of world dynamics (and then semantics)

Page 16: Driving Autonomy - Intelligent Vehicles

Identifying Domain Knowledge What do we have? What do we know?

What do we have? What do we know?

Page 17: Driving Autonomy - Intelligent Vehicles

Path Proposals for Urban AutonomyDan Barnes, Will Maddern and Ingmar Posner

[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 18: Driving Autonomy - Intelligent Vehicles

[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 19: Driving Autonomy - Intelligent Vehicles

[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 20: Driving Autonomy - Intelligent Vehicles

Training is entirely self-supervised.[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 21: Driving Autonomy - Intelligent Vehicles

Deployed with monocular camera only.[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 22: Driving Autonomy - Intelligent Vehicles

Generating training dataPath Proposals

[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 23: Driving Autonomy - Intelligent Vehicles

Generalises across appearance changes…

[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 24: Driving Autonomy - Intelligent Vehicles

… and to new environments

[Barnes et al, “Find Your Own Way: Self-Supervised Segmentation of Path Proposals for Urban Autonomy”, ICRA, 2017]

Page 25: Driving Autonomy - Intelligent Vehicles

Self-Supervised Distraction SuppressionDan Barnes, Will Maddern, Geoff Pascoe and Ingmar Posner

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 26: Driving Autonomy - Intelligent Vehicles

Deep Distraction SuppressionLearning where to look…

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 27: Driving Autonomy - Intelligent Vehicles

Deep Distraction SuppressionLearning where to look…

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 28: Driving Autonomy - Intelligent Vehicles

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

OverviewDeep Distraction Suppression (& Mono VO)

Training is entirely self-supervised. Deployed with monocular camera only.

Page 29: Driving Autonomy - Intelligent Vehicles

How do we do it?Deep Distraction Suppression

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 30: Driving Autonomy - Intelligent Vehicles

Generating Training DataDeep Distraction Suppression

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 31: Driving Autonomy - Intelligent Vehicles

DeploymentDeep Distraction Suppression

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 32: Driving Autonomy - Intelligent Vehicles

DeploymentDeep Distraction Suppression

[Barnes et al, “Driven to Distraction: Self-Supervised Distractor Learning for Robust Monocular Visual Odometry in Urban Environments”, ICRA 2018]

Page 33: Driving Autonomy - Intelligent Vehicles

Deep Inverse Sensor Modelling In RadarRob Weston, Sarah Cen, Paul Newman, Ingmar Posner

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 34: Driving Autonomy - Intelligent Vehicles

Using Radar for Autonomous VehiclesGhost Reflections

Saturation

Cars

Speckle Noise

Free

Occluded

Noise Artefacts

Occlusion ?

Why? •Resilient to adverse weather conditions •Large range

Challenges? •Sensor Noise •Occlusion

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 35: Driving Autonomy - Intelligent Vehicles

Project Proposal

Aim: Leverage the power of deep learning to determine what space in the world is occupied in radar

Challenges: Sensor Noise, Occlusion, Training Labels

Solution: Assume radar is inherently uncertain to account for sensor noise and train using partial labels generated in laser

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 36: Driving Autonomy - Intelligent Vehicles

Our Approach

Use a deep Neural Network Model Heteroscedastic Aleatoric Uncertainty

Train using partial lidar labels

Desired Probability of Occupancy Grid

Raw Radar Input Lidar occupancy labels

?

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 37: Driving Autonomy - Intelligent Vehicles

TrainingGenerate lidar labels

Free

Occupied

Partially Observed

Unobserved

Observed

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 38: Driving Autonomy - Intelligent Vehicles

Uncertainty: Knowing what we don’t know…

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 39: Driving Autonomy - Intelligent Vehicles

Results ‣ Our model successfully outperforms classical CFAR filtering approaches

Raw Radar

Ground Truth

Our Approach

2D cartesian CFAR

1D polar CFAR

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 40: Driving Autonomy - Intelligent Vehicles

Use case: mapping

[Weston et al, “Probably Unknown: Deep Inverse Sensor Modelling In Radar”, ICRA 2019]

Page 41: Driving Autonomy - Intelligent Vehicles

• machine learning does fit in (but maybe not in ways we first thought).

• systems engineering remains at the heart of deployable autonomous driving (at least for now).

• leverage the system (by definition it is all-knowing)

• be clever about how you use your data

Take aways for deployable driving solutions…

Page 42: Driving Autonomy - Intelligent Vehicles

Maximum Entropy Deep IRL

[Wulfmeier et al, “Maximum Entropy Deep Inverse Reinforcement Learning”, arXiv, July 2015][Wulfmeier et al, “Watch This: Scalable Cost-Function Learning for Path Planning in Urban Environements”, IROS 2016]

• First to frame Max. Ent. IRL in deep learning context

• Efficient large-scale learning from demonstration… • backing out observed reward structure • refining human prior

Learning to act from lots of demonstration

Learned where to drive from 25,000 real-world trajectories.

Page 43: Driving Autonomy - Intelligent Vehicles

SQAIRGenerative modelling of moving objects

[Kosiorek, Kim, Posner, Teh, “Sequential Attend, Infer, Repeat: Generative Modelling of Moving Objects”, NeurIPS 2018]

• Extends AIR to image sequences.

• Learns to detect and track objects without any supervision.

• Learns object motion models.

• Obtains a richer generative model by leveraging temporal information.

Code available on GitHub

Page 44: Driving Autonomy - Intelligent Vehicles

…is mobile autonomy