mav optical navigation software subsystem

29
MAV Optical Navigation Software Subsystem October 28, 2011 Adrian Fletcher (CECS), Jacob Schreiver (ECE), Justin Clark (CECS), & Nathan Armentrout (ECE) Sponsor: Dr. Adrian Lauf 1

Upload: jethro

Post on 26-Feb-2016

27 views

Category:

Documents


1 download

DESCRIPTION

MAV Optical Navigation Software Subsystem. October 28, 2011 Adrian Fletcher (CECS), Jacob Schreiver (ECE), Justin Clark (CECS), & Nathan Armentrout (ECE) Sponsor: Dr. Adrian Lauf. Background – Micro Air Vehicles (MAVs). A subset of Unmanned Aerial Vehicles (UAVs) Predator Raptor - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: MAV Optical  Navigation Software Subsystem

MAV Optical Navigation Software Subsystem

October 28, 2011Adrian Fletcher (CECS), Jacob Schreiver (ECE), Justin Clark (CECS),

& Nathan Armentrout (ECE)Sponsor: Dr. Adrian Lauf

1

Page 2: MAV Optical  Navigation Software Subsystem

A subset of Unmanned Aerial Vehicles (UAVs)◦ Predator◦ Raptor

Very small, maneuverable, and lightweight MAV Categories

◦ Fixed-wing◦ Rotary-wing◦ Flapping-wing

Used for homeland & battlefield applications◦ Surveillance◦ Reconnaissance

Background – Micro Air Vehicles (MAVs)

2

Page 3: MAV Optical  Navigation Software Subsystem

Dr. Lauf is a new assistant professor in the CECS department from Wright State University

His research is in embedded system design with applications to UAVs and MAVs◦ Communications & Networking◦ Controls◦ Navigation◦ Autonomous Flight◦ Multi-Agent Systems

Background – Dr. Lauf

3Courtesy of Dr. Lauf

Page 4: MAV Optical  Navigation Software Subsystem

Flapping-Wing MAV Sensors are limited to

◦ Gyroscopes (MEMS)◦ 3-Axis Accelerometers (MEMS)◦ Monocular Camera with Transceiver Unit

Optical Navigation is necessary for autonomous operation

Background - Dr. Lauf’s MAVs

4Courtesy of Dr. Lauf

Page 5: MAV Optical  Navigation Software Subsystem

Flapping-Wing MAV Example

5Courtesy of Dr. Lauf

Page 6: MAV Optical  Navigation Software Subsystem

Develop a optical navigation software subsystem◦ User selected destination◦ Semi-autonomous operation◦ Adaptable for flapping-wing MAVs◦ Operates in closed, static environment

Classroom with tables and chairs No moving objects

Purpose

6

Page 7: MAV Optical  Navigation Software Subsystem

Preflight operations◦ Calibrate the camera◦ Place the test rig in the room◦ Start the optical navigation software◦ Choose a destination

Mid-flight operations◦ Move camera to simulate flight◦ Follow suggested navigational output

Operational Concept

7

Page 8: MAV Optical  Navigation Software Subsystem

Requirements:◦ Communicate real-time navigation output◦ Create 3D model of the environment◦ Plan a path from current location to a selected

destination◦ Work in any closed, static environment

Restrictions◦ Non-stereoscopic camera

System Requirements and Restrictions

8

Page 9: MAV Optical  Navigation Software Subsystem

Two major components◦ Camera transceiver unit◦ Computer with vision software

Connected via 1.9Ghz RF channel

Hardware Architecture

1.9 Ghz RFCamera Transceiver Unit Computer

9

Page 10: MAV Optical  Navigation Software Subsystem

OpenCV JavaCV Netbeans 7.0.1 Integrated Development

Environment (IDE)

Software Tools

10

Page 11: MAV Optical  Navigation Software Subsystem

OpenCV: open source computer vision software library built by Intel Corporation

Image Processing Object Recognition Machine Learning 3D Reconstruction

JavaCV: a wrapper for OpenCV◦ Allows us to use OpenCV in Java environment◦ Includes added functionality

OpenCV with JavaCV

11

Page 12: MAV Optical  Navigation Software Subsystem

Free, open source IDE Supports multiple languages including Java Includes many developer helper functions

◦ GUI & Form Builder◦ Software Debugger◦ Unit Testing◦ Code completion◦ Integrated subversion (SVN)

Netbeans 7.0.1

12

Page 13: MAV Optical  Navigation Software Subsystem

Software Algorithm

Object Discovery

Object Tracking & Recognition

GUI

Egomotion Estimation

Path PlanningOptical Correction

Video Feed

3D Reconstruction

Working

In Progress

Future

13

Page 14: MAV Optical  Navigation Software Subsystem

Goal: Find a prominent object in view Why: Need to initialize object tracking and

learning How: Use the “Snake” algorithm

◦ Based on active contour detection◦ “Constricts” around strong contours

Object Discovery

14

Page 15: MAV Optical  Navigation Software Subsystem

Snake (Active Contour) Demo

15

Page 16: MAV Optical  Navigation Software Subsystem

Goal: Provide short-term tracking capability in the learning phase is the same object

Why: Assist long-term (learning) tracker How:

◦ Lucas-Kanade optical flow algorithm Uses scattered points on object to track motion

◦ CamShift algorithm Reduces picture color and calculates color histograms

Object Tracking

16

Page 17: MAV Optical  Navigation Software Subsystem

Lucas-Kanade Tracker Demo

17

Page 18: MAV Optical  Navigation Software Subsystem

CamShift Tracker Demo

18

Page 19: MAV Optical  Navigation Software Subsystem

Goal: Establish a model for an object during the learning phase

Why: ◦ Recover from object occlusion ◦ Provide a basis for egomotion (camera motion)

How: ◦ SURF algorithm◦ Haar-Like features◦ Machine learning

Object Recognition

19

Page 20: MAV Optical  Navigation Software Subsystem

SURF Object Recognition Demo

20

Page 21: MAV Optical  Navigation Software Subsystem

Goal: Establish no-fly zones for the current environment

Why: ◦ Collision avoidance◦ Path planning◦ Data visualization

How: Egomotion recovery with stereo vision techniques

3D Reconstruction

21

Page 22: MAV Optical  Navigation Software Subsystem

Goal: Provide navigational output to user Why: Builds framework for autonomous

navigation How:

◦ Modified navigation algorithms

Path Planning

22

Page 23: MAV Optical  Navigation Software Subsystem

Goal: Provide data visualization and user input capability

Why: ◦ Destination selection◦ Navigational output◦ Internal troubleshooting

How:◦ Netbeans GUI builder

Graphical User Interface (GUI)

23

Page 24: MAV Optical  Navigation Software Subsystem

GUI Representation

24

Page 25: MAV Optical  Navigation Software Subsystem

Applications◦ Camera calibration◦ Verification of egomotion estimation

Camera Calibration & Test Rig

25

Page 26: MAV Optical  Navigation Software Subsystem

Integrated JavaCV & OpenCV with Netbeans 7.0.1 IDE

Interfaced with a variety of cameras Camera calibration & test rig built

Completed Tasks

26

Object Discovery

Object Tracking & Recognition

GUI

Egomotion Estimation

Path PlanningOptical Correction

Video Feed

3D Reconstruction

Working

In Progress

Future

Page 27: MAV Optical  Navigation Software Subsystem

Module integration◦ Object recognition◦ Object tracking ◦ Machine learning

3D Reconstruction◦ Obtain depth perception

Egomotion & Stereo techniques Destination selection Path Planning Improved Graphical User Interface (GUI)

Future Work

27

Page 28: MAV Optical  Navigation Software Subsystem

Questions?

28

Page 29: MAV Optical  Navigation Software Subsystem

Adrian P. Lauf, P. George HuangWright State University Center for Micro Aerial Vehicle Studies (CMAVS)

Guidance and Control

On-board Hardware Off-board Control• Each MAV (Micro Aerial Vehicle) equipped

with on-board computing module• Guidance and Intertial Navigation Assistant

(GINA)• Based on schematics developed at UC

Berkeley’s WarpWing project• Modified to reduce weight, unneeded

components• Onboard processing allows for vehicle

stability in flight• Integrated IEEE 802.15.4 radio protocol

permits two-way radio communications• Radio telemetry• External commands• Video image capture and transmission• Without modification, GINA 2.1 weighs over

2.2 grams. • Development will target a weight of 1.5 grams or less

Local Control Loops• MEMS-based gyroscopes onboard GINA

provide information about the aircraft’s stability

• Simple PID control can be used to keep aircraft level and stable

• Filtering functions can mitigate hysteresis caused by wing motion and control surface actuators

• Onboard microprocessor is capable of handling these high-rate, low-complexity tasks

• Feedback from PID control can be sent off-board for processing via 802.15.4 radios

• Actuator control can be directly handled by the microprocessor; inputs to the system from external sources do not directly actuate control surfaces

• Unlike traditional UAVs, MAVs have limited power and computational resources

• Qualify as deeply-embedded systems• Weight restrictions are primary obstacle for

onboard processing systems• In some cases, aircraft weigh less than 7

grams• The need for autonomy requires the

integration of on-board and off-board processing and guidance capabilities

• This hybrid schema permits computationally-intensive operations to run without weight restrictions

• Various sensor inputs can be used to aid local and global navigation objectives

• Video camera images• MEMS gyroscopes• Other heterogeneous mounted sensors

• Off-line image analysis permits identification of navigation objectives and obstacles

• Frame-to-frame analysis allows the system to construct a model of its environment and surroundings

• Information contained in the world-model can be used to make navigation decisions

• Multiple-aircraft implementations can more quickly and accurately build world-model

• Permits joint and distributed operation in an unknown scenario

• Allows distributed agents to augment the accuracy of existing models

• Commands issued as a result of image analysis can be used as inputs into the PID control navigation routines onboard the aircraft

An airframe and drivetrain example of a CMAVS flapping-wing aircraft

Existing receivers and actuators

Hybrid-mode autonomous navigation for MAV platforms

Gyroscope output from a GINA module

A base-station mote used for the off-board computer