a distributed processing architecture for vision based domestic robot navigation

25
A Distributed Processing Architecture for Vision Based Domestic Robot Navigation Marcel-Titus Marginean and Chao Lu Computer & Information Sciences Towson University 11/1/2013 ICCCS 1

Upload: kawena

Post on 07-Jan-2016

46 views

Category:

Documents


0 download

DESCRIPTION

Marcel-Titus Marginean and Chao Lu Computer & Information Sciences Towson University. A Distributed Processing Architecture for Vision Based Domestic Robot Navigation. Highlights: Distributed architecture for indoor robot navigation On board and external computer vision - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Marcel-Titus Marginean and Chao Lu

Computer & Information SciencesTowson University

11/1/2013 ICCCS 1

Page 2: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Highlights:

➢Distributed architecture for indoor robot navigation➢On board and external computer vision➢Communication protocol for cooperative localization and mapping➢Distributed processing and decision making

11/1/2013 ICCCS 2

Page 3: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Rationale:

➢Aging population grows, and they require assistance➢Robots can help with domestic tasks➢Enable Independent Living instead of Institutionalization➢Allows the aging population to live in their own homes while monitoring the health status and providing assistance

11/1/2013 ICCCS 3

Page 4: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Rationale-Cont. ➢Houses already have networks and surveillance cams➢Vision processing is very CPU/Memory intensive ➢Energy efficient embedded computers on robots still low in resources➢Redundancies provides fault / error tolerance

11/1/2013 ICCCS 4

Page 5: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Rationale-Cont.

➢Computer Vision is most promising technology in robot navigation ➢We employ helper technologies to ease the load➢Innate / priory knowledge about the environment should be used to reduce the scope of the problem

11/1/2013 ICCCS 5

Page 6: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Previous work:

➢Mehdi et al. helped navigation with ultrasonic sensors and RFID tags➢Souza and Gonclaves used stereo vision for mapping➢Fernandez et al. placed artificial landmarks on the ceiling and used a vertical looking camera on the robot➢At Cluj-Napoca a laser beam has been used to detect dynamic obstacles

11/1/2013 ICCCS 6

Page 7: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Previous work-Cont.

➢Pizaro et al. used a rig of calibrated and synchronized cameras ➢Chakravarty, Punarjay, Jarvis and Ray also helped the mobile robot navigation with external cameras

11/1/2013 ICCCS 7

Page 8: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Overview

Base Station – One or more general purpose computers

Robot – Embedded System + Camera + Inertial Unit

Network – Typical House WiFi + Wired Network

Fixed Cameras – IP cameras wired or WiFi

Engineering console – Laptop used for development and testing

11/1/2013 ICCCS 8

Page 9: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Epipolar Geometry:➢Two or more cameras oversee the same scene from two different points and orientations➢The projections of a point in space on the two image planes are related by an equation involving the Essential Matrix➢8 pairs of matching points are to be identified in order to be able to calculate the Essential Matrix ➢Pose and relative position of cameras can be calculated from Essential Matrix using (SVD) Singular Value Decomposition method

11/1/2013 ICCCS 9

Page 10: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Epipolar Geometry:

➢One camera can be the camera on the robot and the other the camera mounted on the wall➢?Can be used to either calculate the robot position in respect to fixed camera or to accurately map objects in the environment➢Susceptible to failures if difference in pose / position is too large or if similar patterns located in different places are encountered

11/1/2013 ICCCS 10

Page 11: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Object Tracking:

➢Fixed camera oversee the scene and can map the movement of the robot➢A Gaussian Mixture Model is used for background subtraction in order to detect the moving or moved objects in respect to fixed background➢Each moving object is defined by a status vector containing id, position, velocity and the confidence in the measurement.

11/1/2013 ICCCS 11

Page 12: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Optical Flow Navigation:

➢Used by on-board (robot) computer to detect potential collisions and for reactive navigation when outside of the view of fixed cameras.➢Optical flow field is a velocity field representing the projection on the image plane of the motion of objects in 3D space.➢Can be used to calculate the distance from the moving robot to the obstacles in front or to maintain distance from walls when navigating into a hallway.

11/1/2013 ICCCS 12

Page 13: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Fixed Infrastructure:➢Wall mounted DCS-930L and DCS-932L IP cameras located near ceiling overlooking the room➢Typical house network with 802.11n Wi-Fi router having 10/100 wired Ethernet ➢A pair of computers running Mageia Linux connected to the router with wired Ethernet. Called Base Station they are used for for video processing, model/map building, object tracking and mission planing➢Object recognition for future research and will also take place on the Base Station

11/1/2013 ICCCS 13

Page 14: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Mobile Infrastructure:

➢Mobile platform having Ackerman Steering➢BeagleBoard-XM embedded computer running Debian ARM Linux connected to network with USB Wi-Fi dongle➢LI-5M03 Camera Board connected directly to BeagleBoard bus➢Inertial measurement unit with ADXL345 accelerometer and L3G4200 MEMS gyroscope➢Additional circuitry➢For future research we have in the plan to explore the Adapteva’s Paralella board to add extra processing

11/1/2013 ICCCS 14

Page 15: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Wall mounted DCS-932L WiFi IP Camera BeagleBoard-XM with LI-5M03 Cameraon test bench

Video Capture Hardware:

11/1/2013 ICCCS 15

Page 16: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Software Development Environment:

➢OpenCV 2.4.5 for image processing➢QT 4.8.5 and OpenGL libs for GUI development ➢C++ programming language, gcc version 2.7.2 ➢Eclipse CDT and QT-Creator as IDE➢Mageia Linux Desktop

11/1/2013 ICCCS 16

Page 17: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Software Architecture:➢Modular Architecture using Active Objects➢Message Passing Asynchronous Protocol➢Messages structure designed to minimize the use of network bandwidth ➢Most of image processing localized on each module. Large data sent between modules only “AS NEEDED” upon request➢Communication Infrastructure API abstracts the location of modules➢Each Module is an Active Object with at least two threads (communication, main processing)

11/1/2013 ICCCS 17

Page 18: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Software Modules

CM – Camera Module

SAM – Situation Awareness Module

RM – Robot Module

ARM – Autonomous Robot Module

11/1/2013 ICCCS 18

Page 19: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Camera Module:

➢One CM for each fixed camera. Capture, pre-processing, blob tracking➢Sending periodic Blob Tracking Vectors to SAM➢Upon request, send whole images or sub-images for analysis by other modules

11/1/2013 ICCCS 19

Page 20: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Situation Awareness Module:

➢Maintain a live map of the environment, keeping track of people, objects and robots.➢Receive periodic tracking vectors from CMs and RMs and match blobs with robots and provide robot tracking info➢Future developments may include object recognition and maintaining a database for recognition task

11/1/2013 ICCCS 20

Page 21: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Robot Module:

➢Robot path planning and mission control➢Uses epipolar geometry to map objects or robot pose by requesting images from both ARM and CM➢Translate tracking information from SAM global coordinate system into robot’s local coordinate system➢Future research direction may include landmark tracking

11/1/2013 ICCCS 21

Page 22: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Autonomous Robot Module:

➢Optical Flow (OF) processing and reactive OF navigation➢PID controller to maintain a required trajectory➢Honors request from RM for (sub) images➢Able to temporary overrule RM commands if optical flow detect high potential for collision➢Future research direction may include more processing power to enable true autonomy, more sensors and actuators for “eye-hand coordination”

11/1/2013 ICCCS 22

Page 23: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Typical navigation scenario:

➢RM interrogates SAM for a map ➢RM uses Dijkstra algorithm to find the path➢RM downloads navigation instructions into ARM➢CMs keeps broadcasting blob position to SAM➢SAM provides real-time tracking information to RM➢ARM uses a PID controller to navigate on path using tracking info from SAM as feedback

11/1/2013 ICCCS 23

Page 24: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Typical navigation scenario:

➢Robot encounter obstacles unknown to SAM ➢Optical flow on ARM detects it as an obstacle➢ARM sends obstacle info to RM➢RM requests image from a CM and from ARM➢RM maps the object using epipolar geometry➢Send information to SAM to update the occupancy grid➢Navigation re-starts with new path planning

11/1/2013 ICCCS 24

Page 25: A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

A Distributed Processing Architecture for Vision Based Domestic Robot Navigation

Ideas for future direction in research:

➢Explore landmark based navigation and object recognition ➢Increase the processing power on mobile unit by using something like Adapteva’s Paralella board➢The ability to create a Visual Aspect Indexed Database for object recognition from a large subset of classes of objects

11/1/2013 ICCCS 25