mapping mobile robotics

14
MAPPING IN MOBILE ROBOTICS ASSIGNMENT 2012 – 2013 Submitted by:- Devasena Inupakutika MSc. Automation and Robotics @00346444 1 Mobile Robotics

Upload: devasena-inupakutika

Post on 11-May-2015

619 views

Category:

Education


2 download

TRANSCRIPT

Page 1: Mapping mobile robotics

MAPPING INMOBILE ROBOTICS

ASSIGNMENT 2012 – 2013

Submitted by:- Devasena Inupakutika

MSc. Automation and Robotics @00346444

1 Mobile Robotics

Page 2: Mapping mobile robotics

OUTLINE

S.No Topic Page No.

1 Abstract 3

2 Introduction 4

3 Objectives 4

4 Theory 4

5 Assignment Description and Development

6

6 Results 9

7 Discussion 14

8 Conclusion 14

9 References 14

2 Mobile Robotics

Page 3: Mapping mobile robotics

ABSTRACT

Map building with mobile robots in a known/ an unknown environment had attracted attention for the past many years. In this course project we consider the problem of creating maps in an environment where robot has been put in. We have used SLAM (Simultaneous localization and mapping) GMAPPING package along with fuzzy controller (for optimal target reaching) approach for creating and storing maps for future. The resulting maps are however precise to some extent, however due to usage of rgb sensor (Kinect) as a range sensor instead of real laser scanner, the conversion of huge point cloud data to laser scan data is little imprecise.

We have implemented this approach on real turtlebot with roomba base fitted with Kinect in indoor scenarios. We present our experiments illustrating our approach through RViz.

3 Mobile Robotics

Page 4: Mapping mobile robotics

INTRODUCTION

There are many ways to implement SLAM. It depends on the hardware it's being used with and implemented on. The purpose of this project is performing mapping with a mobile robot specifically a turtlebot with roomba base and Kinect mounted on it. ROS and it's GMAPPING stack along with the controller developed in “ AI in mobile robotics “ project is used to control the robot to build a map of the environment during it's localization and navigation.

With SLAM it is possible for a robot to build a map of its environment while simultaneously locating itself within the map. Here, it was implemented in indoor environment. The robot needs precise way to sense the environment for performing SLAM and navigation. In this project, Openni stack is used as driver for Kinect XBOX 360. As navigation stack and fuzzy logic controller packages were used for path planning and obstacle avoidance, GMapping stack is used to implement SLAM.

OBJECTIVES

The main objective of this work is to present a solution for mapping (SLAM) using extended Kamlan filter (EKF). GMapping ROS stack provides mechanism for the same so, it's just a matter of including this code with controller code, downloading and compiling it, plugging in the H/W and executing this combined application. This concerns with the problem of building a map of an unknown environment by a mobile robot while simultaneously navigating through it.

THEORY

Map-based navigation comprises of below three processes:

• Map-Learning →Memorizing the data captured by robot during environment exploration.• Localization →Process of depicting correct position of robot within the map.• Path Planning →Process of choosing proper trajectory to reach a target, provided with it's

current position.

Path planning is dependent on above two process as map of the environment should be known for robot to reach it's destination goal. However, map is needed to estimate its position and conversely, position is needed to build the map. Issues arise when map-learning and localization are handled simultaneously.

The GMapping stack is used for this map-building during robot's navigation. It provides laser-based SLAM as a ROS node called slam_gmapping. A 2-D occupancy map is created from laser (for our case PCL converted to laser data as described in navigation controller report) and pose data is collected by robot with the presence of transforms. To use slam_gmapping, our robot should be providing odometry data and is equipped with laser scanner (Kinect). This transforms each scan into odom tf frame.

The received laser scan data is then vizualized on RViz which is a 3-D visualization environment for robots using ROS. It consists of options to view live camera images, robot footprint, it's odometry, basic shapes through interactive markers and trajectories, obstacles and axes of robot through visualization markers. These display types can be viewed if relevant topics are being published through execution of programs.

4 Mobile Robotics

Page 5: Mapping mobile robotics

Visualization_msgs are set of messages that deal with visualization specific data (Visualization_msgs/Marker). These are used to send visualization markers such as boxes, spheres, arrows, lines etc. to RViz.

SLAM holds good for 2D and 3D motion. We considered only 2D motion as a part of this work.

Below is the flowchart giving outline of SLAM process:

Figure 1. Outline of SLAM process

5 Mobile Robotics

Page 6: Mapping mobile robotics

ASSIGNMENT DESCRIPTION AND DEVELOPMENT

1. The first part of this work deals with development of launch files using GMapping package including entries for all the transforms specific to the robot as shown below:

Kinect_laser.launch file

roomba_config.launch file

6 Mobile Robotics

Page 7: Mapping mobile robotics

slam.launch file

The final launch file which is launched for obtaining map of the environment is as below. It contains the controller package node which has visualization_msgs topics being published and subscribed that show where exactly robot and obstacles are located on the map and the trajectory that the robot is follwing in order to reach target.

obstacle_avoid_basic.launch file

7 Mobile Robotics

Page 8: Mapping mobile robotics

The Pseudo code is as below: /* subscribe to laser scans */ ros::Subscriber scanSub = n.subscribe("scan", 1, scanCallback); /* subscribe to rviz goals */ ros::Subscriber goalSub = n.subscribe("goal", 1, goalCallback); /* subscribe to global paths */ ros::Subscriber pathSub = n.subscribe("path", 1, pathCallback);

// Publishing Visualization_marker messages.

ros::Publisher markerPub = n.advertise<visualization_msgs::Marker>("controller/trajectory_rollout", 1);

ros::Publisher trajectoryPub = n.advertise<visualization_msgs::Marker>("controller/best_trajectory", 1);

ros::Publisher pathPub = n.advertise<visualization_msgs::Marker>("controller/path", 1);

ros::Publisher localPub = n.advertise<visualization_msgs::Marker>("controller/local_goal", 1);

ros::Publisher localPathPub = n.advertise<visualization_msgs::Marker>("controller/local_path", 1);

ros::Publisher goalPub = n.advertise<visualization_msgs::Marker>("controller/goal_pos", 1);

ros::Publisher cmdPub = n.advertise<geometry_msgs::Twist>("cmd_vel", 1);

2. The second part is launching the above slam.launch and obstacle_avoid_basic.launch launch file followed by running rviz and saving map after navigation of robot through the environment as per below ROS commands:

//launching launch file in seperat windows.

roslaunch slam.launch roslaunch obstacle_avoid_basic.launch

rosrun rviz rviz

rosrun map_server map_saver -f /tmp/my_map

8 Mobile Robotics

Page 9: Mapping mobile robotics

RESULTS

The screenshots of the commands, topics published and created map are as shown below:

Connection to Roomba (turtlebot base) established

9 Mobile Robotics

Page 10: Mapping mobile robotics

Keyboard teleoperation

rostopic list (list of topics published)

10 Mobile Robotics

Page 11: Mapping mobile robotics

Process of map creation in RViz

Obstacles highlighted with black lines, Gray region shows laser scan

11 Mobile Robotics

Page 12: Mapping mobile robotics

Mapping in progress

Map saved

12 Mobile Robotics

Page 13: Mapping mobile robotics

Autonomous navigation using above saved map

Autonomous navigation of robot on saved map highlighting robot footprint, path and obstacles

13 Mobile Robotics

Page 14: Mapping mobile robotics

DISCUSSION

We encountered few issues during the implementation of this project due to the use of kinect as a rgb sensor rather than laser scanner. The Kinect sensor has around 60 degrees field of view, whereas hokuyo / LIDAR sensors have wider range of 240 to 270 degrees of field of view. Hence, when it is required to gather lots of information from environment, i.e. while estimating change in robot position, it is advisable to use sensors with wider field of view.

Also, Kinect can cover between 0.45 to 6 meters range. But laser scanners have different minimum coverage distance. This becomes quite important when it is required to avoid when robot is getting too close to obstacles. With Kinect, the robot in this case just hits the wall. The reason behind this is the use of pointcloud_to_laserscan package whose precision is less than data collected from original laser scan. The obstacle avoidance is not perfect as the one with real laser scanner when the obstacles haven't been detected in the map at earlier stages which increases crashes with obstacles in the beginning.

However, this can be improved by integrating the working of laser with kinect. Also, the dynamic window approach used here depends on obstacle weight. In case the robot encounters huge obstacles and they are too near, the robot just crashes due to its weight and above reasons mentioned (precision of kinect data).

Also, due to data latency issues (due to wireless network through which workstation is communicating with turtlebot), map was getting updated with a little delay after robot has roamed around for a while in the environment.

CONCLUSION

The SLAM used in this work is a very basic SLAM. There is scope for improvement in this. There is problem concerning returning of robot to the place it has seen earlier. Also, there are issues related to robot trajectory and robot poses with respect to map and odom. It takes different path when given a nav goal to same position it visited through 2D nav goal command.

This process can be improved by combining with occupancy grid so as to be readable by human. Moreover, occupancy grids can be used for path planning in A* search algorithm.

REFERENCES

[1] http://www.cas.kth.se/SLAM/Papers/iros02-people3d.pdf

[2] http://www.ros.org/wiki/gmapping

[3] http://www.ros.org/wiki/rviz/Tutorials

[4] http://ros.org/wiki/visualization_msgs

[5] en.wikipedia.org/wiki/

14 Mobile Robotics