flexible mission execution for mobile robots

Upload: mikkel-viager

Post on 05-Apr-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    1/50

    Mikkel Viager

    Flexible Mission Execution for

    Mobile Robots

    Individual course report, July 2012

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    2/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    3/50

    Mikkel Viager

    Flexible Mission Execution forMobile Robots

    Individual course report, July 2012

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    4/50

    Flexible Mission Execution for Mobile Robots

    Report written by:

    Mikkel Viager

    Advisor(s):Jens Christian AndersenNils Axel AndersenAnders Billes Beck

    DTU Electrical EngineeringTechnical University of Denmark2800 Kgs. LyngbyDenmark

    [email protected]

    Project period: 16/01/2012 - 31/05/ 2012

    ECTS: 10 Points

    Education: M. Science

    Field: Electrical Engineering

    Class: Public

    Remarks: This report is submitted as partial fulfillment of the requirementsfor graduation in the above education at the TechnicalUniversity of Denmark.

    Copyrights: Mikkel Viager, 2012

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    5/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    6/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    7/50

    Flexible Mission Execution for Mobile Robots Mi 7

    Mikkel Viager, July 2012

    Table of contents1. Introduction .............................................................................................................. 9

    1.1 Project goal .......................................................................................................... 9

    1.2 Project limitations ............................................................................................... 9

    1.3 Documentation overview .................................................................................... 9

    2. Physical Equipment ................................................................................................ 10

    2.1 The Guidebot ..................................................................................................... 10

    2.2 Joypad ................................................................................................................ 10

    2.3 Electromagnet ................................................................................................... 11

    2.4 Carts................................................................................................................... 11

    2.5 Kinect sensors .................................................................................................... 12

    2.6 LCD screen ......................................................................................................... 12

    3. Kinect Sensor calibration ........................................................................................ 13

    3.1 Kinect tilt calibration rule .................................................................................. 13

    4. Guidemark localization rule .................................................................................... 15

    4.1 Configuration ..................................................................................................... 15

    4.2 Rule procedure .................................................................................................. 16

    5. Cart handling rule ....................................................................................................17

    5.1 Configuration ..................................................................................................... 17

    5.2 Rule procedure .................................................................................................. 17

    6. Conclusion .............................................................................................................. 19

    6.1 Further work: ..................................................................................................... 19

    Appendix A: Kinect accelerometer experiments ......................................................... 23

    Appendix B: Code ....................................................................................................... 27

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    8/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    9/50

    Flexible Mission Execution for Mobile Robots Mi 9

    Mikkel Viager, July 2012

    1. Introduction

    Live demonstrations of mobile robots often take a long time to develop, and cannot

    easily be modified once fully implemented. Since many such demonstrations are set in

    dynamic environments, or completely portable for exhibitions, it is desired to be ableto adapt the robot to a new or changed environment without having to create a new

    implementation from scratch.

    The Guidebot mobile robot platform is used for

    research and demonstration purposes by DTU (the

    Technical University of Denmark) and DTI (the

    Danish Technological Institute). Both find it desir-

    able to have the capability of creating and adapt-

    ing new missions relatively easily.

    1.1 Project goal

    The goal of this project is to create and demon-

    strate a set of tools, with both software and hard-

    ware modules, directly usable in a simple flexible

    mission execution framework.

    Documentation focused on the future use of the

    tools should be made available, encouraging in-

    corporation into new missions and making imple-

    mentation as straight forward as possible.

    1.2 Project limitations

    DTI and DTU want the project to focus on the concept of a tow type AGV (Automated

    Guided Vehicle), where one robot transport simple carts between predefined areas.

    Furthermore, it is requested that the tools are developed and tested with the

    Guidebot platform, based on the concept of rules for mobotware.

    1.3 Documentation overview

    In accordance with the project goals, this report focuses mainly on the details of thefinal implementation of the toolset, and not much on the development process. Each

    chapter is meant to be useful on its own, allowing lookup as in a manual.

    Chapter 2: Primary physical equipment created and used.

    Chapter 3: Kinect sensors and automatic calibration using the accelerometer.

    Chapter 4: Guidemark localization rule details.

    Chapter 5: Cart handling rule configuration and behavior details.

    Chapter 6: Project conclusion

    Appendix A: Kinect accelerometer experiment details

    Appendix B: Code files

    Jakob M. Hansen

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    10/50

    10 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    2. Physical EquipmentExtending the existing platform with hardware add-ons has been necessary to meet

    the project goals. In the following sections are given summaries of the robot itself and

    individual descriptions of the major hardware add-ons created and used during this

    project.

    2.1 The Guidebot

    The Guidebot platform has a two-wheel centered differential drive configuration with

    castor wheels as third and fourth point of contact to prevent tipping. Because of its

    industrial grade motors and encoders, as well as the firm rubber tires, the odometry

    precision of this robot type is remarkably high. Running on a dual core atom proces-

    sor, the system is capable of handling the image data from the two Kinects decently.

    With fully charged batteries, the robot can run continuously for around 15 hours and

    recharge overnight, making it very useful for demonstrations at expos or conventions.

    2.2 Joypad

    An option for manually maneuvering the robot has proven to be highly desirable for

    e.g. moving the robot to its initial start position of a demo or simply general transpor-

    tation. To make this possible, a wireless 2.4GHz joypad has been interfaced into the

    rhd (robot hardware daemon) of mobotware, allowing manual control of the robot

    directly after boot-up, activated by a button press on the joypad.

    A short instruction manual for remote operation with the joypad can be found in ap-pendix C.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    11/50

    Flexible Mission Execution for Mobile Robots Mi 11

    Mikkel Viager, July 2012

    2.3 Electromagnet

    Mounted on the lower backside of the robot, an electromagnet has been firmly fitted

    using only existing holes in the robot chassis, with a highly modified mounting brack-

    et. The electromagnet itself can easily be removed for repair or replacement by re-

    moving the two unbraco screws on the top side and detaching the wiring behind the

    aluminum cover facing outwards.

    The electromagnet is powered directly from the profibus with 24V, drawing 500mA

    current while activated. Originally designed as a secure door lock, the electromagnet

    has a carrying capacity of 500kg.

    Sets of one electromagnet and one metal anchor are available at danbit.dk [1], where

    the mounting bracket can be bought separately. A special arrangement was made to

    purchase additional metal anchors for use with the carts.

    2.4 Carts

    In order to interfere as little as possi-

    ble with the kinematics of the robot,

    castor wheels are chosen for the mo-

    bile carts. Several professional quality

    carts, rated at a 500kg carrying capac-

    ity, was ordered through 3x34 [2] for

    use in this project.

    The metal anchors have been mounted on the bottom side of the carts, using modi-

    fied mounting brackets and rubber shock absorbers allowing some flexibility in order

    to avoid stressing the connection too much.

    An example of a completely rigged cart is shown on the next page.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    12/50

    12 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    2.5 Kinect sensors

    Two Kinect sensors, made for use with the

    Xbox 360, are mounted on top of the robot

    in a modular mounting system allowing

    easy reconfiguration of direction and tilt

    angles. The Kinect sensors are powered by

    an internal 12V power supply and runs at

    below ~5W each, as determined in a previ-

    ous project [3].

    Due to limitations of the USB bus of the

    onboard computer, it is not possible to

    have several Kinect sensors running indepth-mode simultaneously. In this case it

    is chosen to use only the normal cameras,

    making it possible to receive uninterrupted

    image data from both Kinects simultane-

    ously.

    Even though the depth function is not used in this particular project, the Kinect sensor

    was still chosen over standalone VGA cameras due to both price, future expansion

    options and because of the Kinects internal accel-erometer which is already interfaced for use with

    mobotware.

    In the image to the left, the two Kinect sensors are

    configured as both pointing forwards. With the

    final version of the demonstration it has been

    chosen to have one Kinect sensor pointing for-

    wards and the other backwards, mainly to prevent

    having the robot turn around completely whenev-

    er it is desired to look in a specific direction.

    2.6 LCD screen

    An 8-inch LCD display has been fitted to the top of the Guidebot, where it is attached

    to the aluminum pole similar to the Kinect sensors. This screen is meant for debug or

    reconfiguration in areas where no wireless access is available for a SSH connection.

    The screen also has resistive touch capability, and can be set up to provide a graphical

    user interface in the future.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    13/50

    Flexible Mission Execution for Mobile Robots Mi 13

    Mikkel Viager, July 2012

    3. Kinect Sensor calibration

    In order to allow robust positioning based on images of guidemarks, it is very im-

    portant to know the pose of each camera as precisely as possible, as well as any dis-

    tortion of the image caused by the camera lens.

    The problem of lens distortion is handled directly in the mobotware camera server by

    setting up a push command for rectifying all images to a separate image pool. Calibra-

    tion is completed only once, as the properties are assumed to be non-changing, and

    set in the ucamserver.ini file as in this example:

    var campool.device18.distortion="0.24042, -0.72509, -0.00018, -0.00022, 0.72285"

    var campool.device18.intrinsic="525.829, 0.0, 322.205; 0.0, 525.829, 257.529; 0.0, 0.0, 1.0"

    And the push command can be set as:

    poolpush img=18 cmd="rectify device=18 src=18 dst=30 silent"

    Along with this, it is necessary to specify the focal length, physical pose in robot coor-

    dinates and various settings for the Kinect plug-in. An example ucamserver.ini file

    with settings for running 2 Kinects simultaneously can be found in appendix B.

    3.1 Kinect tilt calibration rule

    When setting the camera pose in the ucamserver.ini file it is important to have as pre-

    cise values as possible, since any error in these will cause considerable guidemark po-

    sitioning errors.

    The position of the camera in robot coordinates can be considered relatively static and

    any error will directly translate to the position measurements of guidemarks. Howev-

    er, when it comes to the angular direction of the camera, errors can have a high im-

    pact on the measurement calculation precision, leading to a critical decrease in relia-

    bility of a running demonstration depending on this data for positioning.

    To minimize this positioning error, an addition to the mobotware Kinect plug-in has

    been developed. Along with this is a rule allowing recalibration of a Kinect sensorwhenever the robot itself is not accelerating, usable at any time during a mission.

    An example on how to use this feature can be seen in the end part of carthandling.rule

    in appendix B. An instance of the calibration rule itself can be seen in the

    tiltkinectcalibratefront.rule in appendix B.

    One instance of the rule should be configured for each Kinect by modifying the device

    number within the rule file itself.

    The detailed development documentation for the Kinect accelerometer value calcula-tions can be found in appendix A.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    14/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    15/50

    Flexible Mission Execution for Mobile Robots Mi 15

    Mikkel Viager, July 2012

    4. Guidemark localization ruleWith the capability of determining positions of guidemarks in relation to the robot, a

    rule has been created to reset the localizer plug-in from such relative poses.

    This requires that all waypoint-guidemarks to have unique IDs which are each regis-

    tered with world coordinates in the map.xml file. It is also important to set all way-

    point guidemarks as being stationary. An example can be seen in the file DTU326-

    map.xml in appendix B.

    The gmklocalize rule requires that the ulmsserver, ucamserver and mrc are running.

    - ulmsserver for localization with the localizer plug-in.- ucamserver for the getting guidemark information with the gmk plug-in- mrc for handling acquired and calculated poses (mainly odometry)

    The rule has 3 global variables which can be initialized to default values. The below

    defaults are recommended:

    global.gmklocalize.run = falseglobal.gmklocalize.silent = trueglobal.gmklocalize.successtime = -1

    The run variable can be set to true to activate the rule and thereby have it keep

    running until one successful guidemark-localization has been made.

    The silent variable can be set to false to get a lot of debug information printed in the

    console.

    The successtime variable holds the timestamp for when the last gmklocalization

    was successfully completed. This is meant to serve as a way of detecting whether the

    rule has completed its task, in case this is a requirement for a higher task or program.

    4.1 Configuration

    All configuration of the rule is done by changing the values of the configuration con-stants:

    blocksize = 0.025poolImg = 30timeoutlimit = 4linemapname = "data/DTU326-lines.xml"cov[0] = 0.05cov[1] = 0.05cov[2] = 0.05

    Making it possible to configure:

    blocksize = 0.025

    The size (in meters) of the black squares in the guidemarks to be detected.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    16/50

    16 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    poolImg = 30

    The image pool number containing a rectified image ready for detection of

    guidemarks.

    timeoutlimit = 4

    Limit on how many attempts should be done to try and find a guidemark in an image

    before giving up and requesting a new image instead. This should be set in accordance

    with how often the rulebase is running.

    linemapname = "data/DTU326-lines.xml"

    A string with the relative path to an xml file containing line-map localization data of

    the environment. This is only used for re-initializing the localizer and should not con-

    tain the definitions of guidemarks.

    cov[0] = 0.05

    cov[1] = 0.05cov[2] = 0.05

    Contains the desired values for use in resetting the localizer upon successful gmk-

    localization. These should be determined by the expected precision of the gmk-

    localization.

    4.2 Rule procedure

    The rule can be activated at any time, and keeps

    running until a guidemark has been found and the

    localizer reset.

    This is made possible by taking into account the

    possibility that the robot is moving while trying to

    find a guidemark, calculating any position differ-

    ence from odometry values.

    If the robot is moving along a path, after activation

    of the rule at t0, and receives a picture from the

    camera at time t1, it will take some time to process

    the image and detect a guidemark. Once the detec-tion is complete and the position is calculated, the robot will have moved to a new

    position at time t2. By using the timestamp of the image with the guidemark, it is pos-

    sible to look back into the pose history and determine the difference between the po-

    sitions at t1 and t2, from odometry data. The localizer is then reset to the position at t1

    plus the difference between positions at t1 and t2, which is still very accurate because

    of the high odometry precision and since the travelled distance isrelatively short.

    In cases where the robot is stationary while doing the entire localization, the position

    difference is simply zero, from where the image is taken to where the calculations arecomplete.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    17/50

    Flexible Mission Execution for Mobile Robots Mi 17

    Mikkel Viager, July 2012

    5. Cart handling ruleIn order to make it easy to use utilize the functionality of picking up carts with the

    magnet, a rule has been created to take care of this. By doing this, it is possible to

    have a robust preprogrammed routine take over control of the robot while performing

    a pick up or drop off and return to the mission upon completion of a task.

    Activation of the rule is done by setting one of the following values as true:

    global.carthandling.pickupglobal.carthandling.dropoff

    And then waiting for the equivalent successtime variable to be updated, indicating

    that the procedure was completed and that the mission should now continue:

    wait() : (now() - global.carthandling.pickupsuccesstime < 1.0)wait() : (now() - global.carthandling.dropoffsuccesstime < 1.0)

    An example implementing this rule can be seen in dockdemo.rule in appendix B.

    5.1 Configuration

    The rule is configured similarly to the localization rule, by modifying the indicated

    configuration variables in the top of the rule file. In this case it is also possible to re-

    configure some of the values during the mission, allowing individual configuration for

    specific pick up points if desired, by setting some of the global variables.

    Descriptions for all configuration values are available directly in the carthandling.rule

    file in appendix B. In case of uncertainty about the impact of a configuration variable,

    a complete run-through of the rule procedure is given in the following section.

    5.2 Rule procedure

    Upon setting the variable global.carthandling.pickup=true, the following sequence of

    actions is completed:

    (A demonstration video incorporating this procedure can be found on the CD or via

    the following link: http://youtu.be/ppCNU5yhKz0)

    5.2.1 Find guidemark on cart

    The front Kinect is used for acquiring a new image of the environment in front of the

    robot, which is then searched for any guidemarks. All guidemarks found are noted and

    compared to known guidemarks from the map file in the order of discovery.

    If no guidemarks are recognized or accepted, a new image is requested and the pro-

    cedure is run again. After a preconfigured number of retries (ruling out simple image

    noise as the sole problem) the robot initiates a search for the cart in its immediate

    surroundings (if configured to do so). Turning left and right in preconfigured intervals

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    18/50

    18 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    the robot thoroughly searches for the cart in its surroundings until eventually giving

    up after completing the predefined sweep angle interval, if there is still no guidemark

    found to be acceptable.

    When a guidemark is matched in the list from the map file, it is verified whether the

    guidemark is defined with the value of gmkstationary being false. This will prevent

    the robot from accidentaly trying to pick up guidemarks meant for localization.

    Once the guidemark has been verified, the position of the cart in relation to the robot

    is requested from the guidemark plug-in.

    5.2.2 Moving in position and picking up cart

    A distance is defined in the configuration variables, determining the first position the

    robot should move to (defined as only one value, indicating how far directly in front of

    the cart the initial position should be). It is important to make sure that this position is

    not so close to the cart that the rear Kinect cannot see the guidemark on the cart.

    The robot moves to this position and turns to point the electromagnet directly to-

    wards the metal anchor of the cart. Switching to using the Kinect pointing backwards,

    the guidemark is re-detected in order to confirm that the heading angle is aligned

    properly, as well as to calculate the distance. Any angle offset is corrected and the

    robot slowly closes in on the cart after turning the electromagnet on.

    A configurable value indicates how much further than calculated, the robot should

    move when moving backwards to dock with the cart. A small value of around 0.02-

    0.05m has proven to provide robust pick up maneuvers.

    At this point the electromagnet is hopefully touching the metal anchor, and a small

    twist to both left and right is completed to make sure that there is full surface connec-

    tion. After a very short move forward, the status of the micro switch below the elec-

    tromagnet is evaluated to verify whether the pick up attempt was a success.

    If the micro switch is not pressed, the robot moves to the initial position (still directingthe electromagnet towards the cart) and retries the alignment and docking proce-

    dure. After a predefined number of failed tries it moves even further away, turns

    around to have the front face the cart and start searching for the cart as in the begin-

    ning of the rule.

    When the micro switch is evaluated as pressed after a pick up attempt, the global

    succestime value is updated to indicate the successful pickup, and the cart handling

    rule returns control of the robot back to the mission procedure.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    19/50

    Flexible Mission Execution for Mobile Robots Mi 19

    Mikkel Viager, July 2012

    6. ConclusionWith the new hardware and software it is possible to write and easily reconfigure a

    demonstration, as shown in the video of the robot running the dockdemo.rule.

    The Kinect calibration functionality will most likely prove to be incredibly useful in the

    near future, as most of the robots at Automation and Control of DTU will soon be

    mounted with Kinect sensors.

    Localization by guidemarks has proven to be a very good way of initializing starting

    positions for any mission in mobotware which utilizes the laser scanner localization

    plug-in, as it greatly reduces the initial positioning precision required for successful

    operation.

    Automated handling of picking up and dropping off carts has proven to be both robust

    and efficient. Only very few per cent of the pickup attempts are unsuccessful, and in

    those cases the software quickly corrects any positioning error and makes further

    docking attempts automatically.

    The simple activation commands allow for future use with automated mission planner

    software for navigation and other future software and functionalities.

    6.1 Further work:

    Some obvious approaches to further improve the tool set would be:

    Development of detection software making the robot aware of its surround-ings and thereby greatly decreasing safety risks of moving carts blindly.

    Development of path planning and navigation software capable of navigatingwith extra care whenever the robot is carrying a cart.

    Increasing the maximum allowed resolution in the guidemark detection plug-in, allowing full Kinect image resolution and robust use of the small

    guidemarks.

    The goal of developing a set of tools allowing simple and flexible mission execution of

    an AGV demonstration has been achieved.

    A video of the demonstration example can be seen here:

    http://youtu.be/ppCNU5yhKz0

    and all files used in the demonstration are available from here:

    http://blog.mivia.dk/files/dockdemo.zip

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    20/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    21/50

    Flexible Mission Execution for Mobile Robots Mi 21

    Mikkel Viager, July 2012

    References and links

    [1] http://www.danbit.dk/produkter/2906.phtml(17-07-2012)

    [2] http://www.flyttebutik.dk/shop/flyttehund-191p.html (17-07-2012)

    [3] Analysis of Kinect For Mobile Robots, M. Viager, 2011.

    Can be found here: http://blog.mivia.dk/2011/06/my-kinect-datasheet/

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    22/50

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    23/50

    Flexib

    Appendix A: KinecThe following details are f

    Accelerometer orientatio

    image Acc1 Acc2

    1 0 10

    2 0 0

    3 10 0

    image Acc1 Acc2

    4 7.0 7.0

    5 -7.0 -7.0

    image Acc1 Acc2

    6 -6.6 0

    7 7.0 0

    image Acc1 Acc2

    8 0 7.1

    9 0 7

    le Mission Execution for Mobile Robots

    Mikkel Viager, Jul

    accelerometer experimentsom the experiments with the acceleromete

    n

    Acc3

    0

    10

    0

    Acc3

    0

    0

    Acc3

    6.6

    7.0

    Acc3

    -7.1

    7

    Figure 2

    Figure 1

    Figure 2

    Figure 4

    Figure 6

    Figure 8

    Mi 23

    2012

    r of the Kinect.

    Figure 3

    Figure 5

    Figure 7

    Figure 9

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    24/50

    24 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    From these accelerometer measurements it is possible to determine the orientation of

    the accelerometer, and to set up the according left hand Cartesian coordinate system

    as shown below. It is noticed that the orientation of this coordinate system is not ideal

    in terms of using the Kinect as a sensor for a mobile robot. A robot coordinate system,with the x-axis oriented along the direction of forward movement, is also defined be-

    low.

    Orientation of coordinate systems for accelerometer and robot:

    The accelerometer values (acceleration in m/s2 along that axis) are given from the

    Kinect in the order:

    1 2 3 =

    Which can be converted to robot coordinates as:

    = Where subscripts acc is for accelerometer, and subscript R is for the defined Robot

    coordinates.

    Unreliability during Robot acceleration

    The magnitude of the total resulting acceleration vector can then be calculated as:

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    25/50

    Flexible Mission Execution for Mobile Robots Mi 25

    Mikkel Viager, July 2012

    = + +

    Which is must be downwards vector of approximate magnitude 9.81 (m/s2), in all cases

    where the Kinect sensor is stationary or at a constant speed. Thus, if the Kinect is

    mounted on a mobile robot, the acceleration vector will be impacted by any accelera-

    tion or deceleration of the robot.

    Kinect orientation in radians

    When defining the angular orientation of the Kinect sensor, it is desired to do it on the

    form

    (,,), corresponding to rotation around its own x, y and z axis respectively. In

    most cases, the Kinect will be pointing in the same direction as the robot, making only

    the up and down-wards tilt defined by interesting.Defining zero tilt around x and y as being horizontal instead of vertical downwards, as

    well as choosing positive angles to indicate a downwards tilt of the camera side, the

    following equations describes the orientation parameters of the Kinect.

    The Kinect orientation (, , ) in robot coordinates can then be described as:

    = tan

    = tan

    =tan

    Keeping in mind that the calculation of KR is not to be trusted

    Whereas these calculations are sufficient for cases where the Kinect sensor is mounted

    normally, they will cause problems if the Kinect is tilted more than /2 around either

    xR or yR, e.g. if it is mounted upside down. This is taken care of by using the atan2 func-

    tion in the implementation, such that:

    = 2( , ) = 2( , )

    ( = 2( , ))

    Using these equations, the following orientations have been calculated for the 9 tiltexample figures:

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    26/50

    26 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    Figure 1 0 0 0

    2 0 - /2 /2

    3 - /2 0 /24 - /4 0 /4

    5 /4 0 /4

    6 /4 - /4 /2

    7 - /4 - /4 /2

    8 0 - /4 /4

    9 0 /4 /4

    Changes made to ufunckinect.h:

    Added around line 140:

    UVariable * varOrient;

    Changes made to ufunkkinect.cpp:

    Added around line 323:

    varOrient = addVarA("orient", "0 0 0", "3d", "(r) rotation

    around robot coordinate axes [Omega, Phi, Kappa]");

    added around line 692 (in function run())

    URotation orient;

    Added later in function run() in end of accelerometer value update:

    // calculate new orientation based on acc values

    // (roll) rotation on x, based on values in y,x-plane

    orient.Omega = atan2(-acc.y, -acc.z);

    // (tilt)

    orient.Phi = atan2(acc.x, -acc.z);// kappa (yaw)can not be calculated safely

    orient.Kappa = atan2(-acc.y, acc.x);

    varOrient->setRot(&orient, &t);

    Result

    The calculated values are available in var kinect.orient as [Omega, Phi, Kap-

    pa], updated as often as the accelerometer data update.

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    27/50

    Flexible Mission Execution for Mobile Robots Mi 27

    Mikkel Viager, July 2012

    Appendix B: CodeThis appendix contains all code from the files used in the final demonstration, and can

    be considered a viable example of how to use the tools.

    All the .ini and .conf files are first, followed by the rule files in the order listed here:

    ucamserver.ini ulmsserver.ini aunav.ini robot.conf tiltkinectcalibratefront.rule gmklocalize.rule carthandling.rule dockdemo.rule DTU326-map.xml

    All of the code files are available from the CD or the following web address:

    http://blog.mivia.dk/files/dockdemo.zip

    ucamserver.ini

    module load="var"module load="aukinect.so.0" alias="kinectfront"

    module load="aukinect.so.0" alias="kinectback"module load="gmk"module load="auvarmrc.so.0"module load=aucamrectify.so.0

    camset device=18 focallength=510camset device=28 focallength=510var campool.device18.camPose="0.07 -0.015 0.805 0 0.53 0"var campool.device28.camPose="-0.115 0.015 0.805 0 0.53 3.14159"var campool.device18.distortion="0.24042, -0.72509, -0.00018, -0.00022, 0.72285"var campool.device28.distortion="0.24042, -0.72509, -0.00018, -0.00022, 0.72285"var campool.device18.intrinsic="525.829, 0.0, 322.205; 0.0, 525.829, 257.529; 0.0, 0.0, 1.0"var campool.device28.intrinsic="525.829, 0.0, 322.205; 0.0, 525.829, 257.529; 0.0, 0.0, 1.0"

    var kinectfront.kinectNumber=0var kinectback.kinectNumber=1

    var kinectfront.imagesC3D="18 19 20 22"var kinectback.imagesC3D="28 29 26 27"var kinectfront.camDeviceNum="18 19"var kinectback.camDeviceNum="28 29"var kinectfront.useDepth="0 0"var kinectback.useDepth="0 0"#var kinectfront.desiredResolution=2#var kinectback.desiredResolution=2

    var gmk.diagonal=0kinectfront open=truekinectback open=true

    poolpush img=18 cmd="rectify device=18 src=18 dst=30 silent"poolpush img=28 cmd="rectify device=28 src=28 dst=31 silent"

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    28/50

    28 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    ulmsserver.ini

    server datapath="data"

    # script to load URG and simulated laser scanner modulemodule load="laserpool"module load="odopose"#module load="auv360.so.0"module load="ulmsv360.so.0"module load="var"module load="mapPose"scanset devtype=lms100 devname="lms100:2111"scanset replaySubdir="log"scanset def=lms100# set scanner position in robot coordinatesscanset x=0.3 y=0.0 z=0.25scanset mirror=true

    scanset width=240scanpush cmd="v360 update"

    module load="auefline.so.0"module load="ulmspassable.so.0"module load="aulobst.so.0"module load="aurule.so.0"module load="aulocalize.so.0"module load="auplan.so.0"module load="aupoly.so.0"module load="mapbase.so.0"module load="aulmsmrcobst.so.0"module load="auvarmrc.so.0"

    module load="aumapobst.so.0"#

    #load mapbasemapbase graphload="./data/DTU326-map.xml"mapbase mapload="./data/DTU326-lines.xml"

    # load module to utilize mapped obstaclesvar mapobst.marginSolidFactor=0.3var mapobst.marginFluffyFactor = 1.1mapobst map2localize

    #load settings for guidebotvar localize.wheelBase=0.51

    #set initial posesetinitpose x=0.0 y=0.0 th=0.0setinitcov Cx=0.01 Cy=0.01 Cth=0.01

    scangetscanget

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    29/50

    Flexible Mission Execution for Mobile Robots Mi 29

    Mikkel Viager, July 2012

    aunav.ini

    # test of passable interval# open server command log

    server imagepath="log"server datapath="data"server replayPath="log"server port=24922server serverlog#module load='var'module load='imagepool'module load=odoPosemodule load=utmPosemodule load=mappose

    module load="mapbase.so.0"

    module load="aulaserif.so.0"module load="ausmr.so.0"module load="auplanner.so.0"module load="audrivepos.so.0"module load="auroaddrive.so.0"module load="aupoly.so.0"module load="auavoid.so.0"module load="aurule.so.0"#module load="aurhdif.so.0"

    module load=if alias=laser# ensure laser interface gets the road and obstacle info# add a variable handler to client

    laser add=varlaserdata add=roadlaserdata add=obst# request a copy af all road variableslaserOnConnect cmd="laser varpush struct=obst flush"laserOnConnect cmd="laser varpush struct=obst cmd='obst'"laserOnConnect cmd="laser mapPosePush flush"laserOnConnect cmd="laser mapPosePush cmd='mappose pose'"laserOnConnect cmd="laser odoPosePush flush"laserOnConnect cmd="laser odoPosePush cmd='odopose pose'"# connect to this testserver, as it is laser server too.laser connect=localhost:24919

    # interface to camera servermodule load=if alias=cammodule load=camdatacamdata add=gmkcamdata add=pathcamdata add=cam#request a copy of all gmk variablescamOnConnect cmd="cam var gmk copy"camOnConnect cmd="cam varpush struct=gmk flush"camOnConnect cmd="cam varpush struct=gmk cmd='var gmk copy'"#request a copy of kinectfront.orient variablescamOnConnect cmd="cam var kinectfront copy"camOnConnect cmd="cam varpush struct=kinectfront flush"camOnConnect cmd="cam varpush struct=kinectfront cmd='var kinectfront copy'"#request a copy of kinectback.orient variablescamOnConnect cmd="cam var kinectback copy"

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    30/50

    30 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    camOnConnect cmd="cam varpush struct=kinectback flush"camOnConnect cmd="cam varpush struct=kinectback cmd='var kinectback copy'"

    cam connect=localhost:24920

    #Load SMR Module and connect to

    smrConnect cmd="smr do='laser "mapposepush cmd='localize getonly'"'"smr connect=localhost:31001

    #Load mapbase pluginmapbase graphload="./data/DTU326-map.xml"mapbase mapload="./data/DTU326-lines.xml"

    #Add plannerplanner verbose#Reduce period time for debuggingvar planner.period = 0.25var planner.speed = 0.15var planner.accel = 0.1

    rule load="./gmklocalize.rule"rule load="./tiltkinectcalibrateback.rule"rule load="./tiltkinectcalibratefront.rule"rule load="./carthandling.rule"rule load="./dockdemo.rule"rule resume

    robot.conf

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    31/50

    Flexible Mission Execution for Mobile Robots Mi 31

    Mikkel Viager, July 2012

    maxtick ="10000"control ="0"

    />

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    32/50

    32 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    tiltkinectcalibratefront.rule

    # Detects current Kinect tilt angle and sets device values accordingly

    # Global variablesglobal.tiltkinectcalibratefront.run = falseglobal.tiltkinectcalibratefront.silent = falseglobal.tiltkinectcalibratefront.successtime = -1

    # Configuration constantsdevice = 18

    # Local variablesrotOmega = 0.0

    rotPhi = 0.0

    # Detect tilt angles and set variables

    rotOmega = getvar('ifcam.kinectfront.orient[0]')rotPhi = getvar('ifcam.kinectfront.orient[1]')

    core.send('cam camset device='device' rotOmega='rotOmega'rotPhi='rotPhi)

    if(!global.tiltkinectcalibratefront.silent)print("Tilt values set: rotOmega="rotOmega" rotPhi="rotPhi)

    global.tiltkinectcalibratefront.successtime = now()global.tiltkinectcalibratefront.run=false

    wait() : false

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    33/50

    Flexible Mission Execution for Mobile Robots Mi 33

    Mikkel Viager, July 2012

    gmklocalize.rule

    global.gmklocalize.run = falseglobal.gmklocalize.silent = falseglobal.gmklocalize.successtime = -1

    blocksize = 0.025poolImg = 30timeoutlimit = 4linemapname = "data/DTU326-lines.xml"cov[0] = 0.05cov[1] = 0.05cov[2] = 0.05

    gmkready = falsecanrequestnewgmk = falsegmkname = 'guidemark'

    gmkcrc = 0.0testval = 0.0gmkvalue = 0.0

    IDindex = 0.0gmkinmap = -1gmkstationary = 0

    robpose[0] = 0.0robpose[1] = 0.0robpose[2] = 0.0robpose[3] = 0.0robpose[4] = 0.0robpose[5] = 0.0gmkworldpose[0] = 0.0gmkworldpose[1] = 0.0gmkworldpose[2] = 0.0ododeltastart[0] = 0.0ododeltastart[1] = 0.0ododeltastart[2] = 0.0ododeltaend[0] = 0.0ododeltaend[1] = 0.0ododeltaend[2] = 0.0ododeltaend[3] = 0.0posetolocalize[0] = 0.0posetolocalize[1] = 0.0posetolocalize[2] = 0.0

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    34/50

    34 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    gmktimestamp = 0.0gmkimgserial = 0.0lastgmkimgserial = ifcam.gmk.imageserial[1]timeoutcount = 0.0

    x0 = 0.0y0 = 0.0th0= 0.0x = 0.0y = 0.0th= 0.0resx = 0.0resy = 0.0resth= 0.0resxdelta = 0.0resydelta = 0.0resthdelta= 0.0

    gmkvalue = -1lastgmkimgserial = ifcam.gmk.imageserial[1]core.send('cam gmk img='poolImg' block='blocksize)

    gmkready = falseglobal.gmklocalize.run = falsecanrequestnewgmk = true

    gmkimgserial = ifcam.gmk.imageserial[1]if(gmkimgserial != lastgmkimgserial & ifcam.gmk.count > 0)

    lastgmkimgserial = gmkimgserialif(!global.gmklocalize.silent)print("Found " ifcam.gmk.count " new guidemark(s) with ID(s):

    "ifcam.gmk.IDs" in image with serial: " gmkimgserial)

    IDindex = 0gmkcrc = 0

    gmkinmap = -1gmkstationary = 0while ((gmkcrc < 1 | gmkinmap < 0 | gmkstationary < 1) & IDindex

    0 & gmkinmap >=0)

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    35/50

    Flexible Mission Execution for Mobile Robots Mi 35

    Mikkel Viager, July 2012

    robpose[0] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[0]')robpose[1] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[1]')robpose[2] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[2]')robpose[3] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[3]')robpose[4] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[4]')robpose[5] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[5]')gmktimestamp = getvar('ifcam.gmk.gmk' gmkvalue '.time')

    gmkready = truecanrequestnewgmk = falsetimeoutcount = 0

    else

    if(canrequestnewgmk)

    if(!global.gmklocalize.silent)print("timeoutcount = " timeoutcount" / "timeoutlimit)timeoutcount = timeoutcount + 1if(timeoutcount > timeoutlimit)

    if(!global.gmklocalize.silent)print("Timed out while waiting for guidemark. New

    guidemark requested.")global.gmklocalize.run = truetimeoutcount = 0

    gmkworldpose[0] = mapbase.connectorGlobal(gmkname gmkvalue,'mappose.x')gmkworldpose[1] = mapbase.connectorGlobal(gmkname gmkvalue,'mappose.y')gmkworldpose[2] = mapbase.connectorGlobal(gmkname

    gmkvalue,'mappose.th')

    x0 = gmkworldpose[0]y0 = gmkworldpose[1]th0= gmkworldpose[2]

    x = robpose[0]y = robpose[1]th= robpose[5]

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    36/50

    36 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    resx = cos(th0)*x-sin(th0)*y+x0resy = sin(th0)*x+cos(th0)*y+y0resth= th + th0

    odopose.poseattime(gmktimestamp)ododeltastart[0] = odopose.calcpose[0]ododeltastart[1] = odopose.calcpose[1]ododeltastart[2] = odopose.calcpose[2]

    ododeltaend = odopose.pose

    x = ododeltaend[0] - ododeltastart[0]y = ododeltaend[1] - ododeltastart[1]th= ododeltaend[2] - ododeltastart[2]

    resxdelta = cos(th0)*x-sin(th0)*yresydelta = sin(th0)*x+cos(th0)*yresthdelta= th

    posetolocalize[0] = resx + resxdeltaposetolocalize[1] = resy + resydeltaposetolocalize[2] = limitToPi(resth + resthdelta)

    if(!global.gmklocalize.silent)print("--- printing poses for debug ---")print("Odometry pose at time of image taken = "ododeltastart)print("Odometry pose at current time = "ododeltaend)print("Robot pose diff as a result (x y th) = "resxdelta" "resydelta"

    "resthdelta)

    print("Guidemark pose in world coordinates = "gmkworldpose)print("Robot pose relative to guidemark = "robpose)print("Calculated pose to reset localizer = "posetolocalize)print("Timestamp for image with guidemark = "gmktimestamp)print("--- executing the following commands ---")print("laser resetlocalizer")print("laser mapobst map2localize")print("laser setinitpose x="posetolocalize[0]" y="posetolocalize[1]"

    th="posetolocalize[2])print("laser setinitcov Cx="cov[0]" Cy="cov[1]" Cth="cov[2])print("laser localize")print("")

    core.send("laser resetlocalizer")core.send("laser mapobst map2localize")core.send("laser setinitpose x="posetolocalize[0]"

    y="posetolocalize[1]" th="posetolocalize[2])core.send("laser setinitcov Cx="cov[0]" Cy="cov[1]" Cth="cov[2])core.send("laser localize")

    global.gmklocalize.successtime = now()

    gmkready = false

    wait() : false

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    37/50

    Flexible Mission Execution for Mobile Robots Mi 37

    Mikkel Viager, July 2012

    carthandling.rule

    # Procedure for picking up a cart with a guidemark on it

    # Cart with guidemark should be in front of the robot when rule is activated# Guidemark should be defined in mapbase as a mobile guidemark# Please set global configuration variables to fit needs

    # Global variablesglobal.carthandling.pickup = false

    global.carthandling.dropoff = falseglobal.carthandling.silent = false

    global.carthandling.carryingcartnow = 0global.carthandling.pickupsuccesstime = -1

    global.carthandling.dropoffsuccesstime = -1

    ##########Configuration variables (some are global to allow change on thefly)#########

    # Do search of gmk and cart if not found from initial position (true/false)global.carthandling.searchforgmk = true

    # How many attempts to dock should be made before giving upglobal.carthandling.dockretries = 10

    #how far directly in front of the cart should the robot move to initially (inmeters, 1.0m is suggested)

    global.carthandling.initialpickupdist = 0.65

    # Recalibrate kinects before each pickup using tiltkinectcalibratefront.ruleand tiltkinectcalibrateback.rule (true/false)

    recalibratekinectsactive = true

    # Pool image numbers to look for guidemarks in, and respective device numberspoolimgfront = 30gmkdevicefront = 18poolimgback = 31gmkdeviceback = 28

    # Guidemark rectangle sizes in mblocksize = 0.025

    # How many run through of the rule should be completed before requesting newguidemark (if none found)-->

    timeoutlimit = 3

    # How many times to get and attempt to verify gmk before movinggmkgetattemptlimit = 6

    # How many degrees should be rotated extra every turnrotateincrement = 5

    # how large a total angle should the robot search in? (120 would mean asearch from -60 to 60 degrees)

    searchwidthangle = 120

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    38/50

    38 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    # Movement speed of all docking actions taken (double between 0 and 1)speedmodifier = 1.0

    # The distance from the middle of the robot to the end of the magnetmidtomagnet = 0.39

    # The distance to move further when docking to the cart, than what is justprecisely needed

    extradockdist = 0.02

    # The speed at which the robot should approach the cart in the final stage ofa docking

    dockspeed = 0.08

    # How may times should the robot retry docking before doing a 180 turn andstarting over?

    maxdockattempts = 2

    # Seconds to wait after turning off magnet, before cart can be assumed tohave disconnected

    dropwait = 2

    ######### Local internal variables (do not modify)########doinit = 1dokinectcalibrations = 1

    gmkready = falsedosearchmove = falsesearchmove = 0stage = 1

    dockattempts = 0ask4gmk = truecanrequestnewgmk = false

    gmkname = 'guidemark'gmkvalue = -1lastgmkimgserial = 0gmkimgserial = 0gmktimestamp = 0poolimg = 0lastpoolimg = 0gmkdevice = 0

    IDindex = 0gmkcrc = 0gmkinmap = -1gmkstationary = 0

    gmkrequests = 0rotateval = 0startcalibratetime = 0

    maxspeed = 0.3minspeed = 0.07turnvel = (minspeed + (maxspeed-minspeed)*speedmodifier)*0.6movevel = minspeed + (maxspeed-minspeed)*speedmodifier

    timeoutcount = 0.0

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    39/50

    Flexible Mission Execution for Mobile Robots Mi 39

    Mikkel Viager, July 2012

    robpose[0] = 0.0robpose[1] = 0.0robpose[2] = 0.0robpose[3] = 0.0robpose[4] = 0.0robpose[5] = 0.0

    xpose1 = 0xrob = 0yrob = 0pickupdist1 = 0pickupturn1 = 0pickupturn2 = 0alpha = 0theta = 0Pi = 3.1416atan2angle = 0atan2angle2 = 0

    thetaoffset = 0dockstatus = 0dockattempts = 1dockdisttomove = 0

    # Make sure magnet is turned offsmr.do("setoutput \digitalout\ 0")

    # Take an image and get gmk information

    gmkvalue = -1

    lastgmkimgserial = ifcam.gmk.imageserial[1]lastpoolimg = poolimgif(stage > 1)

    poolimg = poolimgbackgmkdevice = gmkdeviceback

    else

    poolimg = poolimgfrontgmkdevice = gmkdevicefront

    print("Asked for new guidemark from "gmkdevice)

    core.send('cam gmk img='poolImg' block='blocksize)

    if(lastpoolimg == poolimg)

    ask4gmk=falsecanrequestnewgmk = true

    else

    smr.do("wait 0.5")smr.putEvent(100)wait() : smr.gotEvent(100)

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    40/50

    40 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    # Evaluate if there is a new guidemark readygmkimgserial = ifcam.gmk.imageserial[1]if(gmkimgserial != lastgmkimgserial & gmkdevice ==

    ifcam.gmk.imageserial[0] & ifcam.gmk.count > 0)

    print("accepted new guidemark from "gmkdevice)lastgmkimgserial = gmkimgserialif(!global.carthandling.silent)print("Found " ifcam.gmk.count " new guidemark(s) with ID(s):

    "ifcam.gmk.IDs" in image with serial: " gmkimgserial)

    # go through all detected guidemarks, or until one is defined inmap

    IDindex = 0gmkcrc = 0gmkinmap = -1gmkstationary = 1while ((gmkcrc < 1 | gmkinmap < 0 | gmkstationary > 0) & IDindex

    0 & gmkinmap >=0 & gmkstationary < 1)

    robpose[0] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[0]')robpose[1] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[1]')robpose[2] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[2]')robpose[3] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[3]')robpose[4] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[4]')robpose[5] = getvar('ifcam.gmk.gmk' gmkvalue '.robPose[5]')gmktimestamp = getvar('ifcam.gmk.gmk' gmkvalue '.time')if(!global.carthandling.silent)

    print("chose guidemark with ID: "gmkvalue)

    gmkready = true

    canrequestnewgmk = falsetimeoutcount = 0rotateval = 0

    else

    if(canrequestnewgmk)

    if(!global.carthandling.silent)print("timeoutcount = " timeoutcount" / "timeoutlimit)timeoutcount = timeoutcount + 1if(timeoutcount > timeoutlimit)

    if(!global.carthandling.silent)print("Timed out while waiting for guidemark. New

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    41/50

    Flexible Mission Execution for Mobile Robots Mi 41

    Mikkel Viager, July 2012

    guidemark requested.")timeoutcount = 0ask4gmk = truegmkrequests = gmkrequests + 1

    if(gmkrequests > gmkgetattemptlimit & glob-

    al.carthandling.searchforgmk)if(!global.carthandling.silent)print("Max get-guidemark tries reached. Moving to look in

    new direction.")dosearchmove = truegmkrequests = 0ask4gmk = falsecanrequestnewgmk = false

    if(rotateval < 0)rotateval = (rotateval - rotateincrement ) * -1elserotateval = (rotateval + rotateincrement ) * -1

    if(rotateval >= searchwidthangle)

    print("Could not find any carts to pick up. Moving on..."dosearchmove = falseglobal.carthandling.pickup = falseglobal.carthandling.failedtime = now()break

    if(!global.carthandling.silent)print("Turning " rotateval " degrees")

    smr.do("turn " rotateval " @v" turnvel "")smr.do("wait 1")smr.putEvent(101)wait() : smr.gotEvent(101)dosearchmove = false

    ask4gmk = true

    # move in front of cart and point magnet at gmk

    # Make sure magnet is turned offsmr.do("setoutput \digitalout\ 0")

    xpose1 = global.carthandling.initialpickupdist

    theta = robpose[5]

    xrob = robpose[0]yrob = robpose[1]alpha = atan((xrob-xpose1)/yrob)

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    42/50

    42 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    pickupdist1 = sqrt((xrob-xpose1)*(xrob-xpose1)+yrob*yrob)

    #calculate the first and second turn angleif(theta >=0)

    pickupturn1 = (theta - pi/2 + alpha) * -1pickupturn2 = (theta + alpha) * -1

    else

    pickupturn1 = (theta + pi/2 - alpha) * -1pickupturn2 = (theta - alpha) * -1

    atan2angle = atan2(yrob, xrob-xpose1)

    if(theta < 0)atan2angle2 = atan2angle - pi-theta

    elseatan2angle2 = atan2angle + pi-theta

    if(!global.carthandling.silent)

    print("Robot pose in gmk coordinates = "robpose)print("Distance to initial position = "pickupdist1" m")print("Angle to turn = "atan2angle2" radians")

    if(pickupdist1 > 0.05)

    smr.do("turn "atan2angle2" \rad\ @v"turnvel)

    smr.do("ignoreobstacles")smr.do("fwd "pickupdist1" @v"movevel)

    smr.do("turn "-theta-atan2angle2" \rad\ @v"turnvel)smr.putEvent(102)wait() : smr.gotEvent(102)gmkready = falseask4gmk = truestage = 2

    xpose1 = global.carthandling.initialpickupdist

    dockdisttomove = robpose[0] - midtomagnet + extradockdistprint("robpose[0] = "robpose[0])thetaoffset = -robpose[5]

    if(!global.carthandling.silent)print("thetaoffset = "thetaoffset" (max = 0.03)")

    if(abs(thetaoffset) > 0.05) #large offset

    smr.do("turn "thetaoffset" \rad\ @v"minspeed/3)smr.putEvent(103)wait() : smr.gotEvent(103)gmkready = false

    ask4gmk = trueelse

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    43/50

    Flexible Mission Execution for Mobile Robots Mi 43

    Mikkel Viager, July 2012

    if(!global.carthandling.silent)

    print("Distance between cart and magnet (including extradockdist)= "dockdisttomove)

    smr.do("setoutput \digitalout\ 1")smr.do("fwd "-dockdisttomove" @v"dockspeed)smr.do("wait 0.5")smr.do("turn "-0.03" \rad\ @v"minspeed/2)smr.do("turn "0.06" \rad\ @v"minspeed/2)smr.do("turn "-0.03" \rad\ @v"minspeed/2)smr.do("fwd 0.02")smr.putEvent(104)wait() : smr.gotEvent(104)dockstatus = smr.eval("$digital[1]")

    if(dockstatus < 1)

    if(!global.carthandling.silent)print("Docking failed - microswitch not pressed. Trying

    again")smr.do("setoutput \digitalout\ 0")smr.do("wait 1")smr.do("fwd "dockdisttomove" @v"movevel)smr.putEvent(105)wait() : smr.gotEvent(105)if(dockattempts >= maxdockattempts)

    smr.do("turn 180 @v"turnvel)smr.do("fwd -0.1 @v"dockspeed)smr.putEvent(106)wait() : smr.gotEvent(106)

    dockattempts = 1print("SETTING STAGE=1")stage = 1

    else

    dockattempts = dockattempts + 1

    gmkready = falseask4gmk = trueprint("flags are now: gmkready = "gmkready" , ask4gmk = "ask4gmk)

    else

    if(!global.carthandling.silent)print("Docking success! - global variable 'carryingcartnow'

    updated")global.carthandling.carryingcartnow = 1dockattempts = 1stage = 3

    global.carthandling.pickup = false

    gmkready = falseask4gmk = truedosearchmove = false

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    44/50

    44 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    stage = 1global.carthandling.pickupsuccesstime = now()

    smr.do("setoutput \digitalout\ 0")smr.do("wait "dropwait)smr.putEvent(107)wait() : smr.gotEvent(107)

    if(!global.carthandling.silent)print("Dropoff successful!")

    global.carthandling.carryingcartnow = 0global.carthandling.dropoffsuccesstime = now()global.carthandling.dropoff = false

    startcalibratetime = now()global.tiltkinectcalibratefront.run = truewait() : (now() - global.tiltkinectcalibratefront.successtime < 1.0 | now()-

    startcalibratetime > 3) # wait for calibrationif(now() - global.tiltkinectcalibratefront.successtime < 1.0)

    if(!global.carthandling.silent)print("front-kinect successfully calibrated!")

    else

    if(!global.carthandling.silent)print("front-kinect calibration failed!")

    startcalibratetime = now()global.tiltkinectcalibrateback.run = truewait() : (now() - global.tiltkinectcalibrateback.successtime < 1.0 | now()-

    startcalibratetime > 3) # wait for calibrationif(now() - global.tiltkinectcalibrateback.successtime < 1.0)

    if(!global.carthandling.silent)

    print("back-kinect successfully calibrated!")

    else

    if(!global.carthandling.silent)print("back-kinect calibration failed!")

    dokinectcalibrations = 0

    #Print init dataprint("--------------------------carthandling.rule-------------------------")

    print("Rule initialized with settings:")print("")print(" front kinect img: "poolimgfront)

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    45/50

    Flexible Mission Execution for Mobile Robots Mi 45

    Mikkel Viager, July 2012

    print(" back kinect img: "poolimgback)print(" drive velocity: "movevel" m/s")print(" turn velocity: "turnvel" m/s")print(" pickup retries: "global.carthandling.dockretries)print(" initial distance: "global.carthandling.initialpickupdist" m")if(global.carthandling.searchforgmk)

    print(" gmk-search: ON - "gmkgetattemptlimit" attempts to getguidemark before moving")

    print(" - search width: +-"searchwidthangle/2" degrees")print(" - increment size: "rotateincrement" degrees")

    else

    print(" gmk-search: OFF")print("")if(now() - global.tiltkinectcalibratefront.successtime + now() - glob-

    al.tiltkinectcalibrateback.successtime < 10)print(" Front and back kinects successfully calibrated!")print("--------------------------------------------------------------------")doinit = 0

    wait() : false

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    46/50

    46 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    dockdemo.rule

    global.dockdemo.run = falseglobal.dockdemo.loops = 0global.dockdemo.done = true

    highspeed = 0.2lowspeed = 0.1

    speed = 0

    currx = 0curry = 0

    currth = 0

    pi = 3.1416

    smr.do("laser \push cmd='localize' t=1\ ")smr.do("laser \push cmd='mrcobst width=0.8' t=0.5\ ")

    if(global.carthandling.carryingcartnow)

    speed = lowspeedelse

    speed = highspeed

    global.dockdemo.done = falseglobal.dockdemo.loops = global.dockdemo.loops + 1

    global.gmklocalize.run=1wait() : (now() - global.gmklocalize.successtime < 1.0) # wait for lo-

    calization

    smr.do("turn -90 @v"speed)smr.do("drivew 3.2 0.9 180 @v"speed" : ($targetdist < 0.15)")smr.do("fwd 0.15 @v"speed)smr.do("turn -90 @v"speed)smr.putEvent(15)wait() : smr.gotEvent(15)global.carthandling.pickup = truewait() : (now() - global.carthandling.pickupsuccesstime < 1.0)#smr.do("drivew 3.2 0.6 -90 @v0.1 : ($targetdist < 0.15)")currx = mappose.pose[0]curry = mappose.pose[1]currth = mappose.pose[2]smr.do("turn "-pi/2-currth" \rad\ @v"speed)smr.do("fwd "curry-1.10" @v"speed)smr.do("turn "-currth" \rad\ @v"speed*0.6)smr.do("fwd "-(currx-1.30)" @v"speed)smr.putEvent(16)wait() : smr.gotEvent(16)global.carthandling.dropoff = truewait() : (now() - global.carthandling.dropoffsuccesstime < 1.0)

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    47/50

    Flexible Mission Execution for Mobile Robots Mi 47

    Mikkel Viager, July 2012

    smr.do("drive 0.3 @v"speed" : ($drivendist > 0.3)")smr.do("turnr 0.5 -160 @v"speed)smr.do("drive 0.1 @v"speed" : ($drivendist > 0.1)")smr.do("fwd 0.1")smr.putEvent(17)wait() : smr.gotEvent(17)global.gmklocalize.run=1

    wait() : (now() - global.gmklocalize.successtime < 1.0) # wait for lo-calization

    #smr.do("turn 130 @v" -speed)smr.do("drivew 2.2 0.9 90 @v"speed" : ($targetdist < 0.15)")smr.do("fwd 0.15 @v"speed)smr.do("turn 90 @v"speed)smr.putEvent(18)wait() : smr.gotEvent(18)global.carthandling.pickup = truewait() : (now() - global.carthandling.pickupsuccesstime < 1.0)currx = mappose.pose[0]curry = mappose.pose[1]currth = mappose.pose[2]smr.do("turn "-currth" \rad\ @v"speed)smr.do("fwd "3.25-currx" @v"speed)smr.do("turn "-pi/2-currth" \rad\ @v"speed*0.6)smr.do("fwd "-(1.45-curry)" @v"speed)smr.putEvent(19)wait() : smr.gotEvent(19)global.carthandling.dropoff = truewait() : (now() - global.carthandling.dropoffsuccesstime < 1.0)smr.do("fwd 0.2 @v"speed)smr.do("turn 110 @v"speed)

    smr.do("fwd 1.2 @v"speed)smr.do("wait 1")smr.putEvent(20)wait() : smr.gotEvent(20)global.carthandling.dropoff = trueglobal.dockdemo.done = trueprint("Mission done - loop " global.dockdemo.loops)print("---------Dockdemo initialized----------")

    wait() : false

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    48/50

    48 Flexible Mission Execution for Mobile Robots .

    Mikkel Viager, July 2012

    DTU326-map.xml

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    49/50

    Flexible Mission Execution for Mobile Robots Mi 49

    Mikkel Viager, July 2012

  • 7/31/2019 Flexible Mission Execution for Mobile Robots

    50/50