ram_20150601_jun_2015

124
Contents | Zoom in | Zoom out Search Issue | Next Page For navigation instructions please click here Contents | Zoom in | Zoom out Search Issue | Next Page For navigation instructions please click here ____________________________

Upload: henrydcl

Post on 16-Dec-2015

19 views

Category:

Documents


8 download

DESCRIPTION

RAM_20150601_Jun_2015

TRANSCRIPT

  • Contents | Zoom in | Zoom out Search Issue | Next PageFor navigation instructions please click here

    Contents | Zoom in | Zoom out Search Issue | Next PageFor navigation instructions please click here

    ____________________________

  • Move, Manipulate, Accelerate Your Research with the Pioneer Manipulator mobile robot

    Adept MobileRobots, LLC. 10 Columbia Drive, Amherst, NH 03031603-881-7960 [email protected]

    Pioneer Manipulator includes

    Vision: Kinect for Windows V2 Pan/Tilt Stage

    Manipulation: Two Kinova Jaco2 Research Manipulators

    Torso: Two different options for manipulator mounting points

    Autonomous Navigation and Mapping Software

    SICK S300 Laser Scanner

    Joystick (used for Mapping, Re-location)

    Front and Rear Sonar, Forward Bumper Panel

    Wireless Ethernet Communication Color LED Status Indicator Rings Docking station for Autonomous

    or Manual Charging

    The Pioneer Manipulator is a rugged, reliable, sophisticated robot purpose built for the research community and its needs. Designed to be extensively capable out of the box in a broad array of applications, the Pioneer Manipulator is sure to immediately enhance your new or existing program.

    Visit www.mobilerobots.com for specications and accessories.

    Speakers and Voice Synthesis Software

    Pioneer Software Development Kit

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ________________

  • 1JUNE 2015 t IEEE ROBOTICS & AUTOMATION MAGAZINE t

    Vol. 22, No. 2 JUNE 2015ISSN 1070-9932http://www.ieee-ras.org/publications/ram

    51 Radiation Queue Meeting Patient Waiting Time Targets By Siqiao Li, Na Geng, and Xiaolan Xie

    64 Short-Term Scheduling of Crude-Oil Operations Enhancement of Crude-Oil Operations Scheduling Using

    a Petri Net-Based Control-Theoretic Approach By NaiQi Wu, MengChu Zhou, and ZhiWu Li

    77 Coordinating Autonomy Sequential Resource Allocation Systems for Automation

    By Spyros Reveliotis

    Digital Object Identifier 10.1109/MRA.2014.2381433

    FEATURES

    24 Flying SmartphonesAutomated Flight Enabled by Consumer Electronics

    By Giuseppe Loianno, Gareth Cross, Chao Qu, Yash Mulgaonkar, Joel A. Hesch, and Vijay Kumar

    33 Automated Vitrification of Embryos A Robotics Approach By Jun Liu, Chaoyang Shi, Jun Wen, Derek Pyne, Haijiao Liu,

    Changhai Ru, Jun Luo, Shaorong Xie, and Yu Sun

    41 Cloud Automation Precomputing Roadmaps for Flexible Manipulation By Kostas E. Bekris, Rahul Shome, Athanasios Krontiris, and

    Andrew Dobson

    ON THE COVERLoianno et al. describe how an emerging, low-cost consumer electronics technology can be used to control personal drones for home automation.

    If you like an article, click this icon to record your opinion. This capability is available for online Web brows-

    ers and offline PDF reading on a con-nected device.

    BACKGROUND: IMAGE LICENSED BY INGRAM PUBLISHING

    IS

    TOC

    KP

    HO

    TO.C

    OM

    /TH

    AR

    RIS

    ON

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 2 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    A Publication of the IEEE ROBOTICS AND AUTOMATION SOCIETYVol. 22, No. 2 June 2015 ISSN 1070-9932 http://www.ieee-ras.org/publications/ram

    IEEE Robotics & Automation Magazine (ISSN 1070-9932) (IRAMEB) is published quarterly by the Institute of Electrical and Electronics Engineers, Inc. Headquarters: 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, Telephone: +1 212 419 7900. Responsibility for the content rests upon the authors and not upon the IEEE, the Society or its members. IEEE Service Center (for orders, subscriptions, address changes): 445 Hoes Lane, P.O. Box 1331, Piscataway, NJ 08855 USA. Telephone: +1 732 981 0060. Individual copies: IEEE members US$20.00 (first copy only), non-members US$118.00 per copy. Subscription rates: Annual subscription rates included in IEEE Robotics and Automation Society member dues. Subscription rates available on request. Copyright and reprint permission: Abstracting is permitted with credit to the source. Libraries are permitted to

    photocopy beyond the limits of U.S. Copyright law for the private use of patrons 1) those post-1977 articles that carry a code at the bottom of the first page, provided the per-copy fee indicated in the code is paid through the Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923 USA; 2) pre-1978 articles without a fee. For other copying, reprint, or republication permission, write Copyrights and Permissions Department, IEEE Service Center, 445 Hoes Lane, Piscataway, NJ 08854. Copyright 2015 by the Institute of Electrical and Electronics Engineers Inc. All rights reserved. Periodicals postage paid at New York and additional mailing offices. Postmaster: Send address changes to IEEE Robotics & Automation Magazine, IEEE, 445 Hoes Lane, Piscataway, NJ 08854 USA. Canadian GST #125634188 PRINTED IN THE U.S.A.

    IEEE prohibits discrimination, harassment, and bullying. For more information,visit http://www.ieee.org/web/aboutus/whatis/policies/p9-26.html.

    EDITORIAL BOARDEditor-in-ChiefEugenio Guglielmelli ([email protected])Campus Bio-Medico University, Roma (Italy)

    Associate Editors

    Antonio FranchiCentre National de la Recherche Scientifique, CNRS (France)

    Yi GuoStevens Institute of Technology (USA)

    Hesuan HuXidian University (PR China)

    Ferdinando Rodriguez Y BaenaImperial College (UK)

    Yu SunUniversity of South Florida (USA)

    Loredana ZolloCampus Bio-Medico University, Roma (Italy)

    Past Editor-in-ChiefPeter CorkeQueensland University of Technology(Australia)

    RAM Editorial Assistant and Column Manager Rachel O. Warnick (USA)

    COLUMNSCompetitions: Stephen Balakirsky and Dan Popa (USA)

    From the Editors Desk: Eugenio Guglielmelli (Italy)

    IndustryHumanitarian Technology: Raj MadhavanUniversity of Maryland College Park (USA)

    ROS Topics: Steve Cousins(USA)

    On the Shelf: Alex SimpkinsRDP Robotics (USA)

    Student Corner: Lauren MillerNorthwestern University (USA)

    IEEE RAS Vice-Presidentof Publication ActivitiesAntonio Bicchi University of Pisa (Italy)

    RAM home page:http://www.ieee-ras.org/publications/ram

    Robotics and AutomationSociety Project SpecialistsKathy ColabaughRachel O. [email protected]

    Advertising SalesMindy BelferAdvertising Sales CoordinatorTel: +1 732 562 3937Fax: +1 732 981 [email protected]

    IEEE PeriodicalsMagazines DepartmentDebby NowickiManaging [email protected]

    Janet DudarSenior Art DirectorGail A. SchnitzerAssistant Art Director

    Theresa L. SmithProduction Coordinator

    Felicia SpagnoliAdvertising Production Manager

    Peter M. TuohyProduction Director

    Dawn M. MelleyEditorial Director

    Fran ZappullaStaff Director,Publishing Operations

    IEEE-RAS Membershipand Subscription Information:+1 800 678 IEEE (4333)Fax: +1 732 463 3657http://www.ieee.org/membership_services/membership/societies/ras.html

    Digital Object Identifier 10.1109/MRA.2014.2381434

    Engine

    te

    n i

    te

    4 FROM THE EDITORS DESK

    6 PRESIDENTS MESSAGE

    8 COMPETITIONS

    18 ROS TOPICS

    21 INDUSTRIAL ACTIVITIES

    22 FROM THE GUEST EDITORS

    96 EDUCATION

    100 SOCIETY NEWS

    108 HUMANITARIAN TECHNOLOGY

    112 WOMEN IN ENGINEERING

    116 CALENDAR

    120 TURNING POINT

    COLUMNS & DEPARTMENTS

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ________

    ______

    ________

    ________

    ______

    _________________

  • for more information: kuka-robotics.com

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    empower youIn line with the slogan enabling you to realize your potential and your ideas, KUKA Robotics

    provides scientists with a product portfolio that meets the highest standards in research and

    education. Powerful tools that have everything researchers need to develop their own applications

    in the fields of robotics and service robotics and to implement these at a professional level.

    ii am safe

    Thanks to its integrated torque

    sensors, the LBR iiwa detects

    collisions and reacts compliantly.

    ii am sensitive

    LBR iiwa enables the

    automation of sensitive

    assembly tasks.

    LBR iiwa

    Human robot collaboration

    Sensitive

    Programming ease: C++,

    Java or any other code

    language can be used.

    Soft realtime applications:

    visual servoing and haptic

    applications, easy to realize

    with Connectivity.

    KUKA Education Bundle

    Vocational training

    Qualification

    Best practice

    in industrial robotics

    Basic operating

    & programming

    KUKA youBot

    Research in Mobile

    manipulation

    Open Source Programming

    Open interfaces

    Teaching basics of robot

    kinematics and dynamics

  • FROM THE EDITORS DESK

    4 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    Robots Dont PrayBy Eugenio Guglielmelli

    Historically, the evolution of re-search in the areas of robotics and artificial intelligence (AI) has always been related, al-though with different intensity along the years. The concept of AI was originally de-veloped around the basic idea of an artifi-cial mind, mainly capable of formal, logic analysis of structured knowledge, and de-signed to be embedded in computational systems. Until the 1980s, a benchmark for an artificial mind was beating the worlds best chess player, without any need to gen-erate and control movements and forces in the real world. The famous Turing test was about the capability of a computer to give correct, humanlike answers to ordinary questionswhile being hidden behind a curtain. It happened just a few years ago that a computer won a popular television show providing correct answers to a vari-ety of questions directly interpreted from natural spoken language. Actually, Turing himself proposed a variety of visionary scenarios where also the body, not just the mind, was part of the process of develop-ing intelligence. Along this line, a mile-stone for both robotics and AI research domains was the seminal paper by Rod-ney A. Brooks, Elephants Dont Play Chess (Rob. And Aut. Syst., 1990), which states that perception can be turned in ac-tion without any needs of abstract, formal mediation, and logical reasoning. It was the genesis of reactive robotics: perception and, progressively, embodiment become the central focus of an increasing portion of robotics research, the basis for the de-

    velopment of machines capable of synthetizing and adapting their behavior in real-time to their working environment. To be brief, it took a while but, eventually, the artificial mind and the artificial body are being considered of equal importance when dealing with the problem of developing humanlike intelli-gent systems.

    Of note, in the last decades, there has been increasing attention posed by het-erogeneous communities to concepts such as technological singularity (i.e., ar-tificial overcoming human intelligence), artificial life, transhumanism, and im-mortality. In principle this is in line with the visions about cybernetic systems originally proposed by Wiener, Turing and others back in the middle of the last century. What is new in these old specu-lations, which were so far confined in science fiction, is that they are supposed to be now systematically pursued by ex-ploiting also the expected advancements of AI robotics and automation technolo-gy for building a new type of agent, fea-turing different levels of integration between biological and artificial compo-nents. For instance, the 2045 initiative (www.2045.com) has developed a road-map for achieving the implant of a human brain in an artificial body by that year; the Singularity and Humanity+ ini-tiatives are planning to run universities worldwide and sponsor research projects to groups joining the overall philosophy and sharing the vision of immortality en-abled by technology, which is in parallel somehow elaborated from a theoretical, transdisciplinary perspective. To some

    extent, this seems a kind of new religion: you need to be a follower in order to be part of an effort apparently support-

    ed not just by some groups of nave researchers, but also by

    large companies and a variety of spon-sors that are providing a significant amount of financial and logistic resourc-es currently used to build an increasing network of partners worldwide.

    To me, all of this effort seems not only quite weak from a scientific and techno-logical viewpoint but also very dangerous for the negative and misleading percep-tion that it can generate in society about the ultimate research aims and ethical background of our community. I believe that it is important to raise awareness in our Society and in IEEE at large about these initiatives. This is perfectly feasible and timely, as demonstrated by the Asso-ciation for the Advancement of Artificial Intelligence (AAAI), which has recently promoted a dialogue with The Future of Life Institute, another private initiative that promotes research so to ensure that AI systems are robust and beneficial, doing what humans want them to do (futureoflife.org). IEEE Robotics & Auto-mation Society and AAAI co-operation to foster convergence of AI and robotics is ongoig (see p. 106 for a report on re-cent activities). And I am happy to know that IEEE Technology and Society Maga-zine is already very active and interested to join IEEE RAM in this effort. For now, let us enjoy the exciting and reassuring features on the emerging advances of au-tomation selected for you by the guest editors of this special issue!

    Digital Object Identifier 10.1109/MRA.2015.2430991Date of publication: 18 June 2015

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    _________

  • Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    __________________________

  • PRESIDENTS MESSAGE

    6 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    Women in Robotics and AutomationBy Raja Chatila

    It is common knowledge that women are underrepresented in science and engineering in general, and this is true worldwide. What is the situation in the IEEE and in the IEEE Robotics and Automation Society (RAS) in par-ticular? The latest statistics from the IEEE, dated December 2014, provide the following figures: The total number of IEEE Members

    is 426,488, while the total number of women in the IEEE is 31,960.

    The total number of RAS members is 12,694, and the total number of women RAS members is 835.

    The percentage of women in the IEEE is 7.5% and 6.6% in RAS.The IEEE is a global Society, which

    means that these figures should reflect the global situation. However, the aver-age percentages of women in science and engineering are in the 1520% range for the general areas of physics and computer science in industrialized countries. The situation of women in science and engineering has been ana-lyzed thoroughly by many researchers and institutions worldwide, and this col-umn is not the appropriate place to dis-cuss the issue. The questions on which I would like to focus in this column are as follows: What is specific to the IEEE and

    RAS that makes our figures so low? How can we remedy the situation?

    It is possible that the answer to the first question is related to a higher avail-

    ability of men to participate in events and conferences organized by the RAS that require travel, which, therefore, makes membership more useful to them. This might explain the low mem-bership figure with respect to the per-centage of women professionals. However, I am not sure there are actu-ally statistics supporting this, and it is probably something we should investi-gate. But it is a fact that we have many more men than women attending our major conferences.

    The second question is in our hands to address. RAS should take strong positive action to involve more women in our activities and leadership. We should make it easier for women to attend our events, and we should set an example through the involvement of women in our leadership. Some might argue that, given the percentage of women members, it is normal that the RAS leadership has an unbalanced gen-der representation. This is true, but it ignores the dynamics of the situation: the more that women are involved in the organizations leadership, the more women will be attracted to become members.

    Within the IEEE, IEEE Women in Engineering (WIE) is the worlds largest international professional organization dedicated to promoting women engi-neers and scientists. Within the RAS Member Activities Board with Vice President Jing Xiao, our WIE liaison is Laura Margheri. A WIE lunch is orga-nized at our major conferences to pro-vide an opportunity for all female and male professionals who are interested in

    discussing the subjects of womens engi-neering education, career path, career/family choices, and other topics. The WIE Committee in RAS is also reflect-ing on concrete actions to better involve women in our leadership.

    In this respect, the 2015 IEEE International Conference on Robotics and Automation is teaching us a great lesson. With Honorary Chair Ruzena Baczy, General Chair Lynne Parker, and Program Chair Nancy Amato, all the other conference committee mem-bers are also women. This is a bold decision, unprecedented in any con-ference in the area of robotics and automation and very probably in other fields. This clearly proves that RAS counts many dynamic women leaders among its members.

    Having women leaders in robotics and automation will certainly have a great impact on how engineering and science are considered by the public. But we still need to do much more to promote membership and leadership within the RAS.

    Our discipline can be an instrument to improve gender equality in science and engineering. Because of the attrac-tiveness of robotics and automation to students and the general public, we can exploit the opportunity of our confer-ences, which take place yearly in coun-tries worldwide, to be proactive in organizing events targeted to attract young students from both genders. We must create a new dynamic.

    Digital Object Identifier 10.1109/MRA.2015.2427879Date of publication: 18 June 2015

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • KVH Industries, Inc. World Headquarters: 50 Enterprise Center | Middletown, RI 02842 U.S.A. | [email protected] | +1 401.847.33272015 KVH Industries, Inc. KVH is a registered trademark of KVH Industries, Inc. Protected by one or more U.S. or International Patents.

    kvh.com/imu

    1725 IMU 1750 IMU 1775 IMUBias Instability 0.1/hr 0.05/hr 0.05/hr

    Bias vs. Temperature 4/hr 1/hr 0.7/hr

    Angle Random Walk (25 C) 0.017/hr 0.012/hr 0.012/hr

    Bandwidth 440 Hz 440 Hz 1000 Hz

    Data Rate 1 to 1000 Hz 1 to 1000 Hz 1 to 5000 Hz

    Bias Magnetic Sensitivity 2/hr/gauss 2/hr/gauss 0.5/hr/gauss

    FOG performance at MEMS prices1725 IMU

    PERFORMANCE

    NEWPRODUCT

    Higher performance, great value

    1750 IMU

    1775 IMUPremium performance

    for challengingapplications

    NEWPRODUCT

    PR

    ICE

    Need help selecting the right IMU for your application? Get the details and download the white paper

    KVH Introduces a Fiber Optic Gyro IMU for Every Demanding Application

    Great...Better...Best!

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ______

  • 8 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    COMPETITIONS

    Toward the Robot Butler: the HUMABOT Challenge

    By Enric Cervera, Juan Carlos Garca, and Pedro J. Sanz

    The HUMABOT Challenge, the official robot competition for the 2014 IEEE Robotics and Automation Society (RAS) In-ternational Conference on Humanoid Robots (Humanoids 2014), was held during the conference in Madrid, Spain, 1820 November. After running the Humanoids conference around the world for a long time, this was the first time that a competition was organized jointly with the event. This article high-lights some of the logistic challenges and successes of the new competition.

    Inspiration and Aim The original idea for the competition was due to Prof. Carlos Balaguer, general chair of Humanoids 2014. Prof. Balaguer was the coordinator of the Spanish Robotics Network from 2006 to 2009. During that period, he was a fore-father of a national humanoids competi-tion (CEABOT), which today is a well-known and consolidated under-graduate humanoid robot competition. In the CEABOT competition, the win-ner is in charge of organizing the next edition. Universitat Jaume I (UJI) has been participating since 2007 and has won four years consecutively (2007-2010). Although the CEABOT winner is in charge of organizing the next edition, UJI has been a coorganizer of the com-petition since 2010. Therefore, due to their long-time experience, Prof. Balaguer chose UJI as the organizer of the first HUMABOT Challenge.

    A year before the challenge, during the preliminary planning stage, we envisioned a competition including three different segments, depending on the humanoid platform size selected: mini, midsize, or large. The mini platform was the easiest since CEABOT is based on it, and we had experience with this type of platform [1] [2]. However, the capabilities of mini plat-forms (e.g., Robonova) are much more limited than those of the midsize segment (e.g., NAO and DARwIn-Op), with which we also have previous experience [3]. In addition, there are well-established inter-national competitions for midsize plat-forms (e.g., RoboCup). The large humanoids segment relies exclusively on a specific platform, and the competition is sponsored by a company. Thus, after con-sidering all of the restrictions (economic, international impact, and so on), it was de-cided to drastically simplify the competi-tion, focusing exclusively on midsize humanoid platforms around 50-cm tall, making the use of popular platforms like NAO and DARwIn-Op possible.

    Our aim was to design a competition with an affordable entry level suitable for graduate students interested in pursuing a Ph.D. degree in robotics. In our experi-ence, using real robots in teaching with undergraduate students has been accom-panied by an increase in the enrollment rates in robotics studies [4]. We tried not to overlap with existing competitions like RoboCup@Home, although both share some common principles, e.g., the robot must perform manipulation tasks in a domestic environment [5]. We aimed to define three realistic problems that could be easily stated in a single sentence yet

    provide a significant challenge with the available resources.

    Logistic IssuesIn parallel to design scenarios and rules for the competition, an international committee to assist in spreading the event worldwide was selected a year before the challenge. The committee consisted of Pedro Lima, Instituto Superior Tecni-

    co, Portugal Luca Iocchi, Universit di Roma La

    Sapienza, Italy Sven Behnke, Rheinische Fried-

    rich-Wilhelms-Universitt Bonn, Germany

    Peter Stone, University of Texas, United States

    Eiichi Yoshida, CNRS-AIST Joint Research Laboratory, Japan

    Alan F.T. Winfield, University of West England, Bristol, United Kingdom

    Pedro J. Sanz, UJI, Spain (competi-tion chair).

    Likewise, a local committee was created to design and implement all the logisti-cal details of this competition, consist-ing of Enric Cervera, UJI, Spain Juan C. Garca, UJI, Spain Guillem Aleny, Insitut de Robtica

    IndustrialCSIC, University of Cat-alonia, Spain

    Francisco Blanes, Universitat Politc-nica de Valncia, Spain

    Fernando Gmez, Universidad de Huelva, Spain

    Sam Pfeiffer, PAL Robotics, Spain.The official Web site of the event was

    made available to the participants from Digital Object Identifier 10.1109/MRA.2015.2418511Date of publication: 18 June 2015

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • ENGINEERS START HEREAccess 500,000 in-stock electronics products, custom services, tools and expertiseall in one place. Plus, count on customer service that goes above & beyond to deliver your needs. Complete engineering solutions start at Newark element14.

    1 800 463 9275 | newark.com

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 10 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    the very beginning, and it has been a key way to disseminate and exchange all the necessary information to guarantee the success of the initiative, allowing updates to recommendations and guidelines for competitors. A simula-tion model of the competition was pub-lished in advance on the Web site so that potential participants could exer-cise their skills in a simulated environ-ment. The Web site contained not only a complete simulation model represent-ing the scenario but also an online con-nection to an NAO platform in the real kitchen scenario, where, after signing in with authorized credentials, the teams could carry out their algorithms, thus obtaining valuable feedback.

    Attracting Participants One of the most critical points in the organization of a new humanoids com-petition is offering something different to attract potential teams. In this respect, the challenge was taking

    advantage of the strong impact associ-ated with a well-known international conference like Humanoids. Unlike many other similar competitions, the focus was not a single platform. In fact, although two specific platforms were proposed as examples (i.e., NAO and DARwin-Op), other possibilities were admitted as long as they met the size specifications. Other major attractors were the availability of funding to cover travel and equipment expenses as well as the three monetary prizes for the winners of the challenge (i.e., 1,000 for first place, 700 for second place, and 300 for third place).

    Application and Selection ProcessTo participate, the teams were required to fill out a short form on the official competition Web site and submit a qual-ification document of up to five pages with information about the team, the robot, the teams related research inter-

    ests, and a summary of past relevant work and scientific publications. In addition to this document, each team had to submit a link to a video that dem-onstrated the current status of their robotics research. Videos of simulation contributions were also accepted in cases where the real robots were unavail-able. If the candidates already had a robot platform, the video was required to show the current state of their robots performance. Based on this information, the local committee was able to select the best teams to participate in the com-petition. In addition, the preliminary process included the selection of teams that would be granted travel funds for the students (up to 2,000, depending on the team location).

    The SetupThe competition was oriented to real robots, but a simulation setup was also prepared for research groups that could not afford a real platform. Those

    Measure all six components offorce and torque in a compact,rugged sensor.

    www.ati-ia.com/mes919.772.0115

    ROBOTIC END-EFFECTORS

    Low-noise Electronicsinterfaces forEthernet, PCI, USB, EtherNet/IP, PROFINET,CAN, EtherCAT, Wireless, and more

    Interface Structurehigh-strengthalloy provides IP60, IP65, and IP68environmental protection as needed

    Sensing Beams and Flexuresdesignedfor high stiffness and overload protection

    The F/T Sensor outperforms traditional load cells, instantly providingall loading data in every axis. Engineered for high overload protectionand low noise, its the ultimate force/torque sensor. Only from ATI.

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • Find out more at www.pololu.com/zumo

    Zumo 32U4High performance in an accessible robot

    Powerful and versatile, the Zumo 32U4 is a mobile robot accessible to beginners yet engaging for advanced users. The integrated ATmega32U4 offers Arduino-compatible support for those getting started with electronics and microcontrollers, while advanced features include quadrature encoders, a sophisticated object detection system, and an inertial measurement system (IMU), enabling substantial educational and research applications.

    Whether you want a competitive Mini-Sumo robot, a robot swarm, or a small friend to roam your desk, the Zumo 32U4 is the robot for you.

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 12 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    groups would thus have the opportuni-ty to test their algorithms. Then, in the final competition, a standard platform would be available for them. Onboard computation was not enforced, but the robot was required to operate in a fully

    autonomous way, without human intervention.

    Although the setup was downsized to the scale of the robots, it was realistic and addressed challenging problems involv-ing manipulation, grasping, and object

    recognition under casual lighting condi-tions. In the HUMABOT Challenge sce-nario, the robot is an integral part of the house and helps its occupants to have a better quality of life. The robot was locat-ed in a kitchen corner (Figure1) consist-ing of a kitchen module and a table. The kitchen module comprised two lower cupboards, a working surface in the mid-dle, and a higher shelf with a microwave oven. Due to the fact that the original buttons to control the stove were too small, a modification was made in the stove control unit to install bigger switch-es for easier manipulation by the robots. The location of the stove controls was the same for all the robots. Although some landmarks were used, the teams were allowed to use extra landmarks in the fur-niture, but not in the objects.

    Vision was a requirement for solving the challenge, but off-the-shelf tech-niques could have been readily used.

    (a) (b)

    Figure 1. (a) An overview of the kitchen used in the competition and (b) a close-up of the stove area.

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    0HHWWKHZRUOGVPRVWH[LEOHOLJKWZHLJKWWDEOHWRSURERW working alongside people

    Learn more and discover what our robots can do for you at: universal-robots.com

    > Easy Programming> Fast Set Up> Flexible Deployment> Collaborative and Safe> Fastest Payback

    in the Industry

    5DGLXV Payload 5RWDWLRQ

    500 mm19.7 ins

    3 kg6,6 lbs

    360LQVHF

    UR3

  • While the world benets from whats new,IEEE can focus you on whats next.

    Learn Moreinnovate.ieee.org

    IEEE Xplore can power your research

    and help develop new ideas faster with

    access to trusted content:

    + Journals and Magazines+ Conference Proceedings+ Standards+ eBooks+ eLearning+ Plus content from select partners

    IEEE Xplore Digital LibraryInformation Driving Innovation

    Follow IEEE Xplore on

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 14 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    Colored blob segmentation, database image recognition, and landmark recognition are widely available algorithms in many computer-vision frame-works. Besides identifying the objects in the task, landmarks can help to solve the localiza-tion problem of the robot. Even in a small environment, the uncertainty added by the walk-ing motion makes it impossible to rely on open-loop motions for the manipulation tasks, thus imposing the need for vision-based closed-loop algorithms.

    The Participants Initially, nine teams applied to partici-pate in the competition, originating from four different continents (Africa, South America, Asia, and Europe). However, some of the teams were unable to participate at the competition

    in Madrid and had to withdraw. In the final challenge, the qualified teams (Figure 2) came from Colombia, Mexi-co, Spain (including three teams, two from Barcelona and one from Madrid), and Sweden. Two Spanish teams used the DARwIn-OP robot, provided by Ro-Botica company. All the rest of teams used the NAO platform.

    The Challenge The challenge consisted of three independent tests: 1) switching off the kitchen stove, 2) making a shopping list, and 3) preparing a meal. Each test was graded on a 20-point scale. Grading was performed on the day of the finals according to the robots behavior in the real room and not in a simulation. Each team had two attempts per test, lasting 3min each. The best grade of the two attempts was retained as the score. In

    addition, for each of the tests, a bonus was awarded based on the success and speed of the task, and a penalty was issued if the team had to intervene dur-ing the robots attempt.

    During the first day of the competi-tion, all the teams had access to the kitchen setup to test their algorithms before the final scenario. Although the kitchen model was well known and accessible for all the participants, all teams needed to adjust their algo-rithms in the real environment to han-dle the ambient lighting conditions. It is notable that during these trial tests, some teams received better results than during the final competition. For instance, as shown in Figure 3, one of the robots was able to identify the tomato toy, grasp it, and put it into the cooking pan (all of the objectives in the third test). Nevertheless, during the competition, this robot was unable to perform the tasks.

    The first test consisted of a safety task: one of the burners in the kitchen was lit, and the robot had to turn it off (Figure 4). The lit burner was selected at random, and two switches allowed the robot to control each burner inde-pendently. The relationship between the burners and switches was known in advance, as were the relative positions of the elements. Markers were allowed on the furniture for simplifying the localization problem. There were two rounds in this test, and only two teams were able to switch off the burner dur-ing the competition. The other robots suffered from poor precision in the localization of the robots with respect

    Figure 2. Participants with their robots at the HUMABOT Challenge, held 1820 November in Madrid during the IEEE Humanoids Conference.

    Experienced engineers,exceptional robots

    www.robot.engineer

    Advanced robotics research is our specialty and we work with clients world wide. We design custom robot hardware. We provide engineering support for ongoing projects. Come view our portfolio.

    CONSULTING | CONTRACT ENGINEERING | PROJECTS

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • IEEE Member Digital Library is an exclusive subscription available only to active IEEE members.

    The IEEE Member Digital Library gives you the latest technology researchso you can connect ideas, hypothesize new theories, and invent better solutions.

    Get full-text access to the IEEE Xplore digital libraryat an exclusive pricewith the only member subscription that includes any IEEE journal article or conference paper.

    Choose from two great options designed to meet the needs of every IEEE member:

    Fuel your imagination.

    IEEE Member Digital LibraryDesigned for the power researcher who needs a more robust plan. Access all the IEEE content you need to explore ideasand develop better technology.

    tBSUJDMFEPXOMPBETFWFSZNPOUI

    Try the IEEE Member Digital LibraryFREE!www.ieee.org/go/trymdl

    Get the latest technology research.

    IEEE Member Digital Library BasicCreated for members who want to stayup-to-date with current research. Access IEEE content and rollover unused downloads for NPOUIT

    tOFXBSUJDMFEPXOMPBETFWFSZNPOUI

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 16 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    to the kitchen, which resulted in the hands of the robots not making contact with the switch.

    The second test consisted of the iden-tification of missing objects on the shelves for making a shopping list. A set of known objects was given in advance, but the lighting conditions (with no prespeci-fied conditions) were hard. Nevertheless, all the teams succeeded in identifying at least one of the missing objects, yet false positives were also produced. As a result, the final scores for this test were low due

    to the penalties for such false positives.The third test was the most demand-

    ing because it required real grasping of an object. As previously mentioned, this test required the robot to grasp a tomato toy, made of soft tissue, and put it into a pan. The approximate relative positions of the objects were known in advance, and landmarks were allowed on the table. Nevertheless, the uncertainty in the walking motion as well as the limited precision of the visual-based localization led to failures for all teams in this test

    during the competition. It is worth not-ing that several teams succeeded in com-pleting the task during the training phase, just minutes before the real com-petition. However, during the allotted three rounds of the test, no robot was able to grasp the tomato, let alone put it in the pan.

    Based on the results of the three tests, the winner of the challenge was the Swedish team FIA Robotics from Linkoping University; the second-place winner was dotMEX from the Centro de Investigacin y de Estudios Avanzados del Instituto Politcnico Nacional, Mexico; and the third-place prize went to the Spanish team UC3M ROBOTICA from Madrid. In Fig-ure5, all participating team members and the jury are shown.

    Lessons LearnedThe organization of a humanoids com-petition is always hard work. But when the competition is international and is the first edition, difficulties requiring extra effort arise. Some of the main positive aspects and limitations have been highlighted in the previous sec-tions. Thanks to the dissemination of media (e.g., press, radio, TV channels, etc.) Humanoids had a large social impact. One of the best benefits of this kind of competition would be, without doubt, the new skills that team mem-bers are able to reach.

    It should be noted that some teams suffered a negative impact on their per-formance after the competition start because some unexpected problems arose that were impossible to fix within the allotted time. Aside from technical problems, an important limitation that we observed in several teams was unsuitable expertise for undertaking some required tasks, like visually guided grasping. In fact, this skill was necessary for the most critical point during the challenge, making it impossible to suc-ceed in the third test for all the teams. Bearing in mind that no team was able to obtain any mark in the third test, it is clear that a new strategy for evaluation will be necessary in the future. One sug-gestion would be to start with an initial step associated with a simpler skill. For

    Figure 5. The participants and jury (first row, from left): Pedro J. Sanz, Francisco Blanes, and Enric Cervera.

    (a) (b) (c) (d)

    Figure 4. The snapshots of the first test by the Polytechnic University of Catalonia team. (a) The robot approaches the kitchen, (b) detects the lit burner, (c) moves its end-effector toward the button, and (d) switches it off. (Photos courtesy of Gerard Canal and Edgar Riba from the Polytechnic University of Catalonia.)

    (a) (b) (c) (d)

    Figure 3. The snapshots from a trial done before the competition: (a) the robot approaches the table, (b) it moves both end-effectors toward the tomato, (c) it grasps the tomato, and (d) it drops it into the pan. (Photos courtesy of Gerard Canal and Edgar Riba from the Polytechnic University of Catalonia.)

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 17JUNE 2015 t IEEE ROBOTICS & AUTOMATION MAGAZINE t

    example, touching the tomato with the hand would demonstrate that a visually guided algorithm is running in a suit-able manner.

    In the future, this competition could be regarded as a benchmark for middle-scale service humanoids. An online labo-ratory with the required elements is being set up, and it should soon be avail-able to the research community. The online robot should be able to remotely execute the users code, and the feedback will be provided by video and audio streaming and the output of the pro-gram. An experimental version is avail-able at http://www.robotprogramming.net in an initiative sponsored by IEEE RAS for the creation of educational materials in robotics and automation.

    The continuity of the HUMABOT Challenge will depend mostly on the new

    edition of the Humanoids Conference (Korea, 2015) and its organizers. For more information and details, please visit our Web site, http://www.irs.uji.es/humabot/.

    AcknowledgmentsThe authors would like to thank Gerard Canal and Edgar Riba from the Polytech-nic University of Catalonia for their images of the tests on their NAO robot, Francisco Blanes (UPV) for being part of the jury, and all the organizers of IEEE Humanoids 2014, especially General Chair Prof. Carlos Balaguer (UC3M) and Alberto Jardn (UC3M) for their help in setting up the competition.

    References[1] J. C. Garca, J. Felip, and P. J. Sanz, The 2007 Spanish humanoids competition: From the winners point of view, in Proc. 8th Conf. Autonomous Robot

    Systems Competitions, Aveiro, Portugal, Apr. 2008, pp. 105109.[2] A. Jardn, P. Zafra, S. Martnez, and A. Gim-nez, CEABOT: Nationalwide little humanoid robots competition; rules, experiences and new challenges, in Proc. Int. Conf. Simulation, Modeling Programming Autonomous Robots, Venice, Italy,2008, pp. 6669.[3] J. C. Garca, P. J. Sanz, and E. Cervera, Using humanoids for teaching robotics and artificial intelli-gence issues. The UJI case study, in Proc. III Work-shop de Robtica: Robtica experimental, Sevilla,Spain, Nov. 2011, pp. 99104.[4] J. Alemany and E. Cervera, Appealing robots as a means to increase enrollment rates: A case study, in Proc. 3rd Int. Conf. Robotics Education,Prague, Czech Republic, 2012, pp. 1519.[5] J. Stuckler, D. Holz, and S. Behnke, Demon-strating everyday manipulation skills in Robo-Cup@Home, IEEE Robot. Automat. Mag., vol. 19,no. 2, pp. 3442, 2012.

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ____________________

    __

    _____________________________________________________

  • 18 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    Integrating ROS and MATLABBy Peter Corke

    ROS TOPICS

    T he Robot Operating System (ROS) has gained wide curren-cy for the creation of working robotic systems, initially in the laboratory but now also in industry. Despite ongoing evolution, the funda-mental principles of publishing and subscribing on topics, application-spe-cific messages, invoking services, and sharing parameters have remained con-stant. The primary programming envi-ronment for those working with ROS is C++ and Python, though using Java is also possible.

    MATLAB is a powerful tool for prototyping and simulating control systems and robotics [1], [2], but, until very recently, it has not been easy to in-tegrate with ROS. The need for such integration is evidenced by many solutions that have been developed, in-cluding the JavaScript Object Nota-tion-based rosbridge (http://wiki.ros.org/rosbridge_suite), the Java-based ROS-MATLAB bridge package (https://code.google.com/p/mplab-ros-pkg/wiki/java_matlab_bridge), and the ROSlab-IPC bridge (https://alliance.seas.upenn.edu/meam620/wiki/index.php?n=Roslab.IpcBridge), among oth-ers. However, none of these have caught on in a big way, perhaps due to installation and usability concerns.

    With the recent release of MATLAB 2015a, there is a better option available through the newly introduced Robotics Systems Toolbox Q (RST). This toolbox

    has three main areas of functionality: ROS integration, support for pose repre-sented as special Euclidean group (3) homogeneous transformations, and probabilistic-road-map-based path planning. The remainder of this article will introduce the ROS functionality in a tutorial manner. Note that this function-ality is an evolution of Mathworks own ROS input/output package introduced in 2014, and the RST ROS application program interface has some changes with respect to this earlier package.

    Assuming the presence of a running ROS system with an ROS master, we initialize the MATLAB ROS subsystem with the IP address and port number of the ROS master, e.g.,

    rosinit(192.168.1.10, 11311).

    If no arguments are provided, then MATLAB will create an ROS master and display its URI so that it can be used by other nodes.

    Next, we want to publish on a topic; so let us take a simple example from the ROS tutorial. We first create a message object of the standard string type

    msg = rosmessage(std_msgs/String);

    and then set its value

    msg.Data = hello world;

    The message is an object, and its prop-erties are hierarchical and match the fields of the message. We can read or write the properties directly without having to use setter or getter methods. All that remains now is to publish it.

    rospublisher(/MyTopic, msg);Alternatively, we could create a pub-

    lisher object and optionally specify the message type

    pub = rospublisher(/MyTopic, std_msgs/String);

    and then invoke its send method

    pub.send(msg);

    Various options can be configured at construction time for the publish-er object.

    Receiving a topic is just as easy. We first create a subscriber object for the particular topic and optionally specify the message type

    sub = rossubscriber(/MyTopic, std_msgs/String).

    The constructor has various options to control buffer size and whether only the most recent message should be returned. We read the next message on the topic by

    msg = sub.receive(),

    which blocks until a message is re-ceived, but we could also specify a time-out interval in seconds

    msg = sub.receive(5)

    An alternative to polling for mes-sages is to establish a callback

    sub = rossubscriber(/MyTopic, std_m s g s / S t r i n g , @rxcallback)

    Digital Object Identifier 10.1109/MRA.2015.2418513Date of publication: 18 June 2015

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    _____________

    ________________________

    __________________

    __________

    _________________________

    ________________

  • 19JUNE 2015 t IEEE ROBOTICS & AUTOMATION MAGAZINE t

    to the function

    function rxcallback(src, msg)

    disp([char(msg.Data()), sprintf(\n Message received: %s, datestr(now))]);

    which is invoked on every message receipt.

    Next, let us look at a more com-plex message: the velocity twist with time stamp

    msg = rosmessage(geometry_msgs/TwistStamped),

    and we can view its definition

    >> definition(msg) % A Twist with reference coordinate frame and timestamp std_msgs/Head-er Header Twist Twist

    or access one of its fields

    msg.Twist.Linear.X = 0;

    Custom messages are also possible but beyond the scope of this article. (See http://www.mathworks.com/matlabcentral/fileexchange/49810 for details.) The ROS parameters can be accessed via a ParameterTree ob-ject returned by

    ptree = rosparam,

    and we can use it to set, get, create, or delete parameters in the ROS parame-ter server.

    ptree.get(rosversion)ptree.set(myparameter, 23)

    and parameters can have integer, log-ical, char, double, or cell array types.

    We can also access and create ser-vices in MATLAB code. Inspired by the TwoInts example given in the ROS tutorial, we can easily create a service to add two integers

    sumserver = rossvcserver(/sum, rostype.roscpp_tutori-als_TwoInts, @SumCallback),

    and the service function is

    function resp = SumCallback(,req,resp)resp.Sum = req.A + req.B;

    and this can now be invoked from any ROS node

    $ rosservice call /sum2 1 2 sum: 3

    or from inside MATLAB by first creat-ing a service client

    sumclient = rossvcclient (/sum),

    creating a message with the numbers to be added

    sumreq = r o s m e s s a g e ( s u m c l i e n t ); sumreq.A = 2; sumreq.B = 1,

    and then invoking the service

    sumresp = call(sumclient,s u m r e q , T i m e o u t , 3 )>> sumresp.Sum ans = 3.

    There is also the capability to read and write ROS bag files. First, we open the bag file and list the available topics:

    bag = rosbag (quad-2014-06-13.bag)bag.AvailableTopics.

    To extract all the images on the topic /preview , we use the selectmethod to choose the particular topic, use readMessages to ex-tract a cell array of 100 messages that happen to be of type sen-sormsgs/Image, and then convert this to a cell array of images that can be displayed

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    _______________________

    __________________________

  • 20 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    preview = bag.select (Topic, pre-view)

    subset = preview.readMes-sages(500:599)

    images = cellfun(@readIm-age, all, Uni-formOutput, false);

    We could also extract inertial mea-surement unit (IMU) data and place the x-axis translational and z-axis an-gular velocity, for example into a timeseries object, which automati-cally picks up the message time stamps and has various methods for analysis and display.

    imu = bag.select(Topic, fcu/imu);

    ts = imu.timeseries(LinearAcceleration.X, AngularVelocity.Z); ts.plot

    A number of interactive com-mands are available in the MATLAB environment, which mimics the ROS command line utilities to find mes-sages, nodes, topics, parameters, or services, e.g.,

    >> rosmsg list>> rosmsg show geometry_msgs/TwistStamped>> rosnode list>> rostopic list>> rosparam list>> rosservice list.

    This provides all the programmatic tools required to write code in MAT-LAB that can fully participate in an ROS-based robot control system. A powerful advantage of MATLAB ROS is its platform independencethis code will work on a Mac, Windows, or Linux system. The real-time update rate is, of course, going to depend on the size of the messages, the complexity and effi-ciency of your MATLAB code, and the performance of your computer, but tens of hertz is feasible.

    An alternative to programmatic im-plementation is to use the Simulink block diagram modeling environment, as shown in Figure 1. An RST provides a palette of blocks that includes Publishand Subscribe. The Msg output of the Subscribe block is a bus type, and we can use a Simulink Bus Selector to pull out the particular message fields in which we are interested. We use a triggered sub-system to ensure that a message is pub-

    lished only after a message is received. With the appropriate Simulink settings, this controller can run in real time (see the aforementioned caveats), and the Scope blocks allow us to conveniently see what is happening. We can, of course, log signals to workspace variables for later analysis and graphical display.

    Finally, and most significantly, we can export this diagram as code. Simulink generates a .tgz archive file that con-tains all the code necessary to build a standalone real-time ROS node on a Linux system.

    In this short article, we can only skim the surface of the new MATLAB capa-bility for ROS integration. More details can be found at mathworks.com/ros. Over time, the RST will gain additional functionality that will allow students, en-gineers, and researchers to create robotic systems more quickly and leverage the large and mature ROS code base.

    References[1] P. I. Corke, Robotics, Vision and Control: Fun-damental Algorithms in MATLAB. Berlin Heidel-berg, Germany: Springer-Verlag, 2011.[2] P. Corke. Robotics toolbox for MATLAB. [On-line]. Available: http://www.petercorke.com/robot

    The Desired (X, Y) Position of the Robot(Colored Blocks Indicate Tunable Parameters)

    [-6 8]

    ROS IsNewIsNewSubscribe

    Subscribe

    Desired Position

    x

    y

    zYaw

    quat2eul

    Conversion

    The /odom Topic Represents Orientation as aQuarternion, So Convert It to a Yaw Angle

    v

    w

    Copyright 2014 The MathWorks, Inc.

    Command VelocityPublisher

    Proportional Controller

    Desired PositionLinear

    Velocity (v)

    AngularVelocity (w)

    Position X

    Position Y

    Orientation (i)

    w

    /odom Msg

    Scope

    ?

    Figure 1. A robot controller implemented in Simulink with ROS. (Image used with permission from MathWorks.)

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • Q See the exciting work that engineers do

    Q Learn how engineers make a dierence

    Q Play online games and challenges

    Q Find accredited engineering programs, summer camps, lesson plans and more

    TryEngineering.org

    T

    Visit www.tryengineering.org today!Brought to you by IEEE, IBM and TryScience

    Explore the amazing world of engineersall in one web site

    INDUSTRIAL ACTIVITIES

    21

    The First-Ever IEEE RAS Standard Published!By Raj Madhavan

    JUNE 2015 t IEEE ROBOTICS & AUTOMATION MAGAZINE t

    IEEE Standard Ontologies for Robotics and Automation, IEEE Standard P1872, was approved by the IEEE Standards Association (IEEE-SA) Standards Board on 16 February 2015. After starting as a study group in early 2011, a working group (WG) was formed in November 2011. In a relatively short span of three years, the draft standard was submitted in late 2014 to the IEEE-SA before it was approved as a formal standard in February 2015. The standard defines a core ontology that specifies the most general concepts, relations, and

    axioms of robotics and automation. It is intended as a reference for knowledge representation and reasoning in robots as well as a formal reference vocabulary for communicating knowledge between robots and humans. The WG chairs for P1872 are Craig Schlenoff ([email protected]) and Edson Prestes ([email protected]).

    A second WG is about to submit a draft of Robot Map Data Representation for Navigation, IEEE Standard P1873. This standard specifies a map data rep-resentation of environments of a mobile robot performing navigation tasks and provides data models and data formats for two-dimensional met-

    ric and topological maps. The WG chairs for P1873 are Wonpil Yu ([email protected]) and Franceso Amigoni ([email protected]).

    The IEEE Robotics and Automation Society (RAS) Standing Committee for Standards Activities is the sponsor of P1872 and P1873. An article with addi-tional details on the standards will be published in a forthcoming issue of IEEE Robotics and Automation Maga-zine. For additional information on the Societys standards efforts, please con-tact the RAS standards chair, Raj Mad-havan ([email protected]).

    Digital Object Identifier 10.1109/MRA.2015.2418521Date of publication: 18 June 2015

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ____________

    ____________

    ____

    ______________

    ____

    _____ ___

    _______________

  • FROM THE GUEST EDITORS

    22 t IEEE ROBOTICS & AUTOMATION MAGAZINE t JUNE 2015

    Emerging Advances in AutomationBy Ken Goldberg, Youfu Li, Yu Sun, Maria Pia Fanti, and Hesuan Hu

    e are grateful for the op-portunity to organize this

    special issue to highlight exciting new directions in

    automation. The IEEE Robotics and Au-tomation Society (RAS) publishes two transactions, IEEE Transactions on Ro-botics and IEEE Transactions on Auto-mation Science and Engineering (T-ASE).The latter was established in 2004 to

    encourage re-search on the ways to sustain performance for robotic and au-tomation sys-tems, especially on systems that operate autono-mously, often in structured envi-ronments and over extended

    periods, and on the explicit structuring of the corresponding environments. Sustained performance requires new theory, analysis, models, and experi-mental techniques for automation and the ability to provide performance guar-antees. Furthermore, the corresponding results must address concepts relating to robustness, stability, productivity, effi-ciency, completeness, optimality, conver-gence, time complexity, sensitivity, verification, and reliability.

    T-ASE is widely read in China, where more than 100 universities have

    established automation departments. T-ASE and the associated annual Con-ference on Automation Science and Engineering (CASE) were both estab-lished in 2004 and recently celebrated their tenth anniversary. Automation has expanded beyond its roots in man-ufacturing to many other application domains, including health care, securi-ty, transportation, agriculture, con-struction, and energy. Recently, there has been a surge of international inter-est in how the cloud can enhance auto-mation, e.g., the Internet of Things, industrial Internet, industry 4.0, and multiplicity (see the corresponding edi-torial in the April 2015 issue of T-ASE). There are also many exciting new ap-plications in biology labs, warehouses (e.g., Amazons purchase of Kiva Sys-tems), and homes (e.g., Googles pur-chase of Nest). Upcoming special issues of T-ASE will focus on cloud robotics and automation, networked coopera-tive autonomous systems, the 2014 Workshop on the Algorithmic Founda-tions of Robotics, the 2014 CASE, home automation, human-centered au-tomation, emerging advances in logis-tics, and the Internet of Things.

    The following articles and tutorial were carefully reviewed and revised to provide a sample of the emerging ideas, developments, and applications out-lined in the previous paragraphs.

    In Flying Smartphones, Giuseppe Loianno, Gareth Cross, Chao Qu, Yash Mulgaonkar, Joel A. Hesch, and Vijay Kumar describe how an emerging con-sumer electronics technology can be used to control personal drones for

    home automation. When the sensing and computation for a robot is per-formed on a smartphone, as shown in the article, the robot unit cost is de-creased, and the capability of integra-tion and reprogramming for home automation is enhanced. This article ex-amines, in particular, how autonomous quadrotors can reliably navigate an un-structured, indoor environment to cre-ate 3-D maps.

    In Automated Vitrification of Em-bryos, Jun Liu, Chaoyang Shi, Jun Wen, Derek Pyne, Haijiao Liu, Changhai Ru, Jun Luo, Shaorong Xie, and Yu Sun de-scribe an emerging application of labo-ratory automation for vitrification for the cryopreservation of oocytes and em-bryos in clinics for in vitro fertilization (IVF). Currently, vitrification is con-ducted manually in IVF clinics by highly skilled embryologists. Manual vitrifica-tion is a laborious and demanding task, and due to poor reproducibility and in-consistency across operators, success rates and cell survival rates vary signifi-cantly. An automated embryo vitrifica-tion system is embedded with two contact detection methods to determine the relative Z positions of the vitrifica-tion micropipette, embryo, and vitrifica-tion straw. A three-dimensional (3-D) tracking algorithm is developed for visu-ally servoed embryo transfer and real-time monitoring of embryo volume changes during vitrification. The excess medium is automatically removed from around the vitrified embryo on the

    Digital Object Identifier 10.1109/MRA.2015.2418516Date of publication: 18 June 2015

    W

    T-ASE is widely read

    in China, where over

    100 universities

    have established

    automation

    departments.

    (continued on page 98)

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • Digital Object Identifier 10.1109/MRA.2015.2418523

    GEFI

    */-*0+*(+* -*+*++0.*3/+)0+*

    /+'$+())6%)3)+ /5+%0+*(,#.%.,-)%2>(.+%*4%/3/$+-./+ .3)%/ 4%+(%, /+ +),()*/ /$ .3)%..%+*>/%( %*./-30+*. +-.3)%..%+*-4%((+*/$+* -*5.%/>3-/$-)+-

  • onsumer-grade technology seen in cameras and phones has led to the priceperformance ratio falling dra-matically over the last decade. We are seeing a similar trend in ro-

    bots that leverage this technology. A recent development is the interest of companies such as Google, Apple, and Qualcomm in high-end communication devices equipped with such sensors as cameras and inertial measurement units (IMUs) and with significant computational capa-bility. Google, for instance, is developing a customized phone equipped with con-ventional as well as depth cameras. This article explores the potential for the rapid integration of inexpensive con-sumer-grade electronics with the off-the-shelf robotics technology for automation in homes and offices. We de-scribe how standard hardware platforms (robots, processors, and smartphones) can be integrated through simple software architecture to build autonomous quadro-tors that can navigate and map unknown, indoor environments. We show how the quadrotor can be stabilized and controlled to achieve autonomous flight and the generation of three-dimensional (3-D) maps for exploring and mapping indoor buildings with application to smart homes, search and rescue, and architec-ture. This opens up the possibility for any consumer to take a commercially available robot platform and a smartphone and automate the process of creating a 3-D map of his/her home or office.

    By Giuseppe Loianno, Gareth Cross, Chao Qu, Yash Mulgaonkar, Joel A. Hesch, and Vijay Kumar

    Automated Flight Enabled by Consumer Electronics

    24 t IEEE ROBOTICS & AUTOMATION MAGAZINE tJUNE 2015 1070-9932/152015IEEE

    Digital Object Identifier 10.1109/MRA.2014.2382792Date of publication: 11 May 2015 IMAGE LICENSED BY GRAPHIC STOCK

    C

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 25JUNE 2015 t IEEE ROBOTICS & AUTOMATION MAGAZINE t

    Automation of Aerial RoboticsThe priceperformance ratio of processors, sensors, and net-working infrastructure, which has dropped significantly over the last decade, has led to new applications founded on the conver-gence of computation, sensing, and computing. A recent General Electric report [1] calls this convergence the industrial Internet and suggests that the potential macroeconomic benefit from the industrial Internet could be comparable with the eco-nomic productivity gain attributable to the Internet revolution of the late 20th century. Others call it the Internet of Things and predict an economic impact in the tens of trillions of dollars [2]. More than 75% of business leaders surveyed predicted a direct impact of this technology on their business [3]. This conver-gence also holds great promise for automation with robots, which emphasizes efficiency, productivity, quality, and reliabili-ty, focusing on systems that operate autonomously, often in structured environments over extended periods [4].

    In this article, we address this confluence of technologies enabling automation in the context of aerial robotics, a field that has also seen dramatic advances over the last decade. The same drop in priceperformance ratio of processors and sensors has fu-eled the development of micro unmanned aerial vehicles (UAVs) that are between 0.1 and 1 m in length and 0.12 kg in mass. These low-cost platforms are easy to manufacture in contrast to the expensive UAVs used for military applications. The number of Predators and Global Hawks is estimated to be around 1,000. Growth in the consumer electronics industry (millions or billions of components at low cost) has resulted in inexpensive hardware for sensing and computation. These advances, coupled with open-source tools for building robots like quadrotors [5], have led to in-novative low-cost toys and hobby kits (e.g., diydrones.org). The real opportunity for robotics and automation is in leveraging these tools (and the convergence of computation, sensing, and commu-nication) to develop economical (compared with the military counterparts), functional, and robust aerial robots. These can be used in such tasks as inspection [6], interaction with the environ-ment [7], [8], search and rescue [9], construction [10], [11], and mapping of homes and offices.

    Early quadrotor vehicles were primarily experimental sys-tems, but improved design and software tools have lead to sig-nificant increases in reliability and reductions in cost. Today, quadrotors have reached the maturity of consumer-grade devic-es. To reiterate, this is in large part due to the decreasing priceperformance ratio of sensors for autonomous navigation, including global positioning systems (GPSs), cameras, IMUs, and laser scanners [7], [12][16]. In this context, low-cost range sensors offer an attractive alternative to high-end laser scanners and 3-D cameras for applications such as indoor navigation and mapping, surveillance, and autonomous robotics.

    Consumer-grade range sensing technology has led to many devices becoming available on the market, like the Microsoft Kinect sensor and the ASUS Xtion sensor (PrimeSense 2010; see Figure 1). The richness of the provided data and the low cost of the sensor have attracted many researchers from the fields of mapping, 3-D modeling, and reconstruction. The ASUS Xtion sensor boasts a lower weight than the first

    generation of red, green, blue, and depth (RGB-D) cameras (around 70 g without the external casing). (While this specific sensor is no longer available, there are others under develop-ment that are likely to be available in the future.) It does not re-quire external power beyond a standard universal serial bus (USB) port, and it is quite compact. Consequently, sensors in this form factor have received significant attention from the scientific community, particularly for environment mapping and monitoring applications with UAVs [16], [17].

    This article addresses the use of an off-the-shelf quadrotor platform with a commercial off-the-shelf (COTS) RGB-D cam-era (see Figure 2). The current algorithms based on RGB-D cameras need a large platform and specific customized hard-ware processors that are not widely available. These algorithms generally only work without limitation on laptop or desktop computers [18], [19]. Moreover, there is still a gap between the use of complex algorithms in the research field and their use by naive humans for everyday applications. Googles Project Tango has bridged these gaps by creating a prototype [20] smartphone

    (a)

    (b)

    Figure 1. The flying platform (a) back view and (b) front view.

    RGB Camera

    IR Camera

    IR Projector

    Figure 2. The ASUS Xtion Pro live sensor.

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

  • 26 t IEEE ROBOTICS & AUTOMATION MAGAZINE tJUNE 2015

    [Google (http://www.google.com) 2014; see Figure 3]. The smartphone incorporates an enhanced RGB-D camera with a fisheye lens that has a field of view of 170c and a depth sensor able to capture a dense set of point clouds. It also incorporates customized hardware and software designed to track full 3-D motion while concurrently creating a map of the environment using visual odometry and structure from motion algorithms. These sensors allow the phone to make more than a quarter of a million 3-D measurements every second, updating its position and orientation in real time and combining the data into a single 3-D model of the surrounding space. In comparison, previous works on implementing vision-based algorithms on camera phones are based on marker tracking [21] and localization algo-rithms [22]. These algorithms are designed for augmented-reali-ty applications; they employ cameras with a limited frame rate and with a small field of view. For these reasons, they are not suitable to deal with the long-term operations and large naviga-tion coverage areas needed in robotic tasks.

    In this article, a complete architecture representing a first step toward autonomous aerial robot flight with a camera phone is

    presented. It represents the first plug-and-play integration of a consumer product with an off-the-shelf aerial robot to enable autonomy with possible onboard localization, mapping, and control. Thus, it is representative a new class of affordable smart devices that can potentially lower the barrier to automation in homes by providing services for localization, state estimation, control, and mapping. In the future, end users may be able to utilize their smartphone device to autonomously control an aeri-al platform and to add new functionalities. The first contribu-tion of this work is the development of a quadrotor platform equipped with the Google Project Tango [20] smartphone sen-sor and a small processor unit. The second contribution is the vehicles control based on smartphone localization estimation. A nonlinear controller guarantees the exponential stability of the platform, which is able to follow trajectories in 3-D space. Finally, the fusion of the phones pose with inertial sensor mea-surements allows for an increased rate of state estimation, and, thus, it enables fast motions.

    System ArchitectureOur platform of choice was a quadrotor due to its mechanical simplicity [5] and ease of control. Moreover, its ability to oper-ate in confined spaces, hover at any given point in space, and perch or land on a flat surface makes it a very attractive aerial platform with tremendous potential. A description of the pro-posed hardware and software architecture is presented here. A schema of the proposed approach is shown in Figure 4.

    Hardware ArchitectureThe experimental platform shown in Figure 5 is made from COTS components and is equipped with an AutoPilot board con-sisting of an IMU and a user-programmable ARM7 microcon-troller. The main computation unit on board is an ODROID-XU(http://www.hardkernel.com) with a 1.7-GHz Samsung Exynos 5 Octa processor with 2 GB of random access memory (RAM), a 16-GB embedded multimedia controller (eMMC) module, and a 802.11n Wi-Fi transceiver. The only other addition to this setup is a forward-pointing Project Tango [20] smartphone from Google and a USB 3.0 cable for communication between the ODROID-

    XU and the Tango phone. The total mass of the platform is 900 g. The software framework is presented in the Software Architecture section.

    It should be noted that our ex-perimental setup is independent of the specifics of the employed em-bedded board. The processor usage is estimated to be 35% of the total available central processing unit (CPU), which suggests that a smaller and less powerful embed-ded processor will suffice. However, to guarantee a reliable setup and to reuse the same config-uration for other robotic tasks, we choose to use the ODROID-XU

    Figure 3. The Google Project Tango device.

    Visualizationand

    Navigation Settings

    TrajectoryPlanning

    PositionController

    SO (3)Controller

    Plant TangoPhone

    Hardware Architecture

    Software Architecture

    x, v, R, X

    xd, vd Rc, Xc

    Mi, x

    Base Station Robot

    UKFEstimation

    ODROID-XU

    Figure 4. The system architecture for specification, planning, control, and estimation. UKF: unscented Kalman filter.

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ________________

    _______________

  • 27JUNE 2015 t IEEE ROBOTICS & AUTOMATION MAGAZINE t

    board. The price is similar to other embedded platforms in the same class. Recent developments by ODROID suggest the possibility of using the less powerful ODROID-W board since the price is four times lower compared with the Odroid-XU, but the ODROID-W has been recently released on the market.

    Software ArchitectureThe ODROID-XU performs the following tasks: sensor fusion between Tangos pose and the vehicles IMU nonlinear position-based control.While both tasks could conceivably have been performed on the Tango device, we used an independent processor to facilitate the ease of prototyping and to ensure a more reli-able approach to state estimation and control at a fixed rate of 100 Hz.

    A Java application routine is enabled on the phone for pose streaming using the user datagram protocol (UDP). The ODROID-XU runs a robot operating system (ROS)-based ar-chitecture (http://www.ros.org). The UDP packets are re-ceived by a ROS node and are subsequently converted into ordinary ROS messages. It should be pointed out that the pre-sented strategy allows the vehicle to run all the algorithms on board. The base station is responsible only for visualization and handling user interaction.

    ModelingA quadrotor is a system made of four identical rotors and pro-pellers located at the vertices of a square. The first and the third propellers rotate clockwise, and the second and the fourth propellers rotate counterclockwise (see Figure 6). The symbols used in this article are listed in Table 1.

    Dynamic ModelLet us consider an inertial reference frame denoted by

    ; ;e e e1 2 3v v v" , and a body reference frame centered in the center of mass (COM) of the vehicle denoted by ; ;b b b1 2 3r r r" ,. The dynamic model of the vehicle can be expressed as

    ,,

    ,,

    x vv R e eR R

    J J M

    m mg3 3

    #

    x

    X

    X X X

    =

    =- +

    =

    + =

    o

    oo t

    o (1)

    where x R3! is the Cartesian position of the vehicle ex-pressed in the inertial frame, v R3! is the velocity of the vehicle in the inertial frame, m R! is the mass, R3!Xis the angular velocity in the body-fixed frame, and J R3! is the inertia matrix with respect to the body frame. The hat symbol $t denotes the skew-symmetry op-erator according to xy x y#=t for all , ,x y R3! g is the standard gravitational acceleration, and .e 0 0 1 T3 =6 @

    Figure 5. The computer-aided design (CAD) model for the robot platform and the Google Tango device.

    f2

    f3

    f1

    f4

    b"

    1

    e"

    1

    e"

    2

    e"

    3

    b"

    2

    b"

    3x

    f2ff

    f3ff

    f1

    f4ff

    b"

    1

    e"

    1

    e"""""""""

    2

    b"

    2

    b"

    3x

    Figure 6. The quadrotor model.

    Table 1. A glossary of important symbols.

    f Rj ! Force produced by the thj propellerR!x Sum of forces produced by all four propellers

    M R3! Moments generated by propellers around body axes

    x R3! Position of robots COM( )R SO 3! Rotation matrix of the vehicle with respect to

    the inertial frame

    ( )R SO 3c ! Commanded rotation matrixm R! Mass of the vehicleJ R! Rotational inertia of robot about its COM

    R3!X Angular velocity of robot in the body frameRc

    3!X Commanded angular velocity of robotin the body frame

    a R3! Linear acceleration of robot in the body frameg R! Gravitational accelerationd R! Distance of each rotor from the COMx Rd 3! Desired positionx Rd 3!o Desired velocityx Rd 3!p Desired acceleration

    ,e e Rx v 3! Translational errors,e e RR 3!X Attitude errors

    x R13! State estimation vectora Rb 3! Accelerometer biasesu R6! Estimator input

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    Previous Page | Contents | Zoom in | Zoom out | Front Cover | Search Issue | Next PageIEEE

    MAGAZ INE

    oboticsutomation

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    qqMM

    qqMM

    qM

    QmagsTHE WORLDS NEWSSTAND

    ____________

  • 28 t IEEE ROBOTICS & AUTOMATION MAGAZINE tJUNE 2015

    The total moment M R3! along all axes of the body-fixed frame and the thrust R!x are control inputs of the plant. The dynamics of rotors and propellers are neglect-ed, and it is assumed that the force of each propeller is di-rectly controlled. The total thrust, ,f jj 1

    4x =

    =/ acts in the

    direction of the z-axis of the body-fixed frame, which is orthogonal to the plane defined by the centers of the four propellers. The relationship between single-motor force f j , the total thrust ,x and the total moment M can be

    written as

    ,MMM

    dc

    d

    cdc

    d

    c

    ffff

    10

    1

    0

    10

    1

    01

    2

    3

    1

    2

    3

    4

    x

    =

    -

    -

    -

    -

    R

    T

    SSSSS

    R

    T

    SSSSS

    R

    T

    SSSSS

    V

    X

    WWWWW

    V

    X

    WWWWW

    V

    X

    WWWWW

    (2)

    where c a constant value and d is the distance from the COM to the center of each rotor in the ,b b1 2v v plane. For nonzero val-ues of d , (2) can be inverted. Our assumption that x and Mare the inputs of the plant is, therefore, valid.

    Vehicle ControlIn most previous works, a back-stepping approach to control is used because the attitude dynamics can be assumed to be faster than the dynamics governing the position and linear-ized controllers are used for both loops [5], [13]. In this arti-cle, because we need to model large excursions from the hover position for robustness, we use a nonlinear controller based on the work of [23] and [24].

    The control inputs and Mx are chosen as

    ,M e e

    J R R R Rk k JR R

    c c c c

    #X X

    X X X

    =- - +

    - -