towards micro robot hydrobatics: vision-based guidance ...imu data, auv navigation requires...

8
Towards Micro Robot Hydrobatics: Vision-based Guidance, Navigation, and Control for Agile Underwater Vehicles in Conļ¬ned Environments Daniel A Duecker, Nathalie Bauschmann, Tim Hansen, Edwin Kreuzer, and Robert Seifried Abstractā€” Despite the recent progress, guidance, navigation, and control (GNC) are largely unsolved for agile micro au- tonomous underwater vehicles (Ī¼AUVs). Hereby, robust and accurate self-localization systems which ļ¬t Ī¼AUVs play a key role and their absence constitutes a severe bottleneck in micro underwater robotics research. In this work we present, ļ¬rst, a small-size low-cost high performance vision-based self- localization module which solves this bottleneck even for the re- quirements of highly agile robot platforms. Second, we present its integration into a powerful GNC-framework which allows the deployment of Ī¼AUVs in fully autonomous mission. Finally, we critically evaluate the performance of the localization system and the GNC-framework in two experimental scenarios. I. I NTRODUCTION Hydrobatics refers to agile maneuvering of underactuated underwater vehicles and has become the marine counterpart of acrobatics in aerial robotics [1], [2], see Fig. 1. Agile underwater robots enable ā€“ beside the control design chal- lenge ā€“ a wide range of applications such as monitoring of conļ¬ned areas of interest e. g. aquaculture farms and harbors. These challenging tasks require robust and accurate control and localization strategies. Recent technological advances in robot miniaturization pushed this concept one step further resulting in the development of micro autonomous under- water vehicles (Ī¼AUVs) which are characterized by length scales of less than 50 cm. These small-scale systems allow deployment in monitoring missions within strictly conļ¬ned volumes such as industry tanks and cluttered disaster sides e. g. the Fukushima-Daiichi reactor [3] which cannot be done using full-scale robots. Moreover, since Ī¼AUV designs usually aim for low-cost vehicles, deployment of underwater robot ļ¬‚eets becomes feasible and is expected to show the full capabilities of autonomously operating vehicles. However, various bottlenecks hinder the development and deployment of Ī¼AUVs, namely embedded robust self- localization and powerful guidance, navigation, and control (GNC) architectures. While full and medium-scale under- water robot can be equipped with sophisticated sensing equipment such as ring-laser gyroscopes their deployment on Ī¼AUVs is infeasible due to their size and cost. Thus, a trade-off arises between miniaturization and autonomous capabilities. This research was supported by the German Research Foundation (DFG) under grant Kr752/36-1. All authors are with Hamburg University of Technology, Germany. Please direct correspondence to [email protected] Fig. 1: Fully autonomous hydrobatic maneuvering through hoops gates with the HippoCampus Ī¼AUV. Apparently, much work has been done on designing pow- erful control strategies for aerial robots [4]. Hence, transfer- ring these concepts to the underwater domain is appealing. However, early-stage testing of these control algorithms even in controlled environments such as in research tanks remains challenging. It is often hindered due to the lack of robust and accurate on-board localization schemes which ļ¬t agile Ī¼AUVs. A. Prior Work Current research on Ī¼AUVs can be split into two ap- proaches. Vehicles which are temporarily tethered can out- source major shares of the localization and control processing to an external computing unit which enables them to run complex algorithms. Recent work on these vehicles has shown great steps forward. Robust model predictive control in the presence of currents was presented in [5] while [6] demonstrated trajectory tracking under model uncertainties. However, due to their tether these vehicles can be seen as semi-autonomous, as the tether limits their freedom in path and motion planning especially in obstacle rich domains. In contrast, tether-free vehicles are fully autonomous but have to rely on their on-board hardware. Literature [2], [7]ā€“ [9] shows a clear trade-off between a small vehicle size which offers a wide range of mission scenarios while it limits the capabilities of on-board sensors and computa- tional power. An example for these restrictions is the non- availability of high-ļ¬delity sensors in Ī¼AUVs such as ring- laser gyroscopes which are essential for dead reckoning based navigation as it is widely used in medium and full- size AUVs [10], [11]. 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) October 25-29, 2020, Las Vegas, NV, USA (Virtual) 978-1-7281-6211-9/20/$31.00 Ā©2020 IEEE 1819

Upload: others

Post on 19-Jan-2021

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

Towards Micro Robot Hydrobatics:Vision-based Guidance, Navigation, and Control

for Agile Underwater Vehicles in Confined Environments

Daniel A Duecker, Nathalie Bauschmann, Tim Hansen, Edwin Kreuzer, and Robert Seifried

Abstractā€” Despite the recent progress, guidance, navigation,and control (GNC) are largely unsolved for agile micro au-tonomous underwater vehicles (ĀµAUVs). Hereby, robust andaccurate self-localization systems which fit ĀµAUVs play akey role and their absence constitutes a severe bottleneck inmicro underwater robotics research. In this work we present,first, a small-size low-cost high performance vision-based self-localization module which solves this bottleneck even for the re-quirements of highly agile robot platforms. Second, we presentits integration into a powerful GNC-framework which allowsthe deployment of ĀµAUVs in fully autonomous mission. Finally,we critically evaluate the performance of the localization systemand the GNC-framework in two experimental scenarios.

I. INTRODUCTION

Hydrobatics refers to agile maneuvering of underactuatedunderwater vehicles and has become the marine counterpartof acrobatics in aerial robotics [1], [2], see Fig. 1. Agileunderwater robots enable ā€“ beside the control design chal-lenge ā€“ a wide range of applications such as monitoring ofconfined areas of interest e. g. aquaculture farms and harbors.These challenging tasks require robust and accurate controland localization strategies. Recent technological advances inrobot miniaturization pushed this concept one step furtherresulting in the development of micro autonomous under-water vehicles (ĀµAUVs) which are characterized by lengthscales of less than 50 cm. These small-scale systems allowdeployment in monitoring missions within strictly confinedvolumes such as industry tanks and cluttered disaster sidese. g. the Fukushima-Daiichi reactor [3] which cannot bedone using full-scale robots. Moreover, since ĀµAUV designsusually aim for low-cost vehicles, deployment of underwaterrobot fleets becomes feasible and is expected to show thefull capabilities of autonomously operating vehicles.

However, various bottlenecks hinder the developmentand deployment of ĀµAUVs, namely embedded robust self-localization and powerful guidance, navigation, and control(GNC) architectures. While full and medium-scale under-water robot can be equipped with sophisticated sensingequipment such as ring-laser gyroscopes their deploymenton ĀµAUVs is infeasible due to their size and cost. Thus,a trade-off arises between miniaturization and autonomouscapabilities.

This research was supported by the German Research Foundation (DFG)under grant Kr752/36-1.

All authors are with Hamburg University of Technology, Germany. Pleasedirect correspondence to [email protected]

Fig. 1: Fully autonomous hydrobatic maneuvering throughhoops gates with the HippoCampus ĀµAUV.

Apparently, much work has been done on designing pow-erful control strategies for aerial robots [4]. Hence, transfer-ring these concepts to the underwater domain is appealing.However, early-stage testing of these control algorithms evenin controlled environments such as in research tanks remainschallenging. It is often hindered due to the lack of robustand accurate on-board localization schemes which fit agileĀµAUVs.

A. Prior Work

Current research on ĀµAUVs can be split into two ap-proaches. Vehicles which are temporarily tethered can out-source major shares of the localization and control processingto an external computing unit which enables them to runcomplex algorithms. Recent work on these vehicles hasshown great steps forward. Robust model predictive controlin the presence of currents was presented in [5] while [6]demonstrated trajectory tracking under model uncertainties.However, due to their tether these vehicles can be seen assemi-autonomous, as the tether limits their freedom in pathand motion planning especially in obstacle rich domains.

In contrast, tether-free vehicles are fully autonomous buthave to rely on their on-board hardware. Literature [2], [7]ā€“[9] shows a clear trade-off between a small vehicle sizewhich offers a wide range of mission scenarios while itlimits the capabilities of on-board sensors and computa-tional power. An example for these restrictions is the non-availability of high-fidelity sensors in ĀµAUVs such as ring-laser gyroscopes which are essential for dead reckoningbased navigation as it is widely used in medium and full-size AUVs [10], [11].

2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)October 25-29, 2020, Las Vegas, NV, USA (Virtual)

978-1-7281-6211-9/20/$31.00 Ā©2020 IEEE 1819

Page 2: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

In contrast, state estimation systems based on small-size MEMS-sensors are widely used on ĀµAUV platforms.Examples include the AVEXIS [7], the AUVx [8], and theHippoCampus [2], [9] underwater robots. However, due tothe strong drift of dead reckoning when using MEMS-IMU data, ĀµAUV navigation requires additional robust andcontinuous information on the vehicleā€™s absolute position.

Accurate and robust on-board self-localization of ĀµAUVsis required for fully autonomous operation. Moreover, mod-ular concepts of on-board localization systems allow a quickreplacement depending on in which scenario the robot isdeployed.

External motion capture systems have been used to providean accurate localization and even control commands to theunderwater robot [12]ā€“[15]. However, they require an under-water communication link to the robot which suffers fromhigh latencies and a limited bandwidth and, thus, their usageis restricted to limited scenarios. Moreover, external systemsdo not scale with increasing vehicle fleet size. Hence, bothaspects result in a bottleneck for research targeting agilecontrol of underwater vehicles. In this sense, robust andaccurate underwater localization in confined environmentscan still be seen as widely unsolved.

Localization systems and their challenges can be groupedfollowing their physical principle into vision, acoustics, andelectro-magnetic (EM) waves [16].

Vision based approaches use cameras to perceive therobotā€™s environment. They aim to estimate the robot poserelative to detected features [17]. With regard to underwaterapplications their usage is mostly limited to short rangedetection and to clear water scenarios, as they require goodvisibility conditions. Artificial landmarks are a widely usedconcept in localization, as they can be used to enrich fea-tureless environments such as halls and tanks. However, theyrequire the effort of pre-mission deployment. This makestheir usage attractive for controlled environments such asresearch testbeds. Classic approaches are based on illumi-nated markers and require the detection of a marker pattern tocompute the robotā€™s pose. Recently, powerful libraries suchas the AprilTag marker system [18], [19] became availableand are by now a standard method in robotics for pose esti-mation. The markers are uniquely distinguishable and comewith the advantage of simultaneously providing informationon position and orientation relative to the camera [10].Studies on the robustness of fiducial marker detection invarious water conditions were extensively conducted in[20] and demonstrated the general suitability for underwaterapplications, e. g. for insertion tasks [21]. Moreover, simul-taneous localization and mapping (SLAM) techniques basedon artificial markers were recently applied in underwater do-mains and have shown promising results [11], [22]. However,known concepts mainly cover comparatively slow motions asthese are dominant in docking and manipulation tasks. Theirextension to hydrobatic maneuvering, which is a commonrequirement in ĀµAUV missions, remains an open field ofresearch.

Self-localization systems based on acoustics are widely

Wx

Wz

Wy

WO

Bz

BOBy

Bx

xTixTj

yTj

zTj

yTi

zTi

OTi OTj

ĀµTi

ĀµTj

Fig. 2: ĀµAUV navigating through a confined volume usingAprilTag-based self-localization for feedback control.

used for AUVs in open-sea scenarios. They typically usethe signalā€™s time-of-flight or time-difference-of-arrival tocompute the vehicleā€™s position relative to the sound sig-nal emitting beacons. Recent work on small size acousticmodems made this technology available for ĀµAUVs [23],[24]. While acceptable performance has been demonstratedin harbors and large tanks, missions which require precisepositioning such as docking or trajectory tracking in e. gsmall research tanks remain out of todayā€™s capabilities. Thisis due to the fact that acoustic methods suffer massivelyfrom reflections caused by e. g. tank walls, reverberations,and multi-path effects [25].

The use of attenuation of electro-magnetic (EM) carrierwaves for accurate short-range underwater localization wasrecently proposed in [26], [27]. This concept seems to bepromising for confined volumes. The concept was minia-turized in [28] to fit the demands of ĀµAUVs with respectto size and cost. Experiments using EM-localization forĀµAUV control in a small research tank have been conductedand analyzed in [29]. The tests demonstrated the generalfeasibility but also revealed the methodā€™s sensitivity to EM-wave reflections from the tank walls which limits deploymentscenarios.

In summary, each concept comes with its specific advan-tages and disadvantages. However, when studying controlstrategies for fully autonomous hydrobatic maneuvering,existing approaches reach their limits. This is in particularthe case for small-scale robotic systems. Thus, a bottleneckexists even for the potential simple scenario of testing therobotā€™s autonomous capabilities in controlled environmentssuch as research tanks.

B. Contribution

The contribution of this work is three-fold. First, wepresent an embedded, robust, and accurate visual localizationsystem for ĀµAUVs which enables agile trajectory trackingand path following in strictly confined volumes such as in-dustry and research tanks. Moreover, our localization systemcan be used as temporary benchmark system for existinglocalization systems in the absence of external ground truth.Second, we propose a fully integrated GNC-framework andits combination with the developed self-localization concept.

1820

Page 3: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

Both the localization system and the GNC-module consistsof low-cost off-the-shelf components (<USD 80 + FlightControl Unit USD 100) are open-source available under 1.Their small size and open ROS-interface allow an easyintegration on basically any micro underwater robot platform.Third, we provide experimental performance statistics on thesingle localization module in experiments. Additionally, weexperimentally demonstrate the suitability of the integratedGNC-framework for agile feedback control in an hydrobaticpath following scenario.

To the best of the authorā€™s knowledge there is no suchsystem described in literature which provides ĀµAUVs withthe demonstrated amount of fully autonomous hydrobaticmaneuvering capabilities.

The remainder of this work is structured as follows.Section II introduces the problem of self-localization forfeedback control in agile maneuvering. The embedded local-ization module is described in Section III. Its integration intothe GNC-framework is presented in Section IV. Experimentalresults are presented and critically discussed in Section V.Section VI summarizes the findings of this work and givesan outlook on future work.

II. PROBLEM STATEMENT

Consider the performance examination of the guidance,navigation and control architecture onboard a highly agileĀµAUV inside confined volumes such as small research tanks,as depicted in Fig. 1 and 2. The efficient development ofsophisticated planning and control algorithms for ĀµAUVswhich allow the execution of complex fully autonomousmissions is still hindered by todayā€™s absence of robust,accurate, and easy-to-use embedded localization moduleswhich provide high-rate updates on the vehicle position andits yaw-orientation.

This capability gap is specific to the research on ĀµAUVs,since full-scale AUVs and remotely operated vehicles(ROVs) can generally compensate this lack by using high-fidelity inertial measurement units [10], [11] or having accessto powerful off-board computational capacities [5], [22].Both approaches cannot be adapted to the domain of ĀµAUVsas they are contrary to the goals of low-cost, small-size, andfull autonomy.

We solve this problem by proposing a powerful visualfiducial marked-based localization module which fits therequirements of ĀµAUVs such as size and energy consump-tion. Moreover, we embed the localization system into fullyintegrated GNC-framework as it is required for complex mis-sions. Our proposed module features an open ROS-interfacewhich makes it straight-forward to adapt to other ĀµAUVplatforms. In alignment with the idea of pushing forwardmarine robotics research through open-source concepts wemake our module with all algorithms available online.

1https://github.com/DanielDuecker/microAUV-localization

Motor#1

Motor#2

Motor#3

Motor#4

Mix

er

roll

yawthrust

pitch

IMU

IMU

FCUSBC

EKF*

high-levelplanner

LocalizationModule

EKF

MarkerDetection

Camera

attitude

low-levelattitudecontrol

orientationposition,

setpointattitude

Fig. 3: Localization module embedded into the architectureof the GNC-framework. Algorithms on the SBC run atāˆ¼10 Hz, Algorithms on the FCU run at āˆ¼50 Hz.

III. LOCALIZATION-CONCEPT

In this work, we follow the idea of a platform independentlocalization system which can easily be adapted to theindividual use-case. Thus, we aim to improve on generalmetrics such as high robustness and accuracy rather thanoverfitting to scenario specific requirements.

The developed localization module consists of three on-board components, namely a camera, an inertial measure-ment unit (IMU), and a single board computer (SBC), asdepicted in Fig. 3. However, in order to deploy the system thetarget volume, e. g. the research tank, has to be equipped withfiducial markers at known positions. While various markersystems exist we select ā€“ without loss of generality ā€“ theAprilTag system [19].

An on-board wide-angle camera collects images of thevehicleā€™s surrounding environment. A detection algorithm[30] extracts AprilTags from the image and computes theirposition and orientation relative to the camera. This informa-tion is then fed into a sensor fusion algorithm which fusesthe measured tagsā€™ position and orientation and providesuncertainty measures on both. In order to provide high-frequent updates on the robot state, e. g. for controllers, werequire inter-observations predictions steps. Note that ourestimation algorithm focuses on the most challenging robotstates in underwater robotics, namely the robotā€™s position andthe yaw-orientation.

Robust yaw measurements of the yaw-angle e. g. using on-board magnetometers is often infeasible. In practice, strongmagnetic disturbances lead to severe local deviations ofthe magnetic field which is a common challenge in small(steel) tanks and render the magnetometer-based yaw-signalintractable for most navigation tasks.

In this work, we propose an extended Kalman Filter (EKF)scheme which we extend by a dynamic measurement noisemodel. Note that the basic version of this filter is commonlyused in the robotics community. Thus, adaptions to individualscenarios remain straight-forward. However, the choice ofother sensor fusion algorithms, such as particle filtering, isof course possible.

Let Wxk be the position and orientation of the robotā€™sbody frame B in a world-fixed coordinate systemW at time k

Wxk =[x, y, z, Ļ†, Īø, Ļˆ

]>, (1)

1821

Page 4: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

where x, y, z are the vehicle position and the angles roll Ļ†,pitch Īø, and yaw Ļˆ represent its orientation Ī˜, respectively.

The robot dynamics are approximated by a simple ve-locity model f following Fossen [31] which is supple-mented by the translational accelerations measured by theon-board IMU. Complex motion models can be incorporatedto improve the filterā€™s prediction accuracy. However, theyrequire a model parameter identification which is specific tothe individual vehicle on which the localization module isdeployed [32].

The AprilTag detection algorithm provides accurate dis-tance information dTi between the body frame B and thei-th detected AprilTag marker Ti. However, the AprilTag-measured orientations in roll and pitch direction show anunreliable jumpy behavior for various robot orientationswhich is not acceptable for localization in agile maneuveringscenarios. Since computing the pitch and roll orientationfrom IMU data is robust, we restrict our AprilTag-based esti-mate to position and yaw-orientation Ļˆ. Thus, the estimatedstate at time step k reduces to pĢ‚(k) = [x, y, z, Ļˆ]

>.The state update is based on the measurement Āµ which

is modeled by the nonlinear observation function h (p). Itconsists of the distances dTi between the robot and thedetected AprilTags Ti which are located at known positionsand orientation inside the tank. The single tag-distancemeasurement yields

Āµi = hi (p) =

āˆš(Wpāˆ’WOTi)

2, (2)

where WOTi is the known position of the i-th tag. Thus, themeasurement vector Āµ capturing N detected markers yields

Āµk =[ĀµkT1 Ā· Ā· Ā·ĀµkTN

](3)

withĀµkTi =

[dkTi ĻˆkTi

]>. (4)

Note that the dimension of Āµ changes dynamically with thenumber of markers detected N in every camera frame.

Thus, the corresponding Jacobian matrix has the form

Jp(k) =[āˆ‡p(k)h1(p(k))> Ā· Ā· Ā· āˆ‡p(k)hN (p(k))>

]>. (5)

The predicted state and its covariance read

pĢ‚(āˆ’)(k) = f(pĢ‚(+)(k āˆ’ 1)

)and (6)

PĢ‚ (āˆ’)(k) = PĢ‚ (+)(k āˆ’ 1) +Q, (7)

where Q represents the corresponding process noise matrixin diagonal form. The superscripts (-) and (+) indicate valuesgained before and after incorporating the measurement Āµrespectively.

In order to update the estimated state based on the mea-surement Āµ the Kalman-gain reads

K(k) = PĢ‚ (āˆ’)(k)Jp(k)

(Jp(k)PĢ‚

(āˆ’)(k)J>p(k) +R(qC))āˆ’1

(8)

where R(qC) is the measurement noise. Note that we usedynamic measurement noise in order to incorporate theuncertainties which origin from the camera fish-eye lens,

R (q) = ā€–qCā€–/(qCeC3cpenalty

), (9)

where qC the measured tag-position in x-y-z-camera coor-dinates, eC3 is the zC-axis unit vector of the camera frame,and cpenalty is a tuning parameter. Equation 9 dynamicallyincreases the measurement noise R for individual tag mea-surements which appear on the border of the cameraā€™s field ofview. These measurements are usually affected by distortionsin the image rectification process which we capture by highermeasurement noise. This can be seen as a soft outlier penalityand considerably increases localization robustness duringdynamic maneuvers when many tags appear at the field ofview border, e. g the roll screw maneuver depicted in Fig. 1and discussed in Sec. V-B.

Thus, the state update yields

pĢ‚(+)(k) =pĢ‚(āˆ’)(k) +K(k)(Āµ(k)āˆ’ h(pĢ‚(āˆ’)(k))

), (10)

PĢ‚ (+)(k) =(I āˆ’K(k)Jp(k)

)PĢ‚ (āˆ’)(k). (11)

Note that we compensate for individual time stamp shifts.However, this requires to run the filter algorithm on adelayed fusion horizon. The compensation of this delaywhich is required for agile control method is described inthe following section.

IV. GNC-FRAMEWORK

A. Architecture

In the following, we embed the localization module into afully integrated GNC-framework, depicted in Fig. 3. Besidethe previously developed localization module, the GNC-framework possesses a high-level planning module whichbreaks down the mission task into robot attitude setpoints.These are processed by a low-level attitude controller whichcomputes the corresponding control commands. The controlsignals are fed into a mixing module which maps the signalsonto the individual motor actuators. This mapping step isbased on the geometric actuator configuration of the robotand, thus, allows a simple adaptation of the GNC-frameworktowards other robot platforms. With respect to the hardware,we group the software modules into algorithms running onthe SBC at a low update rate, e. g. the planner, and high-rate modules such as the low-level attitude controller whichare implement on the flight computing unit (FCU). Bothcomputing units exchange data via a high bandwidth serialMAVROS-interface.

In order to preserve the modular concept, we followa loosely coupled estimation approach. Note that the lo-calization data can be subject to non-negligible latenciesmainly due to image processing. These latencies have tobe compensated in order to achieve optimal data qualityfor agile maneuvering control. Since the low-level controlalgorithms are running on board the PX4-based FCU [33],a time-delay compensation can hardly capture potential laten-cies on the SBC-FCU-communication within the SBC-based

1822

Page 5: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

localization module. Thus, we integrate the data stream fromthe localization module into PX4ā€™s estimation framework(EKF* in Fig. 3) which is optimized to handle individualsensor latencies. It consists of an EKF which is running ona delayed fusion time horizon and a complementary filterwhich propagates the state to the current time.

B. Planning and Control-Design

In order to achieve high modularity we propose a hi-erarchical planning and control design consisting of twolayers: First, a high-level planning algorithm computes at-titude setpoints in order to follow a given trajectory orpath. Second, a low-level attitude controller which computesthe required motor commands based on the robotā€™s currentattitude Ī˜ and the given attitude setpoints Ī˜des. In order tocompute target attitude setpoints for the low-level controller,we plan cubic path segments from the robotā€™s current posetowards the target path. This is found an efficient strategy andleads to improved results in comparison to a standard purepursuit algorithm [31]. However, more sophisticated planningstrategies are of course possible.

In this work, we exemplary consider the control problemof the underactuated HippoCampus ĀµAUV platform [2].While four degrees of freedom can be actuated directly,sway and heave remain unactuated. Note that the modulararchitecture of the proposed GNC-module allows for a flex-ible adaptation to a wide variety of other underwater robotplatforms. The control output u represents the thrust u1 inthe robotā€™s Bx-axis direction of as well as the moments u2,u3, and u4 around the Bx, By, and Bz-axis respectively

u =[u1 0 0 u2 u3 u4

]>. (12)

We propose a low-level PD-controller based on the controlscheme presented by Mellinger et al. [34] which is originallydesigned for aggressive aerial drone maneuvering. However,its application in underwater robotics is so far limited torobust attitude stabilization [2]. For the sake of simplicitywe assume the thrust signal u1 to a constant target valueu1,des while the outputs u2 to u4 are computed as follows.Let the vector eR describe the robotā€™s orientation error withregard to the desired orientation Rdes = R (Ī˜des)

eR =

eRBx

eRBy

eRBz

=1

2

(R>desR (Ī˜)āˆ’R (Ī˜)

>Rdes

)āˆØ, (13)

where āˆØ is the vee map operator which is used for thetransformation from SO (3) to R3. Note that for the case oflarge orientation errors, we temporary reduce u1 to prioritizethe minimization of the orientation error. Moreover, wedefine the error of the corresponding angular velocities as

eĻ‰ = Ļ‰ āˆ’ Ļ‰des. (14)

Thus, the resulting control outputs yield[u2 u3 u4

]>= āˆ’KReR āˆ’KĻ‰eĻ‰, (15)

where KR and KĻ‰ are diagonal matrices which allowconvenient gain tuning of the individual components.

Fig. 4: Modular ĀµAUV platform HippoCampus, length40 cm, hull diameter 7 cm, with front-mounted GNC-module.

Bz

By

Bx

Cz

Cy

Cx

Flight Computing Unit

Computer TelemetrySingle Board

Camerawide-angle

Fig. 5: GNC-system including the localization modulemounted on the ĀµAUV platform HippoCampus.

C. Hardware Design

The ĀµAUV localization module mainly consists of threecomponents, depicted in Fig. 5: a wide-angle camera, an SBCand a FCU. The camera is a low-cost wide-angle RaspberryPicamera with an opening angle of 140ā—¦. Note that the effectiveopening angle is approximately 120ā—¦ due to rectification. Wereduce the resolution to 640 Ɨ 480 px in order to trade-offaccuracy against processing time which allows the vision-system to run at 10 Hz. Note that other camera mountingorientations are possible to adapt the system to the individualrobot design. A Raspberry Pi 4 with 4 Gb RAM is chosenas an on-board SBC running the vision processing and theEKF. The PixRacer-platform is used as an FCU runningthe PX4-firmware [33] including a dedicated ĀµAUV attitudecontroller. This modular design allows to physically splitthe low-frequent components e. g. the vision processing fromthe high-frequent components such as the attitude controller.Figure 5 shows the complete module in a 3D-printed rackwhich fits the dimensions of the HippoCampus ĀµAUV, seeFig 4. However, all system components fit into a total volumeof approximately 90Ɨ 50Ɨ 30 mm. The system is poweredwith 5 V and its power consumption is 9 W at full loadwhich is comparatively low in comparison to the high powerconsumption of the four thrusters. For more details on theHippoCampus platform we refer the reader to [2], [35].

V. EXPERIMENTS

We evaluate the performance of our system in two ex-perimental settings. First, we analyze the accuracy of theproposed embedded visual localization system. In a second

1823

Page 6: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

experiment, we analyze the GNC-framework in combinationwith the localization module. Hereby, the goal is to drive thesystem to its limits by driving hydrobatic maneuvers fullyautonomously.

The experiments are conducted within a 4 Ɨ 2 Ɨ 2 mfresh water tank. The tank is equipped with an array of63 AprilTags with 400 Ɨ 250 mm spacing. Note that weuse coarser configurations out of the mounted 63 tag-arrayfor the individual experiments. We choose the AprilTag-family 36h11 with side-lengths of 9.6 cm. Early testingshows that this tag size can be detected at 3 m using low-cost RaspberryPi camera at 640 Ɨ 480 px which makes theuse of AprilTags attractive also for larger tanks. Furtherdistances are possible when using a higher resolution. For allexperiments the localization module is deployed on board theHippoCampus ĀµAUV, see Fig. 4 and 5. The acryl tube hullhas a wall thickness of 3 mm. Prior to the experiments, wecalibrate the wide-angle camera underwater using standardcheckerboard calibration. We refer the reader to the accom-panying video for an intuitive visual understanding.

A. Localization Performance

In the following we analyze the performance of theproposed localization module in order to demonstrate itssuitability for accurate visual underwater localization andbenchmarking tasks. Prior works, such as [20], focus theiranalysis of the AprilTag detection mainly on static or slow-motion settings with high-performance camera equipment.Therefore, our analysis supplements these prior results by adynamic tracking setup. In order to benchmark our resultsagainst ground truth data, we mount the localization moduleonto an automated gantry system which can follow desiredpaths with in x-y-z-direction with millimeter accuracy. Inorder to analyze whether the localization accuracy drops incertain areas of the tank volume we conduct the experimentat three different depths levels which represent varying dis-tances d between the localization module and the tag-array.Moreover, we examine three tag-array densities, namely thea fine array with 31 tags, a coarse setup with only 23 tags,and the full array with all 63 AprilTags as a baseline.

The results for all configurations are summarized in Tab. I.The cross track error ecross is defined between the gantryā€™spath and the estimated position and, thus, independent fromlatencies e. g. due to processing time and time stamp shifts.Moreover, the results show that for larger distances thetag-density has only a small influence on the localizationaccuracy. However, in the 0.4 m setup, the coarse array leadsto an observable accuracy drop as only few tags appear inthe cameras field of view.

Figure 6 shows the position estimate exemplary for theexperimental setup with distance d = 0.8 m and the dense-array. It can be seen that the module consistently achieveshigh accuracy throughout the tank volume (RMSE ecross =2.6 cm). The accuracy drops slightly in areas close to the tankwalls (see xā‰ˆ0 m and xā‰ˆ3.1 m) due to the reduced number ofvisible AprilTags in the cameraā€™s field of view. Moreover, thecamera can be subject to motion blur which can considerable

TABLE I: Cross-track error ecross of the position estimatedand the average number of AprilTags ntags detected ateach camera frame for various camera to tag-array distancesd. The corresponding standard deviations are denoted inparentheses. The setup with gray background is depicted indetail in Fig. 6.

Dist. d 1.3 m 0.8 m 0.4 m

all ecross 3.1 (1.2) cm 2.4 (0.7) cm 3.8 (5.3) cmntags 23.1 (4.7) 12.3 (3.1) 4.2 (2.2)

fine ecross 3.1 (1.4) cm 2.6 (1.3) cm 5.5 (4.4) cmntags 11.6 (2.3) 6.1 (1.6) 1.8 (0.8)

coarse ecross 3.4 (1.8) cm 3.2 (1.6) cm 14.9 (20.5) cmntags 8.6 (1.7) 4.8 (1.3) 1.7 (1.1)

reduce the number of detected tags. Therefore, we examinethis effect in more detail in Sec. V-B. Note that the algorithmonly detects markers which completely lie within its field ofview. The analysis of the raw camera images shows that oftenmultiple AprilTags lie only partly in the field of view whichcan result in a reduced accuracy. Thus, a trade-off arisesbetween long and short detection distances, using large andsmall markers, respectively. Moreover, we can observe thateven tags visible in acute angles from the camera can bedetected robustly as long as they lie within the field of view.

As expected, the average number of detected tags hasa strong effect on the localization performance. Especiallylarger tanks allow a coarse tag distribution which still ensuresthat enough tags lie in the cameraā€™s field of view. However,the mounting of the tags can be adjusted depending on thetask.

Finally, we examine the localization moduleā€™s capabilityof estimating its yaw orientation with respect to the markerarray. This is an important aspect for ĀµAUVs since determin-ing the robotā€™s yaw orientation is usually strongly effectedby distortions of the magnetic field leading to inaccuratemeasurements of the magnetometers. Therefore, we movethe localization module along straight lines through the tankwith its own orientation fixed. This allows to determine thedeviation of the yaw estimate during the motion. We conductmultiple experimental runs with different fixed orientationsof the localization module. The resulting standard deviationremains by less than 2ā—¦ which can be seen as a drasticimprovement in comparison to standard magnetic field basedconcepts which often render infeasible in confined tanksconsisting of steel frames.

In summary, the experiment demonstrates the high accu-racy of the localization module. Note that the above reportedresults are achieved with low-cost components of less thanUSD 80.

B. Hydrobatic Path Tracking Control

Our second experiment focuses on the robustness ofthe localization system and its integration into the GNC-framework. In order to demonstrate the performance of thesystem, we design a challenging sequence of hydrobaticmaneuvers. The sequence consists of an āˆž-shape path atvarying depths and a screw-maneuver, where the robot turns

1824

Page 7: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

Fig. 6: Accuracy of the localization module in comparisonto ground truth.

360ā—¦ around its roll-axis while following the path, see Fig. 1.The outer dimensions of the path are 2.5 Ɨ 0.9 m with atotal length of 4.4 m. As no external tracking system isavailable we place fixed hoops as gates with 50 cm diameteras reference points on the robotā€™s path to the visualize thepath containing envelope, see Figs. 1 and 7.

The path is processed on board the SBC by the high-level planning module with a 35 cm planning horizon. Therecorded track is depicted in Fig. 7 for four consecutiverounds at varying depths. Note that we pose an additionalchallenge to the system at the beginning of the experimentby throwing the robot through the air to an arbitrary startingposition. The localization system recovers within the firstfew time instances and estimates the correct position, seethe accompanying video.

When driving hydrobatic maneuvers, accurate and robustinformation on the current robot position are required. There-fore, we want to point out the tracking performance duringthe screw-maneuver when the system cannot see and detectany marker for 2.7 s. For this time span the controller reliespurely on the EKF prediction. Note that, after regainingvisual contact to the tags no noticeable jumps are induced bythe Kalman update. Figure 8 shows the number of detectedmarkers while following the path for one round, includingthe zero sight segment in dark red which corresponds to thescrew maneuver. The reader is referred to the accompanyingvideo to get a more intuitive understanding of the maneu-vering sequence. When examining the raw camera images, itcan be seen that the camera greatly suffers from motion blurwhen the vehicle is performing agile maneuvers. However,tuning the camera exposure time can reduce this effect. Werefer the reader to the rosbag-files which were recordedduring the experiments which are linked to code-repository.

It is worth mentioning that the robot is able to trackthe path at all time instances with small cross-track errors(RMSE 7.2 cm). Thereby, it follows the ĀµAUV is able tofollow at significant higher speed than similar state of theart autonomous underwater robot platforms. During the ex-periment, the angular velocities reach up to [4, 2.8, 1.5] rad/sfor roll, pitch, and yaw rotations, respectively.

As a result, although we pose several challenges, theexperiments demonstrate that our proposed GNC-frameworkis capable to repeatably and accurately drive the ĀµAUV alongthe path.

Fig. 7: Tracking performance of the HippoCampus ĀµAUVdriving through four hoop gates.

# of

det

ecte

d T

ags

Fig. 8: Number of detected AprilTags during the firstāˆž-round. The red segment indicates zero tag-detectionsduring the screw maneuver, see also Fig. 1

VI. CONCLUSION

A. Summary

In this work, we propose a robust high accuracy marker-based visual self-localization system in combination with aguidance, navigation, and control (GNC) architecture whichallow hydrobatic maneuvering with agile micro underwaterrobots. The developed concept provides small-scale sub-merged robots with reliable information on their absoluteposition and orientation, as well as the corresponding controlcommands by only using low-cost components.

The developed concept forms an independent local-ization module which can be used as either a pri-mary source of position information or as a secondarysystem for benchmarking purposes. We demonstrate intwo experiments, first, the strong robustness and accu-racy of the self-localization module on its own and,second, the capability of the integrated GNC-system ofdriving an agile micro underwater robot along a pathwhile performing hydrobatic maneuvers. Herewith, we over-come one of the major bottlenecks in micro underwa-ter robotics research, as experiments on fully autonomousmissions were challenging even in controlled environmentssuch as research tanks. Our algorithms are available on-line under https://github.com/DanielDuecker/microAUV-localization. The algorithms and findingsin the evaluations are likely contributing to other researchersā€™work in micro underwater robotics, as the adaptation of oursystem to other platforms is straight-forward.

1825

Page 8: Towards Micro Robot Hydrobatics: Vision-based Guidance ...IMU data, AUV navigation requires additional robust and continuous information on the vehicle's absolute position. Accurate

B. Future Work

The proposed self-localization system in combination withthe GNC-framework opens several directions of future re-search. Since it provides robust and accurate information inthe robotā€™s absolute position, the framework is expected toaccelerate development of more sophisticated trajectory plan-ning and following algorithms. With regard to ĀµAUV naviga-tion, we will investigate the effect of additional cameras andvarying AprilTag-setups on the localization performance.

REFERENCES

[1] S. Bhat and I. Stenius, ā€œHydrobatics: A Review of Trends, Challengesand Opportunities for Efficient and Agile Underactuated AUVs,ā€in AUV 2018 - 2018 IEEE/OES Autonomous Underwater VehicleWorkshop, Proceedings. Porto: IEEE, 2018, pp. 1ā€“8.

[2] D. A. Duecker, A. Hackbarth, T. Johannink, E. Kreuzer, andE. Solowjow, ā€œMicro Underwater Vehicle Hydrobatics: A SubmergedFuruta Pendulum,ā€ in IEEE International Conference on Robotics andAutomation (ICRA). Brisbane: IEEE, 2018, pp. 7498ā€“7503.

[3] M. Nancekievill, A. R. Jones, M. J. Joyce, B. Lennox, S. Watson,J. Katakura, K. Okumura, S. Kamada, M. Katoh, and K. Nishimura,ā€œDevelopment of a Radiological Characterization Submersible ROVfor Use at Fukushima Daiichi,ā€ IEEE Transactions on Nuclear Science,vol. 65, no. 9, pp. 2565ā€“2572, 2018.

[4] F. Kendoul, ā€œSurvey of Advances in Guidance, Navigation, andControl of Unmanned Rotorcraft Systems,ā€ Journal of Field Robotics,vol. 29, no. 2, pp. 315ā€“378, 2012.

[5] S. Heshmati-Alamdari, G. C. Karras, P. Marantos, and K. J.Kyriakopoulos, ā€œA Robust Model Predictive Control Approachfor Autonomous Underwater Vehicles Operating in a Constrainedworkspace,ā€ in 2018 IEEE International Conference on Robotics andAutomation (ICRA). Brisbane: IEEE, 2018, pp. 6183ā€“6188.

[6] C. P. Bechlioulis, G. C. Karras, S. Heshmati-Alamdari, and K. J.Kyriakopoulos, ā€œTrajectory Tracking With Prescribed Performancefor Underactuated Underwater Vehicles Under Model Uncertaintiesand External Disturbances,ā€ IEEE Transactions on Control SystemsTechnology, vol. 25, no. 2, pp. 429ā€“440, 2017.

[7] A. Griffiths, A. Dikarev, P. R. Green, B. Lennox, X. Poteau, andS. Watson, ā€œAVEXIS - Aqua vehicle explorer for in-situ sensing,ā€IEEE Robotics and Automation Letters, vol. 1, no. 1, pp. 282ā€“287,2016.

[8] H. Hanff, P. Kloss, B. Wehbe, P. Kampmann, S. Kroffke, A. Sander,M. B. Firvida, M. V. Einem, J. F. Bode, and F. Kirchner, ā€œAUVx -A Novel Miniaturized Autonomous Underwater Vehicle,ā€ in Oceans.Aberdeen: IEEE, 2017, pp. 1ā€“10.

[9] A. Hackbarth, E. Kreuzer, and E. Solowjow, ā€œHippoCampus: A microunderwater vehicle for swarm applications,ā€ in IEEE/RSJ InternationalConference on Intelligent Robots and Systems (IROS). Hamburg:IEEE, 9 2015, pp. 2258ā€“2263.

[10] J. Jung, J. H. Li, H. T. Choi, and H. Myung, ā€œLocalization ofAUVs using visual information of underwater structures and artificiallandmarks,ā€ Intelligent Service Robotics, vol. 10, no. 1, pp. 67ā€“76,2017.

[11] J. Jung, Y. Lee, D. Kim, D. Lee, H. Myung, and H.-t. Choi, ā€œAUVSLAM Using Forward / Downward Looking Cameras and ArtificialLandmarks,ā€ IEEE Underwater Technology (UT), pp. 1ā€“3, 2017.

[12] D. J. Klein, P. K. Bettale, B. Triplett, and K. A. Morgansen, ā€œAu-tonomous underwater multivehicle control with limited communica-tion: Theory and experiment,ā€ Proceedings of the 2008 IFAC Workshopon Navigation, Guidance and Control of Underwater Vehicles, vol. 41,no. 1, pp. 113ā€“118, 2008.

[13] S. Napora and D. A. Paley, ā€œObserver-Based Feedback Control forStabilization of Collective Motion,ā€ IEEE Transactions on ControlSystems Technology, vol. 21, no. 5, pp. 1846ā€“1857, 9 2013.

[14] N. Sydney, S. Napora, S. Beal, P. Mohl, P. Nolan, S. Sherman,A. Leishman, S. Butail, and D. A. Paley, ā€œA Micro-UUV Testbed forBio-Inspired Motion Coordination,ā€ Int. Symp. Unmanned UntetheredSubmersible Technology, pp. 1ā€“13, 2009.

[15] R. Cui, Y. Li, and W. Yan, ā€œMutual Information-Based Multi-AUVPath Planning for Scalar Field Sampling Using MultidimensionalRRT*,ā€ IEEE Transactions on Systems, Man, and Cybernetics: Sys-tems, vol. 46, no. 7, pp. 993ā€“1004, 2016.

[16] A. Dikarev, A. Griffiths, S. A. Watson, B. Lennox, and P. R. Green,ā€œCombined multiuser acoustic communication and localisation sys-tem for microAUVs operating in confined underwater environments,ā€IFAC Workshop on Navigation, Guidance and Control of UnderwaterVehicles, vol. 48, no. 2, pp. 161ā€“166, 2015.

[17] M. Ishida and K. Shimonomura, ā€œMarker Based Camera Pose Estima-tion for Underwater Robots,ā€ in IEEE/SICE International Symposiumon System Integration. Fukuoka: IEEE/SICE, 2012, pp. 629ā€“634.

[18] E. Olson, ā€œAprilTag : A robust and flexible visual fiducial system,ā€ inIEEE International Conference on Robotics and Automation (ICRA).Shanghai: IEEE, 2011, pp. 3400ā€“3407.

[19] J. Wang and E. Olson, ā€œAprilTag 2 : Efficient and robust fiducialdetection,ā€ in IEEE/RSJ International Conference on Intelligent Robotsand Systems. Deajeon: IEEE/RSJ, 2016, pp. 2ā€“7.

[20] D. B. dos Santos Cesar, C. Gaudig, M. Fritsche, M. A. dos Reis,and F. Kirchner, ā€œAn Evaluation of Artificial Fiducial Markers inUnderwater Environments,ā€ in IEEE OCEANS, Genova, 2015, pp. 1ā€“6.

[21] E. H. Henriksen, I. SchjĆølberg, and T. B. Gjersvik, ā€œVision BasedLocalization for Subsea Intervention,ā€ in Proceedings of the ASME2017 36th International Conference on Offshore Mechanics and ArcticEngineering (OMAE), Trondheim, 2017, pp. 1ā€“9.

[22] E. Westman and M. Kaess, ā€œUnderwater AprilTag SLAM and calibra-tion for high precision robot localization,ā€ Carnegie Mellon University,Pittsburgh, PA, Tech. Rep. October, 2018.

[23] J. Heitmann, F. Steinmetz, and C. Renner, ā€œSelf-Localization of MicroAUVs Using a Low-Power , Low-Cost Acoustic Modem,ā€ in OCEANS2018 MTS/IEEE. Charleston: IEEE, 2018, pp. 72ā€“74.

[24] Q. Tao, Y. Zhou, F. Tong, A. Song, and F. Zhang, ā€œEvaluating acousticcommunication performance of micro autonomous underwater vehi-cles in confined spaces,ā€ Frontiers of Information Thechnology andElectronic Engineering, vol. 19, no. 8, pp. 1013ā€“1023, 2018.

[25] A. R. Geist, A. Hackbarth, E. Kreuzer, V. Rausch, M. Sankur, andE. Solowjow, ā€œTowards a hyperbolic acoustic one-way localizationsystem for underwater swarm robotics,ā€ in Proceedings - IEEE Inter-national Conference on Robotics and Automation (ICRA), Stockholm,2016, pp. 4551ā€“4556.

[26] D. Park, K. Kwak, J. Kim, and W. K. Chung, ā€œ3D Underwater Local-ization Scheme using EM Wave Attenuation with a Depth Sensor,ā€ inIEEE International Conference on Robotics and Automation (ICRA),2016. Stockholm: IEEE, 2016, pp. 2631ā€“2636.

[27] D. Park, J. Jung, K. Kwak, J. Kim, and W. K. Chung, ā€œReceived SignalStrength of Electromagnetic Waves Aided Integrated Inertial Navi-gation System for Underwater Vehicle,ā€ in IEEE/RSJ InternationalConference on Intelligent Robots and Systems, IEEE, Ed., Madrid,2018, pp. 1870ā€“1876.

[28] D. A. Duecker, A. R. Geist, M. Hengeler, E. Kreuzer, M. A. Pick,V. Rausch, and E. Solowjow, ā€œEmbedded spherical localization formicro underwater vehicles based on attenuation of electro-magneticcarrier signals,ā€ Sensors (Switzerland), vol. 17, no. 5, 2017.

[29] D. A. Duecker, T. Johannink, E. Kreuzer, V. Rausch, and E. Solowjow,ā€œAn Integrated Approach to Navigation and Control in Micro Under-water Robotics,ā€ in (accepted to) IEEE International Conference onRobotics and Automation (ICRA). Montreal: IEEE, 2019, pp. 1ā€“7.

[30] D. Malyuta, C. Brommer, D. Hentzen, T. Stastny, R. Siegwart, andR. Brockers, ā€œLong-duration fully autonomous operation of rotorcraftunmanned aerial systems for remote-sensing data acquisition,ā€ Journalof Field Robotics, vol. August, pp. 1ā€“21, 2019.

[31] T. I. Fossen, Handbook of Marine Craft Hydrodynamics and MotionControl. Wiley, 2011.

[32] D. A. Duecker, E. Kreuzer, G. Maerker, and E. Solowjow, ā€œParameterIdentification for Micro Underwater Vehicles,ā€ in Proceedings inApplied Mathematics and Mechanics (PAMM), no. June. Munich:GAMM, 2018, pp. 1ā€“2.

[33] L. Meier, D. Honegger, and M. Pollefeys, ā€œPX4: A node-basedmultithreaded open source robotics framework for deeply embeddedplatforms,ā€ in 2015 IEEE International Conference on Robotics andAutomation (ICRA). Seattle: IEEE, 5 2015, pp. 6235ā€“6240.

[34] D. Mellinger and V. Kumar, ā€œMinimum Snap Trajectory Generationand Control for Quadrotors,ā€ in IEEE International Conference onRobotics and Automation (ICRA). Shanghai: IEEE, 2011, pp. 2520ā€“2525.

[35] D. Duecker, E. Solowjow, and A. Hackbarth, ā€œThe open-sourceHippoCampus Robotics Project,ā€ 2020. [Online]. Available: https://hippocampusrobotics.github.io/

1826