multirobotcollaborativenavigationalgorithmsbasedon … · 2020. 5. 24. · according to the...

16
Research Article Multirobot Collaborative Navigation Algorithms Based on Odometer/Vision Information Fusion Guohua Liu , 1 Juan Guan, 1 Haiying Liu, 2 and Chenlin Wang 3 1 School of Mathematics, Southeast University, Nanjing 210096, China 2 College of Astronautics, Nanjing University of Aeronautics and Astronautics, Nanjing 210016, China 3 China COMAC Shanghai Aircraft Design and Research Institute, Shanghai 201210, China Correspondence should be addressed to Guohua Liu; [email protected] Received 24 May 2020; Revised 15 June 2020; Accepted 11 July 2020; Published 27 August 2020 Guest Editor: Xiao Ling Wang Copyright © 2020 Guohua Liu et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Collaborative navigation is the key technology for multimobile robot system. In order to improve the performance of collaborative navigation system, the collaborative navigation algorithms based on odometer/vision multisource information fusion are presented in this paper. Firstly, the multisource information fusion collaborative navigation system model is established, including mobile robot model, odometry measurement model, lidar relative measurement model, UWB relative measurement model, and the SLAM model based on lidar measurement. Secondly, the frameworks of centralized and decentralized collaborative navigation based on odometer/vision fusion are given, and the SLAM algorithms based on vision are presented. en, the centralized and decentralized odometer/vision collaborative navigation algorithms are derived, including the time update, single node mea- surement update, relative measurement update between nodes, and covariance cross filtering algorithm. Finally, different simulation experiments are designed to verify the effectiveness of the algorithms. Two kinds of multirobot collaborative navigation experimental scenes, which are relative measurement aided odometer and odometer/SLAM fusion, are designed, respectively. e advantages and disadvantages of centralized versus decentralized collaborative navigation algorithms in different experimental scenes are analyzed. 1. Introduction With the development of the age of intelligence, the ap- plication scenarios and task requirements of intelligent mobile robots are becoming more and more complex [1, 2]. A single robot has difficulty in meeting the needs of humans for highly automated robots. erefore, the multirobot system, which fully embodies the group wisdom, has a broad prospect. e subject has attracted extensive attention of researchers [3–5]. e premise of robot collaborative nav- igation operation is that each robot needs to have a certain ability of navigation and localization. erefore, it is very important for the research of robot collaborative navigation system [6]. e navigation accuracy of a single robot depends on its own navigation system, independent of other motion bodies. is navigation method is relatively simple, but due to the computing power of its own processor, sensor quality, and sensor field of vision and other factors, it limits the working range of the navigation system to a certain extent in the process of navigation and positioning. It also affects the ability of the navigation system to suppress noise, reducing errors and adapting to the complex environment. e collaborative navigation system of robot can make up for the single robot navigation system. When there is relative measurement between each robot, the relative information can be fully used to correct the navigation results. It is also possible to establish links with each other to realize the sharing of navigation resources among robots and to obtain better navigation performance [7–9]. In the research of mobile robot navigation and location technology, the development of Simultaneous Localization Hindawi Mathematical Problems in Engineering Volume 2020, Article ID 5819409, 16 pages https://doi.org/10.1155/2020/5819409

Upload: others

Post on 26-Mar-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

Research ArticleMultirobot Collaborative Navigation Algorithms Based onOdometerVision Information Fusion

Guohua Liu 1 Juan Guan1 Haiying Liu2 and Chenlin Wang3

1School of Mathematics Southeast University Nanjing 210096 China2College of Astronautics Nanjing University of Aeronautics and Astronautics Nanjing 210016 China3China COMAC Shanghai Aircraft Design and Research Institute Shanghai 201210 China

Correspondence should be addressed to Guohua Liu liuguohuaseueducn

Received 24 May 2020 Revised 15 June 2020 Accepted 11 July 2020 Published 27 August 2020

Guest Editor Xiao Ling Wang

Copyright copy 2020 Guohua Liu et al is is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited

Collaborative navigation is the key technology for multimobile robot system In order to improve the performance of collaborativenavigation system the collaborative navigation algorithms based on odometervision multisource information fusion arepresented in this paper Firstly the multisource information fusion collaborative navigation systemmodel is established includingmobile robot model odometry measurement model lidar relative measurement model UWB relative measurement model andthe SLAMmodel based on lidar measurement Secondly the frameworks of centralized and decentralized collaborative navigationbased on odometervision fusion are given and the SLAM algorithms based on vision are presented en the centralized anddecentralized odometervision collaborative navigation algorithms are derived including the time update single node mea-surement update relative measurement update between nodes and covariance cross filtering algorithm Finally differentsimulation experiments are designed to verify the effectiveness of the algorithms Two kinds ofmultirobot collaborative navigationexperimental scenes which are relative measurement aided odometer and odometerSLAM fusion are designed respectivelyeadvantages and disadvantages of centralized versus decentralized collaborative navigation algorithms in different experimentalscenes are analyzed

1 Introduction

With the development of the age of intelligence the ap-plication scenarios and task requirements of intelligentmobile robots are becoming more and more complex [1 2]A single robot has difficulty in meeting the needs of humansfor highly automated robots erefore the multirobotsystem which fully embodies the group wisdom has a broadprospect e subject has attracted extensive attention ofresearchers [3ndash5] e premise of robot collaborative nav-igation operation is that each robot needs to have a certainability of navigation and localization erefore it is veryimportant for the research of robot collaborative navigationsystem [6]

e navigation accuracy of a single robot depends on itsown navigation system independent of other motion bodies

is navigation method is relatively simple but due to thecomputing power of its own processor sensor quality andsensor field of vision and other factors it limits the workingrange of the navigation system to a certain extent in theprocess of navigation and positioning It also affects theability of the navigation system to suppress noise reducingerrors and adapting to the complex environment ecollaborative navigation system of robot can make up for thesingle robot navigation system When there is relativemeasurement between each robot the relative informationcan be fully used to correct the navigation results It is alsopossible to establish links with each other to realize thesharing of navigation resources among robots and to obtainbetter navigation performance [7ndash9]

In the research of mobile robot navigation and locationtechnology the development of Simultaneous Localization

HindawiMathematical Problems in EngineeringVolume 2020 Article ID 5819409 16 pageshttpsdoiorg10115520205819409

and Mapping (SLAM) technology provides a new thoughtfor autonomous positioning of mobile robot under complexscenes [10] At the same time as a significant technicalmeans to realize relative navigation radio navigation andvisual navigation are being paid much attention It providesimportant technical support for further research on thecollaborative navigation of multirobots [11ndash13] Howeverthe traditional combined navigation technology is far frommeeting the growing demand for the high performance ofcollaborative navigation

At present most of the data fusion methods of multi-source heterogeneous sensors in integrated navigationsystem use centralized data fusion algorithmis algorithmhas some defects in collaborative navigation of mobile ro-bots high communication cost and poor robustnessHowever the current research on decentralized data fusionalgorithm which is suitable for collaborative navigationenvironment is not mature enough [14 15]

erefore based on the information fusion of mul-tisource heterogeneous sensors and related SLAM dataassociation algorithms this paper puts forward a newconcept model and solution of multisource heteroge-neous sensor information fusion which can significantlyimprove the performance of collaborative navigationaddress some problems of cooperative navigation tech-nology explore the key technologies that need to be solvedin the collaborative navigation process of multirobotplatform and provide theoretical basis and technicalsupport for high performance of collaborative navigationapplication

e structure of this paper is outlined as follows inSection 2 in the multisource information fusion collabo-rative navigation system model the principle of the moduleinvolved and the source of error are analyzed In Section 3the framework of centralized and decentralized collaborativenavigation framework based on odometervision red fusionis given and the SLAM algorithms based on vision arepresented In Sections 4 and 5 the centralized and decen-tralized odometervision collaborative navigation algo-rithms are derived respectively In Section 6 differentsimulation experiments are designed to verify the effec-tiveness of the algorithms

2 Multisource Information FusionCollaborative Navigation System Model

21 Motion Robot Model Suppose that the mobile robotmoves according to the circular arc trajectory Figure 1shows the motion of a single mobile robot (see Figure 1)

211 Odometry Measurement Model According to theprinciple of odometer motion calculation [16] it is assumedthat the starting position of themobile robot is (x y θ) afterΔT of movement to the end e end position is(x + Δx y + Δy θ + Δθ) and the trajectory can be regardedas a circular arc ΔS with radius R In this case the pose of therobot (in discrete time) is given by

ΔSl π D middotΔNL

P R minus

B

21113874 1113875Δθ

ΔSr π D middotΔNR

P R +

B

21113874 1113875Δθ

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(1)

where ΔNLΔNR are increment of pulse number ofodometer left and right wheel encoder (ΔSrΔSl) is thedistance between the right wheel and the left wheel of amobile robot P are the resolution parameters of left andright wheel encoder e pose increment of sampling periodΔT is obtained by solving the above equations e discretemotion equation of the mobile robot is further obtained by

xk asymp xkminus1 + ΔSkminus1 cosθkminus1 + Δθkminus1

21113888 1113889

yk asymp ykminus1 + ΔSkminus1 sinθkminus1 + Δθkminus1

21113888 1113889

θk θkminus1 + Δθkminus1

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(2)

212 Odometer Error Model ere are systematic errorsand random errors in odometer based positioning [17] Inthe simulation experiment the influence of random error ismainly considered e corresponding covariance matrixΣodom is given by

Σodom kr ΔSr

11138681113868111386811138681113868111386811138681113868 0

0 kl ΔSl

11138681113868111386811138681113868111386811138681113868

⎡⎣ ⎤⎦ (3)

22 Lidar Environmental Detection Model As the mainmodule to realize the SLAM environment detection modulemainly obtains the surrounding environment informationthrough the external sensing sensor

221 Lidar Scanning Model e two-dimensional lidaroutputs the point cloud data from the surrounding envi-ronment and contains the angular information e scan-ning schematic of the two-dimensional lidar is shown inFigure 2 (see Figure 2)

Xw

Xb

Ow

Ob

O

Yb

Yb

Yw

B

∆Sr

∆S1∆S

RXb

Ob

θ

θ + ∆θ

Figure 1 Motion model of a single mobile robot

2 Mathematical Problems in Engineering

As shown in Figure 2 the lidar is scanned in the sensorcoordinate system along the scanning direction with a fixedscanning angle resolutione measured point is detected inthe ith direction Denote θi as the angle between this di-rection and the positive direction of the Xs-axis and θi canbe obtained by the resolution of the scanning angle with theinherent A frame of point cloud data is obtained after a scanwhich can be recorded as a set A (di θi) | i 1 n1113864 1113865where n is the number of scanned data points in this frame ofdata (di θi) is the coordinate of the measured point in thepolar coordinate system And the final point cloud data setB (xi yi) | i 1 n1113864 1113865 is obtained too

222 Lidar Error Model In the simulation experiment theabove error can be simplified to Gaussian white noise alongthe x and y directions in the carrier coordinate system of themobile robot [18] It is assumed that the positioning errorsare in the two independent directions so the correspondingcovariance matrix is given by

Σrad σb

x 0

0 σby

⎡⎢⎣ ⎤⎥⎦ (4)

where σbx and σb

y represent the variance along the x and y

directions of the observation in the carrier coordinate systemof the mobile robot respectively

23 UWB Relative Measurement Model e relative mea-surement module is an important part of the cooperativenavigation network in which nodes relate to the state in-formation of the surrounding nodes UWB (Ultra WideBand) can measure relative distance and lidar can measurerelative position which are selected as research objects in thispaper Because the scanning model of lidar has been de-scribed in the previous section this section only establishesthe corresponding ranging model and error model

231 UWB Ranging Model At present there are two mainranging methods that is two-way ranging (TWR) methodand symmetric double-sided two-way ranging (SDSTWR)method SDSTWR method is greatly improved comparedwith TWRmethod in terms of ranging error caused by clock

synchronization erefore SDSTWR method is often usedfor distance measurement [19] (see Figure 3)

e distance expression between transceiver A andtransceiver B can be obtained from Figure 3

d cTA1TB2 minus TB1TA2

TA1 + TA2 + TB1 + TB2 (5)

where d is the distance between the two tested transceiversTA1 Ta2 minus Ta1 is the time interval between the first signaland the received signal of transceiver A We can getTA2 TB1 TB2 in the same way

232 UWB ErrorModel One of themain sources of error inUWB ranging is the error caused by the clock synchroni-zation of the two transceivers [20]

eSDS Tfc 1 minuskA + kB

21113888 1113889 (6)

where kA and kB are the ratio of transceivers A and B be-tween the actual and the expected frequency

24 SLAM Model Based on Lidar Measurement SLAM iswidely used in the field of mobile robot navigation andpositioning [21] It is the key technology to solve themapping problem at the same time and it also provides anew idea for solving the path planning problem SLAMtechnology mainly aims at the unknown location of themobile robot e environmental map is built in the form ofan increasing perception of the surrounding strange envi-ronment according to the external sensor At the same timethe state estimation of the mobile robot itself is obtained byusing the built map information

e SLAM localization technology is the key technologyto realize the autonomous positioning of the mobile robotunder the unknown environment It is of great researchvalue to realized more accurate navigation and localizationunder the condition of reducing the limitation of movingrobot (such as no GPS) [22] By applying the technology tothe framework of the collaborative navigation system of themobile robot the navigation performance of the wholecollaborative navigation system can be effectively improved(see Figure 4)

It can be seen from Figure 4 that a mobile robotrsquos lo-calization using a SLAM navigation system is essentially aprocess which is continuous estimation to approximate thetrue value [23] e triangle in the graph represents themobile robot and the circle represents the observed land-mark where the grey circle represents the estimated land-mark e solid line connecting the triangle represents thereal trajectory and the dotted line is the estimated trajectory(see Figure 4)

3 Collaborative Navigation FrameworkBased on OdometerVision

When nodes in collaborative navigation network participatein collaborative navigation different data fusion algorithms

θidiYs

Xs

The measuredpoint

Scanningdirection

Scan angularresolution

Figure 2 e scanning schematic diagram of the two-dimensionallidar

Mathematical Problems in Engineering 3

can be used for data fusion which is obtained multisourceheterogeneous sensors of different nodes ere are two datafusion algorithms generally used centralized and decen-tralized data fusion algorithms [24]

31 Framework of Collaborative Navigation Algorithm Inthis subsection a corresponding data fusion framework ofthe centralized collaborative navigation algorithm isdesigned for the general odometervision model and a datafusion framework of decentralized collaborative navigationalgorithm is designed in the same way

In the centralized collaborative navigation structure themeasured data obtained by each node are concentrated in adata fusion center for data fusion In a distributed collab-orative navigation structure each node shares some infor-mation with other nodes while processing its own sensordata (see Figure 5)

According to the odometervision collaborative navi-gation model we use the most common EKF algorithms tosimulate centralized and decentralized cooperative naviga-tion A centralized location algorithm (CL) and a decen-tralized location algorithm (DCL) are designed

As shown in Figure 6 in the CL algorithm corre-sponding to the centralized cooperative navigationstructure each node sends the information obtained bythe sensor itself to the central server and realizes the datafusion through the EKF algorithm Its state vector is a setof state vectors of each node which is updated with theprinciple of track reckoning After that the measurementinformation obtained after data association in the SLAMprocess and the relative measurement information be-tween nodes are selected CL algorithm gathers the stateand covariance information of all nodes corrects themuniformly and sends the corrected estimation results toeach node Because of participating in the joint mea-surement update process the state information of eachnode is related to each other after the first update (seeFigure 6)

is correlation is reasonably estimated based on the CLalgorithm and the task of data fusion is distributed to eachnode then the DCL algorithm is proposed accordinglySince the position of each node in the algorithm is equiv-alent only one node needs to be discussed (see Figure 7)Inorder to avoid overoptimal estimation to the greatest extentthe DCL algorithm in this paper introduces the concept ofseparating covariance crossover filtere covariance of eachnode is divided into correlation term and irrelevant termand the time update process is basically consistent with thetime update process of a single node e measurementupdate process will take two steps Firstly according to themeasurement information of the SLAM navigation systemthe state estimation results are obtained by propagating thestate and integrating the related information of the auxiliarynavigation system en the state information sent by theadjacent nodes and the relative measurement informationbetween the nodes can obtain another state estimation of thenode Here relevant state information sent to neighboringnodes is used to update local maps (see Figure 7)

32 SLAM Algorithms Based on Vision

321 Landmark State Estimation Algorithm e key ofSLAM navigation algorithm lies in the process of data as-sociation e positioning process of this SLAM navigationsystem is essentially a process which is continuous esti-mation to approximate the true value is kind of proba-bility estimation problem is usually solved by introducingappropriate filter e most common is the EKF algorithm(see Figure 8)

Because of the high sampling frequency of the odometerselected in this paper the lidar also has the advantages ofhigh precision and high reliability the EKF algorithm withbetter real-time performance is selectede state estimationprocess of landmark information in SLAM based on EKF isdescribed below e observation equation of the featureinformation obtained by lidar is as follows

zk h Xkl Xkr1113872 1113873 + nk

xkl minus xkr1113872 1113873cos θkr + ykl minus ykr1113872 1113873sin θkr

minus xkl minus xkr1113872 1113873sin θkr + ykl minus ykr1113872 1113873cos θkr

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nk

(7)

where Xkl (xkl ykl)T is state vector of landmark at k time

and Xkr (xkr ykr θkr)T is state vector of mobile robot at

Tα1 Tα2 Tα3TA1 TA2

Tb1Tb2

TB2

Tb3

TB1Tf

Time stamp A

Time stamp B

Transceiver A

Transceiver B

Figure 3 Principle of SDSTWR

Figure 4 SLAM model diagram

4 Mathematical Problems in Engineering

k time nk is measurement noise Its variance matrix is Rkwhich can be denoted as nk sim N(0 Rk) Since the landmarksare static the state estimation of k minus 1 time landmark can beregarded as a priori estimation of k time landmark state emeasurement update process based on EKF is as follows

Step 1 calculating the innovation and the filter gain

vk zk minus h Xkminus1l Xminuskr1113872 1113873

Kk Σkminus1HTk HkΣkminus 1H

Tk + Rk1113872 1113873

minus 1

(8)

Step 2 updating the state estimation and the corre-sponding covariance

Xkl Xkminus1l + Kkvk

Σk I minus KkHk( 1113857Σkminus1(9)

where Σk is the covariance matrix for state estimation oflandmark at k time and Hk is the measurement matrix at k

time

Remark 1 Any observed landmark information can beposition corrected by the above method and it is noted thatsuch position correction is limited to the landmark in thelocal map observed at the k time

322 Data Association Algorithm In a SLAM navigationsystem the process of data association is an importantprerequisite for state estimation Incorrect data association islikely to lead to serious deviation of estimation results[25 26]

At present there are two data association algorithmscommonly used in SLAM technology that is the nearestneighbor data association algorithm (NN nearest neighbor)[27] and joint compatibility body amp body data associationalgorithm (JCBB) [28] e NN algorithm has less com-putation but it is easy to form wrong data association whenthe density of feature information is large which leads to thedivergence of SLAM results so it is only suitable for theenvironment where the density of feature information issmall and the system error is small JCBB is the improvementof the NN algorithm which extends the association of singlefeatures in the NN to all observed feature information which

Node 1

Data fusioncentre

Node 2 Node 3

(a)

Node 3

Processing unit

Node 1

Node 2

Processing unit Processing unit

(b)

Figure 5 Collaborative navigation structure (a) Centralized collaborative navigation (b) Decentralized collaborative navigation

Measurementupdate

Relative measurementinformation

SLAM measurementinformation

Dead reckoning Time update

Node 2Node 1 Node N

Figure 6 Data fusion framework of centralized collaborativenavigation algorithm

Relative measurementinformation

Status informationsent by neighbors

Dead reckoning

Time update

SLAM measurement update

Update map information

Cooperative measurementupdate among nodes

Sending statusinformationto neighbors

Figure 7 Data fusion framework of decentralized collaborativenavigation algorithm

Odometer

Dead reckoning

Perceptual prediction

MapState estimation

Data association

Feature extraction

External sensingsensor

Figure 8 SLAM flow chart

Mathematical Problems in Engineering 5

is more binding and more reliable e JCBB algorithm canobtain more credible association assumptions than the NNalgorithm and exclude some wrong association assumptionsHowever the computation amount is obviously increasedwhich to some extent affects the real-time performance ofSLAM navigation system

To ensure the accuracy of data association in the processof the SLAM reduce the amount of computation as much aspossible and enhance the real-time performance of SLAMalgorithm this subsection describes an optimized data as-sociation algorithme classification method mentioned in[29] is used to divide the related feature information setfinally the appropriate feature information set in the localmap and the preprocessed observation feature informationset are selected to form the association space

First the collection of feature information in the localmap is divided as follows

D xm ym( 1113857 xk yk( 11138571113858 1113859leΔd xk yk( 1113857 isin Fk

D xm ym( 1113857 xk yk( 11138571113858 1113859gtΔd xk yk( 1113857 notin Fk1113896 (10)

where D[(xm ym) (xk yk)] is the relative distance betweenthe feature information (xk yk) of the local map and otherfeature information (xm ym)

en the observation feature information set is pre-processed and divided In the actual navigation process theobservation feature information obtained by lidar containsnoise information e purpose of preprocessing is to filterout some noise information improve the accuracy of dataassociation and reduce the amount of computation at thesame time e judgment process is as follows

f(i j) 1 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961ltΔD

0 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961geΔD

⎧⎪⎨

⎪⎩(11)

where ΔD is the threshold which is determined by theperformance of the laser sensor When the relative dis-tance between the two observation feature information isless than the threshold the observation feature infor-mation is considered to be the feature point otherwise thenoise point does not participate in the subsequentcalculation

When the set is divided the set of observed featureinformation is sorted according to the order of observationBased on the process of the local map feature information setabove the subset is divided in turn and all points are notinvolved in the division repeatedly

Finally we select the appropriate association set to ex-ecute the data association algorithm e subset of featureinformation of each local map and the subset of observedfeature information at the current time are joint compati-bility test and the feature information with the best testresults is selected to form a new subset as the data associationobject

4 Centralized CollaborativeNavigation Algorithm

41TimeUpdate First of all the state space model should beestablished e state vector of a single mobile robot withthree degrees of freedom contains position and headingangle information Suppose the number of nodes is N thestate space of collaborative navigation system in centralizedframework contains the state vectors of all mobile robots inthe group and the state vector of mobile robot i is Xi

k and thestate of system is Xk en the state space equation of thesystem can be expressed as follows

Xk

X1k

X2k

XNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

f1 X1kminus1 u1

kminus1( 1113857

f2 X2kminus1 u2

kminus1( 1113857

fN XNkminus1 uN

kminus1( 1113857

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

+

w1kminus1

w2kminus1

wNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

≜Φ Xkminus1 ukminus1( 1113857 + wkminus1

(12)

where the function fi(X u) describes the kinematic char-acteristics of the mobile robot ui

kminus1 ΔSrkminus 1 ΔSlkminus 11113858 1113859T

represents the input required by the mobile robot i tocalculate the track at time k wi

kminus1 is the system noise andwi

kminus1 sim N(0 Qikminus1)

It is assumed that the motion of any node is not affectedby any other node and each node moves independentlywithout being controlled by other nodes erefore the statetransfer matrix for centralized collaborative positioning isgiven by

Fkminus1

J1X(kminus1) 0 middot middot middot 0

0 J2X(kminus1) middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot JNX(kminus1)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(13)

where Jiu(kminus1) and Ji

X(kminus1) are the Jacobian matrices offunction f for state vectors and control inputs respectivelye system noise variance matrix of collaborative navigationsystem in centralized framework is as follows

Qkminus1

Q1kminus1 0 middot middot middot 0

0 Q2kminus1 middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot QNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(14)

where Qikminus1 Ji

u(kminus1)ΣuJiTu(kminus1) and Σu is the covariance

matrix for controlling input en the time update processof collaborative navigation system in centralized frameworkcan be deduced

6 Mathematical Problems in Engineering

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 2: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

and Mapping (SLAM) technology provides a new thoughtfor autonomous positioning of mobile robot under complexscenes [10] At the same time as a significant technicalmeans to realize relative navigation radio navigation andvisual navigation are being paid much attention It providesimportant technical support for further research on thecollaborative navigation of multirobots [11ndash13] Howeverthe traditional combined navigation technology is far frommeeting the growing demand for the high performance ofcollaborative navigation

At present most of the data fusion methods of multi-source heterogeneous sensors in integrated navigationsystem use centralized data fusion algorithmis algorithmhas some defects in collaborative navigation of mobile ro-bots high communication cost and poor robustnessHowever the current research on decentralized data fusionalgorithm which is suitable for collaborative navigationenvironment is not mature enough [14 15]

erefore based on the information fusion of mul-tisource heterogeneous sensors and related SLAM dataassociation algorithms this paper puts forward a newconcept model and solution of multisource heteroge-neous sensor information fusion which can significantlyimprove the performance of collaborative navigationaddress some problems of cooperative navigation tech-nology explore the key technologies that need to be solvedin the collaborative navigation process of multirobotplatform and provide theoretical basis and technicalsupport for high performance of collaborative navigationapplication

e structure of this paper is outlined as follows inSection 2 in the multisource information fusion collabo-rative navigation system model the principle of the moduleinvolved and the source of error are analyzed In Section 3the framework of centralized and decentralized collaborativenavigation framework based on odometervision red fusionis given and the SLAM algorithms based on vision arepresented In Sections 4 and 5 the centralized and decen-tralized odometervision collaborative navigation algo-rithms are derived respectively In Section 6 differentsimulation experiments are designed to verify the effec-tiveness of the algorithms

2 Multisource Information FusionCollaborative Navigation System Model

21 Motion Robot Model Suppose that the mobile robotmoves according to the circular arc trajectory Figure 1shows the motion of a single mobile robot (see Figure 1)

211 Odometry Measurement Model According to theprinciple of odometer motion calculation [16] it is assumedthat the starting position of themobile robot is (x y θ) afterΔT of movement to the end e end position is(x + Δx y + Δy θ + Δθ) and the trajectory can be regardedas a circular arc ΔS with radius R In this case the pose of therobot (in discrete time) is given by

ΔSl π D middotΔNL

P R minus

B

21113874 1113875Δθ

ΔSr π D middotΔNR

P R +

B

21113874 1113875Δθ

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(1)

where ΔNLΔNR are increment of pulse number ofodometer left and right wheel encoder (ΔSrΔSl) is thedistance between the right wheel and the left wheel of amobile robot P are the resolution parameters of left andright wheel encoder e pose increment of sampling periodΔT is obtained by solving the above equations e discretemotion equation of the mobile robot is further obtained by

xk asymp xkminus1 + ΔSkminus1 cosθkminus1 + Δθkminus1

21113888 1113889

yk asymp ykminus1 + ΔSkminus1 sinθkminus1 + Δθkminus1

21113888 1113889

θk θkminus1 + Δθkminus1

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(2)

212 Odometer Error Model ere are systematic errorsand random errors in odometer based positioning [17] Inthe simulation experiment the influence of random error ismainly considered e corresponding covariance matrixΣodom is given by

Σodom kr ΔSr

11138681113868111386811138681113868111386811138681113868 0

0 kl ΔSl

11138681113868111386811138681113868111386811138681113868

⎡⎣ ⎤⎦ (3)

22 Lidar Environmental Detection Model As the mainmodule to realize the SLAM environment detection modulemainly obtains the surrounding environment informationthrough the external sensing sensor

221 Lidar Scanning Model e two-dimensional lidaroutputs the point cloud data from the surrounding envi-ronment and contains the angular information e scan-ning schematic of the two-dimensional lidar is shown inFigure 2 (see Figure 2)

Xw

Xb

Ow

Ob

O

Yb

Yb

Yw

B

∆Sr

∆S1∆S

RXb

Ob

θ

θ + ∆θ

Figure 1 Motion model of a single mobile robot

2 Mathematical Problems in Engineering

As shown in Figure 2 the lidar is scanned in the sensorcoordinate system along the scanning direction with a fixedscanning angle resolutione measured point is detected inthe ith direction Denote θi as the angle between this di-rection and the positive direction of the Xs-axis and θi canbe obtained by the resolution of the scanning angle with theinherent A frame of point cloud data is obtained after a scanwhich can be recorded as a set A (di θi) | i 1 n1113864 1113865where n is the number of scanned data points in this frame ofdata (di θi) is the coordinate of the measured point in thepolar coordinate system And the final point cloud data setB (xi yi) | i 1 n1113864 1113865 is obtained too

222 Lidar Error Model In the simulation experiment theabove error can be simplified to Gaussian white noise alongthe x and y directions in the carrier coordinate system of themobile robot [18] It is assumed that the positioning errorsare in the two independent directions so the correspondingcovariance matrix is given by

Σrad σb

x 0

0 σby

⎡⎢⎣ ⎤⎥⎦ (4)

where σbx and σb

y represent the variance along the x and y

directions of the observation in the carrier coordinate systemof the mobile robot respectively

23 UWB Relative Measurement Model e relative mea-surement module is an important part of the cooperativenavigation network in which nodes relate to the state in-formation of the surrounding nodes UWB (Ultra WideBand) can measure relative distance and lidar can measurerelative position which are selected as research objects in thispaper Because the scanning model of lidar has been de-scribed in the previous section this section only establishesthe corresponding ranging model and error model

231 UWB Ranging Model At present there are two mainranging methods that is two-way ranging (TWR) methodand symmetric double-sided two-way ranging (SDSTWR)method SDSTWR method is greatly improved comparedwith TWRmethod in terms of ranging error caused by clock

synchronization erefore SDSTWR method is often usedfor distance measurement [19] (see Figure 3)

e distance expression between transceiver A andtransceiver B can be obtained from Figure 3

d cTA1TB2 minus TB1TA2

TA1 + TA2 + TB1 + TB2 (5)

where d is the distance between the two tested transceiversTA1 Ta2 minus Ta1 is the time interval between the first signaland the received signal of transceiver A We can getTA2 TB1 TB2 in the same way

232 UWB ErrorModel One of themain sources of error inUWB ranging is the error caused by the clock synchroni-zation of the two transceivers [20]

eSDS Tfc 1 minuskA + kB

21113888 1113889 (6)

where kA and kB are the ratio of transceivers A and B be-tween the actual and the expected frequency

24 SLAM Model Based on Lidar Measurement SLAM iswidely used in the field of mobile robot navigation andpositioning [21] It is the key technology to solve themapping problem at the same time and it also provides anew idea for solving the path planning problem SLAMtechnology mainly aims at the unknown location of themobile robot e environmental map is built in the form ofan increasing perception of the surrounding strange envi-ronment according to the external sensor At the same timethe state estimation of the mobile robot itself is obtained byusing the built map information

e SLAM localization technology is the key technologyto realize the autonomous positioning of the mobile robotunder the unknown environment It is of great researchvalue to realized more accurate navigation and localizationunder the condition of reducing the limitation of movingrobot (such as no GPS) [22] By applying the technology tothe framework of the collaborative navigation system of themobile robot the navigation performance of the wholecollaborative navigation system can be effectively improved(see Figure 4)

It can be seen from Figure 4 that a mobile robotrsquos lo-calization using a SLAM navigation system is essentially aprocess which is continuous estimation to approximate thetrue value [23] e triangle in the graph represents themobile robot and the circle represents the observed land-mark where the grey circle represents the estimated land-mark e solid line connecting the triangle represents thereal trajectory and the dotted line is the estimated trajectory(see Figure 4)

3 Collaborative Navigation FrameworkBased on OdometerVision

When nodes in collaborative navigation network participatein collaborative navigation different data fusion algorithms

θidiYs

Xs

The measuredpoint

Scanningdirection

Scan angularresolution

Figure 2 e scanning schematic diagram of the two-dimensionallidar

Mathematical Problems in Engineering 3

can be used for data fusion which is obtained multisourceheterogeneous sensors of different nodes ere are two datafusion algorithms generally used centralized and decen-tralized data fusion algorithms [24]

31 Framework of Collaborative Navigation Algorithm Inthis subsection a corresponding data fusion framework ofthe centralized collaborative navigation algorithm isdesigned for the general odometervision model and a datafusion framework of decentralized collaborative navigationalgorithm is designed in the same way

In the centralized collaborative navigation structure themeasured data obtained by each node are concentrated in adata fusion center for data fusion In a distributed collab-orative navigation structure each node shares some infor-mation with other nodes while processing its own sensordata (see Figure 5)

According to the odometervision collaborative navi-gation model we use the most common EKF algorithms tosimulate centralized and decentralized cooperative naviga-tion A centralized location algorithm (CL) and a decen-tralized location algorithm (DCL) are designed

As shown in Figure 6 in the CL algorithm corre-sponding to the centralized cooperative navigationstructure each node sends the information obtained bythe sensor itself to the central server and realizes the datafusion through the EKF algorithm Its state vector is a setof state vectors of each node which is updated with theprinciple of track reckoning After that the measurementinformation obtained after data association in the SLAMprocess and the relative measurement information be-tween nodes are selected CL algorithm gathers the stateand covariance information of all nodes corrects themuniformly and sends the corrected estimation results toeach node Because of participating in the joint mea-surement update process the state information of eachnode is related to each other after the first update (seeFigure 6)

is correlation is reasonably estimated based on the CLalgorithm and the task of data fusion is distributed to eachnode then the DCL algorithm is proposed accordinglySince the position of each node in the algorithm is equiv-alent only one node needs to be discussed (see Figure 7)Inorder to avoid overoptimal estimation to the greatest extentthe DCL algorithm in this paper introduces the concept ofseparating covariance crossover filtere covariance of eachnode is divided into correlation term and irrelevant termand the time update process is basically consistent with thetime update process of a single node e measurementupdate process will take two steps Firstly according to themeasurement information of the SLAM navigation systemthe state estimation results are obtained by propagating thestate and integrating the related information of the auxiliarynavigation system en the state information sent by theadjacent nodes and the relative measurement informationbetween the nodes can obtain another state estimation of thenode Here relevant state information sent to neighboringnodes is used to update local maps (see Figure 7)

32 SLAM Algorithms Based on Vision

321 Landmark State Estimation Algorithm e key ofSLAM navigation algorithm lies in the process of data as-sociation e positioning process of this SLAM navigationsystem is essentially a process which is continuous esti-mation to approximate the true value is kind of proba-bility estimation problem is usually solved by introducingappropriate filter e most common is the EKF algorithm(see Figure 8)

Because of the high sampling frequency of the odometerselected in this paper the lidar also has the advantages ofhigh precision and high reliability the EKF algorithm withbetter real-time performance is selectede state estimationprocess of landmark information in SLAM based on EKF isdescribed below e observation equation of the featureinformation obtained by lidar is as follows

zk h Xkl Xkr1113872 1113873 + nk

xkl minus xkr1113872 1113873cos θkr + ykl minus ykr1113872 1113873sin θkr

minus xkl minus xkr1113872 1113873sin θkr + ykl minus ykr1113872 1113873cos θkr

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nk

(7)

where Xkl (xkl ykl)T is state vector of landmark at k time

and Xkr (xkr ykr θkr)T is state vector of mobile robot at

Tα1 Tα2 Tα3TA1 TA2

Tb1Tb2

TB2

Tb3

TB1Tf

Time stamp A

Time stamp B

Transceiver A

Transceiver B

Figure 3 Principle of SDSTWR

Figure 4 SLAM model diagram

4 Mathematical Problems in Engineering

k time nk is measurement noise Its variance matrix is Rkwhich can be denoted as nk sim N(0 Rk) Since the landmarksare static the state estimation of k minus 1 time landmark can beregarded as a priori estimation of k time landmark state emeasurement update process based on EKF is as follows

Step 1 calculating the innovation and the filter gain

vk zk minus h Xkminus1l Xminuskr1113872 1113873

Kk Σkminus1HTk HkΣkminus 1H

Tk + Rk1113872 1113873

minus 1

(8)

Step 2 updating the state estimation and the corre-sponding covariance

Xkl Xkminus1l + Kkvk

Σk I minus KkHk( 1113857Σkminus1(9)

where Σk is the covariance matrix for state estimation oflandmark at k time and Hk is the measurement matrix at k

time

Remark 1 Any observed landmark information can beposition corrected by the above method and it is noted thatsuch position correction is limited to the landmark in thelocal map observed at the k time

322 Data Association Algorithm In a SLAM navigationsystem the process of data association is an importantprerequisite for state estimation Incorrect data association islikely to lead to serious deviation of estimation results[25 26]

At present there are two data association algorithmscommonly used in SLAM technology that is the nearestneighbor data association algorithm (NN nearest neighbor)[27] and joint compatibility body amp body data associationalgorithm (JCBB) [28] e NN algorithm has less com-putation but it is easy to form wrong data association whenthe density of feature information is large which leads to thedivergence of SLAM results so it is only suitable for theenvironment where the density of feature information issmall and the system error is small JCBB is the improvementof the NN algorithm which extends the association of singlefeatures in the NN to all observed feature information which

Node 1

Data fusioncentre

Node 2 Node 3

(a)

Node 3

Processing unit

Node 1

Node 2

Processing unit Processing unit

(b)

Figure 5 Collaborative navigation structure (a) Centralized collaborative navigation (b) Decentralized collaborative navigation

Measurementupdate

Relative measurementinformation

SLAM measurementinformation

Dead reckoning Time update

Node 2Node 1 Node N

Figure 6 Data fusion framework of centralized collaborativenavigation algorithm

Relative measurementinformation

Status informationsent by neighbors

Dead reckoning

Time update

SLAM measurement update

Update map information

Cooperative measurementupdate among nodes

Sending statusinformationto neighbors

Figure 7 Data fusion framework of decentralized collaborativenavigation algorithm

Odometer

Dead reckoning

Perceptual prediction

MapState estimation

Data association

Feature extraction

External sensingsensor

Figure 8 SLAM flow chart

Mathematical Problems in Engineering 5

is more binding and more reliable e JCBB algorithm canobtain more credible association assumptions than the NNalgorithm and exclude some wrong association assumptionsHowever the computation amount is obviously increasedwhich to some extent affects the real-time performance ofSLAM navigation system

To ensure the accuracy of data association in the processof the SLAM reduce the amount of computation as much aspossible and enhance the real-time performance of SLAMalgorithm this subsection describes an optimized data as-sociation algorithme classification method mentioned in[29] is used to divide the related feature information setfinally the appropriate feature information set in the localmap and the preprocessed observation feature informationset are selected to form the association space

First the collection of feature information in the localmap is divided as follows

D xm ym( 1113857 xk yk( 11138571113858 1113859leΔd xk yk( 1113857 isin Fk

D xm ym( 1113857 xk yk( 11138571113858 1113859gtΔd xk yk( 1113857 notin Fk1113896 (10)

where D[(xm ym) (xk yk)] is the relative distance betweenthe feature information (xk yk) of the local map and otherfeature information (xm ym)

en the observation feature information set is pre-processed and divided In the actual navigation process theobservation feature information obtained by lidar containsnoise information e purpose of preprocessing is to filterout some noise information improve the accuracy of dataassociation and reduce the amount of computation at thesame time e judgment process is as follows

f(i j) 1 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961ltΔD

0 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961geΔD

⎧⎪⎨

⎪⎩(11)

where ΔD is the threshold which is determined by theperformance of the laser sensor When the relative dis-tance between the two observation feature information isless than the threshold the observation feature infor-mation is considered to be the feature point otherwise thenoise point does not participate in the subsequentcalculation

When the set is divided the set of observed featureinformation is sorted according to the order of observationBased on the process of the local map feature information setabove the subset is divided in turn and all points are notinvolved in the division repeatedly

Finally we select the appropriate association set to ex-ecute the data association algorithm e subset of featureinformation of each local map and the subset of observedfeature information at the current time are joint compati-bility test and the feature information with the best testresults is selected to form a new subset as the data associationobject

4 Centralized CollaborativeNavigation Algorithm

41TimeUpdate First of all the state space model should beestablished e state vector of a single mobile robot withthree degrees of freedom contains position and headingangle information Suppose the number of nodes is N thestate space of collaborative navigation system in centralizedframework contains the state vectors of all mobile robots inthe group and the state vector of mobile robot i is Xi

k and thestate of system is Xk en the state space equation of thesystem can be expressed as follows

Xk

X1k

X2k

XNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

f1 X1kminus1 u1

kminus1( 1113857

f2 X2kminus1 u2

kminus1( 1113857

fN XNkminus1 uN

kminus1( 1113857

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

+

w1kminus1

w2kminus1

wNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

≜Φ Xkminus1 ukminus1( 1113857 + wkminus1

(12)

where the function fi(X u) describes the kinematic char-acteristics of the mobile robot ui

kminus1 ΔSrkminus 1 ΔSlkminus 11113858 1113859T

represents the input required by the mobile robot i tocalculate the track at time k wi

kminus1 is the system noise andwi

kminus1 sim N(0 Qikminus1)

It is assumed that the motion of any node is not affectedby any other node and each node moves independentlywithout being controlled by other nodes erefore the statetransfer matrix for centralized collaborative positioning isgiven by

Fkminus1

J1X(kminus1) 0 middot middot middot 0

0 J2X(kminus1) middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot JNX(kminus1)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(13)

where Jiu(kminus1) and Ji

X(kminus1) are the Jacobian matrices offunction f for state vectors and control inputs respectivelye system noise variance matrix of collaborative navigationsystem in centralized framework is as follows

Qkminus1

Q1kminus1 0 middot middot middot 0

0 Q2kminus1 middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot QNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(14)

where Qikminus1 Ji

u(kminus1)ΣuJiTu(kminus1) and Σu is the covariance

matrix for controlling input en the time update processof collaborative navigation system in centralized frameworkcan be deduced

6 Mathematical Problems in Engineering

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 3: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

As shown in Figure 2 the lidar is scanned in the sensorcoordinate system along the scanning direction with a fixedscanning angle resolutione measured point is detected inthe ith direction Denote θi as the angle between this di-rection and the positive direction of the Xs-axis and θi canbe obtained by the resolution of the scanning angle with theinherent A frame of point cloud data is obtained after a scanwhich can be recorded as a set A (di θi) | i 1 n1113864 1113865where n is the number of scanned data points in this frame ofdata (di θi) is the coordinate of the measured point in thepolar coordinate system And the final point cloud data setB (xi yi) | i 1 n1113864 1113865 is obtained too

222 Lidar Error Model In the simulation experiment theabove error can be simplified to Gaussian white noise alongthe x and y directions in the carrier coordinate system of themobile robot [18] It is assumed that the positioning errorsare in the two independent directions so the correspondingcovariance matrix is given by

Σrad σb

x 0

0 σby

⎡⎢⎣ ⎤⎥⎦ (4)

where σbx and σb

y represent the variance along the x and y

directions of the observation in the carrier coordinate systemof the mobile robot respectively

23 UWB Relative Measurement Model e relative mea-surement module is an important part of the cooperativenavigation network in which nodes relate to the state in-formation of the surrounding nodes UWB (Ultra WideBand) can measure relative distance and lidar can measurerelative position which are selected as research objects in thispaper Because the scanning model of lidar has been de-scribed in the previous section this section only establishesthe corresponding ranging model and error model

231 UWB Ranging Model At present there are two mainranging methods that is two-way ranging (TWR) methodand symmetric double-sided two-way ranging (SDSTWR)method SDSTWR method is greatly improved comparedwith TWRmethod in terms of ranging error caused by clock

synchronization erefore SDSTWR method is often usedfor distance measurement [19] (see Figure 3)

e distance expression between transceiver A andtransceiver B can be obtained from Figure 3

d cTA1TB2 minus TB1TA2

TA1 + TA2 + TB1 + TB2 (5)

where d is the distance between the two tested transceiversTA1 Ta2 minus Ta1 is the time interval between the first signaland the received signal of transceiver A We can getTA2 TB1 TB2 in the same way

232 UWB ErrorModel One of themain sources of error inUWB ranging is the error caused by the clock synchroni-zation of the two transceivers [20]

eSDS Tfc 1 minuskA + kB

21113888 1113889 (6)

where kA and kB are the ratio of transceivers A and B be-tween the actual and the expected frequency

24 SLAM Model Based on Lidar Measurement SLAM iswidely used in the field of mobile robot navigation andpositioning [21] It is the key technology to solve themapping problem at the same time and it also provides anew idea for solving the path planning problem SLAMtechnology mainly aims at the unknown location of themobile robot e environmental map is built in the form ofan increasing perception of the surrounding strange envi-ronment according to the external sensor At the same timethe state estimation of the mobile robot itself is obtained byusing the built map information

e SLAM localization technology is the key technologyto realize the autonomous positioning of the mobile robotunder the unknown environment It is of great researchvalue to realized more accurate navigation and localizationunder the condition of reducing the limitation of movingrobot (such as no GPS) [22] By applying the technology tothe framework of the collaborative navigation system of themobile robot the navigation performance of the wholecollaborative navigation system can be effectively improved(see Figure 4)

It can be seen from Figure 4 that a mobile robotrsquos lo-calization using a SLAM navigation system is essentially aprocess which is continuous estimation to approximate thetrue value [23] e triangle in the graph represents themobile robot and the circle represents the observed land-mark where the grey circle represents the estimated land-mark e solid line connecting the triangle represents thereal trajectory and the dotted line is the estimated trajectory(see Figure 4)

3 Collaborative Navigation FrameworkBased on OdometerVision

When nodes in collaborative navigation network participatein collaborative navigation different data fusion algorithms

θidiYs

Xs

The measuredpoint

Scanningdirection

Scan angularresolution

Figure 2 e scanning schematic diagram of the two-dimensionallidar

Mathematical Problems in Engineering 3

can be used for data fusion which is obtained multisourceheterogeneous sensors of different nodes ere are two datafusion algorithms generally used centralized and decen-tralized data fusion algorithms [24]

31 Framework of Collaborative Navigation Algorithm Inthis subsection a corresponding data fusion framework ofthe centralized collaborative navigation algorithm isdesigned for the general odometervision model and a datafusion framework of decentralized collaborative navigationalgorithm is designed in the same way

In the centralized collaborative navigation structure themeasured data obtained by each node are concentrated in adata fusion center for data fusion In a distributed collab-orative navigation structure each node shares some infor-mation with other nodes while processing its own sensordata (see Figure 5)

According to the odometervision collaborative navi-gation model we use the most common EKF algorithms tosimulate centralized and decentralized cooperative naviga-tion A centralized location algorithm (CL) and a decen-tralized location algorithm (DCL) are designed

As shown in Figure 6 in the CL algorithm corre-sponding to the centralized cooperative navigationstructure each node sends the information obtained bythe sensor itself to the central server and realizes the datafusion through the EKF algorithm Its state vector is a setof state vectors of each node which is updated with theprinciple of track reckoning After that the measurementinformation obtained after data association in the SLAMprocess and the relative measurement information be-tween nodes are selected CL algorithm gathers the stateand covariance information of all nodes corrects themuniformly and sends the corrected estimation results toeach node Because of participating in the joint mea-surement update process the state information of eachnode is related to each other after the first update (seeFigure 6)

is correlation is reasonably estimated based on the CLalgorithm and the task of data fusion is distributed to eachnode then the DCL algorithm is proposed accordinglySince the position of each node in the algorithm is equiv-alent only one node needs to be discussed (see Figure 7)Inorder to avoid overoptimal estimation to the greatest extentthe DCL algorithm in this paper introduces the concept ofseparating covariance crossover filtere covariance of eachnode is divided into correlation term and irrelevant termand the time update process is basically consistent with thetime update process of a single node e measurementupdate process will take two steps Firstly according to themeasurement information of the SLAM navigation systemthe state estimation results are obtained by propagating thestate and integrating the related information of the auxiliarynavigation system en the state information sent by theadjacent nodes and the relative measurement informationbetween the nodes can obtain another state estimation of thenode Here relevant state information sent to neighboringnodes is used to update local maps (see Figure 7)

32 SLAM Algorithms Based on Vision

321 Landmark State Estimation Algorithm e key ofSLAM navigation algorithm lies in the process of data as-sociation e positioning process of this SLAM navigationsystem is essentially a process which is continuous esti-mation to approximate the true value is kind of proba-bility estimation problem is usually solved by introducingappropriate filter e most common is the EKF algorithm(see Figure 8)

Because of the high sampling frequency of the odometerselected in this paper the lidar also has the advantages ofhigh precision and high reliability the EKF algorithm withbetter real-time performance is selectede state estimationprocess of landmark information in SLAM based on EKF isdescribed below e observation equation of the featureinformation obtained by lidar is as follows

zk h Xkl Xkr1113872 1113873 + nk

xkl minus xkr1113872 1113873cos θkr + ykl minus ykr1113872 1113873sin θkr

minus xkl minus xkr1113872 1113873sin θkr + ykl minus ykr1113872 1113873cos θkr

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nk

(7)

where Xkl (xkl ykl)T is state vector of landmark at k time

and Xkr (xkr ykr θkr)T is state vector of mobile robot at

Tα1 Tα2 Tα3TA1 TA2

Tb1Tb2

TB2

Tb3

TB1Tf

Time stamp A

Time stamp B

Transceiver A

Transceiver B

Figure 3 Principle of SDSTWR

Figure 4 SLAM model diagram

4 Mathematical Problems in Engineering

k time nk is measurement noise Its variance matrix is Rkwhich can be denoted as nk sim N(0 Rk) Since the landmarksare static the state estimation of k minus 1 time landmark can beregarded as a priori estimation of k time landmark state emeasurement update process based on EKF is as follows

Step 1 calculating the innovation and the filter gain

vk zk minus h Xkminus1l Xminuskr1113872 1113873

Kk Σkminus1HTk HkΣkminus 1H

Tk + Rk1113872 1113873

minus 1

(8)

Step 2 updating the state estimation and the corre-sponding covariance

Xkl Xkminus1l + Kkvk

Σk I minus KkHk( 1113857Σkminus1(9)

where Σk is the covariance matrix for state estimation oflandmark at k time and Hk is the measurement matrix at k

time

Remark 1 Any observed landmark information can beposition corrected by the above method and it is noted thatsuch position correction is limited to the landmark in thelocal map observed at the k time

322 Data Association Algorithm In a SLAM navigationsystem the process of data association is an importantprerequisite for state estimation Incorrect data association islikely to lead to serious deviation of estimation results[25 26]

At present there are two data association algorithmscommonly used in SLAM technology that is the nearestneighbor data association algorithm (NN nearest neighbor)[27] and joint compatibility body amp body data associationalgorithm (JCBB) [28] e NN algorithm has less com-putation but it is easy to form wrong data association whenthe density of feature information is large which leads to thedivergence of SLAM results so it is only suitable for theenvironment where the density of feature information issmall and the system error is small JCBB is the improvementof the NN algorithm which extends the association of singlefeatures in the NN to all observed feature information which

Node 1

Data fusioncentre

Node 2 Node 3

(a)

Node 3

Processing unit

Node 1

Node 2

Processing unit Processing unit

(b)

Figure 5 Collaborative navigation structure (a) Centralized collaborative navigation (b) Decentralized collaborative navigation

Measurementupdate

Relative measurementinformation

SLAM measurementinformation

Dead reckoning Time update

Node 2Node 1 Node N

Figure 6 Data fusion framework of centralized collaborativenavigation algorithm

Relative measurementinformation

Status informationsent by neighbors

Dead reckoning

Time update

SLAM measurement update

Update map information

Cooperative measurementupdate among nodes

Sending statusinformationto neighbors

Figure 7 Data fusion framework of decentralized collaborativenavigation algorithm

Odometer

Dead reckoning

Perceptual prediction

MapState estimation

Data association

Feature extraction

External sensingsensor

Figure 8 SLAM flow chart

Mathematical Problems in Engineering 5

is more binding and more reliable e JCBB algorithm canobtain more credible association assumptions than the NNalgorithm and exclude some wrong association assumptionsHowever the computation amount is obviously increasedwhich to some extent affects the real-time performance ofSLAM navigation system

To ensure the accuracy of data association in the processof the SLAM reduce the amount of computation as much aspossible and enhance the real-time performance of SLAMalgorithm this subsection describes an optimized data as-sociation algorithme classification method mentioned in[29] is used to divide the related feature information setfinally the appropriate feature information set in the localmap and the preprocessed observation feature informationset are selected to form the association space

First the collection of feature information in the localmap is divided as follows

D xm ym( 1113857 xk yk( 11138571113858 1113859leΔd xk yk( 1113857 isin Fk

D xm ym( 1113857 xk yk( 11138571113858 1113859gtΔd xk yk( 1113857 notin Fk1113896 (10)

where D[(xm ym) (xk yk)] is the relative distance betweenthe feature information (xk yk) of the local map and otherfeature information (xm ym)

en the observation feature information set is pre-processed and divided In the actual navigation process theobservation feature information obtained by lidar containsnoise information e purpose of preprocessing is to filterout some noise information improve the accuracy of dataassociation and reduce the amount of computation at thesame time e judgment process is as follows

f(i j) 1 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961ltΔD

0 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961geΔD

⎧⎪⎨

⎪⎩(11)

where ΔD is the threshold which is determined by theperformance of the laser sensor When the relative dis-tance between the two observation feature information isless than the threshold the observation feature infor-mation is considered to be the feature point otherwise thenoise point does not participate in the subsequentcalculation

When the set is divided the set of observed featureinformation is sorted according to the order of observationBased on the process of the local map feature information setabove the subset is divided in turn and all points are notinvolved in the division repeatedly

Finally we select the appropriate association set to ex-ecute the data association algorithm e subset of featureinformation of each local map and the subset of observedfeature information at the current time are joint compati-bility test and the feature information with the best testresults is selected to form a new subset as the data associationobject

4 Centralized CollaborativeNavigation Algorithm

41TimeUpdate First of all the state space model should beestablished e state vector of a single mobile robot withthree degrees of freedom contains position and headingangle information Suppose the number of nodes is N thestate space of collaborative navigation system in centralizedframework contains the state vectors of all mobile robots inthe group and the state vector of mobile robot i is Xi

k and thestate of system is Xk en the state space equation of thesystem can be expressed as follows

Xk

X1k

X2k

XNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

f1 X1kminus1 u1

kminus1( 1113857

f2 X2kminus1 u2

kminus1( 1113857

fN XNkminus1 uN

kminus1( 1113857

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

+

w1kminus1

w2kminus1

wNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

≜Φ Xkminus1 ukminus1( 1113857 + wkminus1

(12)

where the function fi(X u) describes the kinematic char-acteristics of the mobile robot ui

kminus1 ΔSrkminus 1 ΔSlkminus 11113858 1113859T

represents the input required by the mobile robot i tocalculate the track at time k wi

kminus1 is the system noise andwi

kminus1 sim N(0 Qikminus1)

It is assumed that the motion of any node is not affectedby any other node and each node moves independentlywithout being controlled by other nodes erefore the statetransfer matrix for centralized collaborative positioning isgiven by

Fkminus1

J1X(kminus1) 0 middot middot middot 0

0 J2X(kminus1) middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot JNX(kminus1)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(13)

where Jiu(kminus1) and Ji

X(kminus1) are the Jacobian matrices offunction f for state vectors and control inputs respectivelye system noise variance matrix of collaborative navigationsystem in centralized framework is as follows

Qkminus1

Q1kminus1 0 middot middot middot 0

0 Q2kminus1 middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot QNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(14)

where Qikminus1 Ji

u(kminus1)ΣuJiTu(kminus1) and Σu is the covariance

matrix for controlling input en the time update processof collaborative navigation system in centralized frameworkcan be deduced

6 Mathematical Problems in Engineering

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 4: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

can be used for data fusion which is obtained multisourceheterogeneous sensors of different nodes ere are two datafusion algorithms generally used centralized and decen-tralized data fusion algorithms [24]

31 Framework of Collaborative Navigation Algorithm Inthis subsection a corresponding data fusion framework ofthe centralized collaborative navigation algorithm isdesigned for the general odometervision model and a datafusion framework of decentralized collaborative navigationalgorithm is designed in the same way

In the centralized collaborative navigation structure themeasured data obtained by each node are concentrated in adata fusion center for data fusion In a distributed collab-orative navigation structure each node shares some infor-mation with other nodes while processing its own sensordata (see Figure 5)

According to the odometervision collaborative navi-gation model we use the most common EKF algorithms tosimulate centralized and decentralized cooperative naviga-tion A centralized location algorithm (CL) and a decen-tralized location algorithm (DCL) are designed

As shown in Figure 6 in the CL algorithm corre-sponding to the centralized cooperative navigationstructure each node sends the information obtained bythe sensor itself to the central server and realizes the datafusion through the EKF algorithm Its state vector is a setof state vectors of each node which is updated with theprinciple of track reckoning After that the measurementinformation obtained after data association in the SLAMprocess and the relative measurement information be-tween nodes are selected CL algorithm gathers the stateand covariance information of all nodes corrects themuniformly and sends the corrected estimation results toeach node Because of participating in the joint mea-surement update process the state information of eachnode is related to each other after the first update (seeFigure 6)

is correlation is reasonably estimated based on the CLalgorithm and the task of data fusion is distributed to eachnode then the DCL algorithm is proposed accordinglySince the position of each node in the algorithm is equiv-alent only one node needs to be discussed (see Figure 7)Inorder to avoid overoptimal estimation to the greatest extentthe DCL algorithm in this paper introduces the concept ofseparating covariance crossover filtere covariance of eachnode is divided into correlation term and irrelevant termand the time update process is basically consistent with thetime update process of a single node e measurementupdate process will take two steps Firstly according to themeasurement information of the SLAM navigation systemthe state estimation results are obtained by propagating thestate and integrating the related information of the auxiliarynavigation system en the state information sent by theadjacent nodes and the relative measurement informationbetween the nodes can obtain another state estimation of thenode Here relevant state information sent to neighboringnodes is used to update local maps (see Figure 7)

32 SLAM Algorithms Based on Vision

321 Landmark State Estimation Algorithm e key ofSLAM navigation algorithm lies in the process of data as-sociation e positioning process of this SLAM navigationsystem is essentially a process which is continuous esti-mation to approximate the true value is kind of proba-bility estimation problem is usually solved by introducingappropriate filter e most common is the EKF algorithm(see Figure 8)

Because of the high sampling frequency of the odometerselected in this paper the lidar also has the advantages ofhigh precision and high reliability the EKF algorithm withbetter real-time performance is selectede state estimationprocess of landmark information in SLAM based on EKF isdescribed below e observation equation of the featureinformation obtained by lidar is as follows

zk h Xkl Xkr1113872 1113873 + nk

xkl minus xkr1113872 1113873cos θkr + ykl minus ykr1113872 1113873sin θkr

minus xkl minus xkr1113872 1113873sin θkr + ykl minus ykr1113872 1113873cos θkr

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nk

(7)

where Xkl (xkl ykl)T is state vector of landmark at k time

and Xkr (xkr ykr θkr)T is state vector of mobile robot at

Tα1 Tα2 Tα3TA1 TA2

Tb1Tb2

TB2

Tb3

TB1Tf

Time stamp A

Time stamp B

Transceiver A

Transceiver B

Figure 3 Principle of SDSTWR

Figure 4 SLAM model diagram

4 Mathematical Problems in Engineering

k time nk is measurement noise Its variance matrix is Rkwhich can be denoted as nk sim N(0 Rk) Since the landmarksare static the state estimation of k minus 1 time landmark can beregarded as a priori estimation of k time landmark state emeasurement update process based on EKF is as follows

Step 1 calculating the innovation and the filter gain

vk zk minus h Xkminus1l Xminuskr1113872 1113873

Kk Σkminus1HTk HkΣkminus 1H

Tk + Rk1113872 1113873

minus 1

(8)

Step 2 updating the state estimation and the corre-sponding covariance

Xkl Xkminus1l + Kkvk

Σk I minus KkHk( 1113857Σkminus1(9)

where Σk is the covariance matrix for state estimation oflandmark at k time and Hk is the measurement matrix at k

time

Remark 1 Any observed landmark information can beposition corrected by the above method and it is noted thatsuch position correction is limited to the landmark in thelocal map observed at the k time

322 Data Association Algorithm In a SLAM navigationsystem the process of data association is an importantprerequisite for state estimation Incorrect data association islikely to lead to serious deviation of estimation results[25 26]

At present there are two data association algorithmscommonly used in SLAM technology that is the nearestneighbor data association algorithm (NN nearest neighbor)[27] and joint compatibility body amp body data associationalgorithm (JCBB) [28] e NN algorithm has less com-putation but it is easy to form wrong data association whenthe density of feature information is large which leads to thedivergence of SLAM results so it is only suitable for theenvironment where the density of feature information issmall and the system error is small JCBB is the improvementof the NN algorithm which extends the association of singlefeatures in the NN to all observed feature information which

Node 1

Data fusioncentre

Node 2 Node 3

(a)

Node 3

Processing unit

Node 1

Node 2

Processing unit Processing unit

(b)

Figure 5 Collaborative navigation structure (a) Centralized collaborative navigation (b) Decentralized collaborative navigation

Measurementupdate

Relative measurementinformation

SLAM measurementinformation

Dead reckoning Time update

Node 2Node 1 Node N

Figure 6 Data fusion framework of centralized collaborativenavigation algorithm

Relative measurementinformation

Status informationsent by neighbors

Dead reckoning

Time update

SLAM measurement update

Update map information

Cooperative measurementupdate among nodes

Sending statusinformationto neighbors

Figure 7 Data fusion framework of decentralized collaborativenavigation algorithm

Odometer

Dead reckoning

Perceptual prediction

MapState estimation

Data association

Feature extraction

External sensingsensor

Figure 8 SLAM flow chart

Mathematical Problems in Engineering 5

is more binding and more reliable e JCBB algorithm canobtain more credible association assumptions than the NNalgorithm and exclude some wrong association assumptionsHowever the computation amount is obviously increasedwhich to some extent affects the real-time performance ofSLAM navigation system

To ensure the accuracy of data association in the processof the SLAM reduce the amount of computation as much aspossible and enhance the real-time performance of SLAMalgorithm this subsection describes an optimized data as-sociation algorithme classification method mentioned in[29] is used to divide the related feature information setfinally the appropriate feature information set in the localmap and the preprocessed observation feature informationset are selected to form the association space

First the collection of feature information in the localmap is divided as follows

D xm ym( 1113857 xk yk( 11138571113858 1113859leΔd xk yk( 1113857 isin Fk

D xm ym( 1113857 xk yk( 11138571113858 1113859gtΔd xk yk( 1113857 notin Fk1113896 (10)

where D[(xm ym) (xk yk)] is the relative distance betweenthe feature information (xk yk) of the local map and otherfeature information (xm ym)

en the observation feature information set is pre-processed and divided In the actual navigation process theobservation feature information obtained by lidar containsnoise information e purpose of preprocessing is to filterout some noise information improve the accuracy of dataassociation and reduce the amount of computation at thesame time e judgment process is as follows

f(i j) 1 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961ltΔD

0 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961geΔD

⎧⎪⎨

⎪⎩(11)

where ΔD is the threshold which is determined by theperformance of the laser sensor When the relative dis-tance between the two observation feature information isless than the threshold the observation feature infor-mation is considered to be the feature point otherwise thenoise point does not participate in the subsequentcalculation

When the set is divided the set of observed featureinformation is sorted according to the order of observationBased on the process of the local map feature information setabove the subset is divided in turn and all points are notinvolved in the division repeatedly

Finally we select the appropriate association set to ex-ecute the data association algorithm e subset of featureinformation of each local map and the subset of observedfeature information at the current time are joint compati-bility test and the feature information with the best testresults is selected to form a new subset as the data associationobject

4 Centralized CollaborativeNavigation Algorithm

41TimeUpdate First of all the state space model should beestablished e state vector of a single mobile robot withthree degrees of freedom contains position and headingangle information Suppose the number of nodes is N thestate space of collaborative navigation system in centralizedframework contains the state vectors of all mobile robots inthe group and the state vector of mobile robot i is Xi

k and thestate of system is Xk en the state space equation of thesystem can be expressed as follows

Xk

X1k

X2k

XNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

f1 X1kminus1 u1

kminus1( 1113857

f2 X2kminus1 u2

kminus1( 1113857

fN XNkminus1 uN

kminus1( 1113857

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

+

w1kminus1

w2kminus1

wNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

≜Φ Xkminus1 ukminus1( 1113857 + wkminus1

(12)

where the function fi(X u) describes the kinematic char-acteristics of the mobile robot ui

kminus1 ΔSrkminus 1 ΔSlkminus 11113858 1113859T

represents the input required by the mobile robot i tocalculate the track at time k wi

kminus1 is the system noise andwi

kminus1 sim N(0 Qikminus1)

It is assumed that the motion of any node is not affectedby any other node and each node moves independentlywithout being controlled by other nodes erefore the statetransfer matrix for centralized collaborative positioning isgiven by

Fkminus1

J1X(kminus1) 0 middot middot middot 0

0 J2X(kminus1) middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot JNX(kminus1)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(13)

where Jiu(kminus1) and Ji

X(kminus1) are the Jacobian matrices offunction f for state vectors and control inputs respectivelye system noise variance matrix of collaborative navigationsystem in centralized framework is as follows

Qkminus1

Q1kminus1 0 middot middot middot 0

0 Q2kminus1 middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot QNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(14)

where Qikminus1 Ji

u(kminus1)ΣuJiTu(kminus1) and Σu is the covariance

matrix for controlling input en the time update processof collaborative navigation system in centralized frameworkcan be deduced

6 Mathematical Problems in Engineering

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 5: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

k time nk is measurement noise Its variance matrix is Rkwhich can be denoted as nk sim N(0 Rk) Since the landmarksare static the state estimation of k minus 1 time landmark can beregarded as a priori estimation of k time landmark state emeasurement update process based on EKF is as follows

Step 1 calculating the innovation and the filter gain

vk zk minus h Xkminus1l Xminuskr1113872 1113873

Kk Σkminus1HTk HkΣkminus 1H

Tk + Rk1113872 1113873

minus 1

(8)

Step 2 updating the state estimation and the corre-sponding covariance

Xkl Xkminus1l + Kkvk

Σk I minus KkHk( 1113857Σkminus1(9)

where Σk is the covariance matrix for state estimation oflandmark at k time and Hk is the measurement matrix at k

time

Remark 1 Any observed landmark information can beposition corrected by the above method and it is noted thatsuch position correction is limited to the landmark in thelocal map observed at the k time

322 Data Association Algorithm In a SLAM navigationsystem the process of data association is an importantprerequisite for state estimation Incorrect data association islikely to lead to serious deviation of estimation results[25 26]

At present there are two data association algorithmscommonly used in SLAM technology that is the nearestneighbor data association algorithm (NN nearest neighbor)[27] and joint compatibility body amp body data associationalgorithm (JCBB) [28] e NN algorithm has less com-putation but it is easy to form wrong data association whenthe density of feature information is large which leads to thedivergence of SLAM results so it is only suitable for theenvironment where the density of feature information issmall and the system error is small JCBB is the improvementof the NN algorithm which extends the association of singlefeatures in the NN to all observed feature information which

Node 1

Data fusioncentre

Node 2 Node 3

(a)

Node 3

Processing unit

Node 1

Node 2

Processing unit Processing unit

(b)

Figure 5 Collaborative navigation structure (a) Centralized collaborative navigation (b) Decentralized collaborative navigation

Measurementupdate

Relative measurementinformation

SLAM measurementinformation

Dead reckoning Time update

Node 2Node 1 Node N

Figure 6 Data fusion framework of centralized collaborativenavigation algorithm

Relative measurementinformation

Status informationsent by neighbors

Dead reckoning

Time update

SLAM measurement update

Update map information

Cooperative measurementupdate among nodes

Sending statusinformationto neighbors

Figure 7 Data fusion framework of decentralized collaborativenavigation algorithm

Odometer

Dead reckoning

Perceptual prediction

MapState estimation

Data association

Feature extraction

External sensingsensor

Figure 8 SLAM flow chart

Mathematical Problems in Engineering 5

is more binding and more reliable e JCBB algorithm canobtain more credible association assumptions than the NNalgorithm and exclude some wrong association assumptionsHowever the computation amount is obviously increasedwhich to some extent affects the real-time performance ofSLAM navigation system

To ensure the accuracy of data association in the processof the SLAM reduce the amount of computation as much aspossible and enhance the real-time performance of SLAMalgorithm this subsection describes an optimized data as-sociation algorithme classification method mentioned in[29] is used to divide the related feature information setfinally the appropriate feature information set in the localmap and the preprocessed observation feature informationset are selected to form the association space

First the collection of feature information in the localmap is divided as follows

D xm ym( 1113857 xk yk( 11138571113858 1113859leΔd xk yk( 1113857 isin Fk

D xm ym( 1113857 xk yk( 11138571113858 1113859gtΔd xk yk( 1113857 notin Fk1113896 (10)

where D[(xm ym) (xk yk)] is the relative distance betweenthe feature information (xk yk) of the local map and otherfeature information (xm ym)

en the observation feature information set is pre-processed and divided In the actual navigation process theobservation feature information obtained by lidar containsnoise information e purpose of preprocessing is to filterout some noise information improve the accuracy of dataassociation and reduce the amount of computation at thesame time e judgment process is as follows

f(i j) 1 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961ltΔD

0 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961geΔD

⎧⎪⎨

⎪⎩(11)

where ΔD is the threshold which is determined by theperformance of the laser sensor When the relative dis-tance between the two observation feature information isless than the threshold the observation feature infor-mation is considered to be the feature point otherwise thenoise point does not participate in the subsequentcalculation

When the set is divided the set of observed featureinformation is sorted according to the order of observationBased on the process of the local map feature information setabove the subset is divided in turn and all points are notinvolved in the division repeatedly

Finally we select the appropriate association set to ex-ecute the data association algorithm e subset of featureinformation of each local map and the subset of observedfeature information at the current time are joint compati-bility test and the feature information with the best testresults is selected to form a new subset as the data associationobject

4 Centralized CollaborativeNavigation Algorithm

41TimeUpdate First of all the state space model should beestablished e state vector of a single mobile robot withthree degrees of freedom contains position and headingangle information Suppose the number of nodes is N thestate space of collaborative navigation system in centralizedframework contains the state vectors of all mobile robots inthe group and the state vector of mobile robot i is Xi

k and thestate of system is Xk en the state space equation of thesystem can be expressed as follows

Xk

X1k

X2k

XNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

f1 X1kminus1 u1

kminus1( 1113857

f2 X2kminus1 u2

kminus1( 1113857

fN XNkminus1 uN

kminus1( 1113857

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

+

w1kminus1

w2kminus1

wNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

≜Φ Xkminus1 ukminus1( 1113857 + wkminus1

(12)

where the function fi(X u) describes the kinematic char-acteristics of the mobile robot ui

kminus1 ΔSrkminus 1 ΔSlkminus 11113858 1113859T

represents the input required by the mobile robot i tocalculate the track at time k wi

kminus1 is the system noise andwi

kminus1 sim N(0 Qikminus1)

It is assumed that the motion of any node is not affectedby any other node and each node moves independentlywithout being controlled by other nodes erefore the statetransfer matrix for centralized collaborative positioning isgiven by

Fkminus1

J1X(kminus1) 0 middot middot middot 0

0 J2X(kminus1) middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot JNX(kminus1)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(13)

where Jiu(kminus1) and Ji

X(kminus1) are the Jacobian matrices offunction f for state vectors and control inputs respectivelye system noise variance matrix of collaborative navigationsystem in centralized framework is as follows

Qkminus1

Q1kminus1 0 middot middot middot 0

0 Q2kminus1 middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot QNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(14)

where Qikminus1 Ji

u(kminus1)ΣuJiTu(kminus1) and Σu is the covariance

matrix for controlling input en the time update processof collaborative navigation system in centralized frameworkcan be deduced

6 Mathematical Problems in Engineering

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 6: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

is more binding and more reliable e JCBB algorithm canobtain more credible association assumptions than the NNalgorithm and exclude some wrong association assumptionsHowever the computation amount is obviously increasedwhich to some extent affects the real-time performance ofSLAM navigation system

To ensure the accuracy of data association in the processof the SLAM reduce the amount of computation as much aspossible and enhance the real-time performance of SLAMalgorithm this subsection describes an optimized data as-sociation algorithme classification method mentioned in[29] is used to divide the related feature information setfinally the appropriate feature information set in the localmap and the preprocessed observation feature informationset are selected to form the association space

First the collection of feature information in the localmap is divided as follows

D xm ym( 1113857 xk yk( 11138571113858 1113859leΔd xk yk( 1113857 isin Fk

D xm ym( 1113857 xk yk( 11138571113858 1113859gtΔd xk yk( 1113857 notin Fk1113896 (10)

where D[(xm ym) (xk yk)] is the relative distance betweenthe feature information (xk yk) of the local map and otherfeature information (xm ym)

en the observation feature information set is pre-processed and divided In the actual navigation process theobservation feature information obtained by lidar containsnoise information e purpose of preprocessing is to filterout some noise information improve the accuracy of dataassociation and reduce the amount of computation at thesame time e judgment process is as follows

f(i j) 1 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961ltΔD

0 D xi yj1113872 1113873 xi yj1113872 11138731113960 1113961geΔD

⎧⎪⎨

⎪⎩(11)

where ΔD is the threshold which is determined by theperformance of the laser sensor When the relative dis-tance between the two observation feature information isless than the threshold the observation feature infor-mation is considered to be the feature point otherwise thenoise point does not participate in the subsequentcalculation

When the set is divided the set of observed featureinformation is sorted according to the order of observationBased on the process of the local map feature information setabove the subset is divided in turn and all points are notinvolved in the division repeatedly

Finally we select the appropriate association set to ex-ecute the data association algorithm e subset of featureinformation of each local map and the subset of observedfeature information at the current time are joint compati-bility test and the feature information with the best testresults is selected to form a new subset as the data associationobject

4 Centralized CollaborativeNavigation Algorithm

41TimeUpdate First of all the state space model should beestablished e state vector of a single mobile robot withthree degrees of freedom contains position and headingangle information Suppose the number of nodes is N thestate space of collaborative navigation system in centralizedframework contains the state vectors of all mobile robots inthe group and the state vector of mobile robot i is Xi

k and thestate of system is Xk en the state space equation of thesystem can be expressed as follows

Xk

X1k

X2k

XNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

f1 X1kminus1 u1

kminus1( 1113857

f2 X2kminus1 u2

kminus1( 1113857

fN XNkminus1 uN

kminus1( 1113857

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

+

w1kminus1

w2kminus1

wNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

≜Φ Xkminus1 ukminus1( 1113857 + wkminus1

(12)

where the function fi(X u) describes the kinematic char-acteristics of the mobile robot ui

kminus1 ΔSrkminus 1 ΔSlkminus 11113858 1113859T

represents the input required by the mobile robot i tocalculate the track at time k wi

kminus1 is the system noise andwi

kminus1 sim N(0 Qikminus1)

It is assumed that the motion of any node is not affectedby any other node and each node moves independentlywithout being controlled by other nodes erefore the statetransfer matrix for centralized collaborative positioning isgiven by

Fkminus1

J1X(kminus1) 0 middot middot middot 0

0 J2X(kminus1) middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot JNX(kminus1)

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(13)

where Jiu(kminus1) and Ji

X(kminus1) are the Jacobian matrices offunction f for state vectors and control inputs respectivelye system noise variance matrix of collaborative navigationsystem in centralized framework is as follows

Qkminus1

Q1kminus1 0 middot middot middot 0

0 Q2kminus1 middot middot middot 0

⋮ ⋮ ⋱ ⋮

0 0 middot middot middot QNkminus1

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(14)

where Qikminus1 Ji

u(kminus1)ΣuJiTu(kminus1) and Σu is the covariance

matrix for controlling input en the time update processof collaborative navigation system in centralized frameworkcan be deduced

6 Mathematical Problems in Engineering

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 7: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

Xminusk Φ X

+kminus1 ukminus1( 1113857

Pminusk Fkminus1Pkminus1F

Tkminus1 + Qkminus1≜

Pminus11k Pminus

12k middot middot middot Pminus1Nk

Pminus21k Pminus

22k middot middot middot Pminus2Nk

⋮ ⋮ ⋱ ⋮

PminusN1k Pminus

N2k middot middot middot PminusNNk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(15)

where

Pminusiik J

iX(kminus1)Piikminus1J

iTX(kminus1) + Q

ikminus1

Pminusijk J

iX(kminus1)Pijkminus1J

jTX(kminus1)

(16)

42 Single-Node Measurement Update In this section themeasurement updating process involving only one node inthe centralized framework is describede aided navigationsystem selected is SLAM navigation system which integratesthe landmark information of the surrounding environmentmeasured by lidar In this paper a measurement modelbased on this navigation system is built and the process ofmeasurement updating based on EKF is described

421 Measurement Model Based on SLAM e measure-ment model based on SLAM is the measurement model afterdata association In this paper the position information oflandmarks obtained by lidar is taken as the observationequation

zik

xbl

ybl

⎡⎣ ⎤⎦

xw

l minus xik( 1113857cos θi

k + ywl minus yi

k( 1113857sin θik

minus xwl minus xi

k( 1113857sin θik + yw

l minus yik( 1113857cos θi

k

⎡⎢⎣ ⎤⎥⎦ + nik

(17)

where (xbl yb

l ) is position information for landmarks ob-tained by lidar (xw

l ywl θi

k) is the coordinates of landmarksin the world coordinate system (xi

k yik ) is the state of the

mobile robot at the time of k nik is the measurement noise

and its variance matrix is Rik which can be denoted as

nik sim N(0 Ri

k) After linearization and state extension theobservation equations of the whole system can be obtained

zik H

ikXk + h

iX

iminusk1113872 1113873 minus nablahi

Ximinusk + n

ik (18)

where

Hik 0 middot middot middotnablahi middot middot middot 01113858 11138592times3N (19)

and nablahii is Jacobian matrices of function hi(Xi

k)

422 Measurement and Update Based on EKFCombined with the basic principle of Kalman filter themeasurement and update process of the aided navigationsystem for a single node can be obtained as follows

Step 1 calculating the innovation and the filter gain

v zik minus h

iX

iminusk1113872 1113873

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

Si

HikP

minusk H

ik1113872 1113873

T+ R

ik

(20)

Step 2 updating the state estimation and the corre-sponding covariance

X+k X

minusk + K

iv

Pk Pminusk minus P

minusk H

ik1113872 1113873

TS

i1113872 1113873

minus 1H

ikP

minusk

(21)

43 Relative Measurement Update among Nodes e stan-dard observation model can be divided into two types themeasurement model based on the relative distance and themeasurement model based on the relative position

431 Measurement Model Based on Relative Distancee observation of mobile robot i to mobile robot j at k timecan be denoted by z

ij

k then the observation equation is givenby

zij

k hij

Xik X

j

k1113872 1113873 + nij

k

xik minus x

j

k1113872 11138732

+ yik minus y

j

k1113872 11138732

1113970

+ nij

k

(22)

where nij

k is the measurement noise its variance matrix isR

ij

k σUWB which can be denoted as nij

k sim N(0 Rij

k ) andσUWB is the variance for UWB ranging

After linearization and state extension the observationequations of the whole system can be obtained

zij

k Hij

k Xk + hij

Ximinusk X

jminus

k1113872 1113873 minus nablahiji X

iminusk minus nablahij

j Xjminus

k + nij

k

(23)

where

Hij

k 0 middot middot middotnablahij

i middot middot middotnablahij

j middot middot middot 01113960 11139612times3N (24)

and nablahiji and nablahij

j are Jacobian matrices of functionhij(Xi Xj) respectively

432 Measurement Model Based on Relative PositionUsing lidar as the sensor to realize the relative observationamong nodes can be divided into two kinds direct methodand indirect method e direct method is to measure therelative position between the two nodes directly the indirectmethod is to use lidar to observe the nearest landmarkbetween the two nodes e relative position between thetwo nodes is obtained by correlation calculation

e state of mobile robot i is denoted by (xik yi

k θik) at

time k and the state of mobile robot j is denoted by(xi

k yik θi

k) e coordinates of landmark L1 adjacent to

Mathematical Problems in Engineering 7

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 8: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

mobile robot i in the world coordinate system are (xwl1 yw

l1)the coordinates in the mobile robot i coordinate system are(xi

l1 yil1) the coordinates of landmark L2 adjacent to mobile

robot j in the world coordinate system are (xwl2 yw

l2) and thecoordinates in the coordinate system of mobile robot j are(xi

l2 ywi ) e specific solution process of the indirect

method is as follows (see Figure 9)

xj

k minus xik

yj

k minus yik

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

l2 cos θj

k minus yj

l2 sin θj

k

xj

l2 sin θj

k + yj

l2 cos θj

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ minusxi

l1 cos θik minus yi

l1 sin θik1113872 1113873 + xw

l1 minus xwl2( 1113857

xil1 sin θ

ik + yi

l1 cos θik1113872 1113873 + yw

l1 minus ywl2( 1113857

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ (25)

when mobile robot i observe mobile robot j at k time thecoordinates of mobile robot j in the mobile robot i coor-dinate system as the observation e observation equationsare as follows

zij

k x

ji

k

yji

k

⎡⎢⎢⎣ ⎤⎥⎥⎦ x

j

k minus xik1113872 1113873cos θi

k + yj

k minus yik1113872 1113873sin θi

k

minus xj

k minus xik1113872 1113873sin θi

k + yj

k minus yik1113872 1113873cos θi

k

⎡⎢⎢⎢⎢⎣ ⎤⎥⎥⎥⎥⎦ + nij

k

(26)

where nij

k is the measurement noise its variancematrix is Rij

k which can be denoted as n

ij

k sim N(0 Rij

k ) and (xji

k yji

k ) is thecoordinate of mobile robot j in the coordinate system ofmobile robot i at k time

433 Measurement Update Based on EKF Similarly we canfinally get the measurement update process for the relativeobservation between nodes

5 Decentralized CollaborativeNavigation Algorithm

e state and covariance information of each node under thedecentralized collaborative navigation algorithm is re-spectively calculated In order to avoid overoptimal esti-mation to the maximum extent the concept of thecovariance intersection filter is introduced and the covari-ance of each node is divided into related and irrelevantitems

51Covariance IntersectionFilter Given the state estimationvector X and corresponding covariance matrix P assuming

that Plowast is the covariance of the error between the stateestimate X and the state real value Xlowast it can be expressed asfollows

Plowast

E X minus Xlowast

( 1113857 X minus Xlowast

( 1113857T

1113960 1113961 (27)

Consistency is a characteristic of the covariancematrix of the estimation [30] When the covariancematrix of the state estimation is not less than the realcovariance it is said that the estimation satisfies theconsistency that is no overoptimal estimation is pro-duced Suppose two state estimates X1 and X2 are in-dependent and satisfy the consistency the correspondingcovariances are P1 and P2 If there is a correlation be-tween the two estimates the Kalman filter may produceinconsistent results in other words it leads to over-optimal estimation

P1 P1d

w+ P1i (28)

P2 P2d

w+ P2i (29)

Pminus 1

Pminus11 + P

minus12 (30)

X P Pminus11 X1 + P

minus12 X21113872 1113873 (31)

Pi P Pminus11 P1iP

minus11 + P

minus12 P2iP

minus121113872 1113873 (32)

Pd P minus Pi (33)

XwOw

Yb

Yb

L1

L2

Yw

Ob

Ob

θXb

Xb

Figure 9 Indirect observation schematic diagram

8 Mathematical Problems in Engineering

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 9: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

where the covariance corresponding to two state estimatesX1 and X2 is P1d + P1i and P2d + P2i respectively P1d andP2 d are the correlation covariance components corre-sponding to the maximum correlation between the two stateestimates P1i and P2i are independent covariance compo-nents corresponding to absolute independence of two stateestimates W within interval [0 1] It is an optimizationparameter that minimizes the covariance after fusion andthe w in this interval can ensure the consistency of the fusionresults

52 TimeUpdate Before describing the time update processof DCL algorithm it is necessary to decompose the stateinformation of the system in the framework of centralizedcollaborative navigation which can be expressed as

EG XG PG1113864 1113865

X1 P11113864 1113865 X2 P21113864 1113865 middot middot middot XN PN1113864 11138651113864 1113865

X1 P1 d + P1i1113864 1113865 X2 P2 d + P2i1113864 1113865 middot middot middot XN PN d + PNi1113864 11138651113864 1113865

(34)

where EG is the set of states under the centralized collab-orative navigation framework and XG and PG are the statespace and the corresponding covariance matrix under thecentralized collaborative navigation frameworkrespectively

e state propagation process under the decentralizedcollaborative navigation framework is the state propagationprocess of a single node and the propagation process ofcovariance can be expressed as

Piminusk J

iX(kminus1)P

ikminus1J

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

Piminuski J

iX(kminus1)P

ikminus1iJ

i TX(kminus1) + J

iu(kminus1)ΣuJ

i Tu(kminus1)

(35)

wherePiminusk is the one-step prediction covariancematrix of the

mobile robot i at time k and Piminuski is the covariance e

independent covariance components of the matrix JiX(kminus1)

and Jiu(kminus1) are Jacobian matrices of function fi(X u) for

state vector and control input and Σu is the error matrix ofthe control input

53 Single Node Measurement Update e measurementand updating process of a single node only involves the aidednavigation system of a single node so there is no need toestimate the correlation that is the process of saving for-mulas (28) and (29) Similar to the single node measurementupdate process in centralized collaborative navigation thesingle node measurement update process in distributedcollaborative navigation can be expressed as

Step 1 calculate the innovation and the filtering gain

v zik minus nablahi

Ximinusk

S nablahiP

iminusk nablah

i1113872 1113873

T+ R

ik

K Piminusk nablah

i1113872 1113873

TS

minus 1

(36)

Step 2 update the state estimation and the corre-sponding covariance

Xi+k X

iminusk + Kv

Pik I minus Knablahi

1113872 1113873Piminusk

Piki I minus Knablahi

1113872 1113873Piminuski I minus Knablahi

1113872 1113873T

+ KRikK

T

Pikd P

ik minus P

iki

(37)

54 CollaborativeMeasurementUpdate amongNodes In theframework of decentralized collaborative navigation thestate estimation results of a single node aided navigationsystem and the state estimation results based on informationsharing among nodes are integrated in the process of in-ternode collaborative measurement updating and the cor-rected state information is derived

For the decentralized collaborative navigation frame-work any node can clearly estimate the state of other nodesIn order to save the communication cost and reduce thecomputation of a single mobile robot platform this papersets up that the information exchange among the mobilerobots is only taking place between the two adjacent mobilerobot nodes

Assuming that the mobile robot i performs relativeobservations of the mobile robot j at the k time and sharesits own state and covariance information with the mobilerobot j the state of the mobile robot j can be clearlyexpressed with information received state of the mobilerobot i and the relative measurement information betweenthe two nodes

Xcojk

xcojk

ycojk

⎡⎢⎢⎢⎣ ⎤⎥⎥⎥⎦ xi

k + xji

k cos θik minus y

ji

k sin θik

yik + x

ji

k sin θik + y

ji

k cos θik

⎡⎢⎢⎣ ⎤⎥⎥⎦ (38)

where (xcojk y

cojk ) is the partial state estimation of the

mobile robot j obtained by the information sharing be-tween the mobile robot i and the mobile robot j at k time(xi

k xik θi

k) is the state vector shared by the mobile robot j

in the k direction of the mobile robot i andurel (x

ji

k yji

k ) is the relative measurement informationof the two nodes in the i coordinate system of the mobilerobot

If there is a direct relative observation between the twonodes the relative measurement information can be ob-tained directly by the sensor that carries on the relativeobservation If the relative observation between the twonodes is with the help of the indirect relative observation ofthe surrounding landmark information then the relativemeasurement information needs to be solved to a certainextent and the concrete solution method can be combined(25) and then converted into the mobile robot

Finally based on the principle of covariance intersectionfilter the updating process of collaborative measurementamong nodes in the framework of decentralized collabo-rative navigation can be obtained

Mathematical Problems in Engineering 9

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 10: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

6 Simulation Results

61 Simulated Experimental Environment In this sectionthe nodes of the mobile robot network involved in collab-orative navigation are 3 Assuming that the area of themoving 2D environment is 25m times 25m when the mobilerobot population group works together each node in theenvironment is assigned to an initial position and each nodecan follow random trajectory motion in this area Assumingthat all nodes follow the same simulation trajectory only theinitial position is different e maximum speed of themobile robot in the straight line is 03125ms and the an-gular velocity at the bend is 01degs It is assumed that theenvironment around the simulated trajectory of this rect-angle can be extracted by lidar scanning to 88 landmarks(environmental feature points) for the SLAM-assistednavigation system (see Figure 10)

During this simulation mobile robots as carriers cancarry different types of sensors including odometers UWBand lidar Suitable sensors are selected according to therequirements of positioning accuracy among which TimeDomain P410 UWB sensors are used to measure the relativedistance and lidar is selected from LMS291 series 2D lidarproduced by a German company Based on the relevantparameters of these sensors which are shown in Table 1 asimulation model for mobile robots carrying different typesof sensors is built using MATLAB

62 Relative Measurement Aided Odometer CollaborativeNavigation In the experiment all three mobile robots areequipped with odometer capable of moving monitoringUWB capable of measuring relative distance or lidar capableof measuring relative position

From Figure 11 it can be seen that the collaborativenavigation system which realizes relative informationsharing has significant advantages over the case of not

sharing any information in positioning accuracy Besidesthe improvement of group navigation performance ofmobile robots is affected by the type of shared relative in-formation When the relative position information is sharedthe growth of the error can be effectively limited relativelyspeaking when the relative distance information is sharedthe position error is still growing slowly which only reducesthe growth rate of the error (see Figure 11)

e analysis shows that the relative distance informationis weakly constrained so sharing this information cannoteffectively realize the navigation and localization of mobilerobots In contrast the sharing of relative position infor-mation includes the solution to mobile robot navigation andinformation Information accuracy is significantly im-proved It can even be increased by more than 60 at sometime is difference is more obvious in the angle errordiagram (see Figure 11)

In this paper two observation methods direct relativemeasurement and indirect relativemeasurement arementionedin the description of the measurement model based on relativeposition Based on this experimental scene scenario I is threemobile robots to observe the relative position information di-rectly through lidar Scenario II is three mobile robots to extractthe surrounding landmark information through lidar and basedon this solution the relative position information is calculatedIn the above experimental scenario the centralized collaborativenavigation algorithm is used to solve the navigation probleme two relative position measurement methods are comparedthrough the above simulation experimental scenarios ecomparison results are shown in Figure 12 (see Figure 12)

rough Figure 12 it is clear that the collaborative navi-gation and positioning accuracy of relative position measure-ment using the direct method are better than those of indirectmethod However cost calculation cannot be ignored whilenavigation performance is considered e direct method re-quires that the measurement range of lidar includes the activityrange of the whole mobile robot population group while themeasurement range of lidar required by indirect method onlyneeds to include the surrounding landmarks is greatly re-duces the cost calculation Considering that the accuracy ofcollaborative navigation and positioning using the two relativeposition measurement methods is not much different it isobvious that the indirect method is more suitable for practicalapplication (see Figure 12)

e difference of the decentralized collaborative navi-gation framework compared with the centralized collabo-rative navigation framework is that the correlation among

ndash5 0 5 10 15 20 25 30X (m)

ndash5

0

5

10

15

20

25

30Y

(m)

LandmarkTrajectory

Figure 10 Simulation trajectory diagram

Table 1 Relevant parameters of sensors

Type Measure Frequency(Hz) Error

Odometer Linear velocity of the twowheels 20 4 cms

UWB Relative distance 10 3 cm

Lidar

Relative position of Xdirection 10 2 cm

Relative position of Ydirection 10 2 cm

10 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 11: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

60 120 180 240 300 3600Time (s)

ndash1

ndash05

0

05

1Po

sitio

n er

ror (

m)

Relative distanceRelative positionNo information

(a)

60 120 180 240 300 3600Time (s)

Relative distanceRelative positionNo information

ndash15

ndash10

ndash5

0

5

10

15

20

Ang

le er

ror (

deg)

(b)

Figure 11 Comparative diagram of navigation error under the condition of sharing different relative information (a) Position error(b) Angle error

60 120 180 240 300 3600Time (s)

ndash02

0

02

04

06

Posit

ion

erro

r (m

)

DirectIndirect

(a)

60 120 180 240 300 3600Time (s)

DirectIndirect

ndash05

0

05

1

15

2

Ang

le er

ror (

deg)

(b)

Figure 12 Comparison of navigation errors under different relative position measuring methods (a) Position error (b) Angle error

Posit

ion

erro

r (m

)

1

05

0

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

(a)

Ang

le er

ror (

deg)

252

15

050

ndash0560 120 180 240 300 3600

Time (s)

CLDCL

1

(b)

Figure 13 Comparison of navigation errors under different collaborative navigation algorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 11

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 12: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

the different node states is accurately calculated in thecentralized collaborative navigation framework and thiscorrelation cannot be used in the decentralized collaborativenavigation framework In order to better reflect the impactof this correlation the navigation errors of the two col-laborative navigation algorithms in the odometer collabo-rative navigation system are shown in Figure 13 (seeFigure 13)

To compare the two algorithms 20 experiments arecarried out in this paper and the root mean square error(RMS) and formulas of the two collaborative navigationalgorithms are calculated as shown in the following formula

RMS

1n

1113944

n

i1xi minus xi( 1113857

2

11139741113972

(39)

where n is the total number of samples xi is the actual valueand xi is the estimated value RMS parameters for theodometer collaborative navigation are shown in Table 2

As can be seen from Figure 13 and Table 2 the error ofthe centralized collaborative navigation algorithm is smallerthan that of the decentralized collaborative navigation al-gorithm is can be predicted because the correlationamong node states in the centralized collaborative naviga-tion algorithm can be calculated accurately which is esti-mated in the decentralized collaborative navigationalgorithm However the improved accuracy of the navi-gation is at the expense of high computing power and highquality data communication erefore although the per-formance of centralized collaborative navigation frameworkis better than that of distributed collaborative navigationframework the centralized collaborative navigationframework is not applicable in some practical scenarios (seeFigure 13)

63OdometerVisionSLAMCollaborativeNavigation In theodometervision collaborative navigation model scenario Iis designed that all the mobile robots are equipped with anodometer which can monitor the motion One of the mobilerobots is equipped with SLAM-aided navigation system andcan work properly

Firstly the mobile robot with SLAM-aided navigationsystem is studied and it only runs its own integratednavigation algorithm without sharing the relative positioninformation Using the centralized collaborative navigationalgorithm the navigation error of nodes with SLAM-aidednavigation system is shown in Figure 14 (see Figure 14)

Figure 14 fully verifies the correctness of a centralizedcollaborative navigation algorithm based on the odometervision collaborative navigation model e SLAM-assistednavigation system is based on the relative observation eposition estimation of the node itself and the position es-timation of the landmark have the error accumulation butthe association algorithm of the SLAM is combined thecentralized collaborative navigation algorithm makes theposition estimation of the landmark closer to the real valuewhile the positioning accuracy of the mobile robot is

improved the data association is more reliable furthercorrecting the state estimation of the mobile robot itselferefore the algorithm is very obvious to improve thenavigation accuracy of mobile robot (see Figure 14)

en the mobile robots without the SLAM-aided nav-igation system in the experiment are studied In order tofully reflect the influence of the SLAM-aided navigationinformation on the navigation performance of other nodesScenario II is designed that all mobile robots are equippedwith odometer which can monitor the motion and two ofthem are equipped with SLAM-aided navigation system andcan work properly e navigation error of other nodeswithout SLAM-aided navigation system is shown in Fig-ure 15 (see Figure 15)

As shown in Figure 15 the mobile robot with SLAM-aided navigation system performs loop-back detection inabout 320 seconds and data associated with the local mapcreated at the initial location thus eliminating most of theaccumulated errorse unique superior performance of theSLAM-aided navigation system is transmitted to other nodesin the group through the process of information sharing inthe process of collaborative navigation so that it can alsoeliminate most of the accumulated errors in the vicinity ofthe time which is an important advantage of the collabo-rative navigation system (see Figure 15)

To verify the NN algorithm JBCC algorithm and theoptimized data association algorithm on the navigationperformance of nodes without SLAM-aided navigationsystem the experimental scene is designed that all mobilerobots are equipped with odometer which can carry outmotion monitoring One of the mobile robots is equippedwith SLAM-aided navigation system and can work normallyand the CL algorithm is run e navigation error of nodeswithout SLAM-aided navigation system is shown in Fig-ure 16 (see Figure 16)

e performance of the centralized collaborative navi-gation algorithm under the three SLAM data associationalgorithms is shown in Table 3

From Figure 16 and Table 3 it can be seen that thenavigation performance of nodes without SLAM-aidednavigation system is affected by the SLAM data associationalgorithm used by nodes carrying SLAM-aided navigationsystem Running the NN algorithm the matching accuracyof feature information is not high so that the navigationaccuracy is poor Running the JCBB algorithm the correctrate of data association is the highest but the operation timeis the longest Running optimization data association al-gorithm the navigation accuracy is slightly reduced but theoperation time is less which can meet the real-time re-quirements (see Figure 16)

In this subsection to compare the performance of col-laborative navigation systems in odometervision collabo-rative navigation systems under centralized and

Table 2 Odometer collaborative navigation RMS parameters

Algorithm type Position error (m) Angle error (deg)CL 01629 074625DCL 036342 13762

12 Mathematical Problems in Engineering

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 13: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash005

0

005

01

015

02

025

03Po

sitio

n er

ror (

m)

(a)

60 120 180 240 300 3600Time (s)

No informationRelative position

ndash4

ndash3

ndash2

ndash1

0

1

2

3

Ang

le er

ror (

deg)

(b)

Figure 14 e navigation error map of the node with the SLAM-aided navigation system (a) Position error (b) Angle error

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash01

ndash005

0

005

01

015

Posit

ion

erro

r (m

)

(a)

60 120 180 240 300 3600Time (s)

Two nodes with SLAMOne node with SLAM

ndash02

ndash01

0

01

02

03

04

05

Ang

le er

ror (

deg)

(b)

Figure 15 Some nodes are equipped with SLAM-aided navigation system (a) Position error (b) Angle error

6

4

2

0

ndash2

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

JCBBOptimization algorithmNN

(a)

Ang

le er

ror (

deg)

5

0

ndash5

ndash10

ndash15

ndash2060 120 180 240 300 3600

Time (s)

JCBBOptimization algorithmNN

(b)

Figure 16 Comparison diagram of navigation error for fusion of single node SLAM information under different SLAM data associationalgorithms (a) Position error (b) Angle error

Mathematical Problems in Engineering 13

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 14: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

decentralized collaborative navigation algorithms we runthe CL and DCL algorithms separately under the experi-mental scenario I e navigation errors of the two col-laborative navigation algorithms are compared as shown inFigure 17 Under the experimental scenario II of this sub-section we run the CL algorithm and the DCL algorithmrespectively e navigation errors of the two collaborativenavigation algorithms are compared as shown in Figure 18(see Figures 17 and 18)

After 20 experiments the RMS parameters of collabo-rative navigation with single node SLAM information areshown in Table 4

e RMS parameters of the coordinated navigation withthe fused multinode SLAM information are shown inTable 5

As can be seen from Figures 17 and 18 in conjunctionwith Tables 4 and 5 in the odometervision collaborativenavigation system the error of the centralized collaborative

Table 3 Performance comparison of centralized collaborative navigation algorithms under different SLAM data association algorithms

Algorithm type Position error (m) Angle error (deg) Relative timeNN 28323 107919 4JCBB 00322 01623 12Optimization 05587 22476 1

03

02

01

0

ndash01

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)A

ngle

erro

r (de

g)

06

04

02

0

ndash02

ndash0460 120 180 240 300 3600

Time (s)

CLDCL

(b)

Figure 17 Comparative diagram of navigation error for fusion of single node SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

01

005

0

ndash005

ndash01

ndash015

Posit

ion

erro

r (m

)

60 120 180 240 300 3600Time (s)

CLDCL

(a)

04

02

0

ndash02

ndash04

Ang

le er

ror (

deg)

60 120 180 240 300 3600Time (s)

CLDCL

(b)

Figure 18 Comparison diagram of navigation error for fusion of multinode SLAM information under different collaborative navigationalgorithms (a) Position error (b) Angle error

Table 4 Collaborative navigation RMS parameters for fusion ofsingle node SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00322 01623DCL 00669 02094

Table 5 Collaborative navigation RMS parameters for fusion ofmultinode SLAM information

Algorithm type Position error (m) Angle error (deg)CL 00243 00524DCL 00438 01265

14 Mathematical Problems in Engineering

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 15: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

navigation algorithm is less than the distributed collabo-rative navigation algorithm after the landmark informationcollected by the single node or the multinode is fused thereis a small gap between the two algorithms In other wordsthe distributed collaborative navigation algorithm based onthe odometervision collaborative navigation model can wellestimate the correlation of the internode information (seeFigures 17 and 18)

Considering the high requirement of the centralizedcollaborative navigation algorithm to the computing powerand the communication level the application scenarios ofthe two algorithms are analyzed in combination with theabovementioned collaborative navigation experiment thecentralized collaborative navigation algorithm is suitable forthe case that there are few nodes and the nodes are notequipped with additional aided navigation system thedecentralized collaborative navigation algorithm is suitablefor the large number of nodes and the large amount ofinformation shared and some nodes are equipped withadditional aided navigation systems especially in the case ofSLAM-aided navigation system

7 Conclusion

In order to improve the performance of cooperative navi-gation system a multirobot collaborative navigation algo-rithm based on odometervision multisource informationfusion is studied On the basis of establishing the multi-source information fusion collaborative navigation systemmodel the centralized collaborative navigation of odometervision fusion the decentralized collaborative navigationframework and the vision-based SLAM are given and thecentralized and decentralized odometervision collaborativenavigation algorithms are derived respectively e effec-tiveness of the proposed algorithm is verified by the sim-ulation experiments which has some theoretical value andapplication value in high performance collaborative navi-gation applications

Data Availability

e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

e authors declare that they have no conflicts of interest

References

[1] K N Olivier D E Griffith G Eagle et al ldquoRandomized trialof liposomal amikacin for inhalation in nontuberculousmycobacterial lung diseaserdquo American Journal of Respiratoryand Critical Care Medicine vol 195 no 6 pp 814ndash823 2017

[2] M Schwarz M Beul D Droeschel et al ldquoDRC team nimbrorescue perception and control for centaur-like mobile ma-nipulation robot momarordquo Springer Tracts in Advanced Ro-botics Springer Berlin Germany pp 145ndash190 2018

[3] M Long H Su and B Liu ldquoGroup controllability of two-time-scale discrete-time multi-agent systemsrdquo Journal of theFranklin Institute vol 357 no 6 pp 3524ndash3540 2020

[4] T Fukuda S Nakagawa Y Kawauchi and M BussldquoStructure decision method for self organising robots basedon cell structures-CEBOTrdquo in Proceedings of the 1989 In-ternational Conference on Robotics and Automation Scotts-dale AZ USA May 1989

[5] H Asama A Matsumoto and Y Ishida ldquoDesign of an au-tonomous and distributed robot system actressrdquo in Pro-ceedings of the IEEERSJ International Workshop on IntelligentRobots and Systems ldquo(IROS rsquo89)rdquo the Autonomous MobileRobots and its Applications September 1989

[6] J Zhou Y Lv G Wen X Wu and M Cai ldquoree-dimen-sional cooperative guidance law design for simultaneous at-tack with multiple missiles against a maneuvering targetrdquo inProceedings of the 2018 IEEE CSAA Guidance Navigation andControl Conference (CGNCC) August 2018

[7] H Su J Zhang and Z Zeng ldquoFormation-containmentcontrol of multi-robot systems under a stochastic samplingmechanismrdquo Science China Technological Sciences vol 63no 6 pp 1025ndash1034 2020

[8] H Park and S Hutchinson ldquoA distributed robust conver-gence algorithm for multi-robot systems in the presence offaulty robotsrdquo in Proceedings of the 2015 IEEERSJ Interna-tional Conference on Intelligent Robots and Systems (IROS)pp 2980ndash2985 IEEE Hamburg Germany September-Oc-tober 2015

[9] K Petersen and R Nagpal ldquoComplex design by simple robotsa collective embodied intelligence approach to constructionrdquoArchitectural Design vol 87 no 4 pp 44ndash49 2017

[10] L Chaimowicz T Sugar V Kumar and M F M CamposldquoAn architecture for tightly coupled multi-robot coopera-tionrdquo in Proceedings 2001 ICRA IEEE International Con-ference on Robotics and Automation (Cat no 01CH37164)vol 3 pp 2992ndash2997 IEEE Seoul Korea May 2001

[11] H-X Hu G Chen and G Wen ldquoEvent-triggered control onquasi-average consensus in the cooperation-competition net-workrdquo in Proceedings of the IECON 2018mdash44th Annual Con-ference of the IEEE Industrial Electronics Society October 2018

[12] A Amanatiadis K Charalampous I Kostavelis et al ldquoeavert project autonomous vehicle emergency recovery toolrdquoin Proceedings of the 2013 IEEE International Symposium onSafety Security and Rescue Robotics (SSRR) pp 1ndash5 IEEELinkoping Sweden October 2013

[13] R Kurazume S Hirose T Iwasaki S Nagata and N SashidaldquoStudy on cooperative positioning systemrdquo in Proceedings ofthe IEEE International Conference on Robotics andAutomation Minneapolis MN USA August 1996

[14] Z Fu Y Zhao and G Wen ldquoDistributed continuous-timeoptimization in multi-agent networks with undirected topol-ogyrdquo in Proceedings of the 2019 IEEE 15th International Con-ference on Control and Automation (ICCA) November 2019

[15] Y Zhao Y Liu and G Wen ldquoFinite-time average estimationfor multiple double integrators with unknown bounded in-putsrdquo in Proceedings of the 2018 33rd Youth Academic AnnualConference of Chinese Association of Automation (YAC) May2018

[16] S Mao Mobile robot localization in indoor environmentZhejiang University Hangzhou China PhD dissertation2016

[17] J Yang ldquoAnalysis approach to odometric non-systematicerror uncertainty for mobile robotsrdquo Chinese Journal ofMechanical Engineering vol 44 no 8 pp 7ndash12 2008

[18] J Kang F Zhang and X Qu Angle Measuring Error Analysisof Coordinate Measuring System of Laser Radar vol 40 no 6pp 834ndash839 2016

Mathematical Problems in Engineering 15

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering

Page 16: MultirobotCollaborativeNavigationAlgorithmsBasedon … · 2020. 5. 24. · According to the odometer/vision collaborative navi-gation model, we use the most common EKF algorithms

[19] J Zhang P Orlik Z Sahinoglu A Molisch and P KinneyldquoUWB systems for wireless sensor networksrdquo Proceedings ofthe IEEE vol 97 no 2 pp 313ndash331

[20] D Kaushal and T Shanmuganantham ldquoDesign of a compactand novel microstrip patch antenna for multiband satelliteapplicationsrdquo Materials Today Proceedings vol 5 no 10pp 21 175ndash21 182 2018

[21] J Xiucai Data association problem for simultaneous locali-zation and mapping of mobile robots National University ofDefense Technology Changsha China PhD dissertation2008

[22] Z Yuan ldquoResearch of mobile robotrsquos slam based on binocularvisionrdquo Masterrsquos thesis Tianjin University of TechnologyTianjin China 2016

[23] F Bellavia M Fanfani F Pazzaglia and C Colombo ldquoRobustselective stereo slam without loop closure and bundle ad-justmentrdquo in Proceedings of the International Conference onImage Analysis and Processing pp 462ndash471 Springer NaplesItaly 2013

[24] H Fourati Multisensor Data Fusion From Algorithms andArchitectural Design to Applications CRC Press Boca RatonFL USA 2015

[25] S Jia X Yin and X Li ldquoMobile robot parallel pf-slam basedon openmprdquo in Proceedings of the 2012 IEEE InternationalConference on Robotics and Biomimetics (ROBIO) pp 508ndash513 IEEE Guangzhou China December 2012

[26] W Zhou E Shiju Z Cao and Y Dong ldquoReview of slam dataassociation studyrdquo in Proceedings of the 2016 InternationalConference on Sensor Network and Computer EngineeringAtlantis Press Shanghai China 2016

[27] R Singer and R Sea ldquoA new filter for optimal tracking indense multitarget environmentsrdquo in Proceedings of the An-nual Allerton Conference on Circuit and System Jeorypp 201ndash211 Monticello MN USA 1972

[28] J Neira and J D Tardos ldquoData association in stochasticmapping using the joint compatibility testrdquo IEEE Transactionson Robotics and Automation vol 17 no 6 pp 890ndash897 2001

[29] L Yanju X Yufeng G Song H Xi and G ZhengpingldquoResearch on data association in slam based laser sensorrdquoMicrocomputer amp Its Application vol 36 no 2 pp 78ndash822017

[30] O Hlinka O Sluciak F Hlawatsch and M Rupp ldquoDis-tributed data fusion using iterative covariance intersectionrdquoin Proceedings of the 2014 IEEE International Conference onAcoustics Speech and Signal Processing (ICASSP) pp 1861ndash1865 IEEE Florence Italy May 2014

16 Mathematical Problems in Engineering