research article unmanned aerial vehicle navigation using wide...

13
Research Article Unmanned Aerial Vehicle Navigation Using Wide-Field Optical Flow and Inertial Sensors Matthew B. Rhudy, 1 Yu Gu, 2 Haiyang Chao, 3 and Jason N. Gross 4 1 Division of Engineering, Pennsylvania State University, Reading, PA 19610, USA 2 Department of Mechanical and Aerospace Engineering and Lane Department of Computer Science and Electrical Engineering at WVU, Morgantown, WV 26506, USA 3 Aerospace Engineering Department, University of Kansas, Lawrence, KS 66045, USA 4 Department of Mechanical and Aerospace Engineering at West Virginia University, Morgantown, WV 26506, USA Correspondence should be addressed to Matthew B. Rhudy; [email protected] Received 16 January 2015; Revised 12 March 2015; Accepted 30 March 2015 Academic Editor: Liwei Shi Copyright © 2015 Matthew B. Rhudy et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. is paper offers a set of novel navigation techniques that rely on the use of inertial sensors and wide-field optical flow information. e aircraſt ground velocity and attitude states are estimated with an Unscented Information Filter (UIF) and are evaluated with respect to two sets of experimental flight data collected from an Unmanned Aerial Vehicle (UAV). Two different formulations are proposed, a full state formulation including velocity and attitude and a simplified formulation which assumes that the lateral and vertical velocity of the aircraſt are negligible. An additional state is also considered within each formulation to recover the image distance which can be measured using a laser rangefinder. e results demonstrate that the full state formulation is able to estimate the aircraſt ground velocity to within 1.3 m/s of a GPS receiver solution used as reference “truth” and regulate attitude angles within 1.4 degrees standard deviation of error for both sets of flight data. 1. Introduction Information about the velocity and attitude of an aircraſt is important for purposes such as remote sensing [1], naviga- tion, and control [2]. Traditional low-cost aircraft naviga- tion relies on the use of both inertial sensors and Global Positioning System (GPS) [35]. While GPS can provide useful information to an aircraſt system, this information is not always available or reliable in certain situations, such as flying in urban environments or other GPS-denied areas (e.g., under radio-frequency jamming or strong solar storm). GPS is not self-contained within the aircraſt system; rather the information comes from external satellites. Insects, such as the honeybee, have demonstrated impressive capabilities in flight navigation without receiving external communications [6]. One significant information source that is used by insects as well as birds is vision [68]. is information can also be made available to an aircraſt through the use of onboard video cameras. e challenge with this information rich data is correctly processing and integrating the vision data with the other onboard sensor measurements [9]. Vision data can be processed using feature detection algorithms such as the Scale-Invariant Feature Transform (SIFT) [10] to obtain optical flow vectors, as well as other techniques. Optical flow is useful for aircraſt systems because it is rich in navigation information, simple to represent, and easy to compute [11]. One of the benefits of this information is that it can be used in order to extract velocity information about the aircraſt, which in turn can be used for aircraſt positioning. is optical flow information has been used for autonomous navigation applications such as relative heading and lateral position estimation of a quadrotor helicopter [12, 13]. Another work has considered the use of optical flow for UAV take-off and landing [14] and landmark navigation [15]. Another potential benefit of optical flow is that it implicitly contains information about the aircraſt attitude angles. is implicit information has been used in related work for UAV attitude estimation using horizon detection and optical flow Hindawi Publishing Corporation Journal of Robotics Volume 2015, Article ID 251379, 12 pages http://dx.doi.org/10.1155/2015/251379

Upload: others

Post on 27-Mar-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

Research ArticleUnmanned Aerial Vehicle Navigation Using Wide-Field OpticalFlow and Inertial Sensors

Matthew B Rhudy1 Yu Gu2 Haiyang Chao3 and Jason N Gross4

1Division of Engineering Pennsylvania State University Reading PA 19610 USA2Department of Mechanical and Aerospace Engineering and Lane Department of Computer Science andElectrical Engineering at WVU Morgantown WV 26506 USA3Aerospace Engineering Department University of Kansas Lawrence KS 66045 USA4Department of Mechanical and Aerospace Engineering at West Virginia University Morgantown WV 26506 USA

Correspondence should be addressed to Matthew B Rhudy matthewrhudygmailcom

Received 16 January 2015 Revised 12 March 2015 Accepted 30 March 2015

Academic Editor Liwei Shi

Copyright copy 2015 Matthew B Rhudy et al This is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work is properlycited

This paper offers a set of novel navigation techniques that rely on the use of inertial sensors and wide-field optical flow informationThe aircraft ground velocity and attitude states are estimated with an Unscented Information Filter (UIF) and are evaluated withrespect to two sets of experimental flight data collected from an Unmanned Aerial Vehicle (UAV) Two different formulations areproposed a full state formulation including velocity and attitude and a simplified formulation which assumes that the lateral andvertical velocity of the aircraft are negligible An additional state is also considered within each formulation to recover the imagedistance which can be measured using a laser rangefinderThe results demonstrate that the full state formulation is able to estimatethe aircraft ground velocity to within 13ms of a GPS receiver solution used as reference ldquotruthrdquo and regulate attitude angles within14 degrees standard deviation of error for both sets of flight data

1 Introduction

Information about the velocity and attitude of an aircraft isimportant for purposes such as remote sensing [1] naviga-tion and control [2] Traditional low-cost aircraft naviga-tion relies on the use of both inertial sensors and GlobalPositioning System (GPS) [3ndash5] While GPS can provideuseful information to an aircraft system this information isnot always available or reliable in certain situations such asflying in urban environments or otherGPS-denied areas (egunder radio-frequency jamming or strong solar storm) GPSis not self-contained within the aircraft system rather theinformation comes from external satellites Insects such asthe honeybee have demonstrated impressive capabilities inflight navigation without receiving external communications[6] One significant information source that is used by insectsas well as birds is vision [6ndash8] This information can alsobe made available to an aircraft through the use of onboardvideo cameras The challenge with this information rich data

is correctly processing and integrating the vision data withthe other onboard sensor measurements [9]

Vision data can be processed using feature detectionalgorithms such as the Scale-Invariant Feature Transform(SIFT) [10] to obtain optical flow vectors as well as othertechniques Optical flow is useful for aircraft systems becauseit is rich in navigation information simple to represent andeasy to compute [11] One of the benefits of this informationis that it can be used in order to extract velocity informationabout the aircraft which in turn can be used for aircraftpositioning This optical flow information has been used forautonomous navigation applications such as relative headingand lateral position estimation of a quadrotor helicopter [1213] Another work has considered the use of optical flow forUAV take-off and landing [14] and landmark navigation [15]Another potential benefit of optical flow is that it implicitlycontains information about the aircraft attitude angles Thisimplicit information has been used in related work for UAVattitude estimation using horizon detection and optical flow

Hindawi Publishing CorporationJournal of RoboticsVolume 2015 Article ID 251379 12 pageshttpdxdoiorg1011552015251379

2 Journal of Robotics

along the horizon line [16 17] and pose estimation fora hexacopter [18] a lunar rover [19] and spacecraft [20]While this work is useful these vehicles contain significantlydifferent dynamic characteristics than a typical airplane Dueto this more analysis of the application of optical flow forairplane applications is necessary

This work presents a combined velocity and attitude esti-mation algorithm using wide-field optical flow for airplanesthat does not require horizon detection which is usefulbecause the horizon does not need to be visible in the imageframe in order to obtain attitude information The algorithmrelies on the optical flow computed using a downward facingvideo camera measurements from a laser range finder andan Inertial Measurement Unit (IMU) that are mounted inparallel to the camera axis and a flat ground assumptionto determine information about the aircraft velocity andattitude Many of the existing experiments for optical flowand inertial sensor fusion are done using helicopter platformsand focus on position and velocity estimation [21 22] Thiswork considers an airplane system rather than a helicopterwhich contains a significantly different flight envelope anddynamics Additionally the regulation of attitude informa-tion through the use of optical flow is considered which isnot typically done in existing applications This work takesadvantage of all detected optical flow points in the imageplane including wide-field optical flow points which wereoften omitted in previous works [23ndash25] These wide-fieldoptical flow points are of significant importance for attitudeestimation since they contain roll and pitch information thatis not observable from the image center Although this workconsiders the use of a laser range finder to recover the distancebetween the image scene and the camera it is possible todetermine this information using other techniques [26] Infact it has been demonstrated that the scale is an observablemode for the vision and IMU data fusion problem [27]The presented formulation was originally offered in its earlystages of development in [28] Since this original publicationthe implementation and tuning of the formulation havebeen refined and additional results have been generatedIn particular a simplified formulation is offered whichreduces the filter states and the inclusion of a range stateis considered The main contribution of this paper is theanalysis of a stable vision-aided solution for the velocity andattitude determination without the use of GPS This solutionis verified with respect to two sets of actual UAV flight testingdata

The rest of this paper is organized as follows Section 2presents the different considered formulations and frame-work for this problem Section 3 describes the experimentalsetup which was used to collect data for this study Theresults are offered in Section 4 followed by a conclusion inSection 5

2 Problem Formulation

21 Optical Flow Equations Optical flow is the projection of3D relative motion into a 2D image plane Using the pinholecamera model the 3D position (120578

119909 120578119910 120578119911) in the 3D camera

body frame can be mapped into the 2D image plane withcoordinates (120583 ]) using

120583 = 119891120578119909

120578119911

] = 119891120578119910

120578119911

(1)

where 120583 ] and 119891 are given in pixels and 119891 is the focallength For a downward looking camera that is parallel to theaircraft 119911-axis and with a level and flat ground assumptionthe optical flow equations have been derived [29]

[

]] =

119891 + ] tan120601 minus 120583 tan 120579 cos120601120578119911

[[[

[

minus119906 +120583119908

119891

minusV +]119908119891

]]]

]

minus[[[

[

119891119902 minus 119903] minus119901120583]119891

+1199021205832

119891

minus119891119901 + 119903120583 minus119901]2

119891+

119902120583]119891

]]]

]

(2)

where 120601 120579 are the roll and pitch angles 119901 119902 119903 are the rollpitch and yaw body-axis angular rates 119906 V 119908 are the body-axis ground velocity components of the aircraft and ] arethe components of optical flow in the 2D image plane givenin pixelssecThis equation captures the relationship betweenoptical flow at various parts of the image plane with otherpieces of navigation information By considering only thearea close to the image center (120583 asymp 0 ] asymp 0) the narrow-fieldoptical flow model can be simplified [23ndash25] however thisremoves the roll and pitch dependence of the equation and istherefore not desirable for attitude estimation purposes

22 State Space Formulation and Stochastic Modeling Thiswork considers the simultaneous estimation of body-axisground velocity components (119906 V 119908) and Euler attitudeangles (120601 120579 120595) This estimation is performed through thefusion of Inertial Measurement Unit (IMU)measurements ofbody-axis accelerations (119886

119909 119886119910 119886119911) and angular rates (119901 119902 119903)

laser rangefinder range measurements (119871) and 119899 sets ofoptical flow measurements ( ])i where 119894 = 1 2 119899 Thevalue of 119899 varies with each time step based on how manyfeatures in the frame can be used for optical flow calculationUsing these values the state space model of the system isformulated with the following state vector x bias state vectorb input vector u optical flow input vectors d

119894 and output

vectors z119894

x = [119906 V 119908 120601 120579 120595 b119879]119879

b = [119887119886119909

119887119886119910

119887119886119911

119887119901

119887119902

119887119903

119887119871]119879

u = [119886119909

119886119910

119886119911

119901 119902 119903 119871]119879

d119894= [120583 ]]119879

119894

z119894= [ ]]119879

119894

(3)

Journal of Robotics 3

Body coordinatesNED coordinatesRange coordinate

x998400

y998400

z998400

L z (down)

Figure 1 Diagram of the range coordinate

A diagram describing the definition of the range coordinate119871 is provided in Figure 1 Note that the range coordinate 119871is equivalent to the camera 119911 coordinate 120578

119911

In order to determine the dynamics of the velocity statesthe time derivative of the velocity vector observed from thefixed navigation frame is equal to the time rate of change asobserved from the moving body axis frame plus the changecaused by rotation of the frame [30]

119889

119889119905(

[[

[

119906

V

119908

]]

]

) =[[

[

V

]]

]

+[[

[

119901

119902

119903

]]

]

times[[

[

119906

V

119908

]]

]

(4)

The IMU measures the acceleration with respect to the fixedgravity vector as in

[[

[

119886119909

119886119910

119886119911

]]

]

=119889

119889119905(

[[

[

119906

V

119908

]]

]

) + 119862119887

119899

[[

[

0

0

minus119892

]]

]

(5)

where 119862119887119899is the rotation matrix from the navigation frame to

the body frame

119862119887

119899=

[[

[

cos 120579 cos120595 cos 120579 sin120595 minus sin 120579

minus cos120601 sin120595 + sin120601 sin 120579 cos120595 cos120601 cos120595 + sin120601 sin 120579 sin120595 sin120601 cos 120579sin120601 sin120595 + cos120601 sin 120579 cos120595 minus sin120601 cos120595 + cos120601 sin 120579 sin120595 cos120601 cos 120579

]]

]

(6)

Combining these results gives the dynamics for the velocitystates [31]

[[

[

V

]]

]

=[[

[

119903V minus 119902119908 + 119886119909minus 119892 sin 120579

119901119908 minus 119903119906 + 119886119910+ 119892 sin120601 cos 120579

119902119906 minus 119901V + 119886119911+ 119892 cos120601 cos 120579

]]

]

(7)

The dynamics of the attitude states are defined using [32]

[[[

[

120601

120579

]]]

]

=[[

[

1 sin120601 tan 120579 cos120601 tan 120579

0 cos120601 minus sin120601

0 sin120601 sec 120579 cos120601 sec 120579

]]

]

[[

[

119901

119902

119903

]]

]

(8)

To define the dynamics for the bias parameters a first-order Gauss-Markov noise model was used In a relatedwork [33] the Allan deviation [34] approach presented in[35 36] was used to determine the parameters of the first-order Gauss-Markov noisemodel for the dynamics of the biason each IMU channel The Gauss-Markov noise model foreach sensor measurement involves two parameters a timeconstant and a variance of the wide-band sensor noise Usingthis model the dynamics for the bias parameters are given by

b119896= b119896minus1

119890minus119879119904120591

+ n119896minus1

(9)

where 120591 is a vector of time constants and n is a zero-meannoise vector with variance given by a diagonal matrix of

the variance terms for each sensor The time constant andvariance terms were calculated in [33] for each channel of thesame IMU that was considered for this study

The state dynamic equations have been defined incontinuous-time using the following format

x = f119888(x u) (10)

where f119888is the nonlinear continuous-time state transition

function In order to implement these equations in a discrete-time filter a first-order discretization is used [37]

x119896= x119896minus1

+ 119879119904f119888(x119896minus1

u119896minus1

) ≜ f (x119896minus1

u119896minus1

) (11)

where 119896 is the discrete time index f is the nonlinear discrete-time state transition function and 119879

119904is the sampling time of

the systemTo formulate the observation equations optical flow

information is utilized In particular each optical flow pointidentified from vision data consists of four values 120583 ] ]These values are obtained using a point matching method[38] and the Scale-Invariant Feature Transform (SIFT) algo-rithm [10] Note that the method for optical flow generationis not the emphasis of this research [38] therefore anyother optical flow algorithm can be used similarly within theproposed estimator without any loss of generality

During the state estimation process the image planecoordinates (120583 ]) are taken as inputs to the observation

4 Journal of Robotics

equation allowing the optical flow ( ]) to be predictedat that point in the image plane using (2) where 120578

119911is

provided by the laser rangefinder measurement 119871 Thesecomputed observables are then compared with the opticalflow measurements of ( ]) from the video in order todetermine how to update the states Since multiple opticalflow points can be identified within a single time step thiscreates a set of 119899

119896observation equations where 119899

119896is the

number of optical flow points at time step 119896Since (7) and (8) are derived from kinematics the only

uncertainty that must be modeled is due to the inputmeasurements Therefore the input vector is given by

u119896minus1

= u119896minus1

+ b119896minus1

(12)

where u is the measured input vector and b is the vector ofsensor biases which follow a first order Gauss-Markov noisemodel as determined in [33]

The uncertainty in the measurements is due to the errorsin the optical flow estimation from the video It is assumedthat each optical flow measurement y

119894has an additive

measurement noise vector v119894 with corresponding covariance

matrix R119894 For this study it is also assumed that each optical

flow measurement carries equal uncertainty and that errorsalong the two component directions of the image plane alsohave equal uncertainty and are uncorrelated that is

R119894= R = 119877I (13)

where 119877 is the scalar uncertainty of the optical flowmeasure-ments and I is a 2 times 2 identity matrix

23 Simplified Formulation Themotion of a typical airplaneis mostly in the forward direction that is the speed of theaircraft is primarily contained in the component 119906 whileV and 119908 are small With this idea assuming that V and 119908

are zero the formulation is simplified to the following statevector x bias state vectorb input vectoru optical flow inputvectors d

119894 and output vectors z

119894

x = [119906 120601 120579 b119879]119879

b = [119887119886119909

119887119901

119887119902

119887119903

119887119871]119879

u = [119886119909

119901 119902 119903 119871]119879

d119894= [120583 ]]119879

119894

z119894= 119894

(14)

Note that this simplified formulation removes the V and 119908

states which removes the need for 119910-axis and 119911-axis accel-eration measurements Since the yaw state is not containedin any of the state or observation equations it has also beenremoved Due to the assumption that V and 119908 are zero

only the 119909-direction of optical flow is relevant With thesesimplifications the state dynamics become

= 119886119909minus 119892 sin 120579

120601 = 119901 + 119902 sin120601 tan 120579 + 119903 cos120601 tan 120579

120579 = 119902 cos120601 minus 119903 sin120601

(15)

The dynamics of the bias states remain the same as in the fullformulation except the corresponding bias states for 119886

119910and

119886119911have been removed The observation equations from (2)

are simplified to be

=minus119906 (119891 + ] tan120601 minus 120583 tan 120579 cos120601)

120578119911

minus 119891119902 + 119903]

+119901120583]119891

minus1199021205832

119891

(16)

The advantage of considering this simplified formulationis primarily to reduce the computational complexity of thesystem The processing of vision data leading to a relativelylarge number of measurement updates can significantlydrive up the computation time of the system particularlyfor higher sampling rates This simplified formulation notonly reduces the computation time through a reductionof states but also significantly reduces the processing andupdate time for optical flow measurements since only theforward component of flow is used This formulation couldbe more practical than the full state formulation for real-timeimplementation especially on systems which are limited inonboard computational power due for example to cost orsize constraints

24 Inclusion of a Range State It is possible to include astate to estimate the range in order to recover the scale ofthe optical flow images To determine the dynamics of therange state the flat ground assumption is used With thisassumption consider the projection of the range vector ontothe Earth-fixed 119911-axis that is ldquodownrdquo as shown in Figure 1 bytaking the projection through both the roll and pitch anglesof the aircraft

119911 = minus119871 cos120601 cos 120579 (17)

Here the negative sign is used because the 119871 coordinateis always positive while the 119911 coordinate will be negativewhen the aircraft is above the ground (due to the ldquodownrdquoconvention) Taking the derivative with respect to time yields

= minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579 (18)

Compare this 119911-velocity equation with that obtained fromrotating aircraft body velocity components into the Earth-fixed frame

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579 (19)

Equating these two expressions for 119911-velocity gives

minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579(20)

Journal of Robotics 5

Simplifying this relationship leads to

= 120601119871 tan120601 + 120579119871 tan 120579 + 119906 sec120601 tan 120579 minus V tan120601 minus 119908 (21)

Substituting in the dynamics for the roll and pitch angles andsimplifying leads to the following expression for the rangestate dynamics

= 119906 sec120601 tan 120579 minus V tan120601 minus 119908

+ 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (22)

Note that for level conditions that is roll and pitch angles arezero the equation reduces to

= minus119908 (23)

which agrees with physical intuition In order to implementthe range state in the simplified formulation the followingexpression can be used

= 119906 sec120601 tan 120579 + 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (24)

25 Information Fusion Algorithm Due to the nonlinearitynonadditive noise and numbers of multiple optical flowmeasurements ranging from 0 to 300 per frame with amean of 250 the Unscented Information Filter (UIF) [39ndash41] was selected for the implementation of this algorithm[42] The advantage of the information filtering frameworkover Kalman filtering is that redundant information vectorsare additive [39ndash41] therefore the time-varying number ofoutputs obtained from optical flow can easily be handled withrelatively low computation since the coupling between theerrors in different optical flow measurements is neglectedThe UIF algorithm is summarized as follows [41]

Consider a discrete time nonlinear dynamic system of theform

x119896= f (x

119896minus1 u119896minus1

) + w119896minus1

(25)

with measurement equations of the form

z119896= h (x

119896 d119896) + k119896 (26)

where h is the observation function andw and v are the zero-mean Gaussian process and measurement noise vectors Ateach time step sigma-points are generated from the priordistribution using

120594119896minus1

= [x119896minus1

x119896minus1

+ radic119873 + 120582radicP119896minus1

x119896minus1

minus radic119873 + 120582radicP119896minus1

] (27)

where 119873 is the total number of states and 120582 is a scalingparameter [42] Now the sigma-points are predicted using

120594(119894)

119896|119896minus1= f (120594(119894)

119896minus1 u119896minus1

) 119894 = 0 1 2119873 (28)

where (119894) denotes the 119894th column of a matrix The a prioristatistics are then recovered

x119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120594(119894)

119896|119896minus1

P119896|119896minus1

= Q119896minus1

+

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120594(119894)

119896|119896minus1minus x119896|119896minus1

)119879

(29)

where Q is the process noise covariance matrix and 120578119898119894and

120578119888

119894are weight vectors [42] Using these predicted values the

information vector y and matrix Y are determined

y119896|119896minus1

= Pminus1119896|119896minus1

x119896|119896minus1

Y119896|119896minus1

= Pminus1119896|119896minus1

(30)

For each measurement that is each optical flow pair theoutput equations are evaluated for each sigma-point as in

120595(119894119895)

119896|119896minus1= h (120594

(119894119895)

119896|119896minus1 d119896)

119894 = 0 1 2119873 119895 = 1 119899119896

(31)

where 120595 denotes an output sigma-point and the superscript(119894 119895) denotes the 119894th sigma-point and the 119895th measurementThe computed observation is then recovered using

z(119895)119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120595(119894)

119896|119896minus1 119895 = 1 119899

119896 (32)

Using the computed observation the cross-covariance iscalculated

P119909119910(119895)119896|119896minus1

=

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120595(119894)

119896|119896minus1minus z119896|119896minus1

)119879

(33)

Then the observation sensitivity matrixH is determined

H(119895)119896

= [Pminus1119896|119896minus1

P119909119910(119895)119896|119896minus1

]119879

119895 = 1 119899119896 (34)

The information contributions can then be calculated

y119896= y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896

[z(119895)119896

minus z(119895)119896|119896minus1

+ H(119895)119896x119896|119896minus1

]

Y119896= Y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896H(119895)119896

(35)

3 Experimental Setup

The research platform used for this study is the WestVirginia University (WVU) ldquoRed Phastballrdquo UAV shown inFigure 2 with a customGPSINS data logger mounted inside

6 Journal of Robotics

Figure 2 Picture of WVU ldquoRed Phastballrdquo aircraft

Table 1 Details for WVU Phastball aircraft

Property ValueLength 22mWingspan 24mTakeoff mass 11 kgPayload mass 3 kgPropulsion system Dual 90mm Ducted Fan MotorStatic thrust 60NFuel capacity Two 5Ah LiPo batteriesCruise speed 30msMission duration 6min

the aircraft [28 43] Some details for this aircraft are providedin Table 1

The IMU used in this study is an Analog Devices ADIS-16405MEMS-based IMU which includes triaxial accelerom-eters and rate gyroscopes Each suite of sensors on the IMUis acquired at 18-bit resolution at 50Hz over ranges of plusmn18 grsquosand plusmn150 degs respectively The GPS receiver used in thedata logger is a Novatel OEM-V1 which was configured toprovide Cartesian position and velocity measurements andsolution standard deviations at a rate of 50Hz with 15mRMS horizontal position accuracy and 003ms RMS velocityaccuracy An Optic-Logic RS400 laser range finder was usedfor range measurement with an approximate accuracy of1m and range of 366m pointing downward In additiona high-quality Goodrich mechanical vertical gyroscope ismounted onboard the UAV to provide pitch and roll mea-surements to be used as sensor fusion ldquotruthrdquo data withreported accuracy of within 025∘ of true verticalThe verticalgyroscope measurements were acquired at 16-bit resolutionwith measurement ranges of plusmn80 deg for roll and plusmn60 deg forpitch

A GoPro Hero video camera is mounted at the centerof gravity of the UAV for flight video collection pointingdownwards The camera was previously calibrated to a focallength of 1141 pixels [29] Two different sets of flight data wereused for this study each using different camera settings Thefirst flight used a pixel size of 1920 times 1080 and a samplingrate of 2997Hz The second flight used a pixel size of 1280 times

720 and a sampling rate of 5994Hz All the other sensor datawere collected at 50Hz and resampled to the camera time forpostflight validation after manual synchronization

(a) (b)

Figure 3 Flight trajectories for Flight 1 (a) and Flight 2 (b) copy 2014Google

4 Experimental Results

41 Flight Data Two sets of flight data from the WVU ldquoRedPhastballrdquo aircraft were used in this study Each flight consistsof approximately 5 minutes of flight The top-down flighttrajectories from these two data sets are overlaid on a GoogleEarth image of the flight test location in Figure 3 Six differentunique markers have been placed in Figure 3 in order toidentify specific points along the trajectory These markerswill be used in future figures in order to synchronize thepresentation of data

42 Selection of Noise Assumptions for Optical Flow Mea-surements Since the noise properties of the IMU have beenestablished from previous work [33] only the characteris-tics of the uncertainty in the laser range and optical flowmeasurements need to be determined The uncertainty inthe laser range finder measurement is modeled as 1m zero-mean Gaussian noise based on the manufacturerrsquos reportedaccuracy of the sensor The optical flow errors are a bitmore difficult to model Due to this difficulty differentassumptions of the optical flow uncertainty were consideredUsing both sets of the flight data the full state UIF wasexecuted for each assumption of optical flow uncertainty Toevaluate the performance of the filter the speed measure-ments were compared with reference measurements fromGPS which have been mapped into the aircraft frame usingroll and pitch measurements from the vertical gyroscopeand approximating the yaw from the heading as determinedby GPS The roll and pitch estimates were compared withthe measurements from the vertical gyroscope Due to thepossibility of alignment errors only standard deviation oferror was considered Each of these errors was calculated foreach set of flight data and the results are offered in Figure 4

Figure 4 shows how changing the assumption on the opti-cal flow uncertainty affects the estimation performance of thetotal ground speed roll angle and pitch angle The relatively

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 2: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

2 Journal of Robotics

along the horizon line [16 17] and pose estimation fora hexacopter [18] a lunar rover [19] and spacecraft [20]While this work is useful these vehicles contain significantlydifferent dynamic characteristics than a typical airplane Dueto this more analysis of the application of optical flow forairplane applications is necessary

This work presents a combined velocity and attitude esti-mation algorithm using wide-field optical flow for airplanesthat does not require horizon detection which is usefulbecause the horizon does not need to be visible in the imageframe in order to obtain attitude information The algorithmrelies on the optical flow computed using a downward facingvideo camera measurements from a laser range finder andan Inertial Measurement Unit (IMU) that are mounted inparallel to the camera axis and a flat ground assumptionto determine information about the aircraft velocity andattitude Many of the existing experiments for optical flowand inertial sensor fusion are done using helicopter platformsand focus on position and velocity estimation [21 22] Thiswork considers an airplane system rather than a helicopterwhich contains a significantly different flight envelope anddynamics Additionally the regulation of attitude informa-tion through the use of optical flow is considered which isnot typically done in existing applications This work takesadvantage of all detected optical flow points in the imageplane including wide-field optical flow points which wereoften omitted in previous works [23ndash25] These wide-fieldoptical flow points are of significant importance for attitudeestimation since they contain roll and pitch information thatis not observable from the image center Although this workconsiders the use of a laser range finder to recover the distancebetween the image scene and the camera it is possible todetermine this information using other techniques [26] Infact it has been demonstrated that the scale is an observablemode for the vision and IMU data fusion problem [27]The presented formulation was originally offered in its earlystages of development in [28] Since this original publicationthe implementation and tuning of the formulation havebeen refined and additional results have been generatedIn particular a simplified formulation is offered whichreduces the filter states and the inclusion of a range stateis considered The main contribution of this paper is theanalysis of a stable vision-aided solution for the velocity andattitude determination without the use of GPS This solutionis verified with respect to two sets of actual UAV flight testingdata

The rest of this paper is organized as follows Section 2presents the different considered formulations and frame-work for this problem Section 3 describes the experimentalsetup which was used to collect data for this study Theresults are offered in Section 4 followed by a conclusion inSection 5

2 Problem Formulation

21 Optical Flow Equations Optical flow is the projection of3D relative motion into a 2D image plane Using the pinholecamera model the 3D position (120578

119909 120578119910 120578119911) in the 3D camera

body frame can be mapped into the 2D image plane withcoordinates (120583 ]) using

120583 = 119891120578119909

120578119911

] = 119891120578119910

120578119911

(1)

where 120583 ] and 119891 are given in pixels and 119891 is the focallength For a downward looking camera that is parallel to theaircraft 119911-axis and with a level and flat ground assumptionthe optical flow equations have been derived [29]

[

]] =

119891 + ] tan120601 minus 120583 tan 120579 cos120601120578119911

[[[

[

minus119906 +120583119908

119891

minusV +]119908119891

]]]

]

minus[[[

[

119891119902 minus 119903] minus119901120583]119891

+1199021205832

119891

minus119891119901 + 119903120583 minus119901]2

119891+

119902120583]119891

]]]

]

(2)

where 120601 120579 are the roll and pitch angles 119901 119902 119903 are the rollpitch and yaw body-axis angular rates 119906 V 119908 are the body-axis ground velocity components of the aircraft and ] arethe components of optical flow in the 2D image plane givenin pixelssecThis equation captures the relationship betweenoptical flow at various parts of the image plane with otherpieces of navigation information By considering only thearea close to the image center (120583 asymp 0 ] asymp 0) the narrow-fieldoptical flow model can be simplified [23ndash25] however thisremoves the roll and pitch dependence of the equation and istherefore not desirable for attitude estimation purposes

22 State Space Formulation and Stochastic Modeling Thiswork considers the simultaneous estimation of body-axisground velocity components (119906 V 119908) and Euler attitudeangles (120601 120579 120595) This estimation is performed through thefusion of Inertial Measurement Unit (IMU)measurements ofbody-axis accelerations (119886

119909 119886119910 119886119911) and angular rates (119901 119902 119903)

laser rangefinder range measurements (119871) and 119899 sets ofoptical flow measurements ( ])i where 119894 = 1 2 119899 Thevalue of 119899 varies with each time step based on how manyfeatures in the frame can be used for optical flow calculationUsing these values the state space model of the system isformulated with the following state vector x bias state vectorb input vector u optical flow input vectors d

119894 and output

vectors z119894

x = [119906 V 119908 120601 120579 120595 b119879]119879

b = [119887119886119909

119887119886119910

119887119886119911

119887119901

119887119902

119887119903

119887119871]119879

u = [119886119909

119886119910

119886119911

119901 119902 119903 119871]119879

d119894= [120583 ]]119879

119894

z119894= [ ]]119879

119894

(3)

Journal of Robotics 3

Body coordinatesNED coordinatesRange coordinate

x998400

y998400

z998400

L z (down)

Figure 1 Diagram of the range coordinate

A diagram describing the definition of the range coordinate119871 is provided in Figure 1 Note that the range coordinate 119871is equivalent to the camera 119911 coordinate 120578

119911

In order to determine the dynamics of the velocity statesthe time derivative of the velocity vector observed from thefixed navigation frame is equal to the time rate of change asobserved from the moving body axis frame plus the changecaused by rotation of the frame [30]

119889

119889119905(

[[

[

119906

V

119908

]]

]

) =[[

[

V

]]

]

+[[

[

119901

119902

119903

]]

]

times[[

[

119906

V

119908

]]

]

(4)

The IMU measures the acceleration with respect to the fixedgravity vector as in

[[

[

119886119909

119886119910

119886119911

]]

]

=119889

119889119905(

[[

[

119906

V

119908

]]

]

) + 119862119887

119899

[[

[

0

0

minus119892

]]

]

(5)

where 119862119887119899is the rotation matrix from the navigation frame to

the body frame

119862119887

119899=

[[

[

cos 120579 cos120595 cos 120579 sin120595 minus sin 120579

minus cos120601 sin120595 + sin120601 sin 120579 cos120595 cos120601 cos120595 + sin120601 sin 120579 sin120595 sin120601 cos 120579sin120601 sin120595 + cos120601 sin 120579 cos120595 minus sin120601 cos120595 + cos120601 sin 120579 sin120595 cos120601 cos 120579

]]

]

(6)

Combining these results gives the dynamics for the velocitystates [31]

[[

[

V

]]

]

=[[

[

119903V minus 119902119908 + 119886119909minus 119892 sin 120579

119901119908 minus 119903119906 + 119886119910+ 119892 sin120601 cos 120579

119902119906 minus 119901V + 119886119911+ 119892 cos120601 cos 120579

]]

]

(7)

The dynamics of the attitude states are defined using [32]

[[[

[

120601

120579

]]]

]

=[[

[

1 sin120601 tan 120579 cos120601 tan 120579

0 cos120601 minus sin120601

0 sin120601 sec 120579 cos120601 sec 120579

]]

]

[[

[

119901

119902

119903

]]

]

(8)

To define the dynamics for the bias parameters a first-order Gauss-Markov noise model was used In a relatedwork [33] the Allan deviation [34] approach presented in[35 36] was used to determine the parameters of the first-order Gauss-Markov noisemodel for the dynamics of the biason each IMU channel The Gauss-Markov noise model foreach sensor measurement involves two parameters a timeconstant and a variance of the wide-band sensor noise Usingthis model the dynamics for the bias parameters are given by

b119896= b119896minus1

119890minus119879119904120591

+ n119896minus1

(9)

where 120591 is a vector of time constants and n is a zero-meannoise vector with variance given by a diagonal matrix of

the variance terms for each sensor The time constant andvariance terms were calculated in [33] for each channel of thesame IMU that was considered for this study

The state dynamic equations have been defined incontinuous-time using the following format

x = f119888(x u) (10)

where f119888is the nonlinear continuous-time state transition

function In order to implement these equations in a discrete-time filter a first-order discretization is used [37]

x119896= x119896minus1

+ 119879119904f119888(x119896minus1

u119896minus1

) ≜ f (x119896minus1

u119896minus1

) (11)

where 119896 is the discrete time index f is the nonlinear discrete-time state transition function and 119879

119904is the sampling time of

the systemTo formulate the observation equations optical flow

information is utilized In particular each optical flow pointidentified from vision data consists of four values 120583 ] ]These values are obtained using a point matching method[38] and the Scale-Invariant Feature Transform (SIFT) algo-rithm [10] Note that the method for optical flow generationis not the emphasis of this research [38] therefore anyother optical flow algorithm can be used similarly within theproposed estimator without any loss of generality

During the state estimation process the image planecoordinates (120583 ]) are taken as inputs to the observation

4 Journal of Robotics

equation allowing the optical flow ( ]) to be predictedat that point in the image plane using (2) where 120578

119911is

provided by the laser rangefinder measurement 119871 Thesecomputed observables are then compared with the opticalflow measurements of ( ]) from the video in order todetermine how to update the states Since multiple opticalflow points can be identified within a single time step thiscreates a set of 119899

119896observation equations where 119899

119896is the

number of optical flow points at time step 119896Since (7) and (8) are derived from kinematics the only

uncertainty that must be modeled is due to the inputmeasurements Therefore the input vector is given by

u119896minus1

= u119896minus1

+ b119896minus1

(12)

where u is the measured input vector and b is the vector ofsensor biases which follow a first order Gauss-Markov noisemodel as determined in [33]

The uncertainty in the measurements is due to the errorsin the optical flow estimation from the video It is assumedthat each optical flow measurement y

119894has an additive

measurement noise vector v119894 with corresponding covariance

matrix R119894 For this study it is also assumed that each optical

flow measurement carries equal uncertainty and that errorsalong the two component directions of the image plane alsohave equal uncertainty and are uncorrelated that is

R119894= R = 119877I (13)

where 119877 is the scalar uncertainty of the optical flowmeasure-ments and I is a 2 times 2 identity matrix

23 Simplified Formulation Themotion of a typical airplaneis mostly in the forward direction that is the speed of theaircraft is primarily contained in the component 119906 whileV and 119908 are small With this idea assuming that V and 119908

are zero the formulation is simplified to the following statevector x bias state vectorb input vectoru optical flow inputvectors d

119894 and output vectors z

119894

x = [119906 120601 120579 b119879]119879

b = [119887119886119909

119887119901

119887119902

119887119903

119887119871]119879

u = [119886119909

119901 119902 119903 119871]119879

d119894= [120583 ]]119879

119894

z119894= 119894

(14)

Note that this simplified formulation removes the V and 119908

states which removes the need for 119910-axis and 119911-axis accel-eration measurements Since the yaw state is not containedin any of the state or observation equations it has also beenremoved Due to the assumption that V and 119908 are zero

only the 119909-direction of optical flow is relevant With thesesimplifications the state dynamics become

= 119886119909minus 119892 sin 120579

120601 = 119901 + 119902 sin120601 tan 120579 + 119903 cos120601 tan 120579

120579 = 119902 cos120601 minus 119903 sin120601

(15)

The dynamics of the bias states remain the same as in the fullformulation except the corresponding bias states for 119886

119910and

119886119911have been removed The observation equations from (2)

are simplified to be

=minus119906 (119891 + ] tan120601 minus 120583 tan 120579 cos120601)

120578119911

minus 119891119902 + 119903]

+119901120583]119891

minus1199021205832

119891

(16)

The advantage of considering this simplified formulationis primarily to reduce the computational complexity of thesystem The processing of vision data leading to a relativelylarge number of measurement updates can significantlydrive up the computation time of the system particularlyfor higher sampling rates This simplified formulation notonly reduces the computation time through a reductionof states but also significantly reduces the processing andupdate time for optical flow measurements since only theforward component of flow is used This formulation couldbe more practical than the full state formulation for real-timeimplementation especially on systems which are limited inonboard computational power due for example to cost orsize constraints

24 Inclusion of a Range State It is possible to include astate to estimate the range in order to recover the scale ofthe optical flow images To determine the dynamics of therange state the flat ground assumption is used With thisassumption consider the projection of the range vector ontothe Earth-fixed 119911-axis that is ldquodownrdquo as shown in Figure 1 bytaking the projection through both the roll and pitch anglesof the aircraft

119911 = minus119871 cos120601 cos 120579 (17)

Here the negative sign is used because the 119871 coordinateis always positive while the 119911 coordinate will be negativewhen the aircraft is above the ground (due to the ldquodownrdquoconvention) Taking the derivative with respect to time yields

= minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579 (18)

Compare this 119911-velocity equation with that obtained fromrotating aircraft body velocity components into the Earth-fixed frame

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579 (19)

Equating these two expressions for 119911-velocity gives

minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579(20)

Journal of Robotics 5

Simplifying this relationship leads to

= 120601119871 tan120601 + 120579119871 tan 120579 + 119906 sec120601 tan 120579 minus V tan120601 minus 119908 (21)

Substituting in the dynamics for the roll and pitch angles andsimplifying leads to the following expression for the rangestate dynamics

= 119906 sec120601 tan 120579 minus V tan120601 minus 119908

+ 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (22)

Note that for level conditions that is roll and pitch angles arezero the equation reduces to

= minus119908 (23)

which agrees with physical intuition In order to implementthe range state in the simplified formulation the followingexpression can be used

= 119906 sec120601 tan 120579 + 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (24)

25 Information Fusion Algorithm Due to the nonlinearitynonadditive noise and numbers of multiple optical flowmeasurements ranging from 0 to 300 per frame with amean of 250 the Unscented Information Filter (UIF) [39ndash41] was selected for the implementation of this algorithm[42] The advantage of the information filtering frameworkover Kalman filtering is that redundant information vectorsare additive [39ndash41] therefore the time-varying number ofoutputs obtained from optical flow can easily be handled withrelatively low computation since the coupling between theerrors in different optical flow measurements is neglectedThe UIF algorithm is summarized as follows [41]

Consider a discrete time nonlinear dynamic system of theform

x119896= f (x

119896minus1 u119896minus1

) + w119896minus1

(25)

with measurement equations of the form

z119896= h (x

119896 d119896) + k119896 (26)

where h is the observation function andw and v are the zero-mean Gaussian process and measurement noise vectors Ateach time step sigma-points are generated from the priordistribution using

120594119896minus1

= [x119896minus1

x119896minus1

+ radic119873 + 120582radicP119896minus1

x119896minus1

minus radic119873 + 120582radicP119896minus1

] (27)

where 119873 is the total number of states and 120582 is a scalingparameter [42] Now the sigma-points are predicted using

120594(119894)

119896|119896minus1= f (120594(119894)

119896minus1 u119896minus1

) 119894 = 0 1 2119873 (28)

where (119894) denotes the 119894th column of a matrix The a prioristatistics are then recovered

x119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120594(119894)

119896|119896minus1

P119896|119896minus1

= Q119896minus1

+

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120594(119894)

119896|119896minus1minus x119896|119896minus1

)119879

(29)

where Q is the process noise covariance matrix and 120578119898119894and

120578119888

119894are weight vectors [42] Using these predicted values the

information vector y and matrix Y are determined

y119896|119896minus1

= Pminus1119896|119896minus1

x119896|119896minus1

Y119896|119896minus1

= Pminus1119896|119896minus1

(30)

For each measurement that is each optical flow pair theoutput equations are evaluated for each sigma-point as in

120595(119894119895)

119896|119896minus1= h (120594

(119894119895)

119896|119896minus1 d119896)

119894 = 0 1 2119873 119895 = 1 119899119896

(31)

where 120595 denotes an output sigma-point and the superscript(119894 119895) denotes the 119894th sigma-point and the 119895th measurementThe computed observation is then recovered using

z(119895)119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120595(119894)

119896|119896minus1 119895 = 1 119899

119896 (32)

Using the computed observation the cross-covariance iscalculated

P119909119910(119895)119896|119896minus1

=

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120595(119894)

119896|119896minus1minus z119896|119896minus1

)119879

(33)

Then the observation sensitivity matrixH is determined

H(119895)119896

= [Pminus1119896|119896minus1

P119909119910(119895)119896|119896minus1

]119879

119895 = 1 119899119896 (34)

The information contributions can then be calculated

y119896= y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896

[z(119895)119896

minus z(119895)119896|119896minus1

+ H(119895)119896x119896|119896minus1

]

Y119896= Y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896H(119895)119896

(35)

3 Experimental Setup

The research platform used for this study is the WestVirginia University (WVU) ldquoRed Phastballrdquo UAV shown inFigure 2 with a customGPSINS data logger mounted inside

6 Journal of Robotics

Figure 2 Picture of WVU ldquoRed Phastballrdquo aircraft

Table 1 Details for WVU Phastball aircraft

Property ValueLength 22mWingspan 24mTakeoff mass 11 kgPayload mass 3 kgPropulsion system Dual 90mm Ducted Fan MotorStatic thrust 60NFuel capacity Two 5Ah LiPo batteriesCruise speed 30msMission duration 6min

the aircraft [28 43] Some details for this aircraft are providedin Table 1

The IMU used in this study is an Analog Devices ADIS-16405MEMS-based IMU which includes triaxial accelerom-eters and rate gyroscopes Each suite of sensors on the IMUis acquired at 18-bit resolution at 50Hz over ranges of plusmn18 grsquosand plusmn150 degs respectively The GPS receiver used in thedata logger is a Novatel OEM-V1 which was configured toprovide Cartesian position and velocity measurements andsolution standard deviations at a rate of 50Hz with 15mRMS horizontal position accuracy and 003ms RMS velocityaccuracy An Optic-Logic RS400 laser range finder was usedfor range measurement with an approximate accuracy of1m and range of 366m pointing downward In additiona high-quality Goodrich mechanical vertical gyroscope ismounted onboard the UAV to provide pitch and roll mea-surements to be used as sensor fusion ldquotruthrdquo data withreported accuracy of within 025∘ of true verticalThe verticalgyroscope measurements were acquired at 16-bit resolutionwith measurement ranges of plusmn80 deg for roll and plusmn60 deg forpitch

A GoPro Hero video camera is mounted at the centerof gravity of the UAV for flight video collection pointingdownwards The camera was previously calibrated to a focallength of 1141 pixels [29] Two different sets of flight data wereused for this study each using different camera settings Thefirst flight used a pixel size of 1920 times 1080 and a samplingrate of 2997Hz The second flight used a pixel size of 1280 times

720 and a sampling rate of 5994Hz All the other sensor datawere collected at 50Hz and resampled to the camera time forpostflight validation after manual synchronization

(a) (b)

Figure 3 Flight trajectories for Flight 1 (a) and Flight 2 (b) copy 2014Google

4 Experimental Results

41 Flight Data Two sets of flight data from the WVU ldquoRedPhastballrdquo aircraft were used in this study Each flight consistsof approximately 5 minutes of flight The top-down flighttrajectories from these two data sets are overlaid on a GoogleEarth image of the flight test location in Figure 3 Six differentunique markers have been placed in Figure 3 in order toidentify specific points along the trajectory These markerswill be used in future figures in order to synchronize thepresentation of data

42 Selection of Noise Assumptions for Optical Flow Mea-surements Since the noise properties of the IMU have beenestablished from previous work [33] only the characteris-tics of the uncertainty in the laser range and optical flowmeasurements need to be determined The uncertainty inthe laser range finder measurement is modeled as 1m zero-mean Gaussian noise based on the manufacturerrsquos reportedaccuracy of the sensor The optical flow errors are a bitmore difficult to model Due to this difficulty differentassumptions of the optical flow uncertainty were consideredUsing both sets of the flight data the full state UIF wasexecuted for each assumption of optical flow uncertainty Toevaluate the performance of the filter the speed measure-ments were compared with reference measurements fromGPS which have been mapped into the aircraft frame usingroll and pitch measurements from the vertical gyroscopeand approximating the yaw from the heading as determinedby GPS The roll and pitch estimates were compared withthe measurements from the vertical gyroscope Due to thepossibility of alignment errors only standard deviation oferror was considered Each of these errors was calculated foreach set of flight data and the results are offered in Figure 4

Figure 4 shows how changing the assumption on the opti-cal flow uncertainty affects the estimation performance of thetotal ground speed roll angle and pitch angle The relatively

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 3: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

Journal of Robotics 3

Body coordinatesNED coordinatesRange coordinate

x998400

y998400

z998400

L z (down)

Figure 1 Diagram of the range coordinate

A diagram describing the definition of the range coordinate119871 is provided in Figure 1 Note that the range coordinate 119871is equivalent to the camera 119911 coordinate 120578

119911

In order to determine the dynamics of the velocity statesthe time derivative of the velocity vector observed from thefixed navigation frame is equal to the time rate of change asobserved from the moving body axis frame plus the changecaused by rotation of the frame [30]

119889

119889119905(

[[

[

119906

V

119908

]]

]

) =[[

[

V

]]

]

+[[

[

119901

119902

119903

]]

]

times[[

[

119906

V

119908

]]

]

(4)

The IMU measures the acceleration with respect to the fixedgravity vector as in

[[

[

119886119909

119886119910

119886119911

]]

]

=119889

119889119905(

[[

[

119906

V

119908

]]

]

) + 119862119887

119899

[[

[

0

0

minus119892

]]

]

(5)

where 119862119887119899is the rotation matrix from the navigation frame to

the body frame

119862119887

119899=

[[

[

cos 120579 cos120595 cos 120579 sin120595 minus sin 120579

minus cos120601 sin120595 + sin120601 sin 120579 cos120595 cos120601 cos120595 + sin120601 sin 120579 sin120595 sin120601 cos 120579sin120601 sin120595 + cos120601 sin 120579 cos120595 minus sin120601 cos120595 + cos120601 sin 120579 sin120595 cos120601 cos 120579

]]

]

(6)

Combining these results gives the dynamics for the velocitystates [31]

[[

[

V

]]

]

=[[

[

119903V minus 119902119908 + 119886119909minus 119892 sin 120579

119901119908 minus 119903119906 + 119886119910+ 119892 sin120601 cos 120579

119902119906 minus 119901V + 119886119911+ 119892 cos120601 cos 120579

]]

]

(7)

The dynamics of the attitude states are defined using [32]

[[[

[

120601

120579

]]]

]

=[[

[

1 sin120601 tan 120579 cos120601 tan 120579

0 cos120601 minus sin120601

0 sin120601 sec 120579 cos120601 sec 120579

]]

]

[[

[

119901

119902

119903

]]

]

(8)

To define the dynamics for the bias parameters a first-order Gauss-Markov noise model was used In a relatedwork [33] the Allan deviation [34] approach presented in[35 36] was used to determine the parameters of the first-order Gauss-Markov noisemodel for the dynamics of the biason each IMU channel The Gauss-Markov noise model foreach sensor measurement involves two parameters a timeconstant and a variance of the wide-band sensor noise Usingthis model the dynamics for the bias parameters are given by

b119896= b119896minus1

119890minus119879119904120591

+ n119896minus1

(9)

where 120591 is a vector of time constants and n is a zero-meannoise vector with variance given by a diagonal matrix of

the variance terms for each sensor The time constant andvariance terms were calculated in [33] for each channel of thesame IMU that was considered for this study

The state dynamic equations have been defined incontinuous-time using the following format

x = f119888(x u) (10)

where f119888is the nonlinear continuous-time state transition

function In order to implement these equations in a discrete-time filter a first-order discretization is used [37]

x119896= x119896minus1

+ 119879119904f119888(x119896minus1

u119896minus1

) ≜ f (x119896minus1

u119896minus1

) (11)

where 119896 is the discrete time index f is the nonlinear discrete-time state transition function and 119879

119904is the sampling time of

the systemTo formulate the observation equations optical flow

information is utilized In particular each optical flow pointidentified from vision data consists of four values 120583 ] ]These values are obtained using a point matching method[38] and the Scale-Invariant Feature Transform (SIFT) algo-rithm [10] Note that the method for optical flow generationis not the emphasis of this research [38] therefore anyother optical flow algorithm can be used similarly within theproposed estimator without any loss of generality

During the state estimation process the image planecoordinates (120583 ]) are taken as inputs to the observation

4 Journal of Robotics

equation allowing the optical flow ( ]) to be predictedat that point in the image plane using (2) where 120578

119911is

provided by the laser rangefinder measurement 119871 Thesecomputed observables are then compared with the opticalflow measurements of ( ]) from the video in order todetermine how to update the states Since multiple opticalflow points can be identified within a single time step thiscreates a set of 119899

119896observation equations where 119899

119896is the

number of optical flow points at time step 119896Since (7) and (8) are derived from kinematics the only

uncertainty that must be modeled is due to the inputmeasurements Therefore the input vector is given by

u119896minus1

= u119896minus1

+ b119896minus1

(12)

where u is the measured input vector and b is the vector ofsensor biases which follow a first order Gauss-Markov noisemodel as determined in [33]

The uncertainty in the measurements is due to the errorsin the optical flow estimation from the video It is assumedthat each optical flow measurement y

119894has an additive

measurement noise vector v119894 with corresponding covariance

matrix R119894 For this study it is also assumed that each optical

flow measurement carries equal uncertainty and that errorsalong the two component directions of the image plane alsohave equal uncertainty and are uncorrelated that is

R119894= R = 119877I (13)

where 119877 is the scalar uncertainty of the optical flowmeasure-ments and I is a 2 times 2 identity matrix

23 Simplified Formulation Themotion of a typical airplaneis mostly in the forward direction that is the speed of theaircraft is primarily contained in the component 119906 whileV and 119908 are small With this idea assuming that V and 119908

are zero the formulation is simplified to the following statevector x bias state vectorb input vectoru optical flow inputvectors d

119894 and output vectors z

119894

x = [119906 120601 120579 b119879]119879

b = [119887119886119909

119887119901

119887119902

119887119903

119887119871]119879

u = [119886119909

119901 119902 119903 119871]119879

d119894= [120583 ]]119879

119894

z119894= 119894

(14)

Note that this simplified formulation removes the V and 119908

states which removes the need for 119910-axis and 119911-axis accel-eration measurements Since the yaw state is not containedin any of the state or observation equations it has also beenremoved Due to the assumption that V and 119908 are zero

only the 119909-direction of optical flow is relevant With thesesimplifications the state dynamics become

= 119886119909minus 119892 sin 120579

120601 = 119901 + 119902 sin120601 tan 120579 + 119903 cos120601 tan 120579

120579 = 119902 cos120601 minus 119903 sin120601

(15)

The dynamics of the bias states remain the same as in the fullformulation except the corresponding bias states for 119886

119910and

119886119911have been removed The observation equations from (2)

are simplified to be

=minus119906 (119891 + ] tan120601 minus 120583 tan 120579 cos120601)

120578119911

minus 119891119902 + 119903]

+119901120583]119891

minus1199021205832

119891

(16)

The advantage of considering this simplified formulationis primarily to reduce the computational complexity of thesystem The processing of vision data leading to a relativelylarge number of measurement updates can significantlydrive up the computation time of the system particularlyfor higher sampling rates This simplified formulation notonly reduces the computation time through a reductionof states but also significantly reduces the processing andupdate time for optical flow measurements since only theforward component of flow is used This formulation couldbe more practical than the full state formulation for real-timeimplementation especially on systems which are limited inonboard computational power due for example to cost orsize constraints

24 Inclusion of a Range State It is possible to include astate to estimate the range in order to recover the scale ofthe optical flow images To determine the dynamics of therange state the flat ground assumption is used With thisassumption consider the projection of the range vector ontothe Earth-fixed 119911-axis that is ldquodownrdquo as shown in Figure 1 bytaking the projection through both the roll and pitch anglesof the aircraft

119911 = minus119871 cos120601 cos 120579 (17)

Here the negative sign is used because the 119871 coordinateis always positive while the 119911 coordinate will be negativewhen the aircraft is above the ground (due to the ldquodownrdquoconvention) Taking the derivative with respect to time yields

= minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579 (18)

Compare this 119911-velocity equation with that obtained fromrotating aircraft body velocity components into the Earth-fixed frame

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579 (19)

Equating these two expressions for 119911-velocity gives

minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579(20)

Journal of Robotics 5

Simplifying this relationship leads to

= 120601119871 tan120601 + 120579119871 tan 120579 + 119906 sec120601 tan 120579 minus V tan120601 minus 119908 (21)

Substituting in the dynamics for the roll and pitch angles andsimplifying leads to the following expression for the rangestate dynamics

= 119906 sec120601 tan 120579 minus V tan120601 minus 119908

+ 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (22)

Note that for level conditions that is roll and pitch angles arezero the equation reduces to

= minus119908 (23)

which agrees with physical intuition In order to implementthe range state in the simplified formulation the followingexpression can be used

= 119906 sec120601 tan 120579 + 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (24)

25 Information Fusion Algorithm Due to the nonlinearitynonadditive noise and numbers of multiple optical flowmeasurements ranging from 0 to 300 per frame with amean of 250 the Unscented Information Filter (UIF) [39ndash41] was selected for the implementation of this algorithm[42] The advantage of the information filtering frameworkover Kalman filtering is that redundant information vectorsare additive [39ndash41] therefore the time-varying number ofoutputs obtained from optical flow can easily be handled withrelatively low computation since the coupling between theerrors in different optical flow measurements is neglectedThe UIF algorithm is summarized as follows [41]

Consider a discrete time nonlinear dynamic system of theform

x119896= f (x

119896minus1 u119896minus1

) + w119896minus1

(25)

with measurement equations of the form

z119896= h (x

119896 d119896) + k119896 (26)

where h is the observation function andw and v are the zero-mean Gaussian process and measurement noise vectors Ateach time step sigma-points are generated from the priordistribution using

120594119896minus1

= [x119896minus1

x119896minus1

+ radic119873 + 120582radicP119896minus1

x119896minus1

minus radic119873 + 120582radicP119896minus1

] (27)

where 119873 is the total number of states and 120582 is a scalingparameter [42] Now the sigma-points are predicted using

120594(119894)

119896|119896minus1= f (120594(119894)

119896minus1 u119896minus1

) 119894 = 0 1 2119873 (28)

where (119894) denotes the 119894th column of a matrix The a prioristatistics are then recovered

x119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120594(119894)

119896|119896minus1

P119896|119896minus1

= Q119896minus1

+

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120594(119894)

119896|119896minus1minus x119896|119896minus1

)119879

(29)

where Q is the process noise covariance matrix and 120578119898119894and

120578119888

119894are weight vectors [42] Using these predicted values the

information vector y and matrix Y are determined

y119896|119896minus1

= Pminus1119896|119896minus1

x119896|119896minus1

Y119896|119896minus1

= Pminus1119896|119896minus1

(30)

For each measurement that is each optical flow pair theoutput equations are evaluated for each sigma-point as in

120595(119894119895)

119896|119896minus1= h (120594

(119894119895)

119896|119896minus1 d119896)

119894 = 0 1 2119873 119895 = 1 119899119896

(31)

where 120595 denotes an output sigma-point and the superscript(119894 119895) denotes the 119894th sigma-point and the 119895th measurementThe computed observation is then recovered using

z(119895)119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120595(119894)

119896|119896minus1 119895 = 1 119899

119896 (32)

Using the computed observation the cross-covariance iscalculated

P119909119910(119895)119896|119896minus1

=

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120595(119894)

119896|119896minus1minus z119896|119896minus1

)119879

(33)

Then the observation sensitivity matrixH is determined

H(119895)119896

= [Pminus1119896|119896minus1

P119909119910(119895)119896|119896minus1

]119879

119895 = 1 119899119896 (34)

The information contributions can then be calculated

y119896= y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896

[z(119895)119896

minus z(119895)119896|119896minus1

+ H(119895)119896x119896|119896minus1

]

Y119896= Y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896H(119895)119896

(35)

3 Experimental Setup

The research platform used for this study is the WestVirginia University (WVU) ldquoRed Phastballrdquo UAV shown inFigure 2 with a customGPSINS data logger mounted inside

6 Journal of Robotics

Figure 2 Picture of WVU ldquoRed Phastballrdquo aircraft

Table 1 Details for WVU Phastball aircraft

Property ValueLength 22mWingspan 24mTakeoff mass 11 kgPayload mass 3 kgPropulsion system Dual 90mm Ducted Fan MotorStatic thrust 60NFuel capacity Two 5Ah LiPo batteriesCruise speed 30msMission duration 6min

the aircraft [28 43] Some details for this aircraft are providedin Table 1

The IMU used in this study is an Analog Devices ADIS-16405MEMS-based IMU which includes triaxial accelerom-eters and rate gyroscopes Each suite of sensors on the IMUis acquired at 18-bit resolution at 50Hz over ranges of plusmn18 grsquosand plusmn150 degs respectively The GPS receiver used in thedata logger is a Novatel OEM-V1 which was configured toprovide Cartesian position and velocity measurements andsolution standard deviations at a rate of 50Hz with 15mRMS horizontal position accuracy and 003ms RMS velocityaccuracy An Optic-Logic RS400 laser range finder was usedfor range measurement with an approximate accuracy of1m and range of 366m pointing downward In additiona high-quality Goodrich mechanical vertical gyroscope ismounted onboard the UAV to provide pitch and roll mea-surements to be used as sensor fusion ldquotruthrdquo data withreported accuracy of within 025∘ of true verticalThe verticalgyroscope measurements were acquired at 16-bit resolutionwith measurement ranges of plusmn80 deg for roll and plusmn60 deg forpitch

A GoPro Hero video camera is mounted at the centerof gravity of the UAV for flight video collection pointingdownwards The camera was previously calibrated to a focallength of 1141 pixels [29] Two different sets of flight data wereused for this study each using different camera settings Thefirst flight used a pixel size of 1920 times 1080 and a samplingrate of 2997Hz The second flight used a pixel size of 1280 times

720 and a sampling rate of 5994Hz All the other sensor datawere collected at 50Hz and resampled to the camera time forpostflight validation after manual synchronization

(a) (b)

Figure 3 Flight trajectories for Flight 1 (a) and Flight 2 (b) copy 2014Google

4 Experimental Results

41 Flight Data Two sets of flight data from the WVU ldquoRedPhastballrdquo aircraft were used in this study Each flight consistsof approximately 5 minutes of flight The top-down flighttrajectories from these two data sets are overlaid on a GoogleEarth image of the flight test location in Figure 3 Six differentunique markers have been placed in Figure 3 in order toidentify specific points along the trajectory These markerswill be used in future figures in order to synchronize thepresentation of data

42 Selection of Noise Assumptions for Optical Flow Mea-surements Since the noise properties of the IMU have beenestablished from previous work [33] only the characteris-tics of the uncertainty in the laser range and optical flowmeasurements need to be determined The uncertainty inthe laser range finder measurement is modeled as 1m zero-mean Gaussian noise based on the manufacturerrsquos reportedaccuracy of the sensor The optical flow errors are a bitmore difficult to model Due to this difficulty differentassumptions of the optical flow uncertainty were consideredUsing both sets of the flight data the full state UIF wasexecuted for each assumption of optical flow uncertainty Toevaluate the performance of the filter the speed measure-ments were compared with reference measurements fromGPS which have been mapped into the aircraft frame usingroll and pitch measurements from the vertical gyroscopeand approximating the yaw from the heading as determinedby GPS The roll and pitch estimates were compared withthe measurements from the vertical gyroscope Due to thepossibility of alignment errors only standard deviation oferror was considered Each of these errors was calculated foreach set of flight data and the results are offered in Figure 4

Figure 4 shows how changing the assumption on the opti-cal flow uncertainty affects the estimation performance of thetotal ground speed roll angle and pitch angle The relatively

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 4: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

4 Journal of Robotics

equation allowing the optical flow ( ]) to be predictedat that point in the image plane using (2) where 120578

119911is

provided by the laser rangefinder measurement 119871 Thesecomputed observables are then compared with the opticalflow measurements of ( ]) from the video in order todetermine how to update the states Since multiple opticalflow points can be identified within a single time step thiscreates a set of 119899

119896observation equations where 119899

119896is the

number of optical flow points at time step 119896Since (7) and (8) are derived from kinematics the only

uncertainty that must be modeled is due to the inputmeasurements Therefore the input vector is given by

u119896minus1

= u119896minus1

+ b119896minus1

(12)

where u is the measured input vector and b is the vector ofsensor biases which follow a first order Gauss-Markov noisemodel as determined in [33]

The uncertainty in the measurements is due to the errorsin the optical flow estimation from the video It is assumedthat each optical flow measurement y

119894has an additive

measurement noise vector v119894 with corresponding covariance

matrix R119894 For this study it is also assumed that each optical

flow measurement carries equal uncertainty and that errorsalong the two component directions of the image plane alsohave equal uncertainty and are uncorrelated that is

R119894= R = 119877I (13)

where 119877 is the scalar uncertainty of the optical flowmeasure-ments and I is a 2 times 2 identity matrix

23 Simplified Formulation Themotion of a typical airplaneis mostly in the forward direction that is the speed of theaircraft is primarily contained in the component 119906 whileV and 119908 are small With this idea assuming that V and 119908

are zero the formulation is simplified to the following statevector x bias state vectorb input vectoru optical flow inputvectors d

119894 and output vectors z

119894

x = [119906 120601 120579 b119879]119879

b = [119887119886119909

119887119901

119887119902

119887119903

119887119871]119879

u = [119886119909

119901 119902 119903 119871]119879

d119894= [120583 ]]119879

119894

z119894= 119894

(14)

Note that this simplified formulation removes the V and 119908

states which removes the need for 119910-axis and 119911-axis accel-eration measurements Since the yaw state is not containedin any of the state or observation equations it has also beenremoved Due to the assumption that V and 119908 are zero

only the 119909-direction of optical flow is relevant With thesesimplifications the state dynamics become

= 119886119909minus 119892 sin 120579

120601 = 119901 + 119902 sin120601 tan 120579 + 119903 cos120601 tan 120579

120579 = 119902 cos120601 minus 119903 sin120601

(15)

The dynamics of the bias states remain the same as in the fullformulation except the corresponding bias states for 119886

119910and

119886119911have been removed The observation equations from (2)

are simplified to be

=minus119906 (119891 + ] tan120601 minus 120583 tan 120579 cos120601)

120578119911

minus 119891119902 + 119903]

+119901120583]119891

minus1199021205832

119891

(16)

The advantage of considering this simplified formulationis primarily to reduce the computational complexity of thesystem The processing of vision data leading to a relativelylarge number of measurement updates can significantlydrive up the computation time of the system particularlyfor higher sampling rates This simplified formulation notonly reduces the computation time through a reductionof states but also significantly reduces the processing andupdate time for optical flow measurements since only theforward component of flow is used This formulation couldbe more practical than the full state formulation for real-timeimplementation especially on systems which are limited inonboard computational power due for example to cost orsize constraints

24 Inclusion of a Range State It is possible to include astate to estimate the range in order to recover the scale ofthe optical flow images To determine the dynamics of therange state the flat ground assumption is used With thisassumption consider the projection of the range vector ontothe Earth-fixed 119911-axis that is ldquodownrdquo as shown in Figure 1 bytaking the projection through both the roll and pitch anglesof the aircraft

119911 = minus119871 cos120601 cos 120579 (17)

Here the negative sign is used because the 119871 coordinateis always positive while the 119911 coordinate will be negativewhen the aircraft is above the ground (due to the ldquodownrdquoconvention) Taking the derivative with respect to time yields

= minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579 (18)

Compare this 119911-velocity equation with that obtained fromrotating aircraft body velocity components into the Earth-fixed frame

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579 (19)

Equating these two expressions for 119911-velocity gives

minus cos120601 cos 120579 + 120601119871 sin120601 cos 120579 + 120579119871 cos120601 sin 120579

= minus119906 sin 120579 + V sin120601 cos 120579 + 119908 cos120601 cos 120579(20)

Journal of Robotics 5

Simplifying this relationship leads to

= 120601119871 tan120601 + 120579119871 tan 120579 + 119906 sec120601 tan 120579 minus V tan120601 minus 119908 (21)

Substituting in the dynamics for the roll and pitch angles andsimplifying leads to the following expression for the rangestate dynamics

= 119906 sec120601 tan 120579 minus V tan120601 minus 119908

+ 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (22)

Note that for level conditions that is roll and pitch angles arezero the equation reduces to

= minus119908 (23)

which agrees with physical intuition In order to implementthe range state in the simplified formulation the followingexpression can be used

= 119906 sec120601 tan 120579 + 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (24)

25 Information Fusion Algorithm Due to the nonlinearitynonadditive noise and numbers of multiple optical flowmeasurements ranging from 0 to 300 per frame with amean of 250 the Unscented Information Filter (UIF) [39ndash41] was selected for the implementation of this algorithm[42] The advantage of the information filtering frameworkover Kalman filtering is that redundant information vectorsare additive [39ndash41] therefore the time-varying number ofoutputs obtained from optical flow can easily be handled withrelatively low computation since the coupling between theerrors in different optical flow measurements is neglectedThe UIF algorithm is summarized as follows [41]

Consider a discrete time nonlinear dynamic system of theform

x119896= f (x

119896minus1 u119896minus1

) + w119896minus1

(25)

with measurement equations of the form

z119896= h (x

119896 d119896) + k119896 (26)

where h is the observation function andw and v are the zero-mean Gaussian process and measurement noise vectors Ateach time step sigma-points are generated from the priordistribution using

120594119896minus1

= [x119896minus1

x119896minus1

+ radic119873 + 120582radicP119896minus1

x119896minus1

minus radic119873 + 120582radicP119896minus1

] (27)

where 119873 is the total number of states and 120582 is a scalingparameter [42] Now the sigma-points are predicted using

120594(119894)

119896|119896minus1= f (120594(119894)

119896minus1 u119896minus1

) 119894 = 0 1 2119873 (28)

where (119894) denotes the 119894th column of a matrix The a prioristatistics are then recovered

x119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120594(119894)

119896|119896minus1

P119896|119896minus1

= Q119896minus1

+

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120594(119894)

119896|119896minus1minus x119896|119896minus1

)119879

(29)

where Q is the process noise covariance matrix and 120578119898119894and

120578119888

119894are weight vectors [42] Using these predicted values the

information vector y and matrix Y are determined

y119896|119896minus1

= Pminus1119896|119896minus1

x119896|119896minus1

Y119896|119896minus1

= Pminus1119896|119896minus1

(30)

For each measurement that is each optical flow pair theoutput equations are evaluated for each sigma-point as in

120595(119894119895)

119896|119896minus1= h (120594

(119894119895)

119896|119896minus1 d119896)

119894 = 0 1 2119873 119895 = 1 119899119896

(31)

where 120595 denotes an output sigma-point and the superscript(119894 119895) denotes the 119894th sigma-point and the 119895th measurementThe computed observation is then recovered using

z(119895)119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120595(119894)

119896|119896minus1 119895 = 1 119899

119896 (32)

Using the computed observation the cross-covariance iscalculated

P119909119910(119895)119896|119896minus1

=

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120595(119894)

119896|119896minus1minus z119896|119896minus1

)119879

(33)

Then the observation sensitivity matrixH is determined

H(119895)119896

= [Pminus1119896|119896minus1

P119909119910(119895)119896|119896minus1

]119879

119895 = 1 119899119896 (34)

The information contributions can then be calculated

y119896= y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896

[z(119895)119896

minus z(119895)119896|119896minus1

+ H(119895)119896x119896|119896minus1

]

Y119896= Y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896H(119895)119896

(35)

3 Experimental Setup

The research platform used for this study is the WestVirginia University (WVU) ldquoRed Phastballrdquo UAV shown inFigure 2 with a customGPSINS data logger mounted inside

6 Journal of Robotics

Figure 2 Picture of WVU ldquoRed Phastballrdquo aircraft

Table 1 Details for WVU Phastball aircraft

Property ValueLength 22mWingspan 24mTakeoff mass 11 kgPayload mass 3 kgPropulsion system Dual 90mm Ducted Fan MotorStatic thrust 60NFuel capacity Two 5Ah LiPo batteriesCruise speed 30msMission duration 6min

the aircraft [28 43] Some details for this aircraft are providedin Table 1

The IMU used in this study is an Analog Devices ADIS-16405MEMS-based IMU which includes triaxial accelerom-eters and rate gyroscopes Each suite of sensors on the IMUis acquired at 18-bit resolution at 50Hz over ranges of plusmn18 grsquosand plusmn150 degs respectively The GPS receiver used in thedata logger is a Novatel OEM-V1 which was configured toprovide Cartesian position and velocity measurements andsolution standard deviations at a rate of 50Hz with 15mRMS horizontal position accuracy and 003ms RMS velocityaccuracy An Optic-Logic RS400 laser range finder was usedfor range measurement with an approximate accuracy of1m and range of 366m pointing downward In additiona high-quality Goodrich mechanical vertical gyroscope ismounted onboard the UAV to provide pitch and roll mea-surements to be used as sensor fusion ldquotruthrdquo data withreported accuracy of within 025∘ of true verticalThe verticalgyroscope measurements were acquired at 16-bit resolutionwith measurement ranges of plusmn80 deg for roll and plusmn60 deg forpitch

A GoPro Hero video camera is mounted at the centerof gravity of the UAV for flight video collection pointingdownwards The camera was previously calibrated to a focallength of 1141 pixels [29] Two different sets of flight data wereused for this study each using different camera settings Thefirst flight used a pixel size of 1920 times 1080 and a samplingrate of 2997Hz The second flight used a pixel size of 1280 times

720 and a sampling rate of 5994Hz All the other sensor datawere collected at 50Hz and resampled to the camera time forpostflight validation after manual synchronization

(a) (b)

Figure 3 Flight trajectories for Flight 1 (a) and Flight 2 (b) copy 2014Google

4 Experimental Results

41 Flight Data Two sets of flight data from the WVU ldquoRedPhastballrdquo aircraft were used in this study Each flight consistsof approximately 5 minutes of flight The top-down flighttrajectories from these two data sets are overlaid on a GoogleEarth image of the flight test location in Figure 3 Six differentunique markers have been placed in Figure 3 in order toidentify specific points along the trajectory These markerswill be used in future figures in order to synchronize thepresentation of data

42 Selection of Noise Assumptions for Optical Flow Mea-surements Since the noise properties of the IMU have beenestablished from previous work [33] only the characteris-tics of the uncertainty in the laser range and optical flowmeasurements need to be determined The uncertainty inthe laser range finder measurement is modeled as 1m zero-mean Gaussian noise based on the manufacturerrsquos reportedaccuracy of the sensor The optical flow errors are a bitmore difficult to model Due to this difficulty differentassumptions of the optical flow uncertainty were consideredUsing both sets of the flight data the full state UIF wasexecuted for each assumption of optical flow uncertainty Toevaluate the performance of the filter the speed measure-ments were compared with reference measurements fromGPS which have been mapped into the aircraft frame usingroll and pitch measurements from the vertical gyroscopeand approximating the yaw from the heading as determinedby GPS The roll and pitch estimates were compared withthe measurements from the vertical gyroscope Due to thepossibility of alignment errors only standard deviation oferror was considered Each of these errors was calculated foreach set of flight data and the results are offered in Figure 4

Figure 4 shows how changing the assumption on the opti-cal flow uncertainty affects the estimation performance of thetotal ground speed roll angle and pitch angle The relatively

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 5: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

Journal of Robotics 5

Simplifying this relationship leads to

= 120601119871 tan120601 + 120579119871 tan 120579 + 119906 sec120601 tan 120579 minus V tan120601 minus 119908 (21)

Substituting in the dynamics for the roll and pitch angles andsimplifying leads to the following expression for the rangestate dynamics

= 119906 sec120601 tan 120579 minus V tan120601 minus 119908

+ 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (22)

Note that for level conditions that is roll and pitch angles arezero the equation reduces to

= minus119908 (23)

which agrees with physical intuition In order to implementthe range state in the simplified formulation the followingexpression can be used

= 119906 sec120601 tan 120579 + 119871 [119901 tan120601 + 119902 sec120601 tan 120579] (24)

25 Information Fusion Algorithm Due to the nonlinearitynonadditive noise and numbers of multiple optical flowmeasurements ranging from 0 to 300 per frame with amean of 250 the Unscented Information Filter (UIF) [39ndash41] was selected for the implementation of this algorithm[42] The advantage of the information filtering frameworkover Kalman filtering is that redundant information vectorsare additive [39ndash41] therefore the time-varying number ofoutputs obtained from optical flow can easily be handled withrelatively low computation since the coupling between theerrors in different optical flow measurements is neglectedThe UIF algorithm is summarized as follows [41]

Consider a discrete time nonlinear dynamic system of theform

x119896= f (x

119896minus1 u119896minus1

) + w119896minus1

(25)

with measurement equations of the form

z119896= h (x

119896 d119896) + k119896 (26)

where h is the observation function andw and v are the zero-mean Gaussian process and measurement noise vectors Ateach time step sigma-points are generated from the priordistribution using

120594119896minus1

= [x119896minus1

x119896minus1

+ radic119873 + 120582radicP119896minus1

x119896minus1

minus radic119873 + 120582radicP119896minus1

] (27)

where 119873 is the total number of states and 120582 is a scalingparameter [42] Now the sigma-points are predicted using

120594(119894)

119896|119896minus1= f (120594(119894)

119896minus1 u119896minus1

) 119894 = 0 1 2119873 (28)

where (119894) denotes the 119894th column of a matrix The a prioristatistics are then recovered

x119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120594(119894)

119896|119896minus1

P119896|119896minus1

= Q119896minus1

+

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120594(119894)

119896|119896minus1minus x119896|119896minus1

)119879

(29)

where Q is the process noise covariance matrix and 120578119898119894and

120578119888

119894are weight vectors [42] Using these predicted values the

information vector y and matrix Y are determined

y119896|119896minus1

= Pminus1119896|119896minus1

x119896|119896minus1

Y119896|119896minus1

= Pminus1119896|119896minus1

(30)

For each measurement that is each optical flow pair theoutput equations are evaluated for each sigma-point as in

120595(119894119895)

119896|119896minus1= h (120594

(119894119895)

119896|119896minus1 d119896)

119894 = 0 1 2119873 119895 = 1 119899119896

(31)

where 120595 denotes an output sigma-point and the superscript(119894 119895) denotes the 119894th sigma-point and the 119895th measurementThe computed observation is then recovered using

z(119895)119896|119896minus1

=

2119873

sum119894=0

120578119898

119894120595(119894)

119896|119896minus1 119895 = 1 119899

119896 (32)

Using the computed observation the cross-covariance iscalculated

P119909119910(119895)119896|119896minus1

=

2119873

sum119894=0

120578119888

119894(120594(119894)

119896|119896minus1minus x119896|119896minus1

) (120595(119894)

119896|119896minus1minus z119896|119896minus1

)119879

(33)

Then the observation sensitivity matrixH is determined

H(119895)119896

= [Pminus1119896|119896minus1

P119909119910(119895)119896|119896minus1

]119879

119895 = 1 119899119896 (34)

The information contributions can then be calculated

y119896= y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896

[z(119895)119896

minus z(119895)119896|119896minus1

+ H(119895)119896x119896|119896minus1

]

Y119896= Y119896|119896minus1

+

119899119896

sum119895=1

(H(119895)119896

)119879

Rminus1119896H(119895)119896

(35)

3 Experimental Setup

The research platform used for this study is the WestVirginia University (WVU) ldquoRed Phastballrdquo UAV shown inFigure 2 with a customGPSINS data logger mounted inside

6 Journal of Robotics

Figure 2 Picture of WVU ldquoRed Phastballrdquo aircraft

Table 1 Details for WVU Phastball aircraft

Property ValueLength 22mWingspan 24mTakeoff mass 11 kgPayload mass 3 kgPropulsion system Dual 90mm Ducted Fan MotorStatic thrust 60NFuel capacity Two 5Ah LiPo batteriesCruise speed 30msMission duration 6min

the aircraft [28 43] Some details for this aircraft are providedin Table 1

The IMU used in this study is an Analog Devices ADIS-16405MEMS-based IMU which includes triaxial accelerom-eters and rate gyroscopes Each suite of sensors on the IMUis acquired at 18-bit resolution at 50Hz over ranges of plusmn18 grsquosand plusmn150 degs respectively The GPS receiver used in thedata logger is a Novatel OEM-V1 which was configured toprovide Cartesian position and velocity measurements andsolution standard deviations at a rate of 50Hz with 15mRMS horizontal position accuracy and 003ms RMS velocityaccuracy An Optic-Logic RS400 laser range finder was usedfor range measurement with an approximate accuracy of1m and range of 366m pointing downward In additiona high-quality Goodrich mechanical vertical gyroscope ismounted onboard the UAV to provide pitch and roll mea-surements to be used as sensor fusion ldquotruthrdquo data withreported accuracy of within 025∘ of true verticalThe verticalgyroscope measurements were acquired at 16-bit resolutionwith measurement ranges of plusmn80 deg for roll and plusmn60 deg forpitch

A GoPro Hero video camera is mounted at the centerof gravity of the UAV for flight video collection pointingdownwards The camera was previously calibrated to a focallength of 1141 pixels [29] Two different sets of flight data wereused for this study each using different camera settings Thefirst flight used a pixel size of 1920 times 1080 and a samplingrate of 2997Hz The second flight used a pixel size of 1280 times

720 and a sampling rate of 5994Hz All the other sensor datawere collected at 50Hz and resampled to the camera time forpostflight validation after manual synchronization

(a) (b)

Figure 3 Flight trajectories for Flight 1 (a) and Flight 2 (b) copy 2014Google

4 Experimental Results

41 Flight Data Two sets of flight data from the WVU ldquoRedPhastballrdquo aircraft were used in this study Each flight consistsof approximately 5 minutes of flight The top-down flighttrajectories from these two data sets are overlaid on a GoogleEarth image of the flight test location in Figure 3 Six differentunique markers have been placed in Figure 3 in order toidentify specific points along the trajectory These markerswill be used in future figures in order to synchronize thepresentation of data

42 Selection of Noise Assumptions for Optical Flow Mea-surements Since the noise properties of the IMU have beenestablished from previous work [33] only the characteris-tics of the uncertainty in the laser range and optical flowmeasurements need to be determined The uncertainty inthe laser range finder measurement is modeled as 1m zero-mean Gaussian noise based on the manufacturerrsquos reportedaccuracy of the sensor The optical flow errors are a bitmore difficult to model Due to this difficulty differentassumptions of the optical flow uncertainty were consideredUsing both sets of the flight data the full state UIF wasexecuted for each assumption of optical flow uncertainty Toevaluate the performance of the filter the speed measure-ments were compared with reference measurements fromGPS which have been mapped into the aircraft frame usingroll and pitch measurements from the vertical gyroscopeand approximating the yaw from the heading as determinedby GPS The roll and pitch estimates were compared withthe measurements from the vertical gyroscope Due to thepossibility of alignment errors only standard deviation oferror was considered Each of these errors was calculated foreach set of flight data and the results are offered in Figure 4

Figure 4 shows how changing the assumption on the opti-cal flow uncertainty affects the estimation performance of thetotal ground speed roll angle and pitch angle The relatively

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 6: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

6 Journal of Robotics

Figure 2 Picture of WVU ldquoRed Phastballrdquo aircraft

Table 1 Details for WVU Phastball aircraft

Property ValueLength 22mWingspan 24mTakeoff mass 11 kgPayload mass 3 kgPropulsion system Dual 90mm Ducted Fan MotorStatic thrust 60NFuel capacity Two 5Ah LiPo batteriesCruise speed 30msMission duration 6min

the aircraft [28 43] Some details for this aircraft are providedin Table 1

The IMU used in this study is an Analog Devices ADIS-16405MEMS-based IMU which includes triaxial accelerom-eters and rate gyroscopes Each suite of sensors on the IMUis acquired at 18-bit resolution at 50Hz over ranges of plusmn18 grsquosand plusmn150 degs respectively The GPS receiver used in thedata logger is a Novatel OEM-V1 which was configured toprovide Cartesian position and velocity measurements andsolution standard deviations at a rate of 50Hz with 15mRMS horizontal position accuracy and 003ms RMS velocityaccuracy An Optic-Logic RS400 laser range finder was usedfor range measurement with an approximate accuracy of1m and range of 366m pointing downward In additiona high-quality Goodrich mechanical vertical gyroscope ismounted onboard the UAV to provide pitch and roll mea-surements to be used as sensor fusion ldquotruthrdquo data withreported accuracy of within 025∘ of true verticalThe verticalgyroscope measurements were acquired at 16-bit resolutionwith measurement ranges of plusmn80 deg for roll and plusmn60 deg forpitch

A GoPro Hero video camera is mounted at the centerof gravity of the UAV for flight video collection pointingdownwards The camera was previously calibrated to a focallength of 1141 pixels [29] Two different sets of flight data wereused for this study each using different camera settings Thefirst flight used a pixel size of 1920 times 1080 and a samplingrate of 2997Hz The second flight used a pixel size of 1280 times

720 and a sampling rate of 5994Hz All the other sensor datawere collected at 50Hz and resampled to the camera time forpostflight validation after manual synchronization

(a) (b)

Figure 3 Flight trajectories for Flight 1 (a) and Flight 2 (b) copy 2014Google

4 Experimental Results

41 Flight Data Two sets of flight data from the WVU ldquoRedPhastballrdquo aircraft were used in this study Each flight consistsof approximately 5 minutes of flight The top-down flighttrajectories from these two data sets are overlaid on a GoogleEarth image of the flight test location in Figure 3 Six differentunique markers have been placed in Figure 3 in order toidentify specific points along the trajectory These markerswill be used in future figures in order to synchronize thepresentation of data

42 Selection of Noise Assumptions for Optical Flow Mea-surements Since the noise properties of the IMU have beenestablished from previous work [33] only the characteris-tics of the uncertainty in the laser range and optical flowmeasurements need to be determined The uncertainty inthe laser range finder measurement is modeled as 1m zero-mean Gaussian noise based on the manufacturerrsquos reportedaccuracy of the sensor The optical flow errors are a bitmore difficult to model Due to this difficulty differentassumptions of the optical flow uncertainty were consideredUsing both sets of the flight data the full state UIF wasexecuted for each assumption of optical flow uncertainty Toevaluate the performance of the filter the speed measure-ments were compared with reference measurements fromGPS which have been mapped into the aircraft frame usingroll and pitch measurements from the vertical gyroscopeand approximating the yaw from the heading as determinedby GPS The roll and pitch estimates were compared withthe measurements from the vertical gyroscope Due to thepossibility of alignment errors only standard deviation oferror was considered Each of these errors was calculated foreach set of flight data and the results are offered in Figure 4

Figure 4 shows how changing the assumption on the opti-cal flow uncertainty affects the estimation performance of thetotal ground speed roll angle and pitch angle The relatively

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 7: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

Journal of Robotics 7

1 2 3 4 5 6 7 8 91

12

14

16

18

2

22

24

26

28

Std

dev

of e

rror

Flight 1 120590V (ms)Flight 1 120590120601 (deg)Flight 1 120590120579 (deg)

Flight 2 120590V (ms)Flight 2 120590120601 (deg)Flight 2 120590120579 (deg)

Assumed 120590OF (pixels)

Figure 4 Comparison of errors for different assumed optical flowuncertainties

flat region in Figure 4 for assumed optical flow standarddeviations from approximately 3 to 9 pixels indicates thatthis formulation is relatively insensitive to tuning of theseoptical flow errors It is also interesting to note in Figure 4that Flight 1 and Flight 2 have optimum performance atdifferent values of 119877 This however makes sense as Flight 2has twice the frame rate as Flight 1 therefore the assumednoise characteristics should be one half that of Flight 1 FromFigure 4 the optical flow uncertainties were selected to be119877 = 5

2 pixels2 for Flight 1 and 119877 = 252 pixels2 for Flight2

43 Full State Formulation Estimation Results Using eachset of flight data the full state formulation using UIF wasexecuted The estimated components of velocity are shownfor Flight 1 in Figure 5 and for Flight 2 in Figure 6 Theseestimates from theUIF are offeredwith respect to comparablereference values from GPS which were mapped into theaircraft frame using roll and pitch measurements from thevertical gyroscope and approximating the yaw angle with theheading angle obtained fromGPS From each of these figuresthe following observations can bemadeThe forward velocity119906 is reasonably captured by the estimation The lateralvelocity V and vertical velocity 119908 however demonstratesomewhat poor results This does however make sense asthe primary direction of flight is forward thus resulting ingood observability characteristics in the optical flow in theforward direction while the signal-to-noise ratio (SNR) forthe lateral and vertical directions remains small for mosttypical flight conditions However since these lateral andvertical components are only a small portion of the totalvelocity the total speed can be reasonably approximatedby this technique The total speed estimates are shown inFigure 7 for Flight 1 with GPS reference

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

u (m

s)

UIFGPS

0 50 100 150 200 250 300 350minus10minus5

05

10

Time (s)

v (m

s)

0 50 100 150 200 250 300 350minus10

0

10

Time (s)w

(ms

)

Figure 5 Estimated velocity components for Flight 1

UIFGPS

0 50 100 150 200 250 300minus10

0102030

Time (s)

u (m

s)

0 50 100 150 200 250 300minus10

minus5

0

5

Time (s)

v (m

s)

0 50 100 150 200 250 300minus10

0

10

20

Time (s)

w (m

s)

Figure 6 Estimated velocity components for Flight 2

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 8: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

8 Journal of Robotics

0 50 100 150 200 250 300 3500

5

10

15

20

25

30

35

40

Tota

l gro

und

spee

d (m

s)

UIFGPS

Time (s)

Figure 7 Estimated total speed for Flight 1

0 50 100 150 200 250 300 350minus80minus60minus40minus20

020

Time (s)

UIFVG

0 50 100 150 200 250 300 350minus20

0

20

Time (s)

120601(d

eg)

120579(d

eg)

Figure 8 Roll and pitch estimation results for Flight 2

The attitude estimates for the roll and pitch angles arecompared with the vertical gyroscope measurements as areference as shown in Figure 8 In order to demonstrate theeffectiveness of this method in regulating the drift in attitudeestimates that occurs with dead reckoning the estimationerrors from the UIF are compared with the errors obtainedfrom dead reckoning attitude estimation These roll andpitch errors are offered in Figure 9 for Flight 2 Figure 9demonstrates the effectiveness of the UIF in regulating theattitude errors from dead reckoning

In order to quantify the estimation results the meanabsolute error and standard deviation of error of the estimatesare calculated for the velocity components with respect tothe GPS reference and also for the roll and pitch angles with

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

0

20

40

Time (s)

Δ120601

(deg

)Δ120579

(deg

)Figure 9 Roll and pitch estimation errors as compared to deadreckoning (DR) for Flight 2

Table 2 Flight 1 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 10644 12667 msV 24699 27719 ms119908 21554 16554 ms119881 12466 12858 ms120601 11553 13668 deg120579 21752 13339 deg

respect to the vertical gyroscope reference These statisticalresults are provided in Table 2 for Flight 1 and Table 3 forFlight 2 where 119881 is the total airspeed as determined by

119881 = radic1199062 + V2 + 1199082 (36)

It is shown in Tables 2 and 3 that reasonable errors areobtained in both sets of flight data for the velocity and attitudeof the aircraft Larger errors are noted in particular for thelateral velocity state V which is due to observability issuesin the optical flow Note that mean errors in the roll andpitch estimation could be due to misalignment between thevertical gyroscope IMU and video camera The attitudeestimation accuracy is reported in Tables 2 and 3 similar tothe reported accuracy of loosely coupled GPSINS attitudeestimation using similar flight data [43]

44 Simplified Formulation Estimation Results Since it wasobserved in the full state formulation results that the lateraland vertical estimates were small the simplified formulationwas implemented in order to investigate the feasibility of asimplified version of the filter that estimates only the forwardvelocity component and assumes the lateral and verticalcomponents are zero The forward velocity 119906 for Flight 1is offered in Figure 10 while the roll and pitch errors with

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 9: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

Journal of Robotics 9

0 50 100 150 200 250 300minus5

0

5

10

15

20

25

30

35

40

Time (s)

u (m

s)

UIFGPS

Figure 10 Simplified formulation speed estimation results for Flight1

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

UIFDR

0 50 100 150 200 250 300 350minus20

minus10

0

10

20

Time (s)

Δ120601

(deg

)Δ120579

(deg

)

Figure 11 Simplified formulation attitude estimates with deadreckoning (DR) for Flight 2

Table 3 Flight 2 error statistics for estimated states

Estimated state Mean abs Standard deviation Units119906 11299 12663 msV 15184 18649 ms119908 11324 13453 ms119881 12087 12535 ms120601 18818 12782 deg120579 16646 11049 deg

respect to the vertical gyroscope measurement are offered inFigure 11 for the UIF and dead reckoning (DR) Additionally

0 5 10 15 20 25 30 35 40 45 500

1

2

3

4

5

6

7

Time (s)

Attit

ude e

rror

corr

elat

ion

Δ120601 (deg)Δ120579 (deg)sqrt(2 + w2) (ms)

Figure 12 Comparison of attitude estimation errors with respect tolateral and vertical velocity

Table 4 Simplified formulation error statistics for Flight 1

Estimated state Mean abs Standard deviation Units119906 12283 15730 ms120601 19901 22922 deg120579 19002 21881 deg

Table 5 Simplified formulation error statistics for Flight 2

Estimated state Mean abs Standard deviation Units119906 13073 15775 ms120601 28402 35426 deg120579 20286 24691 deg

the mean absolute error and standard deviation of error forthese terms are provided in Table 4 for Flight 1 and Table 5for Flight 2

It is shown in Tables 4 and 5 that the simplified formula-tion results in significantly higher attitude estimation errorswith respect to the full state formulation These increasedattitude errors are likely due to the assumption that lateraland vertical velocity components are zero To investigatethis possible correlation the roll and pitch errors are shownin Figure 12 with the magnitude of the lateral and verticalvelocity as determined from GPS for a 50-second segment offlight data which includes takeoff Figure 12 shows that thereis some correlation between the attitude estimation errorsand the lateral and vertical velocity though it is not the onlysource of error for these estimates

45 Results Using Range State The results for each flight forboth the full state formulation and simplified formulationwere recalculated with the addition of the range state Thestatistical results for these tests are offered in Tables 6ndash9

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 10: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

10 Journal of Robotics

Table 6 Flight 1 error statistics for estimated stateswith range state

Estimated state Mean abs Standard deviation Units119906 11330 13699 msV 24387 27369 ms119908 21210 15998 ms119881 13041 13852 ms120601 11599 14084 deg120579 21767 14064 deg

Table 7 Flight 2 error statistics for estimated states with rangestate

Estimated state Mean abs Standard deviation Units119906 11444 12937 msV 15122 18529 ms119908 11154 13268 ms119881 12112 12761 ms120601 18897 12845 deg120579 16609 11152 deg

Table 8 Simplified formulation error statistics for Flight 1 withrange state

Estimated state Mean abs Standard deviation Units119906 12919 16256 ms120601 20621 23746 deg120579 19277 22713 deg

Table 9 Simplified formulation error statistics for Flight 2 withrange state

Estimated state Mean abs Standard deviation Units119906 13356 16016 ms120601 28748 35462 deg120579 20303 24876 deg

In order to compare the results from the different casesthe standard deviation of error is shown graphically forFlight 1 in Figure 13 and Flight 2 in Figure 14 It is shownin Figures 13 and 14 that the simplified formulation offerspoorer estimation performance as expected particularly forthe attitude estimatesThe addition of the range state does notaffect the performance significantly

5 Conclusions

This paper presented vision-aided inertial navigation tech-niques which do not rely upon GPS using UAV flightdata Two different formulations were presented a full stateestimation formulation which captures the aircraft groundvelocity vector and attitude and a simplified formulationwhich assumes all of the aircraft velocity is in the forwarddirection Both formulations were shown to be effective inregulating the INS drift Additionally a state was included ineach formulation in order to estimate the distance betweenthe image center and the aircraft The full state formulation

u Roll Pitch0

05

1

15

2

25

3

35

4

FullFull w range

SimplifiedSimplified w range

Flight 1

120590(m

s) o

r (de

g)

Figure 13 Graphical comparison of standard deviation of error forFlight 1

FullFull w range

SimplifiedSimplified w range

u Roll Pitch0

05

1

15

2

25

3

35

4Flight 2

120590(m

s) o

r (de

g)

Figure 14 Graphical comparison of standard deviation of error forFlight 2

was shown to be effective in estimating aircraft groundvelocity to within 13ms and regulating attitude angleswithin 14 degrees standard deviation of error for both setsof flight data

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 11: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

Journal of Robotics 11

Acknowledgments

This work was supported in part by NASA Grant noNNX12AM56A Kansas NASA EPSCoR PDG grant and SRIgrant

References

[1] C Li L Shen H-B Wang and T Lei ldquoThe research onunmanned aerial vehicle remote sensing and its applicationsrdquoin Proceedings of the IEEE International Conference onAdvancedComputer Control (ICACC rsquo10) pp 644ndash647 Shenyang ChinaMarch 2010

[2] A J Calise and R T Rysdyk ldquoNonlinear adaptive flight controlusing neural networksrdquo IEEE Control SystemsMagazine vol 18no 6 pp 14ndash24 1998

[3] M S Grewal L R Weill and A P Andrew Global PositioningInertial Navigation amp Integration Wiley New York NY USA2nd edition 2007

[4] J L Crassıdıs ldquoSigma-point Kalman filtering for integratedGPS and inertial navigationrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference pp 1981ndash2004San Francisco Calif USA August 2005

[5] J N Gross Y Gu M B Rhudy S Gururajan and M RNapolitano ldquoFlight-test evaluation of sensor fusion algorithmsfor attitude estimationrdquo IEEE Transactions on Aerospace andElectronic Systems vol 48 no 3 pp 2128ndash2139 2012

[6] MV Srinivasan ldquoHoneybees as amodel for the study of visuallyguided flight navigation and biologically inspired roboticsrdquoPhysiological Reviews vol 91 no 2 pp 413ndash460 2011

[7] P S Bhagavatula C Claudianos M R Ibbotson and M VSrinivasan ldquoOptic flow cues guide flight in birdsrdquo CurrentBiology vol 21 no 21 pp 1794ndash1799 2011

[8] N Franceschini ldquoVisual guidance based on optic flow abiorobotic approachrdquo Journal of Physiology Paris vol 98 no 1-3pp 281ndash292 2004

[9] P Corke J Lobo and J Dias ldquoAn introduction to inertial andvisual sensingrdquo The International Journal of Robotics Researchvol 26 no 6 pp 519ndash535 2007

[10] D G Lowe ldquoDistinctive image features from scale-invariantkeypointsrdquo International Journal of Computer Vision vol 60 no2 pp 91ndash110 2004

[11] M V Srinivasan S Thurrowgood and D Soccol ldquoCompetentvision and navigation systemsrdquo IEEE Robotics amp AutomationMagazine vol 16 no 3 pp 59ndash71 2009

[12] J Conroy G Gremillion B Ranganathan and J S HumbertldquoImplementation of wide-field integration of optic flow forautonomous quadrotor navigationrdquoAutonomous Robots vol 27no 3 pp 189ndash198 2009

[13] F Kendoul I Fantoni andKNonami ldquoOptic flow-based visionsystem for autonomous 3D localization and control of smallaerial vehiclesrdquo Robotics and Autonomous Systems vol 57 no6-7 pp 591ndash602 2009

[14] A Beyeler J-C Zufferey and D Floreano ldquooptiPilot controlof take-off and landing using optic flowrdquo in Proceedings of theEuropeanMicro Air Vehicle Conference and Competition (EMAVrsquo09) Delft The Netherland September 2009

[15] M M Veth and J Raquet ldquoFusion of low-cost imaging andinertial sensors for navigationrdquo in Proceedings of the 19thInternational Technical Meeting of the Satellite Division Instituteof Navigation (ION GNSS rsquo06) pp 1093ndash1103 Fort Worth TexUSA September 2006

[16] D Dusha W Boles and R Walker ldquoFixed-wing attitudeestimation using computer vision based horizon detectionrdquo inProceedings of the International Unmanned Air Vehicle SystemsConference April 2007

[17] D Dusha W Boles and R Walker ldquoAttitude estimation for afixed-wing aircraft using horizon detection and optical flowrdquo inProceedings of the 9th Biennial Conference of the Australian Pat-tern Recognition Society on Digital Image Computing Techniquesand Applications (DICTA rsquo07) pp 485ndash492 IEEE GlenelgAustralia December 2007

[18] S Weiss M W Achtelik S Lynen M Chli and R SiegwartldquoReal-time onboard visual-inertial state estimation and self-calibration ofMAVs in unknown environmentsrdquo in Proceedingsof the IEEE International Conference on Robotics and Automa-tion (ICRA rsquo12) May 2012

[19] M Dille B Grocholsky and S Singh ldquoOutdoor downward-facing optical flow odometry with commodity sensorsrdquo in Fieldand Service Robotics Results of the 7th International Conferencevol 62 of Springer Tracts inAdvancedRobotics pp 183ndash193 2010

[20] S I Roumeliotis A E Johnson and J F Montgomery ldquoAug-menting inertial navigation with image-based motion estima-tionrdquo in Proceedings of the IEEE International Conference onRobotics amp Automation pp 4326ndash4333 Washington DC USAMay 2002

[21] M Achtelik M Achtelik S Weiss and R Siegwart ldquoOnboardIMU and monocular vision based control for MAVs inunknown in- and outdoor environmentsrdquo in Proceedings ofthe IEEE International Conference on Robotics and Automation(ICRA rsquo11) pp 3056ndash3063 Shanghai China May 2011

[22] P-J Bristeau F CallouDVissiere andN Petit ldquoThenavigationand control technology inside the ARDrone micro UAVrdquo inProceedings of the 18th IFAC World Congress pp 1477ndash1484Milano Italy September 2011

[23] H Chao Y Gu and M Napolitano ldquoA survey of opticalflow techniques for robotics navigation applicationsrdquo Journal ofIntelligent and Robotic SystemsTheory and Applications vol 73no 1ndash4 pp 361ndash372 2014

[24] W Ding J Wang S Han et al ldquoAdding optical flow into theGPSINS integration for UAV navigationrdquo in Proceedings of theInternational Global Navigation Satellite Systems Society IGNSSSymposium Surfers Paradise Australia 2009

[25] H Romero S Salazar and R Lozano ldquoReal-time stabilizationof an eight-rotor UAV using optical flowrdquo IEEE Transactionson Systems Man and CyberneticsmdashPart C Applications andReviews vol 42 no 6 pp 1752ndash1762 2012

[26] V Grabe H H Bulthoff and P R Giordano ldquoA comparisonof scale estimation schemes for a quadrotor UAV based onoptical flow and IMU measurementsrdquo in Proceedings of the26th IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo13) pp 5193ndash5200 IEEE Tokyo JapanNovember 2013

[27] A Martinelli ldquoVision and IMU data fusion closed-form solu-tions for attitude speed absolute scale and bias determinationrdquoIEEE Transactions on Robotics vol 28 no 1 pp 44ndash60 2012

[28] M B Rhudy H Chao and Y Gu ldquoWide-field optical flow aidedinertial navigation for unmanned aerial vehiclesrdquo inProceedingsof the IEEERSJ International Conference on Intelligent Robotsand Systems (IROS rsquo14) pp 674ndash679 Chicago Ill USA Septem-ber 2014

[29] H Chao Y Gu J Gross G Guo M L Fravolini and M RNapolitano ldquoA comparative study of optical flow and traditionalsensors in UAV navigationrdquo in Proceedings of the 1st American

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 12: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

12 Journal of Robotics

Control Conference (ACC rsquo13) pp 3858ndash3863 Washington DCUSA June 2013

[30] R C Hibbeler EngineeringMechanics Dynamics Prentice Hall10th edition 2004

[31] V Klein and E A Morelli Aircraft System IdentificationTheory and Practice American Institute of Aeronautics andAstronautics Reston Va USA 2006

[32] J Roskam Airplane Flight Dynamics and Automatic FlightControls DARcorporation Lawrence Kan USA 2003

[33] J N Gross Y Gu M Rhudy F J Barchesky and M RNapolitano ldquoOn-line modeling and calibration of low-costnavigation sensorsrdquo in Proceedings of the AIAA Modeling andSimulation Technologies Conference pp 298ndash311 Portland OreUSA August 2011

[34] DW Allan ldquoStatistics of atomic frequency standardsrdquo Proceed-ings of the IEEE vol 54 no 2 pp 221ndash230 1966

[35] X Zhiqiang and D Gebre-Egziabher ldquoModeling and boundinglow cost inertial sensor errorsrdquo in Proceedings of the IEEEIONPosition Location and Navigation Symposium (PLANS rsquo08) pp1122ndash1132 IEEE Monterey Calif USA May 2008

[36] Z Xing Over-bounding integrated INSGNSS output errors[PhD dissertation]The University of Minnesota MinneapolisMinn USA 2010

[37] F L Lewis and V L SyrmosOptimal Control Wiley New YorkNY USA 2nd edition 1995

[38] M Mammarella G Campa M L Fravolini Y Gu B SeanorandM R Napolitano ldquoA comparison of optical flow algorithmsfor real time aircraft guidance and navigationrdquo in Proceedingsof the AIAA Guidance Navigation and Control Conference andExhibit Honolulu Hawaii USA August 2008

[39] A G O Mutambara Decentralized Estimation and Control forMultisensor Systems CRC Press Washington DC USA 1998

[40] T Vercauteren and X Wang ldquoDecentralized sigma-pointinformation filters for target tracking in collaborative sensornetworksrdquo IEEE Transactions on Signal Processing vol 53 no8 part 2 pp 2997ndash3009 2005

[41] D-J Lee ldquoUnscented information filtering for distributed esti-mation and multiple sensor fusionrdquo in Proceedings of the AIAAGuidance Navigation and Control Conference and ExhibitHonolulu Hawaii USA 2008

[42] M Rhudy andY Gu ldquoUnderstanding nonlinear Kalman filtersrdquoin Interactive Robotics Letters West Virginia University 2013httpwww2statlerwvuedusimirlpage13html

[43] M Rhudy J Gross Y Gu andM RNapolitano ldquoFusion of GPSand redundant IMUdata for attitude estimationrdquo inProceedingsof the AIAA Guidance Navigation and Control ConferenceMinneapolis Minn USA August 2012

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 13: Research Article Unmanned Aerial Vehicle Navigation Using Wide …downloads.hindawi.com/journals/jr/2015/251379.pdf · 2019-07-31 · Research Article Unmanned Aerial Vehicle Navigation

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of