rob21508 hyperuas online

20
HyperUAS—Imaging Spectroscopy from a Multirotor Unmanned Aircraft System Arko Lucieer University of Tasmania, School of Land and Food, Hobart, Tasmania, Australia e-mail: [email protected] Zbyn ˇ ek Malenovsk ´ y, Tony Veness, and Luke Wallace University of Tasmania, School of Land and Food, Hobart, Tasmania, Australia e-mail: [email protected], [email protected], [email protected] Received 17 October 2013; accepted 8 December 2013 One of the key advantages of a low-flying unmanned aircraft system (UAS) is its ability to acquire digital images at an ultrahigh spatial resolution of a few centimeters. Remote sensing of quantitative biochemical and biophysical characteristics of small-sized spatially fragmented vegetation canopies requires, however, not only high spatial, but also high spectral (i.e., hyperspectral) resolution. In this paper, we describe the design, development, airborne operations, calibration, processing, and interpretation of image data collected with a new hyperspectral unmanned aircraft system (HyperUAS). HyperUAS is a remotely controlled multirotor prototype carrying onboard a lightweight pushbroom spectroradiometer coupled with a dual frequency GPS and an inertial movement unit. The prototype was built to remotely acquire imaging spectroscopy data of 324 spectral bands (162 bands in a spectrally binned mode) with bandwidths between 4 and 5 nm at an ultrahigh spatial resolution of 2–5 cm. Three field airborne experiments, conducted over agricultural crops and over natural ecosystems of Antarctic mosses, proved operability of the system in standard field conditions, but also in a remote and harsh, low-temperature environment of East Antarctica. Experimental results demonstrate that HyperUAS is capable of delivering georeferenced maps of quantitative biochemical and biophysical variables of vegetation and of actual vegetation health state at an unprecedented spatial resolution of 5 cm. C 2014 Wiley Periodicals, Inc. 1. INTRODUCTION The recent growth and availability of semiautonomous mul- tirotor helicopters, low-cost Open Source autopilots, and miniaturization of sensors have resulted in the adoption of unmanned aircraft system (UAS) technology in a wide range of science disciplines, especially in earth observa- tion and aerial surveying. In the field of robotics, Cole, Sukkarieh, and G ¨ oktoan (2006), Hrabar (2012), and Kendoul (2012) showed significant advances in guidance, naviga- tion, control, obstacle avoidance, and information gather- ing. Some of these advances in unmanned aerial vehicle (UAV) technology have resulted in the adoption of UASs in more applied fields. Zhang and Kovacs (2012) provided a review of the application of UASs for precision agricul- ture, and they predicted an increased adoption in this area provided there is a relaxation of UAS aviation regulations. While Zarco-Tejada (2008) demonstrated that real-time ir- rigation scheduling and crop monitoring based on UAS remote sensing will soon become a reality, Bryson, Reid, Direct correspondence to: Arko Lucieer, e-mail: Arko.Lucieer@ utas.edu.au Ramos, and Sukkarieh (2010) proposed a vision-based im- age classification that can be used to identify weeds in Red-Green-Blue (RGB) imagery acquired by a UAS over farmland. Anderson and Gaston (2013) highlighted the ad- vantages of lightweight UASs in spatial ecology, and they identified four key advantageous attributes: i) their survey- revisit period controlled by the user allows for a high tem- poral resolution; ii) their low-altitude flights permit data collection at appropriate spatial resolutions; iii) their oper- ating costs are low compared to manned airborne surveys or satellite observations; and iv) they can carry various imag- ing and non-imaging payloads based on the demands of the application and end-user. Publications of several recent special issues on UASs for remote sensing applications (IEEE TGRS: March 2009; GIScience and Remote Sensing: March 2011; Geocarto International: March 2011; Remote Sensing: June 2012) and strong interest in dedicated conferences such as UAV-g in- dicate an increasing popularity of UASs for remote sensing and photogrammetry applications. The principal advan- tage of UAS-based remote sensing is the capacity to bridge the scale gap between field-based observations and high- altitude airborne and satellite observations. From a scientific Journal of Field Robotics 00(0), 1–20 (2014) C 2014 Wiley Periodicals, Inc. View this article online at wileyonlinelibrary.com DOI: 10.1002/rob.21508

Upload: juan-carlos-soto-bohorquez

Post on 16-Sep-2015

1 views

Category:

Documents


0 download

TRANSCRIPT

  • HyperUASImaging Spectroscopy from a MultirotorUnmanned Aircraft System

    Arko LucieerUniversity of Tasmania, School of Land and Food, Hobart, Tasmania, Australiae-mail: [email protected] Malenovsky, Tony Veness, and Luke WallaceUniversity of Tasmania, School of Land and Food, Hobart, Tasmania, Australiae-mail: [email protected], [email protected], [email protected]

    Received 17 October 2013; accepted 8 December 2013

    One of the key advantages of a low-flying unmanned aircraft system (UAS) is its ability to acquire digitalimages at an ultrahigh spatial resolution of a few centimeters. Remote sensing of quantitative biochemicaland biophysical characteristics of small-sized spatially fragmented vegetation canopies requires, however, notonly high spatial, but also high spectral (i.e., hyperspectral) resolution. In this paper, we describe the design,development, airborne operations, calibration, processing, and interpretation of image data collected with anew hyperspectral unmanned aircraft system (HyperUAS). HyperUAS is a remotely controlled multirotorprototype carrying onboard a lightweight pushbroom spectroradiometer coupled with a dual frequency GPSand an inertial movement unit. The prototype was built to remotely acquire imaging spectroscopy data of 324spectral bands (162 bands in a spectrally binned mode) with bandwidths between 4 and 5 nm at an ultrahighspatial resolution of 25 cm. Three field airborne experiments, conducted over agricultural crops and overnatural ecosystems of Antarctic mosses, proved operability of the system in standard field conditions, but alsoin a remote and harsh, low-temperature environment of East Antarctica. Experimental results demonstrate thatHyperUAS is capable of delivering georeferenced maps of quantitative biochemical and biophysical variablesof vegetation and of actual vegetation health state at an unprecedented spatial resolution of 5 cm. C 2014 WileyPeriodicals, Inc.

    1. INTRODUCTION

    The recent growth and availability of semiautonomousmul-tirotor helicopters, low-cost Open Source autopilots, andminiaturization of sensors have resulted in the adoptionof unmanned aircraft system (UAS) technology in a widerange of science disciplines, especially in earth observa-tion and aerial surveying. In the field of robotics, Cole,Sukkarieh, andGoktoan (2006),Hrabar (2012), andKendoul(2012) showed significant advances in guidance, naviga-tion, control, obstacle avoidance, and information gather-ing. Some of these advances in unmanned aerial vehicle(UAV) technology have resulted in the adoption of UASsin more applied fields. Zhang and Kovacs (2012) provideda review of the application of UASs for precision agricul-ture, and they predicted an increased adoption in this areaprovided there is a relaxation of UAS aviation regulations.While Zarco-Tejada (2008) demonstrated that real-time ir-rigation scheduling and crop monitoring based on UASremote sensing will soon become a reality, Bryson, Reid,

    Direct correspondence to: Arko Lucieer, e-mail: [email protected]

    Ramos, and Sukkarieh (2010) proposed a vision-based im-age classification that can be used to identify weeds inRed-Green-Blue (RGB) imagery acquired by a UAS overfarmland. Anderson and Gaston (2013) highlighted the ad-vantages of lightweight UASs in spatial ecology, and theyidentified four key advantageous attributes: i) their survey-revisit period controlled by the user allows for a high tem-poral resolution; ii) their low-altitude flights permit datacollection at appropriate spatial resolutions; iii) their oper-ating costs are low compared tomanned airborne surveys orsatellite observations; and iv) they can carry various imag-ing and non-imaging payloads based on the demands of theapplication and end-user.

    Publications of several recent special issues on UASsfor remote sensing applications (IEEE TGRS: March2009; GIScience and Remote Sensing: March 2011; GeocartoInternational: March 2011; Remote Sensing: June 2012) andstrong interest in dedicated conferences such as UAV-g in-dicate an increasing popularity of UASs for remote sensingand photogrammetry applications. The principal advan-tage of UAS-based remote sensing is the capacity to bridgethe scale gap between field-based observations and high-altitude airborne and satellite observations. Froma scientific

    Journal of Field Robotics 00(0), 120 (2014) C 2014 Wiley Periodicals, Inc.View this article online at wileyonlinelibrary.com DOI: 10.1002/rob.21508

  • 2 Journal of Field Robotics2014

    perspective, UASs allow optimization of the sampling tech-nique, e.g., selection of optimal spatial resolution and sensortype, according to the nature of objects of interest and appli-cation requirements (DOleire-Oltmanns, Marzolff, Peter, &Ries, 2012).

    Numerous UAS studies have employed structure-from-motion (SfM) computer vision techniques to derivethree-dimensional (3D) models of objects on the earthssurface from multiview RGB photography. For example,Niethammer, James, Rothmund, Travelletti, and Joswig(2012) combined multirotor UAS and SfM image process-ing techniques to acquire accurate 3Dmodels of a landslide.Similar techniqueswere appliedbyDOleire-Oltmanns et al.(2012) andHarwin andLucieer (2012) to quantify the impactof erosion in Moroccan badland landscapes and Tasmaniancoastal landscapes, or by Turner, Lucieer, andWatson (2012)and Lucieer, Turner, King, and Robinson (2014) to generateaccurate 3D models of Antarctic moss beds. These studiesdemonstrated the capability of UASs to produce 3D ter-rain models with absolute spatial accuracies in the order ofcentimeters.

    Besides digital RGB cameras, more advanced opti-cal sensors , such as multi- or hyperspectral spectrome-ters, thermal cameras, and laser scanners, have been in-tegrated within UASs. Nagai, Chen, Shibasaki, Kumagai,and Ahmed (2009), Lin, Hyyppa, and Jaakkola (2011),and Wallace, Lucieer, Watson, and Turner (2012) success-fully built a UAS with a mini-LiDAR instrument designedfor 3D scanning of the terrain. Laliberte, Goforth, Steele,and Rango (2011) and Kelcey and Lucieer (2012) collectedmultispectral images from a UAS for rangeland and salt-marsh vegetation mapping. Berni, Zarco-Tejada, Suarez,and Fereres (2009) combined thermal and multispectralUAS images for stress detection inMediterranean orchards.Finally, several publications reported UAS experimentswith pushbroom imaging spectroradiometers, e.g., Zarco-Tejada et al. (2009); Zarco-Tejada,Gonzalez-Dugo, andBerni(2012); Zarco-Tejada, Morales, Testi, & Villalobos (2013);and Hruska, Mitchell, Anderson, and Glenn (2012). In spiteof this progress, further development of advanced opticalUASs equipped with high-resolution imaging spectrome-ters is needed to facilitate their broad operational use.

    Imaging spectroradiometers have a long-standing tra-dition in optical remote sensing of the earth. For example,the thematic mapper (TM) spectrometers onboard the LandRemote Sensing Satellite (LANDSAT) missions started ac-quiring images of seven distinct spectral bands in visibleand infrared wavelengths in 1983 (Markham and Helder,2012). These bands are, however, spectrally broad, integrat-ing from tens to hundreds of nanometers. They providegeneral reflectance of distinct spectral regions, but they donot allow reconstruction of a detailed continual reflectancefunction of observed surfaces. This is feasible only withhigh spectral resolution imaging spectrometers (also re-ferred to as hyperspectral), recording tens to hundreds

    of narrow spectral bands with their full width at half maxi-mum(FWHM)oftenbelow10nm.Anextensive overviewofthe current operational satellite and airborne imaging spec-troradiometers provided in Malenovsky, Mishra, Zemek,Rascher, and Nedbal (2009) shows that their high spectralresolution and spectral sampling allow us to retrieve notonly qualitative but also quantitative information, such as,for instance, the content of green foliar pigments in agricul-tural crops or forest canopies expressed ing of chlorophyllper cm2 of a plant leaf area (Haboudane, Miller, Tremblay,Zarco-Tejada, & Dextraze, 2002; Malenovsky et al., 2013).Still, the high-altitude flying airborne and spaceborne plat-forms offer only a limited spatial detail, i.e., spatial resolu-tion in the order of meters.

    Despite the limited number of suitable lightweight hy-perspectral spectroradiometers, several research teams areendeavoring to develop hyperspectral-enabled UAS plat-forms (Hruska et al., 2012). Only a few teams, however,are flying a hyperspectral UAS operationally and using theacquired data in real scientific applications. One such exam-ple is the Laboratory for Research Methods in QuantitativeRemote Sensing (QuantaLab IAS CSCI, Cordoba, Spain),which accommodated a Micro-Hyperspec sensor (Head-wall Inc., USA) in a fixed-wing UAS. They employed thecollected imaging spectroscopydata inprecision agricultureapplications (Zarco-Tejada, 2008), particularly in the estima-tion of drought stress in citrus orchards (Zarco-Tejada et al.,2012) and more recently in assessing actual photosyntheticactivity of production vineyards using a retrieved steady-state chlorophyll fluorescence signal (Zarco-Tejada et al.,2012, 2013). Although it is fully operational, this hyperspec-tral UAS is optimized to acquire images with a spatial res-olution of about 40 cm, which may still be insufficient forobserving sparse crops of small individual plants (e.g., let-tuce or cabbage fields) or for fine-scalemonitoring of endan-gered plant species growing in highly fragmented spatialformations due to extremely harsh environmental condi-tions (e.g., alpine pioneering grasses or Antarctic mosses).To achieve an ultrahigh spatial resolution, slow flights(

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 3

    an in-depth description of the other components of the sci-entific payload for position and orientation determination,time synchronization, image acquisition, and power supplyfollowed by the integration of the scientific payload withthe multirotor UAS. Practical aspects of field operationsand data acquisition are discussed in Section 2.3. The im-age processing work flow consisting of spectral, radiomet-ric, and atmospheric correction is presented in Section 2.4.The derivation of spectral variables quantifying biophysicaland biochemical vegetation characteristics is discussed inSection 2.5. The application of HyperUAS is demonstratedin three field experiments (Section 3). Experiment I demon-strates proof of concept of ultrahigh resolution hyperspec-tral pushbroom scanning over an agricultural crop with afocus on the spectral processingworkflow. InExperiment II,HyperUAS is tested in the remote and harsh environmentof the Antarctic continent for hyperspectral scanning ofAntarctic moss beds. Finally, Experiment III is designed todemonstrate the full integration of the hyperspectral sensorwith accurate positioning and orientation sensors for directgeoreferencing.

    2. MATERIAL AND METHODS

    2.1. Hyperspectral Sensor

    The hyperspectral sensor used in this study is the Head-wall Photonics Micro-Hyperspec VNIR scanner (HeadwallInc., USA). The dimensions of the sensor unit are 110 by93 by 50 mm and it weighs 960 g. The incoming light isreceived through a lens objective with an aperture of f/2.8and through a slit of 25 m. The spectral wavelengths areseparated by a concentric spectrograph with an aberration-corrected convex holographic diffraction grating and pro-jected onto a charge-coupled device (CCD)matrix. Each col-umn of the matrix records the spatial dimension and eachrow records separated wavelengths. For each capture of theCCD matrix, one row of spatial pixels is recorded simul-taneously, hence the sensor has to move to sweep up afull spatial image. This design is referred to as a pushbroomscanner. The CCD chip has a 12-bit dynamic range and aBase Camera Link as the data interface.

    The size and spatial distribution of the object of inter-est defines the optimal spatial resolution required to observethis object. The Micro-Hyperspec has a lens objective witha focal length of 8 mm and a field of view (FOV) of 49.8, aslit entrance of 25 m, and a CCD matrix with 1,004 pixelsacross the spatial dimension. Flying at 11 m above groundlevel (AGL), the scanner acquires imagery with a pixel sizeof 1.0 cm in the across-track direction and an image swathof about 10.2 m. The finest physically feasible spectral sam-pling of the Micro-Hyperspec spectrometer is 324 spectralbands between 361 and 961 nmwith a FWHMbetween 4.12and 4.67 nm, allowing a maximum frame rate of about 93frames s1. The spectral sampling can be reduced to half, i.e.,162 spectral bandswith a FWHMbetween 4.75 and 5.25 nm,

    by binning two neighboring spectral elements of the sensormatrix as one physical unit. In this configuration, the max-imum frame rate increases to 115 frames s1. The generalaim is to use a longer frame period and integration time re-sulting in a higher SNR of recorded spectra, but at the sametime to prevent oversaturation of the sensor dynamic range.In the binned mode, the frame period and integration timeof about 20 ms combined with the flight speed of 2.5 ms1

    at 11 m above the ground produces a pixel size of 5.0 cmin the along-track direction. The across-track dimension hasto be consequently reduced by a factor 0.2 to obtain regularsquared pixels of 5.0 cm. This spatial resolution was foundto be adequate for capturing the objects of interest of thisstudy (i.e., grassland, barley crop, and also Antarctic mossbeds) in sufficient detail. Therefore, we used these settingsfor all three airborne experiments presented in this paper. Tofulfill the experimental purpose of the orthorectification ex-periment, we additionally collected a binned hyperspectralimage of the highest attainable spatial resolution of 2.0 cm.

    2.2. Design and Integration of Scientific SensorPayload

    To successfully acquire directly georeferenced hyperspec-tral imagery using the Micro-Hyperspec spectrometer, anumber of electronic andmechanical design criteria have tobe met. Although similar design criteria are applicable forimage acquisition undertaken from a low- and slow-flyingUAS as from amanned fixed-wing aircraft, the desire to uti-lize a small multirotor UAS presented a unique challengein terms of size and weight constraints. Figure 1 providesan overview of the three integrated electronic subsystemsof the scientific payload: 1) the position and orientation sys-tem (POS); 2) the digital image acquisition (DIA); and 3) thepower regulation subsystem, described in detail below.

    2.2.1. Position and Orientation System subsystem

    The position and attitude of the spectrometer must be pre-cisely recorded at the time of every frame capture for post-flight georeferencing. The spectrometer has six degrees offreedom (6 DOF) when rigidly attached to the UAS. Thethree degrees of freedom of the UAS trajectory (x, y, and z)and the three degrees of freedom of the UAS attitude alongits trajectory (pitch, roll, and yaw) all need to be accuratelymeasured and recorded. It should be noted that stable flightof a multirotor UAS is facilitated by the closed-loop algo-rithms implemented in the firmware of all commercial mul-tirotor flight controller avionics, which requires accuratecontinuous measurement of dynamic attitude (and option-ally position) in combinationwith the closed-loop control ofmotor speed. Input from the pilot is also taken into accountwithin the control loops, allowing theUAS to translate awayfrom stable hover. The measurement of multirotor UAS air-frame attitude and position is typically undertaken using

    Journal of Field Robotics DOI 10.1002/rob

  • 4 Journal of Field Robotics2014

    Figure 1. Schematic of the electronic subsystems of the scientific payload.

    cost-effective two- or three-axis micro-electrical-mechanical(MEMs) -based rate gyroscopes and accelerometers, alongwith barometers, magnetometers, and a navigation-gradesingle-frequency global positioning system (GPS) receiver(Lim, Park, Daewon, & Kim, 2012).

    The consumer-grade components of the POS system ofa multirotor UAS are typically low in cost (less than $100).The noise, drift, nonlinearity, and temperature-dependentcharacteristics of the low-cost POS components are typicallynot a limitation for achieving stable flight in awell-designedand properly configured multirotor UAS. For direct georef-erencing of ultrahigh spatial resolution pushbroom imagerycaptured from a UAS, however, the accuracy requirementsof thePOSaremuchgreater.Hruska et al. (2012) reliedon theGPS/inertial measurement unit (IMU) solution providedby a high-quality autopilot for direct georeferencing hyper-spectral pushbroom imagery acquired from a fixed-wingUAS, achieving a spatial root-mean-squared error (RMSE)of 4.63 m. They concluded that the poor results were par-tially due to the uncertainty in the GPS/INS solution pro-vided by the autopilot. Berni et al. (2009) likewise outlinedthe reliance on the internal IMU of the autopilot of a smallUAShelicopter toprovideposition andattitude informationfor direct georeferencing of both thermal and multispectral

    imagery. While position and attitude measurements wereavailable at 20 Hz postflight from the autopilot, the lackof synchronization of POS and image exposure preventedthe use of the POS data in a direct georeferencing process.Given our desire to acquire ultrahigh spatial resolution im-agery with pixel sizes less than 5 cm, the use of any single-frequency GPS receiver as part of our scientific payloadwasunrealistic, forcing us to look for a POS solution with a po-sitional accuracy of less than 5 cm.

    The components of our POS solution consist of aMEMsIMU (Microstrain 3DM-GX3 35), a dual frequency (L1/L2)GPS receiver (Novatel OEMV-1DF), and a lightweight he-lical L1/L2 antenna (Maxtena M1227HCT-A). A secondaryonboard miniaturized computer (Gumstix Verdex Pro) isused to log data from the onboard POS sensors. The on-board GPS observations are postprocessed against a secondL1/L2 GPS receiver (Novatel SPAN CPT) logging over aknown location using the standard processing algorithmsinNovatelGrafNav 8.2 (Novatel, 2012) to produce estimatesof position and velocity at a rate of 5 Hz. To derive accurateposition and orientation estimates for the sensor at the in-stance of each frame, these estimates were then fused withthe IMU observations of rotational rate (deg/s) and accel-eration (ms2) made at 100 Hz, using a loosely coupled

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 5

    sigma point Kalman smoother as outlined in Wallace et al.(2012).

    2.2.2. Digital Image Acquisition subsystem

    The following technical criteria had to be taken into accountwhen designing the digital image acquisition system for useon the UAS:

    Since the small Adimec monochrome machine visioncamera (part of the Micro-Hyperspec) uses the Camer-aLink protocol (Base) for image transfer and control, theimage acquisition system had to be CameraLink capable(255 MB s1 at maximum clock rate).

    The image acquisition system must be capable of thesustained recording of megapixel images for tens of min-utes at a frame rate between 50 and 100 fps (and option-ally recording attitude and position records). Capturing1, 004 1, 004 12-bit pixel frames at 50 fps requires a sus-tained data transfer of 96 MB s1.

    The image acquisition system must be capable of veryaccurate time-stamping of each image frame (and op-tionally also each attitude and position record).

    The image acquisition system must be lightweight andconsume minimal electrical power to maximize UASflight time.

    The requirement for highly accurate time stamping ofimages and position and attitude measurements cannot beoverstated.While the accuracy of position and attitudemea-surements is constrained by UAS payload limitations andthe desire for a cost-effective POS, the geometric accuracyof the acquired imagery is also greatly constrained by er-rors in time-stamping. For example, assuming a speed overground (SOG) of 2 ms1, a timing error of 10 ms betweenimage acquisition and position measurement will cause analong-track positioning error of 2 cm. Assuming a rate ofchange of 10 in camera roll, a timing error of 10ms betweenimage acquisition and attitude measurement will cause anapproximate across-track error of 2.2 cm (at a flying heightof 10 m and FOV of 50). These calculations show the iso-lated effect of timing error, however, the effect of a 10 mstiming error on a UAS frame exhibiting real-world dynam-icswill bemore than the 2 cmdescribed above. Tomaximizethe accuracy of the direct image georeferencing process, theuncertainty in time-stamping of less than 100 ns betweenimage capture and POS measurements was desired.

    2.2.3. Selection Process of DIA System

    Although a variety of image acquisition systems were con-sidered for use on the UAS, many were suited only forlaboratory or industrial use, where size and mass are not aconstraint. Third-party applications based on National In-struments LabView hardware and software and also frame

    grabber printed circuit boards (PCBs) for use with a PC-104sized single board computer (SBC) were rejected as theydo not allow for sufficiently accurate image time-stamping.Dehaan, Weston, and Rumbachs (2012) document a hy-perspectral data logging system using a mini-ITX moth-erboard and frame grabber for capturing imagery from aCameraLink-based hyperspectral sensor onboard a fixed-wing UAS. While the ITX format is small compared to thevariety of industry standard desktop PCmotherboards, themini-ITX format is still a sizeable 170 mm 170 mm for useon the multirotor UAS described here. The KINO-QM670SBC specified in Dehaan et al. (2012) weighs 900 g, whilethe airborne acquisition unit accompanying an imagingspectrometer onboard a small fixed-wing UAS describedin Hruska et al. (2012) was reported to weigh 1.16 kg andmeasures 102 165 82 mm.

    The IO IndustriesDVRExpressCorewas selected as thedigital video recorder (DVR) at the core of the image acqui-sition system. The field-programmable gate array (FPGA)-based DVR captured high-speed digital imagery directlyfrom a CameraLink capable camera to a solid-state harddrive (SSD) without relying on a traditional computer oper-ating system. The system was designed to be portable andthus it utilized a pico-ITX format SBC to command and con-trol the FPGA-based DVR. Design features of relevance foruse with this unit on the UAS include the following:

    A variety of interfaces available, including Camera Link(Base).

    The ability to accept a 1 pulse per second (PPS) signaland National Marine Electronics Association (NMEA)telegrams from a GPS receiver allowed microsecond ac-curate time-stamping of each image frame.

    The ability to sustain a write speed to a serially attached(SATA) SSD allowing image capture at 100 fps.

    The availability of technical documentation and sup-port allowing the trouble-free miniaturization of theelectronics.

    The inclusion of a Microsoft (MS) Windows 7 based ap-plication (CoreView) for the basic control of DVR elec-tronics (acquisition start and stop, image preview andexport, etc.). Importantly, this application allows the ex-port of imagery in a variety of formats suitable for furtherprocessing.

    The adaptability of the DVR Express Core, the levelof manufacturer-provided technical support, and the over-all design philosophy ensured that time-stamping and sus-tainedwrite speed requirementswouldbe robustly fulfilled.

    2.2.4. Implementation of DIA

    The IO Industries DVR Express Core unit is comprisedof a PC-104 sized DVR PCB, pITX-sized SBC, and one tofour 2.5 SSDs in an aluminum case with associated power

    Journal of Field Robotics DOI 10.1002/rob

  • 6 Journal of Field Robotics2014

    supplies and cabling. Total weight with a single SSD wasover 1.5 kg. After initial bench testing, the manufacturer-supplied DVR systemwas dismantled to allow a mass- andvolume-saving redesign. The suitability of existing SSDswas bench-tested to ensure that their capacity and writespeedwere sufficient for the intendedflight times and framerates, respectively. The relatively small camera pixel count,moderate frame rate, and relatively short flight times al-lowed the use of a single SATA 2.5 SSD connected via oneof the DVRs four storage device SATA ports. Custom fi-breglass PCBs were designed to physically mount the DVRand SBC alongwith associated SSD, flashmemorymodules,isolation, and power regulation circuitry. The 1.8-mm-thickfibreglass PCBs sandwiched the fragilemultilayerDVRPCBto achieve lightweight mechanical protection. PCB connec-tors were suitably placed on the PCB tomakewiring as easyas possible in the limited available space. Lightweight cablelooms were fabricated to connect system components to-gether. Isolatedhigh-frequency switchingdirect current (dc-dc) convertors were used to power the major componentsfrom a single 18.5 VDC (5S) 5000 mAh Lithium-Polymer(LiPo) battery.

    The use of the 1 PPS signal and NMEA date and time($ZDA) telegram from the Novatel OEMV-1DF dual fre-quency GPS receiver board allowed very accurate time-keeping, as previously outlined. Novatel states the timingaccuracy of the 1 PPS signal to be 20 ns (RMS) between sub-sequent pulses. The transistor-transistor logic (TTL) level1 PPS signal was connected directly between the GPS andthe DVR PCB via optical isolation. NMEA $ZDA telegramswere transmitted at 9600 bps at 1 Hz via a RS232-USB con-verter connected to the SBC running the IO Industries con-trol application (CoreView)under themultitaskingMSWin-dows 7 operating system. The combination of the 1 PPS and$ZDA telegram allowed real-time time-stamping of eachframe at 50100 fps with nanosecond accuracy. The directconnection of the 1 PPS signal to the FPGA-based DVR re-moved image time stamping problems caused by latencyof the operating system. The custom PCBs, mounting theDVR and SBC (and associated electronics) along with a cus-tom power supply PCB, were mounted on a lightweightaluminum equipment tray via molded rubber vibrationdampeners.

    2.2.5. Power Supply Subsystem

    All electrical power for the scientific payload is suppliedfrom two off-the-shelf LiPo battery packs. These batteriesare used in addition to the battery packs supplying unreg-ulated dc power to the UAS avionics and motors. Electricalisolation between the UAS avionics and scientific payloadelectronics is an important design feature to permit testingof either system without reliance on the other. The powersupply subsystem consists of a number of isolated dc-dcswitching regulators and associated components mounted

    on custom-designed PCB. All regulators are wide range in-put devices capable of supplying a regulated 5 or 12 VDCunder full load with an unregulated input range between9 and 36 VDC. This allows LiPo batteries with a cell countbetween 3 (11.1 V) and 9 (33.3 V) to be used. Isolated regu-lators are used throughout the design to enable the futureelectrical isolation of attached subsystems if required. ThePOS and DIA subsystems are supplied from separate LiPobatteries and power regulation electronics. This allows thePOS and power regulation electronics to be removed or re-placed as a stand-alone assembly.

    2.2.6. UAS Airframe and Sensor Integration

    For theUASairframe,we chose theDroidworx Skyjib 8mul-tirotor heavy-lift UAS (Droidworx Limited, New Zealand).The use of a carbon fiber sheet and a thick wall rod heldtogether with alloy machine screws enabled the fabricationof a very light but rigid assembly. The undercarriage sup-ported the combinedmass of theUAS and scientific payload(4.5 kg) when standing on the ground. To mount the hyper-spectral payload on the UAS, a number of issues had to beconsidered:

    The scientific payload was expected to be easily installedand removed without tools, i.e., avoiding use of smallfasteners. This is especially relevant for work in cold andphysically demanding field conditions, such as, for in-stance, in Antarctica.

    UAS flight characteristics must not be compromised bythe addition of the scientific payload. While a reductionin flight time and maneuverability is to be expected, theUAS with the scientific payload must be able to fly out-doors in real-world conditions.

    Themass ofmechanical components of the scientific pay-load had to be as low as feasibly possible. The unavoid-ablemass of the scientific payload greatly limits the flighttime of the UAS compared to the modest payload of asimple two-axis gimbal with a standard digital camera.Desirable minimal reduction of flight time is, therefore,achievable only by designing lightweight mechanics ofthe scientific payload.

    The scientific payload had to be sufficientlymechanicallyisolated from theUAS airframe to prevent vibration fromcausing image motion blur and damage to electronics.Themajor source of in-flight vibrationwas the eight elec-tric motors and propellers operating at 40006000 RPM.

    The scientificpayloadhad tobe a stand-alone instrument,capable of operationwhen notmounted on theUAS. Thisenabled on-ground testing and calibration without therequirement for a UAS or a pilot.

    The arrangement of the imaging spectrometer and theIMU within the scientific payload had to permit therigid mounting of both units. Accurate measurement of

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 7

    spectrometer attitude would not be possible withouthard co-mounting of both instruments.

    The L1/L2 antenna for the dual frequency Novatel GPSreceiver and the L1 antenna for the GPS receiver builtinto the IMU both needed clear views of the sky for op-timum performance. The L1/L2 antenna needed to berigidly mounted to allow measurement of accurate andrepeatable lever arm distances from the antenna phasecenter to the spectrometer focal plane.

    Locating the IMU close to the Center of Gravity(CoG)/Center of Rotation of the UAS is advantageous.Attitude measurement using a lightweight MEMs-basedIMU is limited in part by the accuracy of the IMUs rategyroscopes. By locating the IMU at the CoG of the UAS,the IMU is able to measure changes in UAS pitch androll (using accelerometers) without having to account forlarge simultaneous changes in the accelerationof the IMU(measured by a rate gyroscope). Changes in IMU acceler-ation result from natural UAS pitching and rolling, whenthe IMU center is not precisely located at the CoG of theUAS. The smaller the distance between the IMU centerand UAS CoG, the smaller the changes in acceleration,and hence the smaller the adverse effect on the accuracyof dynamic attitude measurements.

    Design of the scientific payload was undertaken withthese mechanical specifications in mind. A computer-aideddesign (CAD) environment was used, enabling the on-screen testing of a variety of equipment layouts and con-struction techniques prior to fabrication. Certain compo-nents were fabricated using computer-aided manufacturer(CAM) techniques. Lightweight off-the-shelf componentsand composite materials were used where possible. Solid-Works (Dassault Systems) was employed to create modelsof components, subassemblies, and assemblies represent-ing the majority of the mechanical components of the UASand scientific payload. Measured masses were assigned toall components in SolidWorks to enable accurate total masscomparisons of a variety of physical designs. Accuratemassassignments also allowed the calculation of the CoG of thecombined UAS and scientific payload. The comparison ofthe CoG of the UAS alone with the CoG of the UAS withthe scientific payload attached was useful for assuring thatUAS hover performance will not be drastically affected.

    The Droidworx-supplied undercarriage was replacedwith a custom-designed undercarriage. The new undercar-riagewas fabricated from four sections of 10mmnylon rodsand turned acetal plastic rodmounts. The custom-designedeight-rod mounts replaced the eight Droidworx suppliedbottom motor mounts and held the four Nylon hoops insemicircular loops between adjacent motor mounts. Thenew undercarriage provided a larger on-ground footprint,which assisted better stability during take-off and landingsinwindy conditions. Theuse of flexible nylon rods provided

    shock absorption to both the UAS and the scientific packageduring the inevitable occasional hard landing.

    The scientific payload was designed to hang on foursilicon mounts that were in turn attached to four of themotor booms. The mechanical design of the silicon mountsprevented excessive movement of the scientific package,which would tear the supple silicon mounts. Deviationsfrom a neutral position were limited to approximately 10mm in all directions, which was deemed sufficient to allowfor unhindered movement of the scientific package in gen-eral use. The final design of the HyperUAS is illustrated inFigure 2.

    2.2.7. Limitations of Design

    The central hub and radial boom configuration of theDroid-worx SkyJib airframe is preventing the colocation of theCoG of the UAS and the CoG of the scientific payload. Thislimitation is common to all commercially available heavylift multirotor UASs. The thrust vectoring design of cur-rent multirotor UASs means the UAS is constantly pitch-ing, rolling, and yawing to some degree when flying to aset waypoint. Multirotor UASs do not have control surfacesand rely ondeviation fromvertical direction by varyingpro-peller thrust to translate from one position to another. Asthe IMU and L1/L2 antenna cannot be physically mountedat the CoG of the UAS, the natural rotations will cause itssmall translations. Also, as the mass of the scientific pay-load is hanging below the CoG of the UAS, small unwantedpendulous motions are introduced into the POS measure-ments. Rotation of the UAS around three axes will causethe rotation of any hard mounted equipment, such as thescientific payload. These rotations are unavoidable in a hardmounted sensor system, but the availability of an accurateonboard POS, measuring UAS attitude in conjunction withimage capture, enables postprocessing of the image geo-metrical distortions caused by natural rotations and directgeoreferencing of the acquired pushbroom imagery.

    2.3. Description of the Field Operation Procedure

    Hyperspectral images should be collected under the pre-vailing direct solar irradiation and during solar noon toensure a high signal quality. This was the case for the firstproof of concept experiment. For the second experiment, theimages of Antarctic mosses were acquired during the localsolar noon, but due to the time and logistical constraints,the images were acquired only under a full overcast. Thisprovided us with an unintentional opportunity to assessthe feasibility of HyperUAS deployment under prevailingdiffuse irradiation conditions. Finally, the third experimentwas conducted under a partially cloudy sky, as it was notfocusing on spectral interpretation but only on geometricalcorrections and rectification of acquired images.

    Journal of Field Robotics DOI 10.1002/rob

  • 8 Journal of Field Robotics2014

    Figure 2. HyperUAS: (a) CAD model of multirotor UAS with a custom-designed payload package; (b) photo of the HyperUASduring test flight.

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 9

    Five artificial directionally isotropic (near-Lambertian)spectral calibration panels (each 25 25 cm in size), provid-ing a flat reflectance response ranging from 5% to 70% invisible and near-infrared (NIR) wavelengths, were placedon flat ground within the sensor FOV to remove the spec-tral influence of atmospheric scattering and absorption pro-cesses from the acquired images. Their reflectance func-tions are used during the processing phase as a reference toeliminate confounding atmospheric effects. Finally, brightmetallic disks of two sizes (10 and 30 cm in diameter) weresystematically deployed within the area of image acquisi-tion during the last two experiments. Precise geographic po-sitions of their center points weremeasuredwith the geode-ticDifferentialGlobal PositioningSystem (DGPS)Leica 1200(LeicaGeosystems,USA). The ground control points (GCPs)were used in the second experiment for geometric imagecorrection and coregistration, and in the third experimentas the reference points for the GPS/IMU direct georeferenc-ing accuracy assessment.

    As the operational flight time of the HyperUAS is cur-rently limited to 23 min and all autopiloted test flights inAntarctica failed due to the unstable magnetic field and theextreme magnetic declination of 98, the UAS operator hadto execute all experimental flights manually. The lens ob-jective of the sensor was focused before each flight at thedistance of the flight altitude. A hyperspectral dark currentimage (i.e., measure of sensor noise) was then recorded un-der the actual environmental conditions. After uncoveringthe sensor lens, a new image acquisitionwas started and thesystem was flown along a predefined approximately 30-m-long transect. The actual UAS location, flight speed, and al-titudeweremonitored in real time via a telemetry link.Afterlanding, the image acquisition was stopped and supportiveancillary ground measurements (e.g., spectral signatures ofverification targets and position of GCPs) were conducted.

    2.4. Spectral, Radiometric Calibration, andAtmospheric Correction of HyperspectralImage

    Raw images of the Micro-Hyperspec spectroradiometer arerecorded in digital counts of the 12-bit sensor range [i.e.,the maximal signal strength is equal to 4,096 digital num-bers (DNs)]. Spectral and radiometric calibration of theMicro-Hyperspec sensor had to be carried out prior toits operation to reveal sensor-specific spectral characteris-tics (i.e., spectral sampling and resolution) and to facilitatetransfer of the DN counts into the physically meaning-ful radiance units (mWcm2 sr1 m1). Both calibrationswere performed at the optical calibration facility of theAustralian national science agency, The CommonwealthScientific and Industrial Research Organisation (CSIRO) inCanberra. The spectral calibration was conducted in ac-cordance with standard methodology using a pen-ray linesource producing sharp gas absorption features of well-

    Figure 3. Spectral signature of themercury-argon (Hg-Ar) cal-ibration source HG-1 (Ocean Optics, USA) as recorded by theMicro-Hyperspec spectroradiometer during its spectral calibra-tion (324 spectral bands, frame period, and integration time setas 40 ms). Numbers above spectral peaks indicate 13 wave-lengths (nm) of the gas emissions used in Micro-Hyperspecspectral calibration.

    known wavelengths (Chen, 1997). The Micro-Hyperspecsensorwas exposed by amercury-argon (Hg-Ar) calibrationsource HG-1 (Ocean Optics, USA) and set up to collect 100frames in unbinned and also binned spectralmode,with theframe period and integration time set to 40 ms. The spectralbands ofwell-defined featureswith a single peakwere iden-tifiedandattributed to the 13knownHg-Ar emission lines at404.656, 435.833, 546.074, 696.543, 706.722, 727.294, 738.393,763.511, 772.4, 811.531, 826.452, 842.465, and 912.297 nm (seeFigure 3). A linear function established between the rec-ognized wavelengths of Hg-Ar emissions and the corre-sponding spectral bands (i.e., spectral elements of the sensormatrix) was used to derive missing wavelengths of theremaining bands (R2 = 99.99, p = 0.001). The spectralresponse-specific Gaussian functions were fitted afterwardto each of the 13Hg-Ar emission features (see Figure 3), andthewidth at the half of the functionmaximum identified thesensor spectral resolution (i.e., FWHM). The FWHM valuesof the in-between spectral bands were linearly interpolated.The resultingmean spectral sampling (i.e., spacing betweentwo consecutive spectral bands) of 1.857 and 3.714 nmwiththe mean FWHM of 4.12 and 5.05 nmwas found for the un-binned and binned spectral modes of the Micro-Hyperspecsensor, respectively.

    An integrating sphere is an optical component de-signed as a hollow ball with relatively small openings aslight entrance and exit ports and an interior coated for highdiffuse reflectivity producing a uniform light reflectanceat each position. Sensor radiometric calibration was per-formedwithMicro-Hyperspec facing the opening of the cer-tified optical integrating sphere USR-SR 2000S (Labsphere

    Journal of Field Robotics DOI 10.1002/rob

  • 10 Journal of Field Robotics2014

    Figure 4. Radiometric calibration of the Micro-Hyperspec sensor using the Labsphere optical integration sphere USR-SR2000S (a) equipped with four calibrated light sources (lamps) producing predefined irradiances in visible and near-infraredwavelengths (b).

    Inc., USA) equipped with four electric current stabilizedlight sources (labeled as SI and INGAAS lamps), which areproducing calibrated irradiance levels of traceable uncer-tainties (Figure 4). A tungsten ribbon filament lamp is themost widely used standard for spectral radiance for visi-ble and NIR wavelengths (Kostkowsky, 1997). The quartztungsten halogen (QTH) lamps were used to illuminate theMicro-Hyperspec CCD sensor, which recorded images of100 frames with the integration time varying from 10 to40 ms to prevent sensor saturation. Images were acquiredin unbinned and binned spectral mode with the objectivefocused on infinity (case of airborne acquisitions) and neardistance (case of ground-based acquisitions). Correspond-ing dark current images (DCIs) were also acquired withthe sensor facing a dark integrating sphere port. To com-pute the radiometric calibration coefficients for each spec-tral band, hyperspectral DCI was subtracted from spectral-image-containing DN values of predefined irradiancelevels, thereby eliminating the sensor white noise. Subse-quently, it was divided by the integration time to make thecalibration independent of the frame exposure time. Theimage frames were averaged per band and divided by ap-propriated calibration values corresponding to the irradia-tion intensity used (Figure 4). The radiometric calibrationof an airborne and/or ground-measured Micro-Hyperspecimage starts analogously with diminishing the sensor noiseby subtracting the DCI recorded before flight from the ac-quired image and with normalizing it by the applied frameintegration time. Each spectral band ismultiplied afterward

    by the corresponding radiometric coefficient to translate theDN counts into the physical units of radiance. An exampleof this conversion in the case of a grass canopy recordedby Micro-Hyperspec during the first study experiment isprovided in Figure 5.

    The empirical line correction (Smith and Milton, 1999)is a standard remote sensing calibration technique usingspectrally uniform ground targets to remove optical in-fluences of atmospheric gases and aerosols between thesensor and observed surfaces and converting at-sensorradiances to surface hemispherical-directional reflectancefunctions (HDRFs) (Schaepmanstrub, Schaepman, Painter,Dangel, & Martonchik, 2006). The atmospheric correctionof Mirco-Hyperspec images was carried out in ENVI/IDLimage processing software (Exelis Visual Information Solu-tions, USA). Five directionally nearly isotropic (i.e., near-Lambertian) gray scale calibration panels [cream, pearl,light gray, stone gray, and anthracite, Figure 6(a)] that weremanufactured to produce a flat reflectance function rangingfrom 5% up to 70% were located on the hyperspectral im-age. Their actual per-band radiances were extracted fromthe image and empirically related to their HDRF measuredbeforehandwith a numerical spectroradiometer in the labo-ratory [Figure 6(b)]. The atmospheric correction coefficients(including the maximum solar irradiance and the atmo-spheric path radiance) computed from this coupling wereapplied across the whole image to convert the at-sensor ra-diance to the surface HDRF. An example of the atmospher-ically corrected grass canopy HDRF, extracted from the

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 11

    Figure 5. Spectral profile of a grass canopy in (a) digital num-bers (DN) as acquired by the Micro-Hyperspec sensor from thealtitude of 11 m, (b) in radiance units after the radiometric cal-ibration, and (c) in unitless reflectance after the atmosphericcorrection.

    Micro-Hyperspec image displayed in Figure 7, is shownin Figure 5 together with the original at-sensor radiance.

    2.5. Quantitative Estimations of VegetationBiophysical and Biochemical Variables

    As demonstrated by several studies published in a spe-cial issue of Remote Sensing of Environment on ImagingSpectroscopy (Ustin and Schaepman, 2009), modern hyper-spectral remote sensing offers, besides traditional qualita-tive products (e.g., land cover or land use maps), also thepossibility to map the quantitative characteristics of natu-ral and rural ecosystems (e.g., plant functional traits such asleaf area index or leaf chlorophyll content). The quantitativevegetation traits can, for instance, help to predict potentialcrop yield (Clevers, 1997) or indicate a level of environ-mental stress impact in vulnerable natural areas (Kokaly,Asner, Ollinger, Martin, & Wessman, 2009). In this study,we demonstrate the capability of our HyperUAS data togenerate these products at an unprecedented high spatialresolution of 5 cm.

    An efficient way to quantify vegetation traits from hy-perspectral images is transformation of acquired reflectancespectra in so-called optical vegetation indices (VIs). VIs aremathematical combinations of purposely selected spectralbands that can maximize sensitivity toward biophysical orbiochemical variables and simultaneously minimize the ef-fects of confounding environmental factors (e.g., a nega-tive spectral influence of bare soil underneath vegetationcanopy) (Myneni,Hall, Sellers,&Marshak, 1995). In ourfirstand second experiments, we applied the following threespecific VIs: i) the most frequently used normalized differ-ence vegetation index (NDVI), proposed to separate greenvegetation from abiotic surfaces based on their red/NIRspectral contrast, ii) the ratio of the transformed chlorophyllabsorption in reflectance index (TCARI) and the optimizedsoil-adjusted vegetation index (OSAVI), designed to assesstotal leave chlorophyll a + b content (Cab) as a measureof photosynthetically active foliar pigments (Haboudaneet al., 2002), and iii) the modified triangular vegetation in-dex 2 (MTVI2), invented by Haboudane (2004) to estimategreen biomass density via leaf area index (LAI) defined as aone-sided green leaf area per unit ground area. NDVI wasdevelopedbasedonLANDSATobservations in the 1970s asa simple difference ratio of red and NIR TM bands (Tucker,1979). In our experiments, we computed NDVI from theMicro-Hyperspec spectral bands located at 680 (680) and800 (800) nm as

    NDVI = (800 680)(800 + 680) . (1)

    TCARI/OSAVI was calculated as

    TCARI = 3[(700 670) 0.2(700 550)

    (700

    670

    )](2)

    Journal of Field Robotics DOI 10.1002/rob

  • 12 Journal of Field Robotics2014

    Figure 6. Spectral calibration panels: (a) five directionally nearly isotropic (i.e., near-Lambertian) spectral panels of a gray scale(cream, pearl, light gray, stone gray, and anthracite); (b) manufactured using a special paint to provide a flat reflectance functionfrom 5 to 70% in visible and near-infrared wavelengths.

    normalized by

    OSAVI = (1 + 0.16)(800 670)800 + 670 + 0.16 , (3)

    where 550, 670, 700, and 800 are the reflectance values at550, 670, 700, and 800 nm. Haboudane et al. (2002) provedthat TCARI/OSAVI is highly sensitive to Cab variationswhile staying resistant to the variations in canopy LAIand solar zenith angle. Using computer models instructedto simulate radiative transfer through virtual vegetation,they deduced that Cab of remotely sensed wheat or grasscanopies can be empirically estimated via a nonlinear loga-rithmic function:

    Cab = 30.94 ln(TCARIOSAVI

    ) 18.363, (4)

    where TCARI/OSAVI is obtained from airborne imagingspectroscopy data. Finally, MTVI2 was computed as

    MTVI2 = 1.5 [1.2(800 550) 2.5(670 550)](2800 + 1)2 (6800 5670) 0.5

    . (5)

    MTVI2 is a successor of the TVI (triangular vegetation in-dex) (Broge and Leblanc, 2001) that can successfully sup-press a negative confounding influence of varying Cab andbare soil beneath theplant canopy.According toHaboudane(2004), it can be used to estimate green LAI from an expo-nential equation:

    LAI = 0.2227 exp(3.6566MTVI2), (6)where MTVI2 is expected to originate from a hyperspectralimage of structurally homogeneous plant canopies.

    3. EXPERIMENTAL RESULTS

    3.1. Experiment I: Assessing Chlorophyll Contentand Green Biomass of Pasture and Barley Crop

    The first proof-of-concept experiment was designed to testthe general field operation of the HyperUAS. This firsttest was carried out without GPS/IMU georeferencing ca-pability to investigate the steadiness of radiometric andspectral calibration of the Micro-Hyperspec mounted onthe multirotor SkyJib airframe. The main intention was toprove that HyperUAS images can provide spatially mean-ingful patterns of biochemical and biophysical vegetationvariables, particularly Cab and LAI, at a spatial resolu-tion of 5 cm. The experimental flight was conducted overa canopy of grass pasture and green barley crop at theresearch farm of the University of Tasmania (UTAS, Ho-bart, Australia) on a sunny day on 8 November, 2012. Theimage data were acquired in binned spectral mode (i.e.,162 spectral bands), which resulted after radiometric cal-ibration and atmospheric correction in expected standardHDRF signatures of sensed agricultural canopies (Fig. 5).TCARI/OSAVI and MTVI2 were subsequently computed[Eqs. (2), (3), and (5)] for a geometrically consistent imagesubset of approximately 35 m in length and converted intothe Cab and LAI products [Eqs. (4) and (6)].

    As illustrated in Fig. 7, the spatial patterns of both prod-ucts are highly detailed, but systematic, meaningful, andconsistent. Althoughwehadnopossibility to collect groundtruth measurements for direct validation of retrieved char-acteristics, the quantitative ranges of both variables werefound to be plausible, estimating most of the Cab values be-tween 10 and 60g cm1 and LAI between 1 and 10m2 m2.

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 13

    Figure 7. The radiometrically and atmospherically corrected Micro-Hyperspec image in a near-infrared color representation[bands 120 (800 nm), 84 (670 nm), and 57 (570 nm) mapped to RGB] acquired over a grass and barley canopy on 8 November 2012(a), and the subsequently generated biochemical and biophysical products of actual leaf chlorophyll content (b) and true leaf areaindex (c).

    They also correspond to the previously published results ofsimilar canopies (Haboudane, 2004;Haboudane et al., 2002).Grassland is gaining, in general, lowerCab andLAI thanbar-ley, which might be caused by stress from animal tramplingand grazing. More interestingly, a 5 cm pixel size allowsidentification of very small canopy openings (only a few

    centimeters) in the barley field with gradually increasingLAI from the center point toward the edge. These openingsare, however, not identifiable in the Cab pattern, suggestingthat they do not limit the actual photosynthetic capability ofsurrounding plants. Such a detailed observation would notbe achievable without combining narrow spectral bands of

    Journal of Field Robotics DOI 10.1002/rob

  • 14 Journal of Field Robotics2014

    the Micro-Hyperspec with the ultrahigh spatial resolutionobtainable by HyperUAS.

    3.2. Experiment II: Mapping Actual Vigor ofAntarctic Moss Beds

    The second experiment tested the performance of Hyper-UAS in a remote and harsh, low-temperature environ-ment of Antarctica during our scientific activities mappingthe current health vigor of Antarctic moss beds. MossesSchistidium antarctici, Ceratodon purpureus, and Bryum pseu-dotriquetrum are dominant photosynthetically active plantspecies growing along the rocky coast of Antarctica. Re-cent changes in temperature, wind speed, and stratosphericozone are, however, negatively influencing their growthrate, health state, and spatial abundance. A need for map-ping the actual state of Antarctic moss beds arose within thescientific community in the past decade. Because the vege-tation season in Antarctica is very short, lasting only aboutthree months, and weather conditions are changing fast,the moss-monitoring approach must be time-efficient, eas-ily deployable, and still able to capture highly fragmenteddistribution of only a few cm large moss patches in rockyterrain. This practical mapping problemwas one of the mo-tivations for developing HyperUAS, which fulfills all theserequirements. HyperUAS was, after rigorous preparations,successfully flown over several moss beds surrounding theAustralian Antarctic station Casey (Windmill Islands, eastAntarctica) in January and February of 2013. For more in-formation about the study site, see Lucieer et al. (2014). Thespectrally binned nadir looking hyperspectral images ac-quired from 11 m AGLwere first radiometrically calibratedand corrected for atmospheric influence. Geometric correc-tion required an orthophoto to be acquired as a referenceimage.

    Directly after the HyperUAS flight, we flew a smallerOktoKopter with a digital single-lens reflex (DSLR) camerato collect ultrahigh resolution RGB imagery. These photoswere collected to derive a 5 mm resolution orthophoto mo-saic anda1 cmresolutiondigital surfacemodel (DSM)basedon SfM techniques [for more details on this workflow, seeTurner et al. (2012) and Lucieer et al. (2014)]. Twenty largeGCP disks were laid out along the HyperUAS flight tran-sect with a 30 m measuring tape in the middle of the stripfor scale and georectification purposes. Thirty smaller diskswere spread out to identify areas for spectral field sampling.An ASD HandHeld2 field spectrometer was used to collectreference spectra for the calibration panels described in Sec-tion 2.4, and to collect validation spectra for ten rocky areasand tenmoss-covered areas. All disks were geolocated witha dual frequency geodetic GPS receiver (DGPS) with an ab-solute geometric accuracy of 24 cm. Accurate coordinatesfor the largeGCPswereused in the bundle adjustment of theSfM workflow, resulting in a 3D model of the moss bed to-pography at 24 cm accuracy. The orthophoto mosaic was

    exported to a GeoTIFF in the Universal Transverse Mer-cator map projection (UTM Zone 49 south, datum: WGS1984).

    For this experiment, we had not been able to fully inte-grate and test the POS subsystem on the HyperUAS. There-fore,wehad to rely on slowand steadyflights for reasonableimage geometry. The HyperUAS image was coregisteredto the orthophoto mosaic by matching all 50 GCPs. TheGCP disks were clearly visible in both images, and GCPswere manually identified in each image. A triangulation(i.e., rubber sheeting) transformationwas applied to the hy-perspectral image to 1) geometrically transform the imageinto the WGS84 UTM49S coordinate system, 2) correct forgeometric distortions resulting fromflight dynamics, and 3)coregister the hyperspectral image to the orthophoto andDSM (Toutin, 2004). The coregistrationwas carried out withthe widely used image-to-image registration tools in theIDL/ENVI software. We then compared the spectral signa-tures of the geometrically corrected HyperUAS image withspectral signatures observed in the field. An example of thiscomparison for a healthy moss patch is shown in Figure 8.The signatures match very closely with an RMSE of 0.6%reflectance. After all correction steps, the biophysical andbiochemical properties of the moss beds were derived.

    The separation of patchy moss from abiotic rocky sur-roundingswas achieved by selecting pixelswithNDVI>0.6(Tucker, 1979). Since mosses are photosynthetically activeplants growing in water-filled turf, their reflectance sig-nature is similar to the evolutionary more advanced highplants, i.e., to some extent reflecting green, strongly absorb-ing red, and highly reflecting NIR radiance. Being stressedby an insufficient water supply and elevated ultraviolet ir-radiation, a moss turf can change its compactness and color.It can turn from a healthy airy green turf to a stress-resistingdenser yellow-brown pack and finally, during the dormantstage, into a compact black biomass (called here, for thesake of simplicity, dead moss). Since these physiologicalstress reactions are changing their reflectance function, wecould applyMTVI2 to distinguish the following four classesof moss vigor: i) high (MTVI2 >0.75), ii) medium (MTVI20.50), iii) low (MTVI20.25), and iv)dead, i.e., suffocated dormantmoss (MTVI2

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 15

    Figure 8. Comparison of spectral signatures for a patch of healthymoss derived from theHyperUAS image (3 by 3 kernel average)and the matching location measured in the field with an ASD field spectrometer.

    3.3. Experiment III: Incorporating GPS andIMUOrthorectification

    The last experimentwas carried out to test the integration ofthe POS subsystem and to use position and orientation datasynchronized with the hyperspectral image lines for directgeoreferencingandorthorectification. Theorthorectificationworkflow is based on the airborne Parametric Geocoding(PARGE) approach (PARGE Manual Version 3.1, 2011). Al-though itwas originally designed for airbornedata acquiredfrom high altitudes with a pushbroom scanner mounted onhighly stable gyroscopic platforms, PARGE was found toperform effectively also for our small-sized UAS platformflying at lower altitudes around 10 m. PARGE considersthe parameters of aerial platform and terrain geometry andcreates orthorectified imaging spectrometry cubes using aforward transformation algorithm (Schlapfer and Richter,2002). Based on the sensor geometry, the GPS flightpath,and the sensor orientation, an image geometry map is firstcalculatedbyPARGEstoring the coordinates for everypixel.An appropriate resampling method is then selected to ob-tain a final georectified image cube. Optionally, a DSM canbe provided for rigorous orthorectification.

    During field trials, electrical interference was observedbetween the dual frequency GPS receiver and other elec-tronic components of the scientific payload. By using theNovatel Connect software on an attached laptop, the SNRof the L1 and L2 GPS signals and various quality flags wereobserved in real time. The operation of the DVR and the

    SBC had large detrimental effects on the L1/L2 GPS re-ception. When either was switched on, a large reductionin both the L1 and L2 SNR was repeatedly observed. TheGPS receiver would often loose lock completely when theDVR or SBC were switched on. Investigations confirmedthat interference was direct electromagnetic radiation fromthe DVR and SBC into the GPS receiver and GPS antenna.Grounded lightweight brass and aluminumshieldswere in-stalled around the DVR andGPS, and an aluminum groundplane was fitted to the GPS helical antenna to reduce the ef-fects of interference. The shielded GPS receiver was locatedas far from theDVRandSBCaswaspractical.While the elec-tromagnetic interferencewasnot completely eliminated, theshielding reduced the interference to an acceptable level.

    The field trial was carried out on the (flat) universitysports oval. The HyperUASwas manually flown at approx-imately 10 m AGL along a 20 m transect at 23 ms1. GPSand IMU data were processed against a nearby base stationand a position and orientation solution generated for eachimage line using a sigma point Kalman filter. A flat surfacewas assumed for orthorectification in PARGE. The pitch,roll, and yaw angles of the flight are presented in Figure 10.Summary statistics of these sensor attitude data are providein Table I. Nine large GCP disks and 36 smaller disks werespread out on either side of a straight measuring tape tocheck the geometry of the hyperspectral image strip and toassess the geometric accuracy of the direct georeferencingtechnique. The resulting image has a spatial resolution of

    Journal of Field Robotics DOI 10.1002/rob

  • 16 Journal of Field Robotics2014

    Figure 9. Radiometrically and atmospherically corrected hyperspectral image of Antarctic moss bed at Robinson Ridge (EastAntarctica, 5 February 2013) displayed in false near-infrared colors (a) and a derived map of actual moss vigor (superimposed overthe 3D terrain model with an orthophoto mosaic in natural colors), with the graph displaying the relative abundance of moss vigorclasses in the image (b).

    Figure 10. Pitch, roll, and yaw plotted against each image line for the sport oval flight in Experiment III. Maximum pitch and rollangles were close to 5. In this plot, the yaw was normalized by subtracting the average heading (38) for presentation purposes.

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 17

    Figure 11. Orthorectification of HyperUAS image strip: (a) raw uncorrected data with nonuniform pixels, (b) resampled imagewith uniform 4 cm resolution pixels, (c) orthorectified hyperspectral image based on synchronized GPS and IMU observations.True coordinates of ground control disks can be seen in the image. The observed (white) and true (yellow) location of themeasuringtape, and the observed (white) and true (yellow) location of the GCPs, are visible.

    1.0 cm across-track and approximately 4.0 cm along-track.Figure 11 shows the resulting image from this trial flight,including the observed (white) and true (yellow) locationof the measuring tape, and the observed (white) and true(yellow) location of the GCPs. The geometric accuracy wascalculated for the 45 GCPs and is presented in Table II. Theoverall root-mean-square error (RMSE) is 25 cm. The max-

    imum error for a pitch and roll angle of 5 is in the orderof 87 cm at this flying height, which indicates that a higherabsolute spatial accuracy is obtained with the PARGE or-thorectification workflow. For such a low flight and highspatial resolution, a higher accuracy is expected. However,these results only present the outcome of the first directgeoreferencing trials. More research needs to be done on

    Journal of Field Robotics DOI 10.1002/rob

  • 18 Journal of Field Robotics2014

    Table I. Summary statistics for the pitch, roll, and yaw of theHyperUAS. All values are in degrees.

    Min. Max. Mean Abs. Mean St. Dev.

    Pitch 4.93 4.13 0.31 2.12 2.48Roll 1.09 4.81 1.48 1.69 1.56Yaw 33.78 41.34 38.16 38.16 2.58

    Table II. Geometric accuracy of a direct georeferenced imagecalculated from 45 ground control points.

    Mean (m) St. Dev. (m) RMSE (m)

    Easting 0.19 0.09 0.22Northing 0.07 0.08 0.13Total 0.20 0.09 0.25

    the accuracy of the POS and the accuracy of time synchro-nization. In addition, lever arm distances and bore sightangles need to be measured in detail and taken into ac-count in the filtering process. Future research will focus ona detailed analysis of the orthorectification workflow. Thisfield trial shows that the concept of direct georeferencing ofHyperUAS data is both possible and feasible.

    4. CONCLUSIONS

    This study has demonstrated the design, implementation,and field trials of a hyperspectral unmanned aircraft sys-tem (UAS), which was developed for ultrahigh resolutionimage acquisition with an imaging pushbroom spectrora-diometer. A scientific payload sensor package consistingof the hyperspectral scanner, a high-accuracy position andorientation system (POS), and a time-synchronized digi-tal image acquisition (DIA) system was integrated with amultirotor UAS. An image processing workflow was im-plemented for spectral, radiometric, atmospheric, and geo-metric correction of the hyperspectral imagery.Quantitativeestimations of biophysical and biochemical vegetation vari-ables, such as leaf area index (LAI) and chlorophyll a + b(Cab) content, were derived from the corrected hyperspec-tral imagery (with 162 or 324 spectral bands depending onbinning). Three real-world field experiments were carriedout to demonstrate the effectiveness of the HyperUAS pro-totype. The first experiment showed that LAI and Cab canbe retrieved from 5 cm resolution imagery of grass and abarley crop in southern Tasmania. The second experimentdemonstrated that the HyperUAS can be operated in harshenvironmental conditions. In this study, theHyperUASwasflown over moss beds in East Antarctica. The hyperspec-tral image was geometrically corrected via coregistration toan orthophoto mosaic, which was generated from a UASsurvey and structure-from-motion (SfM) processing of the

    RGB photographs. Moss vigor was assessed from the hy-perspectral image at unprecedented spatial detail using themodified triangular vegetation index 2 (MTVI2). The thirdfield experiment demonstrated the integration of the POSand DIA subsystems and their use in direct georeferencingof the raw hyperspectral image strip. An absolute accuracyof 25 cm was achieved on the 2 cm resolution imagery.

    The flight time with the described scientific payload inideal conditions at sea level was limited to 3 min. Whilethe current flight time allowed the work described to besuccessfully undertaken, there was very little possibility ofextending the flight duration using the current hardware.While there is the potential to reduce the weight of the sci-entific payload at the cost of mechanical rigidity, reductionswill be modest at best given the unavoidable weight of thekey components. Likewise, adding extra LiPo batteries toextend flight time is partially negated by the weight of theextra batteries and the associated cabling and mounts. Thestability and flight performance of the UAS, both loadedand unloaded, were acceptable for the work undertaken.The use of a large well-designed UAV airframe and com-mercial flight controller electronicsmeantminimal timewasspent building and tuning the UAS. The stability achievedwas typical of the stability achievable during outdoor flightusing a commercial MEMs-based flight controller. Whilethe addition of the scientific payload below the unloadedCoG of the UAS may have increased flight stability to someextent, the natural pitch, roll, and yaw movements of theUAS during transects were directly transmitted to the spec-trometer affixed in the scientific payload. The results of thedirect georeferencing process presented here clearly showthe challenge of acquiring ultrahigh spatial resolution im-agery from a low and slow multirotor UAS based around acost effective MEMs-based IMU.

    The major limitations to be overcome in future workare limited flight time and limited scientific payload sta-bility. Future work will explore the combination of the de-sign features of bothmultirotor and conventional helicopterUASs in an attempt to meld the advantages of both into aplatform specifically suited to scientific applications that re-quire longer flight times. In addition, techniques allowingthe scientific package to be isolated from changes in UASattitude will also be explored. Tiimus and Tamre (2012) de-scribe a mechanical gyroscopically stabilized approach toisolating a camera from changes in UAS attitude. Such anapproach in combination with the considered location of adownwards looking camera and IMUmay greatly improvethe results of the direct georeferenced workflow outlinedhere.

    We have demonstrated that hyperspectral pushbroomscanning from a low and slow flyingmultirotor UAS is bothpossible and feasible, resulting in high-quality hyperspec-tral imagery and image products at unprecedented levelsof detail. The HyperUAS system presented here opens upnew opportunities for real-world remote sensing studies

    Journal of Field Robotics DOI 10.1002/rob

  • Lucieer et al.: HyperUASImaging Spectroscopy from a Multirotor Unmanned 19

    requiring on-demand ultrahigh-resolution hyperspectralimagery. Future work will investigate the value of the Hy-perUAS system for precision agriculture applications.

    ACKNOWLEDGMENTS

    This project was funded by Australian Research Council(ARC) project DP110101714 and Australian Antarctic Divi-sion (AAD; Australian Antarctic Science Grants 3130, 1313,and4046). Logistic supportwas alsoprovidedby theseAADgrants. We thank ANARE expeditioners in 2013 for fieldsupport.We acknowledge the support of Jan Hanus fromGlobal Change Research Centre ASCR, v.v.i. in Czech Re-public, andTimMalthus and JoseA. Jimenez Berni fromTheCommonwealth Scientific and Industrial Research Organi-sation (CSIRO) in Canberra during spectral and radiometriccalibration of theMicro-Hyperspec sensor. Darren Turner isacknowledged for his technical support and piloting theHyperUAS in Tasmania. We thank Aadit Shrestha for hisassistance with PARGE processing and data collection forExperiment III.

    REFERENCES

    Anderson, K., & Gaston, K. J. (2013). Lightweight unmannedaerial vehicles will revolutionize spatial ecology. Frontiersin Ecology and the Environment, 11(3), 138146.

    Berni, J. A. J., Zarco-Tejada, P. J., Suarez, L., & Fereres, E. (2009).Thermal and narrowband multispectral remote sensingfor vegetation monitoring from an unmanned aerial vehi-cle. IEEE Transactions on Geoscience and Remote Sensing,47(3), 722738.

    Broge, N. H., & Leblanc, E. (2001). Comparing predictionpower and stability of broadband and hyperspectral veg-etation indices for estimation of green leaf area index andcanopy chlorophyll density. Remote Sensing of Environ-ment, 76(2000), 156172.

    Bryson, M., Reid, A., Ramos, F., & Sukkarieh, S. (2010). Air-borne vision-based mapping and classification of largefarmland environments. Journal of Field Robotics, 27(5),632655.

    Chen, H. S. (1997). Remote sensing calibration systems: Anintroduction. Hampton, VA: Deepak Publishing.

    Clevers, J. (1997). A simplified approach for yield prediction ofsugar beet based on optical remote sensing data. RemoteSensing of Environment, 61(2), 221228.

    Cole, D. T., Sukkarieh, S., & Goktoan, A. H. (2006). System de-velopment and demonstration of a UAV control architec-ture for information gathering missions. Journal of FieldRobotics, 23(6-7), 417440.

    Dehaan, R. L.,Weston, L. A., & Rumbachs, R. (2012). The designand the development of a hyperspectral unmanned air-craft mapping system for the detection of invasive plants.In Eighteenth Australasian Weeds Conference (pp. 103107).

    DOleire-Oltmanns, S., Marzolff, I., Peter, K., & Ries, J. (2012).Unmanned aerial vehicle (UAV) for monitoring soil ero-sion in Morocco. Remote Sensing, 4(11), 33903416.

    Haboudane, D. (2004). Hyperspectral vegetation indices andnovel algorithms for predicting green LAI of cropcanopies: Modeling and validation in the context of pre-cision agriculture. Remote Sensing of Environment, 90(3),337352.

    Haboudane, D., Miller, J. R., Tremblay, N., Zarco-Tejada, P. J.,& Dextraze, L. (2002). Integrated narrow-band vegetationindices for prediction of crop chlorophyll content for ap-plication to precision agriculture. Remote Sensing of En-vironment, 81(2-3), 416426.

    Harwin, S., & Lucieer, A. (2012). Assessing the accuracy of geo-referenced point clouds produced via multi-view stereop-sis from unmanned aerial vehicle (UAV) imagery. RemoteSensing, 4(6), 15731599.

    Hrabar, S. (2012). An evaluation of stereo and laser-based rangesensing for rotorcraft unmanned aerial vehicle obstacleavoidance. Journal of Field Robotics, 29(2), 215239.

    Hruska, R., Mitchell, J., Anderson, M., & Glenn, N. F. (2012).Radiometric and geometric analysis of hyperspectral im-agery acquired from an unmanned aerial vehicle. RemoteSensing, 4(12), 27362752.

    Kelcey, J., & Lucieer, A. (2012). Sensor correction of a 6-bandmultispectral imaging sensor for UAV remote sensing. Re-mote Sensing, 4(5), 14621493.

    Kendoul, F. (2012). Survey of advances in guidance, navigation,and control of unmanned rotorcraft systems. Journal ofField Robotics, 29(2), 315378.

    Kokaly, R. F., Asner, G. P., Ollinger, S. V., Martin, M. E., &Wessman, C. A. (2009). Characterizing canopy biochem-istry from imaging spectroscopy and its application toecosystem studies. Remote Sensing of Environment, 113,S78S91.

    Kostkowsky, H. J. (1997). Reliable spectroradiometry. La Plata,MD: Spectroradiometry Consulting.

    Laliberte, A. S., Goforth,M.A., Steele, C.M.,&Rango,A. (2011).Multispectral remote sensing fromunmanned aircraft: Im-age processing workflows and applications for rangelandenvironments. Remote Sensing, 3(11), 25292551.

    Lim, H., Park, J., Daewon, L., & Kim, H. J. (2012). Build yourown quadrotor (open-source projects on unmanned aerialvehicles). IEEE Robotics & Automation Magazine, 19(3),3345.

    Lin, Y., Hyyppa, J., & Jaakkola, A. (2011). Mini-UAV-borne LI-DAR for fine-scalemapping. IEEEGeoscience and RemoteSensing Letters, 8(3), 426430.

    Lucieer, A., Turner, D., King, D. H., & Robinson, S. A. (2014).Using an unmanned aerial vehicle (UAV) to capturemicro-topography of Antarctic moss beds. International Jour-nal of Applied Earth Observation and Geoinformation,27(Part A), 5362, doi:10.1016/j.jag.2013.05.011.

    Malenovsky, Z., Homolova, L., Zurita-Milla, R., Lukes, P., Ka-plan, V., Hanus, J., Gastellu-Etchegorry, J.-P., & Schaep-man,M. E. (2013). Retrieval of spruce leaf chlorophyll con-tent from airborne image data using continuum removal

    Journal of Field Robotics DOI 10.1002/rob

  • 20 Journal of Field Robotics2014

    and radiative transfer. Remote Sensing of Environment,131, 85102.

    Malenovsky, Z., Mishra, K. B., Zemek, F., Rascher, U., & Ned-bal, L. (2009). Scientific and technical challenges in remotesensing of plant canopy reflectance and fluorescence. Jour-nal of Experimental Botany, 60(11), 29873004.

    Markham, B. L., & Helder, D. L. (2012). Forty-year calibratedrecord of earth-reflected radiance from Landsat: A review.Remote Sensing of Environment, 122, 3040.

    Myneni, R. B., Hall, F. G., Sellers, P. J., & Marshak, A. L. (1995).The interpretation of spectral vegetation indexes. IEEETransactions on Geoscience and Remote Sensing, 33(2),481486.

    Nagai, M., Chen, T., Shibasaki, R., Kumagai, H., & Ahmed, A.(2009). UAV-borne 3-D mapping system by multisensorintegration. IEEE Transactions on Geoscience and RemoteSensing, 47(3), 701708.

    Niethammer, U., James, M. R., Rothmund, S., Travelletti, J., &Joswig,M. (2012). UAV-based remote sensing of the Super-Sauze landslide: Evaluation and results. Engineering Ge-ology, 128, 211.

    Schaepmanstrub, G., Schaepman, M., Painter, T., Dangel, S., &Martonchik, J. (2006). Reflectance quantities in optical re-mote sensing definitions and case studies. Remote Sensingof Environment, 103(1), 2742.

    Schlapfer, D., & Richter, R. (2002). Geo-atmospheric processingof airborne imaging spectrometry data. Part 1: Parametricorthorectification. International Journal of Remote Sens-ing, 23(13), 26092630.

    Smith, G.M., &Milton, E. J. (1999). The use of the empirical linemethod to calibrate remotely sensed data to reflectance.International Journal of Remote Sensing, 20(13), 26532662.

    Tiimus, K., & Tamre, M. (2012). Camera gimbal performanceimprovementwith spinning-massmechanical gyroscopes.In Danube Adria Association for Automation &Manufac-turing, 8th International DAAAM Baltic Conference onIndustrial Engineering, April.

    Toutin, T. (2004). Review article: Geometric processing of re-mote sensing images:Models, algorithmsandmethods. In-ternational Journal of Remote Sensing, 25(10), 18931924.

    Tucker, C. J. (1979). Red and photographic infrared linear com-binations for monitoring vegetation. Remote Sensing ofEnvironment, 150, 127150.

    Turner, D., Lucieer, A., &Watson, C. (2012). An automated tech-nique for generating georectified mosaics from ultra-highresolution unmanned aerial vehicle (UAV) imagery, basedon structure frommotion (SfM)point clouds. Remote Sens-ing, 4(5), 13921410.

    Ustin, S. L., & Schaepman, M. E. (2009). Imaging spectroscopyspecial issue. Remote Sensing of Environment, 113(August2008), S1S1.

    Wallace, L., Lucieer, A., Watson, C., & Turner, D. (2012). Devel-opment of a UAV-LiDAR systemwith application to forestinventory. Remote Sensing, 4(6), 15191543.

    Zarco-Tejada, P. J. (2008). A new era in remote sensing of cropswith unmanned robots. SPIE Newsroom.

    Zarco-Tejada, P. J., Berni, J. A. J., Suarez, L., Sepulcre-Canto,G., Morales, F., & Miller, J. R. (2009). Imaging chlorophyllfluorescence with an airborne narrow-band multispectralcamera for vegetation stress detection. Remote Sensing ofEnvironment, 113(6), 12621275.

    Zarco-Tejada, P. J., Gonzalez-Dugo, V., & Berni, J. A. J. (2012).Fluorescence, temperature and narrow-band indices ac-quired from a UAV platform for water stress detection us-ing a micro-hyperspectral imager and a thermal camera.Remote Sensing of Environment, 117, 322337.

    Zarco-Tejada, P. J.,Morales, A., Testi, L., &Villalobos, F. J. (2013).Spatio-temporal patterns of chlorophyll fluorescence andphysiological and structural indices acquired from hyper-spectral imagery as compared with carbon fluxes mea-sured with eddy covariance. Remote Sensing of Environ-ment, 133, 102115.

    Zhang, C., & Kovacs, J. M. (2012). The application of small un-manned aerial systems for precision agriculture: A review.Precision Agriculture, 13(6), 693712.

    Journal of Field Robotics DOI 10.1002/rob