quantitative inspection of wind turbine blades using uav ... · quantitative inspection of wind...

12
Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1 , Burnham, K. C. 2 , Zhang, D. 1 , McDonald, L. 1 , MacLeod, C. N. 1 , Dobie, G. 1 , Summan, R. 1 McMahon D 2 . 1 Centre for Ultrasonic Engineering (CUE), Department of Electronic & Electrical Engineering, University of Strathclyde, Royal College Building, 204 George Street, Glasgow G1 1XW 2 Advanced Forming Research Centre (AFRC), 85 Inchinnan Dr, Inchinnan, Renfrew PA4 9LJ Abstract Assessment of on- and off-shore wind turbine structures is currently conducted via ground inspection or manually through remote visual capture. As the renewable energy sector endeavours to reduce the levelised cost of energy (LCOE) and an increasing number of end-users take responsibility for operational and maintenance costs, early detection of compromised integrity gains increasing importance. A remote inspection solution employing an unmanned aerial vehicle (UAV) with an algorithmic controlled flight-path for 360° trajectories has been combined with a photogrammetry payload. The algorithm ensured the distance of the UAV to the blade surface was maintained constant for flap-wise, edge-wise, and blade length during image capture to help maintain focus. An on-board camera captured high-resolution images which were subsequently stitched together to output a visual surface reconstruction, providing an overview of the wind turbine blade condition. Comparison of the mesh with a baseline reference model provided sub-millimetre agreement under optimised lighting conditions and excellent matching of defects when operating off a commercial wind turbine blade complete with damage accumulated during industrial use. 1. Introduction As UAV technology matures and develops, there is an increased interest in using the platforms for asset inspection, particularly for structures with limited or difficult access. The principal inspection technology employed is visual inspection which is attractive as it is non-contact and allows significant stand-off distances from the object being inspected to be maintained. UAV delivered visual inspection has been employed for civil structures (1-3) , power lines (4) , pipelines (5) , flare stacks (6) , as well as wind turbines (7, 8) . The latter sector has been commercially explored by a number of companies (9-13) . Typically, two critical techniques are employed for such inspections; detecting defects from offline pictures (8) and path planning for the inspection (7, 14) . These techniques bring significant challenges in registration of the measurement data to the structure under inspection, and this fundamental issue has long been recognised for all automated approaches to non-destructive evaluation and health monitoring (15, 16) . A possible solution to the data registration problem lies in 3D photogrammetry reconstruction which is a technology whereby 2D images are used to generate a textured 3D model, providing an intuitive overview of the test object. The reconstructed model contains the More info about this article: http://www.ndt.net/?id=23360

Upload: others

Post on 20-May-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

Quantitative inspection of wind turbine blades using UAV

deployed photogrammetry

Pierce, S. G. 1, Burnham, K. C.

2, Zhang, D.

1, McDonald, L.

1, MacLeod, C. N.

1, Dobie, G.

1,

Summan, R. 1 McMahon D

2.

1 Centre for Ultrasonic Engineering (CUE), Department of Electronic & Electrical Engineering,

University of Strathclyde, Royal College Building, 204 George Street, Glasgow G1 1XW

2 Advanced Forming Research Centre (AFRC), 85 Inchinnan Dr, Inchinnan, Renfrew PA4 9LJ

Abstract

Assessment of on- and off-shore wind turbine structures is currently conducted via

ground inspection or manually through remote visual capture. As the renewable energy

sector endeavours to reduce the levelised cost of energy (LCOE) and an increasing

number of end-users take responsibility for operational and maintenance costs, early

detection of compromised integrity gains increasing importance. A remote inspection

solution employing an unmanned aerial vehicle (UAV) with an algorithmic controlled

flight-path for 360° trajectories has been combined with a photogrammetry payload.

The algorithm ensured the distance of the UAV to the blade surface was maintained

constant for flap-wise, edge-wise, and blade length during image capture to help

maintain focus. An on-board camera captured high-resolution images which were

subsequently stitched together to output a visual surface reconstruction, providing an

overview of the wind turbine blade condition. Comparison of the mesh with a baseline

reference model provided sub-millimetre agreement under optimised lighting conditions

and excellent matching of defects when operating off a commercial wind turbine blade

complete with damage accumulated during industrial use.

1. Introduction

As UAV technology matures and develops, there is an increased interest in using the

platforms for asset inspection, particularly for structures with limited or difficult access.

The principal inspection technology employed is visual inspection which is attractive as

it is non-contact and allows significant stand-off distances from the object being

inspected to be maintained. UAV delivered visual inspection has been employed for

civil structures (1-3)

, power lines (4)

, pipelines (5)

, flare stacks (6)

, as well as wind turbines (7, 8)

. The latter sector has been commercially explored by a number of companies (9-13)

.

Typically, two critical techniques are employed for such inspections; detecting defects

from offline pictures (8)

and path planning for the inspection (7, 14)

. These techniques

bring significant challenges in registration of the measurement data to the structure

under inspection, and this fundamental issue has long been recognised for all automated

approaches to non-destructive evaluation and health monitoring (15, 16)

. A possible

solution to the data registration problem lies in 3D photogrammetry reconstruction

which is a technology whereby 2D images are used to generate a textured 3D model,

providing an intuitive overview of the test object. The reconstructed model contains the

Mor

e in

fo a

bout

this

art

icle

: ht

tp://

ww

w.n

dt.n

et/?

id=

2336

0

Page 2: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

2

approximate image locations, which allows the inspector to easily assess the condition

of the testing object with the complex geometry. Such reconstruction has been used in

many fields, such as museum artefacts digitisation (17)

, UAV indoor navigation (18)

and

nuclear tank inspection (19)

. The reconstruction can be accomplished by some

commercial software, such as Autodesk Recap and Agisoft Photoscan. Researchers

have also developed new algorithms to achieve such reconstructions such as real scale (20)

and 3D builds based on a single image (21)

.

Our approach details a quantitative analysis of a wind turbine blade performed by a

UAV under an algorithmic controlled flight path. Section 2 details the hardware and

control approach, section 3 the photogrammetry reconstruction and evaluation against

ground truth model, section 4 considerations for future work, and section 5 conclusions.

2. Hardware, Sample and Control Strategy

2.1 Hardware

The UAV utilised, AscTec Firefly – Figure 1 (22) had a payload capacity of 600g and a

twenty-minute battery life, shown in Figure 1. It contained six rotors and could generate 36

N thrust in total. The UAV had an on-board Core2 DUO computer (pre-installed Linux

Operating System) to execute the applications for serial devices such as a camera. The

camera installed on the UAV was a Point Grey machine vision camera CM3-U3-50S5C-CS (23)

with 8 mm, F2.4, 57.8° FOV (field of view) lens capturing 4 Mega Pixels (MP) raw

images at 2 Hz. To reduce the motion blur, the shutter time was changed from 60 ms

(factory settings) to 30 ms. In addition, six external 135 W 5500 K colour temperature lights

were set-up at locations around the blade to increase the ambient light intensity.

Figure 1. AscTec UAV platform

2.2 Test Object -Wind Turbine Blade

The wind turbine blade tested was part of a Gaia Wind blade of length 6m. To allow

installation in the laboratory, a 3.1 m section of the blade preserving the aerodynamic

section was utilised and mounted vertically on a fixture (Figure 2). The sample

dimensions were 3100 mm length, 386 mm wide on the top and 619 mm on the bottom.

Page 3: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

Figure 2: Gaia Wind Turbine blade in

fixture in laboratory

Figure 3. Mounted wind turbine blade, left;

reconstructed 3d reference model, right

To ascertain the accuracy of subsequent visual reconstruction of the blade, a 3D

reference model was measured using a GOM ATOS Triple Scan system. High

resolution cameras were positioned 0.5 m from the blade surface, manoeuvred around

the blade using a heavy duty tripod, until a full scan had been completed. The GOM

Inspect software allows shape and dimension analysis, 3D inspection and mesh

processing for 3D point clouds and CAD data sets. Two cameras were used for each

data capture for the 3D reconstruction (Figure 3). A high accuracy reconstruction was

achieved (σ = 20 µm estimated accuracy) and this 3D model was used as the baseline

reference model against which all subsequent models were assessed.

2.3 Path planning and control

To ensure the UAV mounted camera (with its limited field of view) covered the

majority of the blade area, the UAV had a pre-programmed scan trajectory. Both linear

and nonlinear controllers are approaches of UAV controllers implemented in literature.

State of the art nonlinear controllers have demonstrated fast-moving stability and

robustness with low uncertainties (24-26)

. However for low speed control, Proportional-

Integral–Derivative (PID) is usually satisfactory (27, 28)

and was the control method

selected for implementation in this work.

To achieve the autonomous UAV control, an inspection path was generated to provide

the reference guidance trajectory to feed the UAV controller for appropriate control

actions. The path chosen was a smooth circular trajectory to maintain the UAV with a

controlled distance to the centre of the object. Since UAV stability time for an

ascending path was greater than a descending path, the latter was chosen for the

inspection. Therefore, the UAV started circular manoeuvring around the blade at 3.1 m

Page 4: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

4

height and finished at 0.7 m height. The flying path contained nine circular paths with

controlled distances to the blade centre at different altitudes. Typically, each circular

orbit took 40 seconds to complete. Because of the geometry of the blade, the radius of

the top circle was 1100 mm and the rest of the circles were increased in radius by 50

mm progressively.

Considering the performance of the controller, the circular trajectory was digitized to a set

of waypoints which were followed sequentially. Accounting for the UAV speed, camera

overlapping and the pose of the blade, the step between each waypoint was 150 mm in

translations; 45 degrees in yaw angle, giving eight points per circle.

Figure 4. High-level architecture of the UAV system

The control system (Figure 4) contained three layers: VICON, Laptop and UAV. The

VICON MX motion capture system (29)

was an optical based 6 Degree of Freedom

(D.O.F.) pose tracking system, working as an indoor navigation replacement of IMU

(Inertial Measurement Unit) and GPS (Global Positioning System) measurement

systems. It measured the UAV positions and attitudes, blade centre position and

transferred this to a laptop on TCP/IP protocol at a rate of 100 Hz. The system utilised

twelve cameras and had approximately 8 x 7 x 7 m coverage volume. An asymmetric

combination of seven retro-reflectors on the top of the UAV was defined as a unique

object in VICON system. The centre position of the blade was determined by

retroreflectors positioned at the base of the blade fixture. (x, y) positions from VICON

were converted from the global to UAV coordinate frame on the laptop. The converted

(x, y) position, altitude and yaw information were passed to the controllers in the laptop,

which calculated control commands for the UAV on-board controller. The

communication between the laptop and UAV were established by XBee modules as the

manufacturer defaults. The on-board controller then translated the incoming commands

to the rotor speeds to accomplish the required UAV actions. The average control rate

was around 20 Hz, which was limited by the wireless communication throughput and

UAV on-board processing speed.

For a typical UAV, the combination of torques, generated from the rotors, produces the

attitude angles and vertical lift. Pitch and roll angles are coupled with the forward and

lateral accelerations. The lift from the propellers produces the acceleration in the vertical

direction. When the UAV is hovering in a pose, the accelerations and velocities are

approximated to zero. When the UAV is assigned to a new pose, the appropriate

accelerations and velocities are calculated and executed. The control system, shown in

Page 5: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

5

Figure 5, was implemented by a PID (Proportional-Integral–Derivative) closed-loop

controller.

Figure 5. UAV position and attitude control system

The PID controller used the feedback system to continuously calculate the error between the

measured value and setpoint. The output of the controller was the sum of the scale (P term),

integration (I term) and changing rate (D term) of the error. The system characteristic was

affected by the three terms, which could be tuned by altering kp,kiand kd. The parameters of

each PID term of controllers used in this project are listed in Table 1.

kp ki kd

Altitude PID 7 0 1.1

x position PID 1.5 0 1.2

y position PID 1.5 0 1

Yaw angle PID 35 0.001 0

x velocity PID 1 0.005 0.1

y velocity PID 1 0.002 0.1

Yaw rotation rate PID 1 0 0

Table 1. Parameters of PID term in the controllers.

The control system included two subsystems, where x, y, ψ (yaw) were implemented by

three cascade multi-loop controllers (System 1) and a parallel single PID controller (System

2) for the vertical position. System 1 contained three independent controllers for each x,y,ψ

(yaw) variable, implemented by the multi-loop structure, where the PID controller in first

loop calculated the desired velocities from the error between measurement and target poses

for the second layer controller. The errors of the velocities were the input of the second

loop, which the output was the attitude commands for the UAV on-board controller. The

speed controller of yaw has been implemented by the Asctec. Therefore, the second loop in

yaw controller only includes the proportional term and was designed as a speed limiter.

System 2 was applied by a single PID closed-feedback-loop because the vertical velocity

controller had been included in the UAV’s on-board controller. The error of altitude was the

input of the System 2, which created the vertical velocity updates to the on-board controller.

2.4 Performance of control system

Trajectory errors were used to estimate the stability and performance of the UAV. In

this paper, the main part of the UAV flying path were nine circles centred around the

blade. The nine circles had different height levels and different radius. Table 2 shows

the errors and standard deviations of these measurements.

Page 6: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

6

Nominal

Radius (mm)

Nominal

Height (mm)

Mean Radius

Error (mm)

Mean Height

Error (mm)

Standard

Deviation

(Radius)

(mm)

Standard

Deviation

(Height)

(mm)

1100 3100 5.02 4.51 11.73 40.05

1150 2800 17.83 7.97 8.80 29.40

1200 2500 17.07 5.99 10.00 37.21

1250 2200 17.37 3.46 12.44 38.30

1250 1900 13.46 8.81 10.09 41.89

1250 1600 10.14 3.97 14.11 38.82

1250 1300 17.97 19.65 10.91 42.87

1250 1000 13.15 14.26 13.02 42.24

1250 700 12.94 24.94 12.88 37.97

Table 2. Mean error and standard deviation of radius and height in different altitude

The results show the mean error of the height increased when the altitude decreased.

Similarly, the mean error of the radius slightly increased when the altitude descended.

3. Visual Inspection and Photogrammetry Reconstruction

3.1 Photogrammetry approach

The photogrammetry reconstruction was achieved by the Agisoft Photoscan (30)

, a

commercially available stand-alone software product which performs photogrammetric

processing of digital images and generates 3D spatial data. Around 600 images were

captured during the UAV inspection of the blade, including time during the take-off and

landing. The raw images were saved on the UAV on-board computer, then copied to the

laptop for the further processing. Although surface dust and cracks on the blade were

visible, to better analyse the performance of the 3D reconstruction and adjust the camera

focusing, the blade surface was prepared prior to the scanning. Ten 6.5 mm dots and a

20 mm textured yellow tape were pasted on the surface. The camera lens was manually

adjusted for optimum focusing. The software assumed the images are captured from a

series of cameras in various position. Therefore, the background environment provides

reference points for the reconstruction. Firstly, the software aligns the photos to

calculate the estimated position of the images, followed by the camera positions

optimisation. Then, the point cloud, mesh and texture were built to create more detailed

3D objects. The final textured 3D model, achieved by the software, is shown in Figure

6.

To ascertain the accuracy of the visual reconstruction of the blade, the model was

compared to the base reference CAD model as measured by the GOM ATOS Triple

Scan. The mesh data from the Agisoft software was imported into GOM Inspect

software for comparison with the 3D reference model. Because the photogrammetry is a

dimensionless technique and the scale factor is unknown during the reconstruction, the

output from the software does not contain the actual size information of the model.

Therefore, prior to comparing the models, a scaling factor for the mesh and the

coordinates allocation were firstly required. These were achieved by identifying two

distinctive points from the mesh and fine-tuned by minimizing the surface difference.

Page 7: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

7

The GOM Inspection software returned a pre-alignment error factor and deviation map

for the model comparison.

(a)

(b)

Figure 6. Agisoft Reconstructed Model (a), showing defect on blade (b)

3.2 Experimental comparison

The reconstructed mesh, built from AscTec UAV images by Agisoft software is shown in

Figure 6 with 6(b) showing a surface defect clearly showing in the mesh reconstruction.

Figure 7 shows the comparison between textures from the original images and the

reconstructed mesh. The images indicate that the dots and texture on the blade surface were

clearly identified in the reconstructed model.

Page 8: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

8

Figure 7. Comparison between the original photo and reconstructed model (a)(c)

Reference Camera captured, (b)(d) reconstructed mesh The mesh was imported into the GOM Inspect software for quantitative comparison with

the reference model of the blade. To understand the influence of light intensity the

experiment was undertaken in three light conditions: 30 ms shutter with external light

supply, 30 ms shutter without external light and 60 ms shutter without external light. These

results are summarised in Table 3 showing as expected that a faster shutter speed reduced

motion blur and improved model accuracy, and that increased ambient lighting flux also

improved model accuracy.

30msshutter

Withlight

30msshutter

Withoutlight

60msshutter

Withoutlight

StandardDeviation(mm) 1.56 2.46 1.97

MeanError(mm) 0.3853 0.6493 0.4571

Table 3. standard deviations of reconstructed model in different light conditions

Figure 8 shows the deviation maps from the GOM software comparison report showing the

30 ms shutter with the external light had the best reconstruction result (with sub millimetric

agreement over most of the blade surface) and 30 ms shutter without light had the worst

result (with errors of several millimetres across the blade).

Page 9: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

9

Figure 8. Deviation maps of the reconstructed model captured in different light intensity

(a) 30 ms shutter with light (b)30 ms shutter without light (c) 60 ms shutter without

light The experimental results illustrate that light intensity and motion blur affected the

reconstruction performance. The images captured under low light intensity were not bright

enough to allow the software to distinguish the blade. Motion blur corrupted the image

information and degraded reconstruction. However, decreasing shutter time could reduce

the blur but sacrifice the brightness at the same time.

4. Discussion and Future Work

The deviation maps of Figure 8 show that using a fast shutter speed with sufficient ambient

illumination allowed for sub-millimetric measurement accuracy over almost all the area of

the blade to be achieved from the UAV inspection. Clearly the trade-offs between shutter

speed and illumination warrant further investigation. Additionally, the reconstruction

models in this paper were built from the raw images. The post-processed images, such as

adjusting the brightness and contrast, might be able to provide a better reconstruction result.

The UAV flying path and flying stability directly affect the model quality. Due to the

circular trajectory and non-circular geometry, the distance between the camera and blade

surface were not constant and the camera focuses were not kept accurately on the blade

surface. Improved trajectory planning or a distance measurement sensor may be able to

improve the reconstruction result.

Future work will investigate the effects of environmental influences on the UAV platform

and reconstruction accuracy. Working with our partners at University of Sheffield, the new

LVV (Laboratory for Verification and Validation) will be available to have controlled wind

speeds of up to 50 mph, variable temperature, humidity and rainfall.

5. Conclusions

The authors have presented an implementation of remote photogrammetric wind turbine

blade testing deployed through a UAV. A 3.1 m long blade section from a Gaia Wind

Turbine blade was mounted vertically, and a series of circular inspection paths were

programmed around the blade to provide full 360 degree coverage. A PID control

Page 10: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

10

algorithm was implemented to control the UAV flight, and position was monitored indoors

in the laboratory using a VICON motion tracking system.

Images from an onboard 5 MP camera were transferred wirelessly and processed on a

laptop using Agisoft Photogrammetry software to produce a 3D mesh of the blade. Surface

structure and defects from the blade were clearly identifiable in the mesh. Comparison with

an accurate 3D model measured using state-of-the-art GOM ATOS Triple Scan showed that

the photogrammetry reconstruction mesh achieved submillimetre error over nearly the full

area of the blade. This error was undesirably increased by increasing motion blur (by

increasing shutter time), or reducing the ambient illumination.

Acknowledgements

The authors acknowledge the support of EDF Energy and Gaia Wind in this project,

Funding support was provided through EPSRC, EP/N018427/1 Autonomous

Inspection in Manufacturing & Remanufacturing (AIMaReM) and EP/R51178X/1,

Impact Acceleration Account - University of Strathclyde.

References

1. Kestrel-Cam. Aerial building inspection and survey. http://kestrel-cam.co.uk/aerial-

drone-photography/ (accessed 18th

April 2018)

2. C Eschmann, CM Kuo, CH Kuo and C Boller, Unmanned Aircraft Systems for

Remote Building Inspection and Monitoring, 6th European Workshop on

Structural Health Monitoring

3. N Hallermann, G Morgenthal, V Rodehorst, Unmanned Aerial Systems (UAS) –

Case Studies of Vision Based Monitoring of Ageing Structures, International

Symposium Non-Destructive Testing in Civil Engineering (NDT-CE) September

15 - 17, 2015, Berlin, Germany.

4. J Zhang et al. “High Speed Automatic Power Line Detection and Tracking for a

UAV-Based inspection”. In: 2012 International Conference on Industrial Control

and Electronics Engineering (2012).

5. JV Palomba, Unmanned Aerial Vehicle Inspections and Environmental Benefits,

15th Asia Pacific Conference for Non-Destructive Testing (APCNDT 2017),

Singapore.

6. CA Marinho, C de Souza, T Motomura, A Gonçalves da Silva, In-service flares

inspection by unmanned aerial Vehicles (UAVs), 18th World Conference on

Nondestructive Testing, 16-20 April 2012, Durban, South Africa

7. M Stokkeland, K Klausen, TA Johansen, Autonomous visual navigation of

unmanned aerial vehicle for wind turbine inspection, International Conference on

Unmanned Aircraft Systems (ICUAS), Denver Marriott Tech Center, Denver,

Colorado, USA, June 9-12, 2015

8. Long Wang and Zijun Zhang. “Automatic Detection of Wind Turbine Blade

Surface Cracks Based on UAV taken Images”. In: IEEE Transactions on

Industrial Electronics (2017). doi: 10.1109/tie.2017.2682037.

Page 11: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

11

9. http://www.thecyberhawk.com/cyberhawk-launches-commercial-roav-wind-

turbine-inspection-service/ (accessed 18th

April 2018)

10. http://www.droneflight.co.uk/wind-turbine-inspection/ (accessed 18th

April 2018)

11. http://www.skeyebv.com/projects/drone-wind-turbine-inspection/ (accessed 18th

April 2018)

12. http://visualworking.com/uav-wind-turbine-inspection/ (accessed 18th

April 2018)

13. https://www.skyspecs.com/ (accessed 18th

April 2018)

14. S Jung et al. “Mechanism and system design of MAV(Micro Aerial Vehicle)-type

wall-climbing robot for inspection of wind blades and non-flat surfaces”. In: 2015

15th International Conference on Control, Automation and Systems (ICCAS)

(2015). doi: 10.1109/iccas.2015.7364634.

15. SG Pierce, G Dobie, R Summan, L Mackenzie, J Hensman, K Worden, G Hayward,

"Positioning Challenges in Reconfigurable Semi-autonomous Robotic NDE

Inspection," SPIE, Smart Structures and Materials + Nondestructive Evaluation

and Health Monitoring 2010, San Diego, CA, 7-11 March 2010.

16. SG Pierce, CN MacLeod, G Dobie, R Summan, Path Planning & Measurement

Registration for Robotic Structural Asset Monitoring, 7th European Workshop on

Structural Health Monitoring, July 8-11, 2014. La Cité, Nantes, France

17. M Nabil and F Saleh. “3D reconstruction from images for museum artefacts: A

comparative study”. In: 2014 International Conference on Virtual Systems &

Multimedia (VSMM) (2014). doi: 10.1109/vsmm.2014.7136681.

18. S Grzonka, G Grisetti, and W Burgard. “A Fully Autonomous Indoor Quadrotor”.

In: IEEE Transactions on Robotics 28.1 (2012), pp. 90–100. doi:

10.1109/tro.2011.2162999.

19. RA. Clark et al. “Autonomous and scalable control for remote inspection with

multiple aerial vehicles”. In: Robotics and Autonomous Systems 87 (2017), pp.

258–268. doi: 10.1016/j.robot.2016.10.012.

20. H. Navarro et al. “Fuzzy Integral Imaging Camera Calibration for Real Scale 3D

Reconstructions”. In: Journal of Display Technology 10.7 (2014), pp. 601–608.

doi: 10.1109/jdt.2014.2312236.

21. A. Saxena, Min Sun, and A.y. Ng. “Make3D: Learning 3D Scene Structure from a

Single Still Image”. In: IEEE Transactions on Pattern Analysis and Machine

Intelligence 31.5 (2009), pp. 824–840. doi: 10.1109/tpami.2008.132.

22. AscTec. Asctec Firefly. url: http://wiki.asctec.de/display/AR/AscTec+Firefly .

(accessed 18th

April 2018)

23. FILR. Chameleon3 5.0 MP Color USB3 Vision (Sony IMX264). url:

https://www.ptgrey.com/chameleon3-usb3-vision-cameras

24. Y Tang and RJ Patton. “Phase modulation of robust variable structure control for

nonlinear aircraft”. In: Proceedings of 2012 UKACC International Conference on

Control (2012). doi: 10.1109/control.2012.6334625.

Page 12: Quantitative inspection of wind turbine blades using UAV ... · Quantitative inspection of wind turbine blades using UAV deployed photogrammetry Pierce, S. G. 1, Burnham, K. C. 2,

12

25. F Mu˜noz et al. “Second order sliding mode controllers for altitude control of a

quadrotor UAS: Realtime implementation in outdoor environments”. In:

Neurocomputing 233 (2017), pp. 61–71. doi: 10.1016/j.neucom.2016.08.111.

26. Z Jia et al. “Integral backstepping sliding mode control for quadrotor helicopter

under external uncertain disturbances”. In: Aerospace Science and Technology

68 (2017), pp. 299–307. doi: 10.1016/j.ast.2017. 05.022.

27. S. Bouabdallah, P. Murrieri, and R. Siegwart. “Design and control of an indoor

micro quadrotor”. In: IEEE International Conference on Robotics and

Automation, 2004. Proceedings. ICRA 04. 2004 (2004). doi:

10.1109/robot.2004.1302409.

28. E. Altug, J.p. Ostrowski, and C.j. Taylor. “Quadrotor control using dual camera

visual feedback”. In: 2003 IEEE International Conference on Robotics and

Automation (Cat. No.03CH37422) (). doi: 10.1109/robot.2003.1242264.

29. VICON. VICON MX System. url: https://www.vicon.com/ (accessed 18th

April

2018)

30. Agisoft. Agisoft Photoscan. url: http://www.agisoft.com/ (accessed 18th

April 2018)

31. The Structural Dynamics Laboratory for Verification and Validation (LVV)

https://www.lvv.ac.uk/ (accessed 18th

April 2018)