the use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

9
139 © Institution of Engineers Australia, 2011 * Reviewed and revised version of a paper originally presented at the 2009 Society for Engineering in Agriculture (SEAg) National Conference, Brisbane, Queensland, 13-16 September 2009. Corresponding author Dr Troy Jensen can be contacted at [email protected]. The use of an unmanned aerial vehicle as a remote sensing platform in agriculture * TA Jensen National Centre for Engineering in Agriculture, Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba, Queensland LC Zeller Department of Employment, Economic Development and Innovation, Toowoomba, Queensland AA Apan Australian Centre for Sustainable Catchments, Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba, Queensland ABSTRACT: One of the limitations of using hobbyist remotely controlled aircraft with an attached digital camera is that a great number of images look alike and unless a large number of natural features or artificial targets are present at the location, it is hard to identify and orientate the images. This paper investigates the use of an unmanned aerial vehicle (UAV) for use in agricultural applications. Trials were conducted, in collaboration with researchers from the Australian Research Centre for Aerospace Automation and Queensland University of Technology, on the ability of a UAV autopilot to accurately trigger a two-camera sensor when at a desired location. The study area was located at Watts Bridge Memorial Airfield, near Toogoolawah (152.460°, –27.098°) in southeast Queensland, Australia. The airfield has dedicated areas for use of remotely controlled aircraft, with the mission being undertaken on 5 March 2008. The target and waypoints were arranged so that the UAV flew in an anticlockwise flight pattern. Three separate missions were flown with images being acquired when over target on each of the nine passes. Although capturing the target in the image was achieved on every flight, the accuracy of capturing the target in the middle of the image was variable. The offset from the centre of the image to the target (zero in the perfect system) ranged from just under 15% to just over 60% of the image extent. The misalignment was due to a combination of the predetermined offset value, cross-wind, global position system/autopilot error, the UAV not being level when the image was acquired, and/or inaccuracies in positioning the sensors in the hinged pod. The capacity to accurately acquire images over pre-determined points is essential to ensure coverage and to expedite mosaicing of the images. It will also expand the application of these technologies into the broader-scale applications, such as imaging in broadacre cereal cropping or imaging along transects. 1 INTRODUCTION The use of unmanned aerial vehicles (UAVs) as remote sensing tools is not new, with a number of recent studies detailing various applications: off-the- shelf componentry was used to construct a UAV for rangeland photography (Hardin & Jackson, 2005); a camera-equipped UAV was used for wilderness search and rescue (Goodrich et al, 2008); a very expensive and sophisticated UAV (the NASA- developed Pathfinder) was used for coffee ripeness monitoring (Johnson et al, 2004); and UAVs were used to monitor wheat in small plots with colour and near-infrared (NIR) images acquired to develop relationships between vegetation indices and field measured parameters (Lelong et al, 2008). The above remote sensing methods used either real- time monitoring of the acquired imagery, or required a large number of images to be taken with the most appropriate being selected for future analysis. No previous studies have investigated the capacity of a Australian Journal of Multi-disciplinary Engineering, Vol 8 No 2 N11-AE06 Jenson.indd 139 N11-AE06 Jenson.indd 139 25/11/11 2:12 PM 25/11/11 2:12 PM

Upload: laurence-musgrove

Post on 16-May-2015

436 views

Category:

Technology


3 download

TRANSCRIPT

Page 1: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

139

© Institution of Engineers Australia, 2011

* Reviewed and revised version of a paper originally presented at the 2009 Society for Engineering in Agriculture (SEAg) National Conference, Brisbane, Queensland, 13-16 September 2009.

† Corresponding author Dr Troy Jensen can be contacted [email protected].

The use of an unmanned aerial vehicle as

a remote sensing platform in agriculture*

TA Jensen†

National Centre for Engineering in Agriculture, Faculty of Engineering and Surveying,University of Southern Queensland, Toowoomba, Queensland

LC ZellerDepartment of Employment, Economic Development and Innovation, Toowoomba, Queensland

AA ApanAustralian Centre for Sustainable Catchments, Faculty of Engineering and Surveying,

University of Southern Queensland, Toowoomba, Queensland

ABSTRACT: One of the limitations of using hobbyist remotely controlled aircraft with an attached digital camera is that a great number of images look alike and unless a large number of natural features or artifi cial targets are present at the location, it is hard to identify and orientate the images. This paper investigates the use of an unmanned aerial vehicle (UAV) for use in agricultural applications. Trials were conducted, in collaboration with researchers from the Australian Research Centre for Aerospace Automation and Queensland University of Technology, on the ability of a UAV autopilot to accurately trigger a two-camera sensor when at a desired location. The study area was located at Watts Bridge Memorial Airfi eld, near Toogoolawah (152.460°, –27.098°) in southeast Queensland, Australia. The airfi eld has dedicated areas for use of remotely controlled aircraft, with the mission being undertaken on 5 March 2008. The target and waypoints were arranged so that the UAV fl ew in an anticlockwise fl ight pattern. Three separate missions were fl own with images being acquired when over target on each of the nine passes. Although capturing the target in the image was achieved on every fl ight, the accuracy of capturing the target in the middle of the image was variable. The offset from the centre of the image to the target (zero in the perfect system) ranged from just under 15% to just over 60% of the image extent. The misalignment was due to a combination of the predetermined offset value, cross-wind, global position system/autopilot error, the UAV not being level when the image was acquired, and/or inaccuracies in positioning the sensors in the hinged pod. The capacity to accurately acquire images over pre-determined points is essential to ensure coverage and to expedite mosaicing of the images. It will also expand the application of these technologies into the broader-scale applications, such as imaging in broadacre cereal cropping or imaging along transects.

1 INTRODUCTION

The use of unmanned aerial vehicles (UAVs) as

remote sensing tools is not new, with a number of

recent studies detailing various applications: off-the-

shelf componentry was used to construct a UAV for

rangeland photography (Hardin & Jackson, 2005);

a camera-equipped UAV was used for wilderness

search and rescue (Goodrich et al, 2008); a very expensive and sophisticated UAV (the NASA-developed Pathfi nder) was used for coffee ripeness monitoring (Johnson et al, 2004); and UAVs were used to monitor wheat in small plots with colour and near-infrared (NIR) images acquired to develop relationships between vegetation indices and fi eld measured parameters (Lelong et al, 2008).

The above remote sensing methods used either real-time monitoring of the acquired imagery, or required a large number of images to be taken with the most appropriate being selected for future analysis. No previous studies have investigated the capacity of a

Australian Journal of Multi-disciplinary Engineering, Vol 8 No 2

N11-AE06 Jenson.indd 139N11-AE06 Jenson.indd 139 25/11/11 2:12 PM25/11/11 2:12 PM

Page 2: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

140

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

UAV to acquire an image only when at a particular location (set of coordinates). The ability of a UAV to navigate to a particular set of coordinates was, however, detailed in a paper investigating spore collection from the air (Schmale III et al, 2008). These investigators utilised the global position system (GPS) track log to determine the fl ight path of the UAV.

The purpose of this investigation was to evaluate a fully autonomous image acquisition system. To achieve this objective, the ability of the autopilot to trigger a remote sensing camera system was tested, and the three-dimensional accuracy of the autopilot (x, y, z) was also evaluated. The procedures to perform this testing and evaluation are detailed in this paper.

2 STUDY AREA

The study area was located at Watts Bridge Memorial Airfi eld, near Toogoolawah in southeast Queensland, (152.460°, –27.098°), Australia (fi gure 1). The mission was undertaken between 1300-1600 hours on 5 March 2008. A slight breeze only became evident in the last hour of testing.

3 PLATFORM

To undertake this evaluation, a specially modifi ed version of a remotely-controlled “Phoenix Boomerang” 60-size fixed-wing trainer aircraft fi tted with an autonomous fl ight control system was

Fig ure 1: Location of the Watts Bridge Memorial Airfield.

N11-AE06 Jenson.indd 140N11-AE06 Jenson.indd 140 25/11/11 2:12 PM25/11/11 2:12 PM

Page 3: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

141

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

utilised (Model Sports, n.d.). The platform consisted of two 60-size Boomerangs merged together to give a wingspan of just under 3.0 m. The platform (fi gure 2) was powered by an “OS Engines” 91FX (16 cc) (O. S. Engine, 2011) methanol-glow motor. The baseline avionics on the platform included the “MicroPilot MP2028g” autopilot (MicroPilot, 2011) and a “microhard Systems Inc. Spectra 910A” 900 MHz spread spectrum modem (Microhard Systems Inc., 2011) for communications with the ground control station. The speed range of the platform was from 45-120 km/h with cruise speed being 70 km/h. The payload of the platform was 3 kg and had a fl y time of 25 minutes.

4 THE REMOTE SENSING SYSTEM

The remote sensing system used to acquire images was based on the system developed and detailed

in Jensen et al (2007). This investigation utilised 5.0 megapixel Kodak Easyshare CX7525 (Kodak, n.d.) digital zoom camera (Eastman Kodak Company, Rochester NY). As described in the previous work, the two-camera system (one camera to capture the colour and the other the NIR portion of the spectrum) was remotely triggered, and was sensitive to NIR light (once the NIR cut-out fi lter had been removed).

The system was housed in a streamlined pod attached to the underside of the fuselage directly beneath the wing. The pod was hinged for easy access and download of the cameras (fi gure 3). As the sensor had been previously triggered using a spare output channel of the radio control equipment, this was easily adapted to suit the autopilot system. When the UAV was within a predetermined distance of the designated location (set prior to take-off at 20 m to allow for cross-winds, GPS error and camera misalignment), the autopilot set a spare servo channel

Fig ure 2: The Queensland University of Technology UAV ready for take-off.

Fig ure 3: The pod opened to remove the secure digital (SD) cards from the sensors.

N11-AE06 Jenson.indd 141N11-AE06 Jenson.indd 141 25/11/11 2:12 PM25/11/11 2:12 PM

Page 4: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

142

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

to the maximum output for 600 ms. A microcontroller (PICAXE-08; Picaxe, n.d.) was programmed to detect this change of state on the designated channel and trigger both cameras. The time delay between the trigger signal and the shot being taken was 0.5 s with a further 1-2 s being required for the image to be stored on the memory card. The microprocessor also gave both cameras a pulse every 10 s to ensure that they did not power down.

5 DEPLOYMENT

The UAV was programmed with a flight plan to do a number of left circuits over a series of pre-determined waypoints (fi gure 4). One of the dots is brighter, indicating that this is the next waypoint that the UAV is heading towards. When passing above the origin point (the target of the image acquisition and where the UAV was initialised), the autopilot triggered the cameras.

The take-off of the UAV was performed manually, under the visual control of a radio-control pilot. Upon reaching a safe altitude (30 m), the UAV was switched into autonomous mode and the autopilot started guiding the aircraft along the set track, with fl ight height targeted at 120 m above ground level (AGL). When the UAV approached the imaging target (the initialisation point) the UAV was instructed to change altitude to 90 m AGL. The change in altitude was performed so that most of the fl ight was at a

higher (hence perceived safer) altitude and likewise to simulate fl ying over obstructions and coming down to image acquisition height. Once past the target, the UAV resumed normal fl ying height. After 15-20 minutes of autonomous fl ying, the UAV was manually landed and the fl ight log was downloaded from the autopilot.

The fl ight log contained 52 columns of information, recorded at 5 Hz. The log contained information detailing the state of the aircraft and included attributes such as attitude, position, speed, heading and servo values. Four fl ights were undertaken on the day of testing with images successfully captured on three of these. The second fl ight had to be aborted and the UAV landed immediately, as conventional aircraft came into the proximity of the UAV. The imagery acquired was analysed to provide fl ight path accuracies.

6 RESULTS AND DISCUSSION

A fl ight path of the three successful missions is shown in fi gure 5. Two circuits were completed on fl ights one and three, with three circuits being made on fl ight four. Each dot in the circuit represents the latitude and longitude of the path taken by the UAV that was updated by the GPS every second and recorded in the fl ight log. The activity around the target area and the reduced distance between consecutive dots in this area indicates that this was the take-off and

Fig ure 4: The Horizon Flight Schedule ground control station software showing the path and the flight details of the UAV being monitored in the autopilot flight software.

N11-AE06 Jenson.indd 142N11-AE06 Jenson.indd 142 25/11/11 2:12 PM25/11/11 2:12 PM

Page 5: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

143

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

landing zone. The fl ight path is superimposed over a Spot 5 satellite image showing the infrastructure of the Watts Bridge Memorial Airfi eld and other natural features in close proximity. Also displayed are the waypoints used in determining the fl ight path and the location of the target, over which the images were captured.

An example of two of the images captured on fl ight four are shown in fi gure 6. These images were taken on the last circuits made by the UAV on the day. Even though the images were acquired a little over 3 minutes apart, there is good consistency in the coverage and positioning of the target within both images. Ideally, if the autopilot was doing a perfect job guiding the UAV, the target should be in the centre of the image. As can be seen from the images (fi gures 6), this was not quite the case.

The target and waypoints were arranged so that the UAV should in theory fl y directly down the centre of the mowed grass runway that ran northeast-southwest in fi gure 5. This should have resulted in the runway being positioned vertically in the centre of each image acquired. This was not the case. The misalignment was possibly due to a combination of cross-wind, GPS/autopilot error, the UAV not being level when the image was acquired, and/or inaccuracies in positioning the sensors in the hinged pod. Defi ning the errors and refi ning them was not within the scope of this proof-of-concept research.

Flight details and inaccuracies in the image acquisition were quantifi ed and detailed in table 1. The scale of the images were determined using GPS coordinates of known features on the images. The direction of fl ight of the UAV was from the top of the image to

Fig ure 5: The flight paths and target positioning, Watts Bridge on 5 March 2008.

Fig ure 6: (a) Image 100_4814 taken at 3:27:44 pm. (b) Image 100_4815 taken just over 3 minutes after image 100_4814.

(a) (b)

N11-AE06 Jenson.indd 143N11-AE06 Jenson.indd 143 25/11/11 2:12 PM25/11/11 2:12 PM

Page 6: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

144

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

the bottom. In the image offset column in table 1, the X distance is the cross-track distance with a positive value indicating that it is to the left and negative to the right of the centre of the image. The offset in the direction of fl ight (undershoot or overshoot) is indicated by the Y column with a positive value indicating that the image was captured before the centre of the image with a negative value indicating after capture. The absolute is the direct distance from the centre of the image to the centre of the target.

Capturing the target in the image was achieved on every fl ight. However, capturing the target in the middle of the image was not as repeatable with the error ranging from just under 15% of the image width (10.8 in 73.9 m; the fi nal image on fl ight four) to just over 60% of the image width (85.6 in 133.4 m; the second image of fl ight one).

The capacity to accurately acquire images over pre-determined points is essential to ensure coverage and to expedite mosaicing of the images. It will also expand the application of these technologies into the broader-scale applications, such as imaging in broadacre cereal cropping or imaging along transects (such a river systems, etc.).

Differing altitudes were programmed for each fl ight (table 1). The fi rst fl ight was undertaken at 250 m above sea level (ASL) with the third at 204 m. The fi nal fl ight was slightly different. The fi rst image was acquired at the set altitude of 214 m. The three circuits that followed were fl own at this set height; however the images were acquired at lower altitudes (194 m for images two and three, and 169 m for the fi nal image). These image acquisition heights were changed in-fl ight with the intention of observing the

response of the UAV to changes of the fl ight schedule.

An altitude plot of fl ight 4 is shown in fi gure 7,

showing the relatively steep climb of the UAV after

take-off. Also evident is the loss of altitude, and then

correction, due to the banking of the aircraft when

manoeuvring to align to the next waypoint. The

saw-toothed nature of the plot, due to the banking,

indicates that the feedback loops to the autopilot to

control the fl ight surfaces are not fi nely tuned enough

to optimise performance and ensure stable fl ight. No

attempt was made to refi ne the triggering accuracy

in this preliminary study. However, modifi cations

such as; extending the turn area to ensure the aircraft

was in straight and level fl ight when the camera was

over target and triggered, or a more stable airframe

such as an electric glider-style, may have improved

on the results obtained.

7 CONCLUSIONS

This study provides proof-of-concept that a low-cost

auto-piloted UAV can fl y on a pre-determined path

and acquire images at pre-determined locations. On

every attempt, the target was successfully captured in

the images. The proximity of the target to the centre

of the image varied due to a number of factors such

as wind speed, direction, aircraft attitude and GPS/

autopilot/camera lags. Improving on the accuracy of

the image acquisition was beyond the scope of this

initial evaluation, however some simple measures

such as ensuring straight and level fl ight at image

acquisition, a more stable platform, careful orientation

of the camera and a higher update rate GPS would

have a benefi cial effect on the accuracies obtained.

Table 1: Details of the errors for the images acquired over the target.

Image # TimeAltitude ASL (m)

Heading (degrees)

Image offset Image extent Area (ha)X (m) Y (m) Absolute X (m) Y (m)

Flight 1

100_4801 1:32:06 261 150 50.4 8.3 51.1 140.4 104.0 1.46

100_4803 1:35:26 83.7 17.8 85.6 133.4 98.8 1.32

100_4805 1:38:32 59.6 10.2 60.5 137.4 101.8 1.40

Flight 2

100_4806 2:36:10 aborted mission, no data collected

Flight 3

100_4809 2:57:24 216 146 17.2 –1.9 17.3 103.0 76.3 0.79

100_4810 3:00:42 198 132 19.1 0.3 19.1 92.2 68.3 0.63

Flight 4

100_4812 3:21:48 211 113 39.5 6.0 40.0 108.7 80.5 0.88

100_4813 3:24:46 38.2 26.6 46.5 92.0 68.2 0.63

100_4814 3:27:44 192 114 –19.3 14.1 23.9 100.4 74.4 0.75

100_4815 3:30:50 166 125 –9.6 4.9 10.8 73.9 54.7 0.40

N11-AE06 Jenson.indd 144N11-AE06 Jenson.indd 144 25/11/11 2:12 PM25/11/11 2:12 PM

Page 7: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

145

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

This autonomous system has the potential to be a highly suitable platform for “real world” applications, but needs further development to overcome the accuracy issues. As the capacity to perform automatic registering and mosaicing of the acquired images fi lters down from conventional aerial imagery, this low cost remote sensing system will have great potential to be utilised in broader agricultural applications.

REFERENCES

Goodrich, M. A., Morse, B. S., Gerhardt, D., Cooper, J. L., Quigley, M., Adams, J. A. & Humphrey, C. 2008, “Supporting wilderness search and rescue using a camera-equipped mini UAV”, Journal of Field Robotics, Vol. 25, No. 1-2, pp. 89-110.

Hardin, P. J. & Jackson, M. W. 2005, “An unmanned aerial vehicle for rangeland photography”, Rangeland Ecology and Management, Vol. 58, No. 4, pp. 439-442.

Jensen, T., Apan, A., Young, F. & Zeller, L. 2007, “Detecting the attributes of a wheat crop using digital imagery acquired from a low-altitude platform”, Computers and Electronics in Agriculture, Vol. 59, No. 1-2, pp. 66-77.

Johnson, L. F., Herwitz, S. R., Lobitz, B. M. & Dunagan, S. E. 2004, “Feasibility of monitoring coffee fi eld ripeness with airborne multispectral imagery”,

Applied Engineering in Agriculture, Vol. 20, No. 6, pp. 845-849.

Kodak, n.d., “KODAK Digital Cameras, Printers, Digital Video Cameras & more”, www.kodak.com.

Lelong, C. C. D., Burger, P., Jubelin, G., Roux, B., Labbe, S. & Baret, F. 2008, “Assessment of unmanned aerial vehicles imagery for quantitative monitoring of wheat crop in small plots”, Sensors, Vol. 8, No. 5, pp. 3557-3585.

Microhard Systems Inc., 2011, “Leaders in Wireless Communication”, http://microhardcorp.com.

MicroPilot, 2011, “World Leader in Miniature UAV Autopilots since 1994”, www.micropilot.com.

Model Sports, n.d., “Welcome”, www.modelsports.com.au.

O. S. Engine, 2011, “O.S. Engines Homepage”, www.osengines.com.

Picaxe, n.d., “Home”, www.picaxe.com.

Schmale III, D. G., Dingus, B. R. & Reinholtz, C. 2008, “Development and application of an autonomous unmanned aerial vehicle for precise aerobiological sampling above agricultural fi elds”, Journal of Field Robotics, Vol. 25, No. 3, pp. 133-147.

Fig ure 7: Altitude details for flight 4 (note the UAV reduced altitude to acquire images).

N11-AE06 Jenson.indd 145N11-AE06 Jenson.indd 145 25/11/11 2:12 PM25/11/11 2:12 PM

Page 8: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

146

Australian Journal of Multi-disciplinary Engineering Vol 8 No 2

“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan

TROY JENSEN

Dr Troy Jensen is a Senior Research Fellow/Senior Lecturer with the National Centre for Engineering in Agriculture and the Faculty Engineering and Surveying, University of Southern Queensland (USQ), Toowoomba. He received his PhD degree in Engineering from USQ. Applying engineering technologies to agriculture is something that Troy has been doing since be commenced work in 1987. Since this time, he has gained extensive applied research experience in such diverse areas as agricultural machinery, animal and plant biosecurity, precision agriculture, remote sensing, controlled traffi c farming, native grass seed harvesting and management, grain storage, horticultural mechanisation, and biomass reuse. His current research area focuses on the use precision agriculture technologies, and is working on a project funded by the Sugar Research and Development Corporation titled “A co-ordinated approach to Precision Agriculture RDE for the Australian Sugar Industry”.

LES ZELLER

Les Zeller is a Senior Research Engineer with the Queensland Department of Employment, Economic Development and Innovation based in Toowoomba. He received his Associate Diploma and Masters Degree in Engineering from the University of Southern Queensland and his Bachelor in Applied Physics from Central Queensland University. He has worked in agricultural research for over 30 years in the development and application of electronic technologies for agricultural research. A turf traction measurement device he developed has been patented by the Queensland State Government.

ARMANDO APAN

Dr Armando A. Apan is an Associate Professor with the Australian Centre for Sustainable Catchments and the Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba. He received his PhD degree in Geography from Monash University in Melbourne. His current research area focuses on the use of remote sensing and geographic information systems in environmental management, agriculture, forestry and ecology. He was awarded the Queensland Spatial Excellence Award (Education and Professional Development) in 2006 by the Spatial Sciences Institute, Australia. Currently, he is the Associate Dean (Research) of the Faculty of Engineering and Surveying, University of Southern Queensland.

N11-AE06 Jenson.indd 146N11-AE06 Jenson.indd 146 25/11/11 2:12 PM25/11/11 2:12 PM

Page 9: The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

Copyright of Australian Journal of Multi-Disciplinary Engineering is the property of Institution of Engineers

Australia, trading as Engineers Australia and its content may not be copied or emailed to multiple sites or

posted to a listserv without the copyright holder's express written permission. However, users may print,

download, or email articles for individual use.