2. 140 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan
UAV to acquire an image only when at a particular 2 STUDY AREA
location (set of coordinates). The ability of a UAV
to navigate to a particular set of coordinates was, The study area was located at Watts Bridge Memorial
however, detailed in a paper investigating spore Airfield, near Toogoolawah in southeast Queensland,
collection from the air (Schmale III et al, 2008). These (152.460°, –27.098°), Australia (figure 1). The mission
investigators utilised the global position system (GPS) was undertaken between 1300-1600 hours on 5 March
track log to determine the flight path of the UAV. 2008. A slight breeze only became evident in the last
hour of testing.
The purpose of this investigation was to evaluate
a fully autonomous image acquisition system. To
achieve this objective, the ability of the autopilot to 3 PLATFORM
trigger a remote sensing camera system was tested,
and the three-dimensional accuracy of the autopilot To undertake this evaluation, a specially modified
(x, y, z) was also evaluated. The procedures to version of a remotely-controlled “Phoenix
perform this testing and evaluation are detailed in Boomerang” 60-size fixed-wing trainer aircraft
this paper. fitted with an autonomous flight control system was
Figure 1: Location of the Watts Bridge Memorial Airfield.
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 140 25/11/11 2:12 PM
3. “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan 141
utilised (Model Sports, n.d.). The platform consisted in Jensen et al (2007). This investigation utilised 5.0
of two 60-size Boomerangs merged together to megapixel Kodak Easyshare CX7525 (Kodak, n.d.)
give a wingspan of just under 3.0 m. The platform digital zoom camera (Eastman Kodak Company,
(figure 2) was powered by an “OS Engines” 91FX Rochester NY). As described in the previous work,
(16 cc) (O. S. Engine, 2011) methanol-glow motor. the two-camera system (one camera to capture the
The baseline avionics on the platform included the colour and the other the NIR portion of the spectrum)
“MicroPilot MP2028g” autopilot (MicroPilot, 2011) was remotely triggered, and was sensitive to NIR
and a “microhard Systems Inc. Spectra 910A” 900 light (once the NIR cut-out filter had been removed).
MHz spread spectrum modem (Microhard Systems
The system was housed in a streamlined pod
Inc., 2011) for communications with the ground
attached to the underside of the fuselage directly
control station. The speed range of the platform was
beneath the wing. The pod was hinged for easy
from 45-120 km/h with cruise speed being 70 km/h.
access and download of the cameras (figure 3). As
The payload of the platform was 3 kg and had a fly
the sensor had been previously triggered using a
time of 25 minutes.
spare output channel of the radio control equipment,
this was easily adapted to suit the autopilot system.
4 THE REMOTE SENSING SYSTEM When the UAV was within a predetermined distance
of the designated location (set prior to take-off at
The remote sensing system used to acquire images 20 m to allow for cross-winds, GPS error and camera
was based on the system developed and detailed misalignment), the autopilot set a spare servo channel
Figure 2: The Queensland University of Technology UAV ready for take-off.
Figure 3: The pod opened to remove the secure digital (SD) cards from the sensors.
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 141 25/11/11 2:12 PM
4. 142 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan
to the maximum output for 600 ms. A microcontroller higher (hence perceived safer) altitude and likewise
(PICAXE-08; Picaxe, n.d.) was programmed to detect to simulate flying over obstructions and coming
this change of state on the designated channel and down to image acquisition height. Once past the
trigger both cameras. The time delay between the target, the UAV resumed normal flying height. After
trigger signal and the shot being taken was 0.5 s 15-20 minutes of autonomous flying, the UAV was
with a further 1-2 s being required for the image to manually landed and the flight log was downloaded
be stored on the memory card. The microprocessor from the autopilot.
also gave both cameras a pulse every 10 s to ensure
The flight log contained 52 columns of information,
that they did not power down.
recorded at 5 Hz. The log contained information
detailing the state of the aircraft and included
5 DEPLOYMENT attributes such as attitude, position, speed, heading
and servo values. Four flights were undertaken on
The UAV was programmed with a flight plan the day of testing with images successfully captured
to do a number of left circuits over a series of on three of these. The second flight had to be aborted
pre-determined waypoints (figure 4). One of the dots and the UAV landed immediately, as conventional
is brighter, indicating that this is the next waypoint aircraft came into the proximity of the UAV. The
that the UAV is heading towards. When passing imagery acquired was analysed to provide flight
above the origin point (the target of the image path accuracies.
acquisition and where the UAV was initialised), the
autopilot triggered the cameras.
6 RESULTS AND DISCUSSION
The take-off of the UAV was performed manually,
under the visual control of a radio-control pilot. A flight path of the three successful missions is shown
Upon reaching a safe altitude (30 m), the UAV was in figure 5. Two circuits were completed on flights
switched into autonomous mode and the autopilot one and three, with three circuits being made on flight
started guiding the aircraft along the set track, with four. Each dot in the circuit represents the latitude
flight height targeted at 120 m above ground level and longitude of the path taken by the UAV that was
(AGL). When the UAV approached the imaging target updated by the GPS every second and recorded in
(the initialisation point) the UAV was instructed to the flight log. The activity around the target area
change altitude to 90 m AGL. The change in altitude and the reduced distance between consecutive dots
was performed so that most of the flight was at a in this area indicates that this was the take-off and
Figure 4: The Horizon Flight Schedule ground control station software showing the path and the
flight details of the UAV being monitored in the autopilot flight software.
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 142 25/11/11 2:12 PM
5. “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan 143
Figure 5: The flight paths and target positioning, Watts Bridge on 5 March 2008.
(a) (b)
Figure 6: (a) Image 100_4814 taken at 3:27:44 pm. (b) Image 100_4815 taken just over 3 minutes after
image 100_4814.
landing zone. The flight path is superimposed over The target and waypoints were arranged so that the
a Spot 5 satellite image showing the infrastructure UAV should in theory fly directly down the centre
of the Watts Bridge Memorial Airfield and other of the mowed grass runway that ran northeast-
natural features in close proximity. Also displayed southwest in figure 5. This should have resulted in
are the waypoints used in determining the flight the runway being positioned vertically in the centre
path and the location of the target, over which the of each image acquired. This was not the case. The
images were captured. misalignment was possibly due to a combination
of cross-wind, GPS/autopilot error, the UAV not
An example of two of the images captured on flight
being level when the image was acquired, and/or
four are shown in figure 6. These images were taken
inaccuracies in positioning the sensors in the hinged
on the last circuits made by the UAV on the day.
pod. Defining the errors and refining them was not
Even though the images were acquired a little over
within the scope of this proof-of-concept research.
3 minutes apart, there is good consistency in the
coverage and positioning of the target within both Flight details and inaccuracies in the image acquisition
images. Ideally, if the autopilot was doing a perfect were quantified and detailed in table 1. The scale of
job guiding the UAV, the target should be in the centre the images were determined using GPS coordinates
of the image. As can be seen from the images (figures of known features on the images. The direction of
6), this was not quite the case. flight of the UAV was from the top of the image to
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 143 25/11/11 2:12 PM
6. 144 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan
Table 1: Details of the errors for the images acquired over the target.
Altitude Heading Image offset Image extent Area
Image # Time
ASL (m) (degrees) X (m) Y (m) Absolute X (m) Y (m) (ha)
Flight 1
100_4801 1:32:06 261 150 50.4 8.3 51.1 140.4 104.0 1.46
100_4803 1:35:26 83.7 17.8 85.6 133.4 98.8 1.32
100_4805 1:38:32 59.6 10.2 60.5 137.4 101.8 1.40
Flight 2
100_4806 2:36:10 aborted mission, no data collected
Flight 3
100_4809 2:57:24 216 146 17.2 –1.9 17.3 103.0 76.3 0.79
100_4810 3:00:42 198 132 19.1 0.3 19.1 92.2 68.3 0.63
Flight 4
100_4812 3:21:48 211 113 39.5 6.0 40.0 108.7 80.5 0.88
100_4813 3:24:46 38.2 26.6 46.5 92.0 68.2 0.63
100_4814 3:27:44 192 114 –19.3 14.1 23.9 100.4 74.4 0.75
100_4815 3:30:50 166 125 –9.6 4.9 10.8 73.9 54.7 0.40
the bottom. In the image offset column in table 1, the response of the UAV to changes of the flight schedule.
X distance is the cross-track distance with a positive An altitude plot of flight 4 is shown in figure 7,
value indicating that it is to the left and negative showing the relatively steep climb of the UAV after
to the right of the centre of the image. The offset take-off. Also evident is the loss of altitude, and then
in the direction of flight (undershoot or overshoot) correction, due to the banking of the aircraft when
is indicated by the Y column with a positive value manoeuvring to align to the next waypoint. The
indicating that the image was captured before the saw-toothed nature of the plot, due to the banking,
centre of the image with a negative value indicating indicates that the feedback loops to the autopilot to
after capture. The absolute is the direct distance from control the flight surfaces are not finely tuned enough
the centre of the image to the centre of the target. to optimise performance and ensure stable flight. No
Capturing the target in the image was achieved on attempt was made to refine the triggering accuracy
every flight. However, capturing the target in the in this preliminary study. However, modifications
middle of the image was not as repeatable with the such as; extending the turn area to ensure the aircraft
error ranging from just under 15% of the image width was in straight and level flight when the camera was
(10.8 in 73.9 m; the final image on flight four) to just over target and triggered, or a more stable airframe
over 60% of the image width (85.6 in 133.4 m; the such as an electric glider-style, may have improved
second image of flight one). on the results obtained.
The capacity to accurately acquire images over pre-
determined points is essential to ensure coverage 7 CONCLUSIONS
and to expedite mosaicing of the images. It will also
expand the application of these technologies into This study provides proof-of-concept that a low-cost
the broader-scale applications, such as imaging in auto-piloted UAV can fly on a pre-determined path
broadacre cereal cropping or imaging along transects and acquire images at pre-determined locations. On
(such a river systems, etc.). every attempt, the target was successfully captured in
Differing altitudes were programmed for each flight the images. The proximity of the target to the centre
(table 1). The first flight was undertaken at 250 m of the image varied due to a number of factors such
above sea level (ASL) with the third at 204 m. The as wind speed, direction, aircraft attitude and GPS/
final flight was slightly different. The first image autopilot/camera lags. Improving on the accuracy of
was acquired at the set altitude of 214 m. The three the image acquisition was beyond the scope of this
circuits that followed were flown at this set height; initial evaluation, however some simple measures
however the images were acquired at lower altitudes such as ensuring straight and level flight at image
(194 m for images two and three, and 169 m for the acquisition, a more stable platform, careful orientation
final image). These image acquisition heights were of the camera and a higher update rate GPS would
changed in-flight with the intention of observing the have a beneficial effect on the accuracies obtained.
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 144 25/11/11 2:12 PM
7. “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan 145
Figure 7: Altitude details for flight 4 (note the UAV reduced altitude to acquire images).
This autonomous system has the potential to Applied Engineering in Agriculture, Vol. 20, No. 6, pp.
be a highly suitable platform for “real world” 845-849.
applications, but needs further development to
overcome the accuracy issues. As the capacity to Kodak, n.d., “KODAK Digital Cameras, Printers,
perform automatic registering and mosaicing of the Digital Video Cameras & more”, www.kodak.com.
acquired images filters down from conventional
aerial imagery, this low cost remote sensing system Lelong, C. C. D., Burger, P., Jubelin, G., Roux, B.,
will have great potential to be utilised in broader Labbe, S. & Baret, F. 2008, “Assessment of unmanned
agricultural applications. aerial vehicles imagery for quantitative monitoring
of wheat crop in small plots”, Sensors, Vol. 8, No. 5,
pp. 3557-3585.
REFERENCES
Microhard Systems Inc., 2011, “Leaders in Wireless
Goodrich, M. A., Morse, B. S., Gerhardt, D., Cooper,
Communication”, http://microhardcorp.com.
J. L., Quigley, M., Adams, J. A. & Humphrey, C. 2008,
“Supporting wilderness search and rescue using a
camera-equipped mini UAV”, Journal of Field Robotics, MicroPilot, 2011, “World Leader in Miniature UAV
Vol. 25, No. 1-2, pp. 89-110. Autopilots since 1994”, www.micropilot.com.
Hardin, P. J. & Jackson, M. W. 2005, “An unmanned Model Sports, n.d., “Welcome”, www.modelsports.
aerial vehicle for rangeland photography”, Rangeland com.au.
Ecology and Management, Vol. 58, No. 4, pp. 439-442.
O. S. Engine, 2011, “O.S. Engines Homepage”, www.
Jensen, T., Apan, A., Young, F. & Zeller, L. 2007, osengines.com.
“Detecting the attributes of a wheat crop using digital
imagery acquired from a low-altitude platform”, Picaxe, n.d., “Home”, www.picaxe.com.
Computers and Electronics in Agriculture, Vol. 59, No.
1-2, pp. 66-77. Schmale III, D. G., Dingus, B. R. & Reinholtz, C. 2008,
“Development and application of an autonomous
Johnson, L. F., Herwitz, S. R., Lobitz, B. M. & unmanned aerial vehicle for precise aerobiological
Dunagan, S. E. 2004, “Feasibility of monitoring coffee sampling above agricultural fields”, Journal of Field
field ripeness with airborne multispectral imagery”, Robotics, Vol. 25, No. 3, pp. 133-147.
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 145 25/11/11 2:12 PM
8. 146 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan
TROY JENSEN
Dr Troy Jensen is a Senior Research Fellow/Senior Lecturer with the National
Centre for Engineering in Agriculture and the Faculty Engineering and
Surveying, University of Southern Queensland (USQ), Toowoomba. He received
his PhD degree in Engineering from USQ. Applying engineering technologies
to agriculture is something that Troy has been doing since be commenced work
in 1987. Since this time, he has gained extensive applied research experience
in such diverse areas as agricultural machinery, animal and plant biosecurity,
precision agriculture, remote sensing, controlled traffic farming, native grass
seed harvesting and management, grain storage, horticultural mechanisation,
and biomass reuse. His current research area focuses on the use precision
agriculture technologies, and is working on a project funded by the Sugar
Research and Development Corporation titled “A co-ordinated approach to
Precision Agriculture RDE for the Australian Sugar Industry”.
LES ZELLER
Les Zeller is a Senior Research Engineer with the Queensland Department of
Employment, Economic Development and Innovation based in Toowoomba.
He received his Associate Diploma and Masters Degree in Engineering from
the University of Southern Queensland and his Bachelor in Applied Physics
from Central Queensland University. He has worked in agricultural research
for over 30 years in the development and application of electronic technologies
for agricultural research. A turf traction measurement device he developed has
been patented by the Queensland State Government.
ARMANDO APAN
Dr Armando A. Apan is an Associate Professor with the Australian Centre
for Sustainable Catchments and the Faculty of Engineering and Surveying,
University of Southern Queensland, Toowoomba. He received his PhD degree
in Geography from Monash University in Melbourne. His current research
area focuses on the use of remote sensing and geographic information systems
in environmental management, agriculture, forestry and ecology. He was
awarded the Queensland Spatial Excellence Award (Education and Professional
Development) in 2006 by the Spatial Sciences Institute, Australia. Currently, he
is the Associate Dean (Research) of the Faculty of Engineering and Surveying,
University of Southern Queensland.
Australian Journal of Multi-disciplinary Engineering Vol 8 No 2
N11-AE06 Jenson.indd 146 25/11/11 2:12 PM
9. Copyright of Australian Journal of Multi-Disciplinary Engineering is the property of Institution of Engineers
Australia, trading as Engineers Australia and its content may not be copied or emailed to multiple sites or
posted to a listserv without the copyright holder's express written permission. However, users may print,
download, or email articles for individual use.