SlideShare une entreprise Scribd logo
1  sur  9
Télécharger pour lire hors ligne
139




                                        The use of an unmanned aerial vehicle as
                                        a remote sensing platform in agriculture*

                                                                    TA Jensen†
                               National Centre for Engineering in Agriculture, Faculty of Engineering and Surveying,
                                          University of Southern Queensland, Toowoomba, Queensland

                                                                 LC Zeller
                           Department of Employment, Economic Development and Innovation, Toowoomba, Queensland

                                                                      AA Apan
                                Australian Centre for Sustainable Catchments, Faculty of Engineering and Surveying,
                                           University of Southern Queensland, Toowoomba, Queensland



                              ABSTRACT: One of the limitations of using hobbyist remotely controlled aircraft with an
                              attached digital camera is that a great number of images look alike and unless a large number of
                              natural features or artificial targets are present at the location, it is hard to identify and orientate the
                              images. This paper investigates the use of an unmanned aerial vehicle (UAV) for use in agricultural
                              applications. Trials were conducted, in collaboration with researchers from the Australian Research
                              Centre for Aerospace Automation and Queensland University of Technology, on the ability of a UAV
                              autopilot to accurately trigger a two-camera sensor when at a desired location. The study area was
                              located at Watts Bridge Memorial Airfield, near Toogoolawah (152.460°, –27.098°) in southeast
                              Queensland, Australia. The airfield has dedicated areas for use of remotely controlled aircraft,
                              with the mission being undertaken on 5 March 2008. The target and waypoints were arranged
                              so that the UAV flew in an anticlockwise flight pattern. Three separate missions were flown with
                              images being acquired when over target on each of the nine passes. Although capturing the target
                              in the image was achieved on every flight, the accuracy of capturing the target in the middle of the
                              image was variable. The offset from the centre of the image to the target (zero in the perfect system)
                              ranged from just under 15% to just over 60% of the image extent. The misalignment was due to a
                              combination of the predetermined offset value, cross-wind, global position system/autopilot error, the
                              UAV not being level when the image was acquired, and/or inaccuracies in positioning the sensors
                              in the hinged pod. The capacity to accurately acquire images over pre-determined points is essential
                              to ensure coverage and to expedite mosaicing of the images. It will also expand the application of
                              these technologies into the broader-scale applications, such as imaging in broadacre cereal cropping
                              or imaging along transects.




                   1        INTRODUCTION                                                    search and rescue (Goodrich et al, 2008); a very
                                                                                            expensive and sophisticated UAV (the NASA-
                   The use of unmanned aerial vehicles (UAVs) as                            developed Pathfinder) was used for coffee ripeness
                   remote sensing tools is not new, with a number of                        monitoring (Johnson et al, 2004); and UAVs were
                   recent studies detailing various applications: off-the-                  used to monitor wheat in small plots with colour
                   shelf componentry was used to construct a UAV for                        and near-infrared (NIR) images acquired to develop
                   rangeland photography (Hardin & Jackson, 2005);                          relationships between vegetation indices and field
                   a camera-equipped UAV was used for wilderness                            measured parameters (Lelong et al, 2008).
                                                                                            The above remote sensing methods used either real-
                   *   Reviewed and revised version of a paper originally presented at
                       the 2009 Society for Engineering in Agriculture (SEAg) National
                                                                                            time monitoring of the acquired imagery, or required
                       Conference, Brisbane, Queensland, 13-16 September 2009.              a large number of images to be taken with the most
                   †   Corresponding author Dr Troy Jensen can be contacted at              appropriate being selected for future analysis. No
                       troy.jensen@usq.edu.au.                                              previous studies have investigated the capacity of a

                   © Institution of Engineers Australia, 2011                            Australian Journal of Multi-disciplinary Engineering, Vol 8 No 2




N11-AE06 Jenson.indd 139                                                                                                                                    25/11/11 2:12 PM
140         “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan


              UAV to acquire an image only when at a particular            2      STUDY AREA
              location (set of coordinates). The ability of a UAV
              to navigate to a particular set of coordinates was,          The study area was located at Watts Bridge Memorial
              however, detailed in a paper investigating spore             Airfield, near Toogoolawah in southeast Queensland,
              collection from the air (Schmale III et al, 2008). These     (152.460°, –27.098°), Australia (figure 1). The mission
              investigators utilised the global position system (GPS)      was undertaken between 1300-1600 hours on 5 March
              track log to determine the flight path of the UAV.           2008. A slight breeze only became evident in the last
                                                                           hour of testing.
              The purpose of this investigation was to evaluate
              a fully autonomous image acquisition system. To
              achieve this objective, the ability of the autopilot to      3      PLATFORM
              trigger a remote sensing camera system was tested,
              and the three-dimensional accuracy of the autopilot          To undertake this evaluation, a specially modified
              (x, y, z) was also evaluated. The procedures to              version of a remotely-controlled “Phoenix
              perform this testing and evaluation are detailed in          Boomerang” 60-size fixed-wing trainer aircraft
              this paper.                                                  fitted with an autonomous flight control system was




              Figure 1:      Location of the Watts Bridge Memorial Airfield.

              Australian Journal of Multi-disciplinary Engineering                                                        Vol 8 No 2




N11-AE06 Jenson.indd 140                                                                                                               25/11/11 2:12 PM
“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan          141


                   utilised (Model Sports, n.d.). The platform consisted        in Jensen et al (2007). This investigation utilised 5.0
                   of two 60-size Boomerangs merged together to                 megapixel Kodak Easyshare CX7525 (Kodak, n.d.)
                   give a wingspan of just under 3.0 m. The platform            digital zoom camera (Eastman Kodak Company,
                   (figure 2) was powered by an “OS Engines” 91FX               Rochester NY). As described in the previous work,
                   (16 cc) (O. S. Engine, 2011) methanol-glow motor.            the two-camera system (one camera to capture the
                   The baseline avionics on the platform included the           colour and the other the NIR portion of the spectrum)
                   “MicroPilot MP2028g” autopilot (MicroPilot, 2011)            was remotely triggered, and was sensitive to NIR
                   and a “microhard Systems Inc. Spectra 910A” 900              light (once the NIR cut-out filter had been removed).
                   MHz spread spectrum modem (Microhard Systems
                                                                                The system was housed in a streamlined pod
                   Inc., 2011) for communications with the ground
                                                                                attached to the underside of the fuselage directly
                   control station. The speed range of the platform was
                                                                                beneath the wing. The pod was hinged for easy
                   from 45-120 km/h with cruise speed being 70 km/h.
                                                                                access and download of the cameras (figure 3). As
                   The payload of the platform was 3 kg and had a fly
                                                                                the sensor had been previously triggered using a
                   time of 25 minutes.
                                                                                spare output channel of the radio control equipment,
                                                                                this was easily adapted to suit the autopilot system.
                   4       THE REMOTE SENSING SYSTEM                            When the UAV was within a predetermined distance
                                                                                of the designated location (set prior to take-off at
                   The remote sensing system used to acquire images             20 m to allow for cross-winds, GPS error and camera
                   was based on the system developed and detailed               misalignment), the autopilot set a spare servo channel




                   Figure 2:      The Queensland University of Technology UAV ready for take-off.




                   Figure 3:      The pod opened to remove the secure digital (SD) cards from the sensors.

                   Australian Journal of Multi-disciplinary Engineering                                                         Vol 8 No 2




N11-AE06 Jenson.indd 141                                                                                                                     25/11/11 2:12 PM
142         “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan


              to the maximum output for 600 ms. A microcontroller          higher (hence perceived safer) altitude and likewise
              (PICAXE-08; Picaxe, n.d.) was programmed to detect           to simulate flying over obstructions and coming
              this change of state on the designated channel and           down to image acquisition height. Once past the
              trigger both cameras. The time delay between the             target, the UAV resumed normal flying height. After
              trigger signal and the shot being taken was 0.5 s            15-20 minutes of autonomous flying, the UAV was
              with a further 1-2 s being required for the image to         manually landed and the flight log was downloaded
              be stored on the memory card. The microprocessor             from the autopilot.
              also gave both cameras a pulse every 10 s to ensure
                                                                           The flight log contained 52 columns of information,
              that they did not power down.
                                                                           recorded at 5 Hz. The log contained information
                                                                           detailing the state of the aircraft and included
              5        DEPLOYMENT                                          attributes such as attitude, position, speed, heading
                                                                           and servo values. Four flights were undertaken on
              The UAV was programmed with a flight plan                    the day of testing with images successfully captured
              to do a number of left circuits over a series of             on three of these. The second flight had to be aborted
              pre-determined waypoints (figure 4). One of the dots         and the UAV landed immediately, as conventional
              is brighter, indicating that this is the next waypoint       aircraft came into the proximity of the UAV. The
              that the UAV is heading towards. When passing                imagery acquired was analysed to provide flight
              above the origin point (the target of the image              path accuracies.
              acquisition and where the UAV was initialised), the
              autopilot triggered the cameras.
                                                                           6      RESULTS AND DISCUSSION
              The take-off of the UAV was performed manually,
              under the visual control of a radio-control pilot.           A flight path of the three successful missions is shown
              Upon reaching a safe altitude (30 m), the UAV was            in figure 5. Two circuits were completed on flights
              switched into autonomous mode and the autopilot              one and three, with three circuits being made on flight
              started guiding the aircraft along the set track, with       four. Each dot in the circuit represents the latitude
              flight height targeted at 120 m above ground level           and longitude of the path taken by the UAV that was
              (AGL). When the UAV approached the imaging target            updated by the GPS every second and recorded in
              (the initialisation point) the UAV was instructed to         the flight log. The activity around the target area
              change altitude to 90 m AGL. The change in altitude          and the reduced distance between consecutive dots
              was performed so that most of the flight was at a            in this area indicates that this was the take-off and




              Figure 4:      The Horizon Flight Schedule ground control station software showing the path and the
                             flight details of the UAV being monitored in the autopilot flight software.

              Australian Journal of Multi-disciplinary Engineering                                                        Vol 8 No 2




N11-AE06 Jenson.indd 142                                                                                                               25/11/11 2:12 PM
“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan          143




                   Figure 5:      The flight paths and target positioning, Watts Bridge on 5 March 2008.




                    (a)                                                         (b)

                   Figure 6:      (a) Image 100_4814 taken at 3:27:44 pm. (b) Image 100_4815 taken just over 3 minutes after
                                  image 100_4814.

                   landing zone. The flight path is superimposed over           The target and waypoints were arranged so that the
                   a Spot 5 satellite image showing the infrastructure          UAV should in theory fly directly down the centre
                   of the Watts Bridge Memorial Airfield and other              of the mowed grass runway that ran northeast-
                   natural features in close proximity. Also displayed          southwest in figure 5. This should have resulted in
                   are the waypoints used in determining the flight             the runway being positioned vertically in the centre
                   path and the location of the target, over which the          of each image acquired. This was not the case. The
                   images were captured.                                        misalignment was possibly due to a combination
                                                                                of cross-wind, GPS/autopilot error, the UAV not
                   An example of two of the images captured on flight
                                                                                being level when the image was acquired, and/or
                   four are shown in figure 6. These images were taken
                                                                                inaccuracies in positioning the sensors in the hinged
                   on the last circuits made by the UAV on the day.
                                                                                pod. Defining the errors and refining them was not
                   Even though the images were acquired a little over
                                                                                within the scope of this proof-of-concept research.
                   3 minutes apart, there is good consistency in the
                   coverage and positioning of the target within both           Flight details and inaccuracies in the image acquisition
                   images. Ideally, if the autopilot was doing a perfect        were quantified and detailed in table 1. The scale of
                   job guiding the UAV, the target should be in the centre      the images were determined using GPS coordinates
                   of the image. As can be seen from the images (figures        of known features on the images. The direction of
                   6), this was not quite the case.                             flight of the UAV was from the top of the image to

                   Australian Journal of Multi-disciplinary Engineering                                                         Vol 8 No 2




N11-AE06 Jenson.indd 143                                                                                                                     25/11/11 2:12 PM
144         “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan


              Table 1:       Details of the errors for the images acquired over the target.

                                        Altitude Heading                     Image offset                Image extent         Area
                Image #      Time
                                        ASL (m) (degrees)            X (m)        Y (m)    Absolute    X (m)      Y (m)       (ha)
               Flight 1
               100_4801     1:32:06        261          150          50.4          8.3       51.1      140.4       104.0      1.46
               100_4803     1:35:26                                  83.7         17.8       85.6      133.4       98.8       1.32
               100_4805     1:38:32                                  59.6         10.2       60.5      137.4       101.8      1.40
               Flight 2
               100_4806     2:36:10                                  aborted mission, no data collected
               Flight 3
               100_4809     2:57:24        216          146          17.2         –1.9       17.3      103.0       76.3       0.79
               100_4810     3:00:42        198          132          19.1          0.3       19.1       92.2       68.3       0.63
               Flight 4
               100_4812     3:21:48        211          113          39.5          6.0       40.0      108.7       80.5       0.88
               100_4813     3:24:46                                  38.2         26.6       46.5       92.0       68.2       0.63
               100_4814     3:27:44        192          114          –19.3        14.1       23.9      100.4       74.4       0.75
               100_4815     3:30:50        166          125          –9.6          4.9       10.8       73.9       54.7       0.40

              the bottom. In the image offset column in table 1, the          response of the UAV to changes of the flight schedule.
              X distance is the cross-track distance with a positive          An altitude plot of flight 4 is shown in figure 7,
              value indicating that it is to the left and negative            showing the relatively steep climb of the UAV after
              to the right of the centre of the image. The offset             take-off. Also evident is the loss of altitude, and then
              in the direction of flight (undershoot or overshoot)            correction, due to the banking of the aircraft when
              is indicated by the Y column with a positive value              manoeuvring to align to the next waypoint. The
              indicating that the image was captured before the               saw-toothed nature of the plot, due to the banking,
              centre of the image with a negative value indicating            indicates that the feedback loops to the autopilot to
              after capture. The absolute is the direct distance from         control the flight surfaces are not finely tuned enough
              the centre of the image to the centre of the target.            to optimise performance and ensure stable flight. No
              Capturing the target in the image was achieved on               attempt was made to refine the triggering accuracy
              every flight. However, capturing the target in the              in this preliminary study. However, modifications
              middle of the image was not as repeatable with the              such as; extending the turn area to ensure the aircraft
              error ranging from just under 15% of the image width            was in straight and level flight when the camera was
              (10.8 in 73.9 m; the final image on flight four) to just        over target and triggered, or a more stable airframe
              over 60% of the image width (85.6 in 133.4 m; the               such as an electric glider-style, may have improved
              second image of flight one).                                    on the results obtained.
              The capacity to accurately acquire images over pre-
              determined points is essential to ensure coverage               7          CONCLUSIONS
              and to expedite mosaicing of the images. It will also
              expand the application of these technologies into               This study provides proof-of-concept that a low-cost
              the broader-scale applications, such as imaging in              auto-piloted UAV can fly on a pre-determined path
              broadacre cereal cropping or imaging along transects            and acquire images at pre-determined locations. On
              (such a river systems, etc.).                                   every attempt, the target was successfully captured in
              Differing altitudes were programmed for each flight             the images. The proximity of the target to the centre
              (table 1). The first flight was undertaken at 250 m             of the image varied due to a number of factors such
              above sea level (ASL) with the third at 204 m. The              as wind speed, direction, aircraft attitude and GPS/
              final flight was slightly different. The first image            autopilot/camera lags. Improving on the accuracy of
              was acquired at the set altitude of 214 m. The three            the image acquisition was beyond the scope of this
              circuits that followed were flown at this set height;           initial evaluation, however some simple measures
              however the images were acquired at lower altitudes             such as ensuring straight and level flight at image
              (194 m for images two and three, and 169 m for the              acquisition, a more stable platform, careful orientation
              final image). These image acquisition heights were              of the camera and a higher update rate GPS would
              changed in-flight with the intention of observing the           have a beneficial effect on the accuracies obtained.

              Australian Journal of Multi-disciplinary Engineering                                                          Vol 8 No 2




N11-AE06 Jenson.indd 144                                                                                                                 25/11/11 2:12 PM
“The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan          145




                   Figure 7:      Altitude details for flight 4 (note the UAV reduced altitude to acquire images).

                   This autonomous system has the potential to                  Applied Engineering in Agriculture, Vol. 20, No. 6, pp.
                   be a highly suitable platform for “real world”               845-849.
                   applications, but needs further development to
                   overcome the accuracy issues. As the capacity to             Kodak, n.d., “KODAK Digital Cameras, Printers,
                   perform automatic registering and mosaicing of the           Digital Video Cameras & more”, www.kodak.com.
                   acquired images filters down from conventional
                   aerial imagery, this low cost remote sensing system          Lelong, C. C. D., Burger, P., Jubelin, G., Roux, B.,
                   will have great potential to be utilised in broader          Labbe, S. & Baret, F. 2008, “Assessment of unmanned
                   agricultural applications.                                   aerial vehicles imagery for quantitative monitoring
                                                                                of wheat crop in small plots”, Sensors, Vol. 8, No. 5,
                                                                                pp. 3557-3585.
                   REFERENCES
                                                                                Microhard Systems Inc., 2011, “Leaders in Wireless
                   Goodrich, M. A., Morse, B. S., Gerhardt, D., Cooper,
                                                                                Communication”, http://microhardcorp.com.
                   J. L., Quigley, M., Adams, J. A. & Humphrey, C. 2008,
                   “Supporting wilderness search and rescue using a
                   camera-equipped mini UAV”, Journal of Field Robotics,        MicroPilot, 2011, “World Leader in Miniature UAV
                   Vol. 25, No. 1-2, pp. 89-110.                                Autopilots since 1994”, www.micropilot.com.

                   Hardin, P. J. & Jackson, M. W. 2005, “An unmanned            Model Sports, n.d., “Welcome”, www.modelsports.
                   aerial vehicle for rangeland photography”, Rangeland         com.au.
                   Ecology and Management, Vol. 58, No. 4, pp. 439-442.
                                                                                O. S. Engine, 2011, “O.S. Engines Homepage”, www.
                   Jensen, T., Apan, A., Young, F. & Zeller, L. 2007,           osengines.com.
                   “Detecting the attributes of a wheat crop using digital
                   imagery acquired from a low-altitude platform”,              Picaxe, n.d., “Home”, www.picaxe.com.
                   Computers and Electronics in Agriculture, Vol. 59, No.
                   1-2, pp. 66-77.                                              Schmale III, D. G., Dingus, B. R. & Reinholtz, C. 2008,
                                                                                “Development and application of an autonomous
                   Johnson, L. F., Herwitz, S. R., Lobitz, B. M. &              unmanned aerial vehicle for precise aerobiological
                   Dunagan, S. E. 2004, “Feasibility of monitoring coffee       sampling above agricultural fields”, Journal of Field
                   field ripeness with airborne multispectral imagery”,         Robotics, Vol. 25, No. 3, pp. 133-147.




                   Australian Journal of Multi-disciplinary Engineering                                                         Vol 8 No 2




N11-AE06 Jenson.indd 145                                                                                                                     25/11/11 2:12 PM
146         “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan


                                               TROY JENSEN

                                               Dr Troy Jensen is a Senior Research Fellow/Senior Lecturer with the National
                                               Centre for Engineering in Agriculture and the Faculty Engineering and
                                               Surveying, University of Southern Queensland (USQ), Toowoomba. He received
                                               his PhD degree in Engineering from USQ. Applying engineering technologies
                                               to agriculture is something that Troy has been doing since be commenced work
                                               in 1987. Since this time, he has gained extensive applied research experience
                                               in such diverse areas as agricultural machinery, animal and plant biosecurity,
                                               precision agriculture, remote sensing, controlled traffic farming, native grass
                                               seed harvesting and management, grain storage, horticultural mechanisation,
                                               and biomass reuse. His current research area focuses on the use precision
                                               agriculture technologies, and is working on a project funded by the Sugar
                                               Research and Development Corporation titled “A co-ordinated approach to
                                               Precision Agriculture RDE for the Australian Sugar Industry”.




                                               LES ZELLER

                                               Les Zeller is a Senior Research Engineer with the Queensland Department of
                                               Employment, Economic Development and Innovation based in Toowoomba.
                                               He received his Associate Diploma and Masters Degree in Engineering from
                                               the University of Southern Queensland and his Bachelor in Applied Physics
                                               from Central Queensland University. He has worked in agricultural research
                                               for over 30 years in the development and application of electronic technologies
                                               for agricultural research. A turf traction measurement device he developed has
                                               been patented by the Queensland State Government.




                                               ARMANDO APAN

                                               Dr Armando A. Apan is an Associate Professor with the Australian Centre
                                               for Sustainable Catchments and the Faculty of Engineering and Surveying,
                                               University of Southern Queensland, Toowoomba. He received his PhD degree
                                               in Geography from Monash University in Melbourne. His current research
                                               area focuses on the use of remote sensing and geographic information systems
                                               in environmental management, agriculture, forestry and ecology. He was
                                               awarded the Queensland Spatial Excellence Award (Education and Professional
                                               Development) in 2006 by the Spatial Sciences Institute, Australia. Currently, he
                                               is the Associate Dean (Research) of the Faculty of Engineering and Surveying,
                                               University of Southern Queensland.




              Australian Journal of Multi-disciplinary Engineering                                                        Vol 8 No 2




N11-AE06 Jenson.indd 146                                                                                                               25/11/11 2:12 PM
Copyright of Australian Journal of Multi-Disciplinary Engineering is the property of Institution of Engineers
Australia, trading as Engineers Australia and its content may not be copied or emailed to multiple sites or
posted to a listserv without the copyright holder's express written permission. However, users may print,
download, or email articles for individual use.

Contenu connexe

Tendances

REMOTE SENSING
REMOTE SENSINGREMOTE SENSING
REMOTE SENSINGijsrd.com
 
UAV System with Terrestrial Geo - Referencing
UAV System with Terrestrial Geo - ReferencingUAV System with Terrestrial Geo - Referencing
UAV System with Terrestrial Geo - ReferencingJOSE ESPEJO VASQUEZ
 
Soil mapping , remote sensing and use of sensors in precision farming
Soil mapping , remote sensing and use of  sensors in precision farmingSoil mapping , remote sensing and use of  sensors in precision farming
Soil mapping , remote sensing and use of sensors in precision farmingDr. M. Kumaresan Hort.
 
REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS
REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS
REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS AM Publications
 
Remote Sensing Presentation
Remote Sensing PresentationRemote Sensing Presentation
Remote Sensing PresentationKamran Ahmed
 
Indian remote sensing satellite mission
Indian remote sensing satellite missionIndian remote sensing satellite mission
Indian remote sensing satellite missionadevekar
 
Remote Sensing Applications in Agriculture in Pakistan
Remote Sensing Applications in Agriculture in PakistanRemote Sensing Applications in Agriculture in Pakistan
Remote Sensing Applications in Agriculture in PakistanGhulam Asghar
 
Lec_13_Introduction to Remote Sensing (RS)
Lec_13_Introduction to Remote Sensing (RS)Lec_13_Introduction to Remote Sensing (RS)
Lec_13_Introduction to Remote Sensing (RS)Atiqa khan
 
Gis and remote sensings
Gis and remote sensingsGis and remote sensings
Gis and remote sensingsGhassan Hadi
 
Remote Sensing Vivek
Remote Sensing VivekRemote Sensing Vivek
Remote Sensing Vivekvivek akkala
 
Remote sensing by Priyanshu kumar, 9608684800
Remote sensing by Priyanshu kumar, 9608684800Remote sensing by Priyanshu kumar, 9608684800
Remote sensing by Priyanshu kumar, 9608684800PRIYANSHU KUMAR
 

Tendances (15)

REMOTE SENSING
REMOTE SENSINGREMOTE SENSING
REMOTE SENSING
 
UAV System with Terrestrial Geo - Referencing
UAV System with Terrestrial Geo - ReferencingUAV System with Terrestrial Geo - Referencing
UAV System with Terrestrial Geo - Referencing
 
Geospatial_Center_Brochure_2016
Geospatial_Center_Brochure_2016Geospatial_Center_Brochure_2016
Geospatial_Center_Brochure_2016
 
Soil mapping , remote sensing and use of sensors in precision farming
Soil mapping , remote sensing and use of  sensors in precision farmingSoil mapping , remote sensing and use of  sensors in precision farming
Soil mapping , remote sensing and use of sensors in precision farming
 
REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS
REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS
REMOTE SENSING AND GEOGRAPHIC INFORMATION SYSTEMS
 
Remote Sensing Presentation
Remote Sensing PresentationRemote Sensing Presentation
Remote Sensing Presentation
 
Indian remote sensing satellite mission
Indian remote sensing satellite missionIndian remote sensing satellite mission
Indian remote sensing satellite mission
 
Remote Sensing Applications in Agriculture in Pakistan
Remote Sensing Applications in Agriculture in PakistanRemote Sensing Applications in Agriculture in Pakistan
Remote Sensing Applications in Agriculture in Pakistan
 
Lec_13_Introduction to Remote Sensing (RS)
Lec_13_Introduction to Remote Sensing (RS)Lec_13_Introduction to Remote Sensing (RS)
Lec_13_Introduction to Remote Sensing (RS)
 
1 remote sensing
1 remote sensing1 remote sensing
1 remote sensing
 
Gis and remote sensings
Gis and remote sensingsGis and remote sensings
Gis and remote sensings
 
GIS
GISGIS
GIS
 
Remote Sensing Vivek
Remote Sensing VivekRemote Sensing Vivek
Remote Sensing Vivek
 
Remote+Sensing
Remote+SensingRemote+Sensing
Remote+Sensing
 
Remote sensing by Priyanshu kumar, 9608684800
Remote sensing by Priyanshu kumar, 9608684800Remote sensing by Priyanshu kumar, 9608684800
Remote sensing by Priyanshu kumar, 9608684800
 

En vedette

Remote sensing-presentaion
Remote sensing-presentaionRemote sensing-presentaion
Remote sensing-presentaionMouna Guru
 
Ppt on remote sensing system
Ppt on remote sensing systemPpt on remote sensing system
Ppt on remote sensing systemAlisha Korpal
 
Lecture on photogrammetry
Lecture on photogrammetryLecture on photogrammetry
Lecture on photogrammetryWaleed Liaqat
 
REMOTE SENSING
REMOTE SENSINGREMOTE SENSING
REMOTE SENSINGKANNAN
 
Intro to GIS and Remote Sensing
Intro to GIS and Remote SensingIntro to GIS and Remote Sensing
Intro to GIS and Remote SensingJohn Reiser
 
Remote Sensing PPT
Remote Sensing PPTRemote Sensing PPT
Remote Sensing PPTAmal Murali
 
Remote Sensing Platforms and Sensors
Remote Sensing Platforms and SensorsRemote Sensing Platforms and Sensors
Remote Sensing Platforms and SensorsUday kumar Devalla
 

En vedette (9)

Remote sensing-presentaion
Remote sensing-presentaionRemote sensing-presentaion
Remote sensing-presentaion
 
remote sensing
remote sensingremote sensing
remote sensing
 
Ppt on remote sensing system
Ppt on remote sensing systemPpt on remote sensing system
Ppt on remote sensing system
 
Lecture on photogrammetry
Lecture on photogrammetryLecture on photogrammetry
Lecture on photogrammetry
 
REMOTE SENSING
REMOTE SENSINGREMOTE SENSING
REMOTE SENSING
 
Intro to GIS and Remote Sensing
Intro to GIS and Remote SensingIntro to GIS and Remote Sensing
Intro to GIS and Remote Sensing
 
Remote Sensing PPT
Remote Sensing PPTRemote Sensing PPT
Remote Sensing PPT
 
Remote Sensing Platforms and Sensors
Remote Sensing Platforms and SensorsRemote Sensing Platforms and Sensors
Remote Sensing Platforms and Sensors
 
My ppt on gis
My ppt on gisMy ppt on gis
My ppt on gis
 

Similaire à The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

IRJET- Tool: Segregration of Bands in Sentinel Data and Calculation of NDVI
IRJET-  	  Tool: Segregration of Bands in Sentinel Data and Calculation of NDVIIRJET-  	  Tool: Segregration of Bands in Sentinel Data and Calculation of NDVI
IRJET- Tool: Segregration of Bands in Sentinel Data and Calculation of NDVIIRJET Journal
 
Supervised machine learning based dynamic estimation of bulk soil moisture us...
Supervised machine learning based dynamic estimation of bulk soil moisture us...Supervised machine learning based dynamic estimation of bulk soil moisture us...
Supervised machine learning based dynamic estimation of bulk soil moisture us...eSAT Journals
 
Supervised machine learning based dynamic estimation
Supervised machine learning based dynamic estimationSupervised machine learning based dynamic estimation
Supervised machine learning based dynamic estimationeSAT Publishing House
 
IGARSS2011_05072011_HO.ppt
IGARSS2011_05072011_HO.pptIGARSS2011_05072011_HO.ppt
IGARSS2011_05072011_HO.pptgrssieee
 
Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...
Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...
Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...Mark Hardesty
 
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptxARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptxPoorvikaNPoorvi
 
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptxARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptxPoorvikaNPoorvi
 
Autonomous surface vessel for search and rescue operation
Autonomous surface vessel for search and rescue operationAutonomous surface vessel for search and rescue operation
Autonomous surface vessel for search and rescue operationjournalBEEI
 
Architectural study of littoral zone sensing using underwater acoustic wirele...
Architectural study of littoral zone sensing using underwater acoustic wirele...Architectural study of littoral zone sensing using underwater acoustic wirele...
Architectural study of littoral zone sensing using underwater acoustic wirele...IAEME Publication
 
Comparative Study of Indoor Navigation Systems for Autonomous Flight
Comparative Study of Indoor Navigation Systems for Autonomous FlightComparative Study of Indoor Navigation Systems for Autonomous Flight
Comparative Study of Indoor Navigation Systems for Autonomous FlightTELKOMNIKA JOURNAL
 
Remote sensing ang GIS
Remote sensing ang GISRemote sensing ang GIS
Remote sensing ang GISVijayarani31
 

Similaire à The use of an unmanned aerial vehicle as a remote sensing platform in agriculture* (20)

IRJET- Tool: Segregration of Bands in Sentinel Data and Calculation of NDVI
IRJET-  	  Tool: Segregration of Bands in Sentinel Data and Calculation of NDVIIRJET-  	  Tool: Segregration of Bands in Sentinel Data and Calculation of NDVI
IRJET- Tool: Segregration of Bands in Sentinel Data and Calculation of NDVI
 
Supervised machine learning based dynamic estimation of bulk soil moisture us...
Supervised machine learning based dynamic estimation of bulk soil moisture us...Supervised machine learning based dynamic estimation of bulk soil moisture us...
Supervised machine learning based dynamic estimation of bulk soil moisture us...
 
Supervised machine learning based dynamic estimation
Supervised machine learning based dynamic estimationSupervised machine learning based dynamic estimation
Supervised machine learning based dynamic estimation
 
Gallant_M_GSFC_2016
Gallant_M_GSFC_2016Gallant_M_GSFC_2016
Gallant_M_GSFC_2016
 
IGARSS2011_05072011_HO.ppt
IGARSS2011_05072011_HO.pptIGARSS2011_05072011_HO.ppt
IGARSS2011_05072011_HO.ppt
 
rs&gis-theena.pptx
rs&gis-theena.pptxrs&gis-theena.pptx
rs&gis-theena.pptx
 
Drones.pptx
Drones.pptxDrones.pptx
Drones.pptx
 
Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...
Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...
Helicopter Flight Demonstration of Lunar & Planetary Lander Technologies - AI...
 
Presentation
PresentationPresentation
Presentation
 
Presentation
PresentationPresentation
Presentation
 
Presentation
PresentationPresentation
Presentation
 
Presentation
PresentationPresentation
Presentation
 
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptxARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
 
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptxARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
ARTIFICIAL intelligence technique for space exploration TECHNICAL SEMINAR.pptx
 
Autonomous surface vessel for search and rescue operation
Autonomous surface vessel for search and rescue operationAutonomous surface vessel for search and rescue operation
Autonomous surface vessel for search and rescue operation
 
G044044249
G044044249G044044249
G044044249
 
Architectural study of littoral zone sensing using underwater acoustic wirele...
Architectural study of littoral zone sensing using underwater acoustic wirele...Architectural study of littoral zone sensing using underwater acoustic wirele...
Architectural study of littoral zone sensing using underwater acoustic wirele...
 
DGcorreia
DGcorreiaDGcorreia
DGcorreia
 
Comparative Study of Indoor Navigation Systems for Autonomous Flight
Comparative Study of Indoor Navigation Systems for Autonomous FlightComparative Study of Indoor Navigation Systems for Autonomous Flight
Comparative Study of Indoor Navigation Systems for Autonomous Flight
 
Remote sensing ang GIS
Remote sensing ang GISRemote sensing ang GIS
Remote sensing ang GIS
 

Plus de Angelo State University

Presentation for Scissortail Creative Writing Festival
Presentation for Scissortail Creative Writing FestivalPresentation for Scissortail Creative Writing Festival
Presentation for Scissortail Creative Writing FestivalAngelo State University
 
What happens when we read for visual thinking, narration, and explanation jun...
What happens when we read for visual thinking, narration, and explanation jun...What happens when we read for visual thinking, narration, and explanation jun...
What happens when we read for visual thinking, narration, and explanation jun...Angelo State University
 
English 4361 syllabus spring 2013 musgrove
English 4361 syllabus spring 2013  musgroveEnglish 4361 syllabus spring 2013  musgrove
English 4361 syllabus spring 2013 musgroveAngelo State University
 
Conjunctions and Interjections Illustrated
Conjunctions and Interjections IllustratedConjunctions and Interjections Illustrated
Conjunctions and Interjections IllustratedAngelo State University
 
Advanced College Students Visualize Their Feelings About Grammar
Advanced College Students Visualize Their Feelings About GrammarAdvanced College Students Visualize Their Feelings About Grammar
Advanced College Students Visualize Their Feelings About GrammarAngelo State University
 
Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...
Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...
Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...Angelo State University
 
Imag(in)ing Tomorrow’s Wars and Weapons
Imag(in)ing Tomorrow’s Wars and WeaponsImag(in)ing Tomorrow’s Wars and Weapons
Imag(in)ing Tomorrow’s Wars and WeaponsAngelo State University
 
Evaluation of an Unmanned Airborne System for Monitoring Marine Mammals
Evaluation of an Unmanned Airborne System   for Monitoring Marine MammalsEvaluation of an Unmanned Airborne System   for Monitoring Marine Mammals
Evaluation of an Unmanned Airborne System for Monitoring Marine MammalsAngelo State University
 
Remote Sensing Using an Airborne Biosensor
Remote Sensing Using an Airborne BiosensorRemote Sensing Using an Airborne Biosensor
Remote Sensing Using an Airborne BiosensorAngelo State University
 
Symbiotic Relationship of Man and Machine in Space Colonization
Symbiotic Relationship of Man and Machine in Space  ColonizationSymbiotic Relationship of Man and Machine in Space  Colonization
Symbiotic Relationship of Man and Machine in Space ColonizationAngelo State University
 

Plus de Angelo State University (20)

Presentation for Scissortail Creative Writing Festival
Presentation for Scissortail Creative Writing FestivalPresentation for Scissortail Creative Writing Festival
Presentation for Scissortail Creative Writing Festival
 
What happens when we read for visual thinking, narration, and explanation jun...
What happens when we read for visual thinking, narration, and explanation jun...What happens when we read for visual thinking, narration, and explanation jun...
What happens when we read for visual thinking, narration, and explanation jun...
 
Handmade Thinking: Drawing Out Reading
Handmade Thinking: Drawing Out ReadingHandmade Thinking: Drawing Out Reading
Handmade Thinking: Drawing Out Reading
 
Is the End of Reading Here?
Is the End of Reading Here?Is the End of Reading Here?
Is the End of Reading Here?
 
Phrases and Clauses Illustrated
Phrases and Clauses IllustratedPhrases and Clauses Illustrated
Phrases and Clauses Illustrated
 
Subjects and Predicates Illustrated
Subjects and Predicates IllustratedSubjects and Predicates Illustrated
Subjects and Predicates Illustrated
 
English 4361 syllabus spring 2013 musgrove
English 4361 syllabus spring 2013  musgroveEnglish 4361 syllabus spring 2013  musgrove
English 4361 syllabus spring 2013 musgrove
 
Conjunctions and Interjections Illustrated
Conjunctions and Interjections IllustratedConjunctions and Interjections Illustrated
Conjunctions and Interjections Illustrated
 
Prepositions Illustrated
Prepositions IllustratedPrepositions Illustrated
Prepositions Illustrated
 
Adjectives and Adverbs Illustrated
Adjectives and Adverbs IllustratedAdjectives and Adverbs Illustrated
Adjectives and Adverbs Illustrated
 
Verbs Illustrated
Verbs IllustratedVerbs Illustrated
Verbs Illustrated
 
Picturing Academic Probation
Picturing Academic ProbationPicturing Academic Probation
Picturing Academic Probation
 
Pronouns Illustrated
Pronouns IllustratedPronouns Illustrated
Pronouns Illustrated
 
Advanced College Students Visualize Their Feelings About Grammar
Advanced College Students Visualize Their Feelings About GrammarAdvanced College Students Visualize Their Feelings About Grammar
Advanced College Students Visualize Their Feelings About Grammar
 
Handmade Thinking Class Presentation
Handmade Thinking Class PresentationHandmade Thinking Class Presentation
Handmade Thinking Class Presentation
 
Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...
Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...
Autonomous and cooperative robotic behavior based on fuzzy logic and genetic ...
 
Imag(in)ing Tomorrow’s Wars and Weapons
Imag(in)ing Tomorrow’s Wars and WeaponsImag(in)ing Tomorrow’s Wars and Weapons
Imag(in)ing Tomorrow’s Wars and Weapons
 
Evaluation of an Unmanned Airborne System for Monitoring Marine Mammals
Evaluation of an Unmanned Airborne System   for Monitoring Marine MammalsEvaluation of an Unmanned Airborne System   for Monitoring Marine Mammals
Evaluation of an Unmanned Airborne System for Monitoring Marine Mammals
 
Remote Sensing Using an Airborne Biosensor
Remote Sensing Using an Airborne BiosensorRemote Sensing Using an Airborne Biosensor
Remote Sensing Using an Airborne Biosensor
 
Symbiotic Relationship of Man and Machine in Space Colonization
Symbiotic Relationship of Man and Machine in Space  ColonizationSymbiotic Relationship of Man and Machine in Space  Colonization
Symbiotic Relationship of Man and Machine in Space Colonization
 

Dernier

The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 

Dernier (20)

The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 

The use of an unmanned aerial vehicle as a remote sensing platform in agriculture*

  • 1. 139 The use of an unmanned aerial vehicle as a remote sensing platform in agriculture* TA Jensen† National Centre for Engineering in Agriculture, Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba, Queensland LC Zeller Department of Employment, Economic Development and Innovation, Toowoomba, Queensland AA Apan Australian Centre for Sustainable Catchments, Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba, Queensland ABSTRACT: One of the limitations of using hobbyist remotely controlled aircraft with an attached digital camera is that a great number of images look alike and unless a large number of natural features or artificial targets are present at the location, it is hard to identify and orientate the images. This paper investigates the use of an unmanned aerial vehicle (UAV) for use in agricultural applications. Trials were conducted, in collaboration with researchers from the Australian Research Centre for Aerospace Automation and Queensland University of Technology, on the ability of a UAV autopilot to accurately trigger a two-camera sensor when at a desired location. The study area was located at Watts Bridge Memorial Airfield, near Toogoolawah (152.460°, –27.098°) in southeast Queensland, Australia. The airfield has dedicated areas for use of remotely controlled aircraft, with the mission being undertaken on 5 March 2008. The target and waypoints were arranged so that the UAV flew in an anticlockwise flight pattern. Three separate missions were flown with images being acquired when over target on each of the nine passes. Although capturing the target in the image was achieved on every flight, the accuracy of capturing the target in the middle of the image was variable. The offset from the centre of the image to the target (zero in the perfect system) ranged from just under 15% to just over 60% of the image extent. The misalignment was due to a combination of the predetermined offset value, cross-wind, global position system/autopilot error, the UAV not being level when the image was acquired, and/or inaccuracies in positioning the sensors in the hinged pod. The capacity to accurately acquire images over pre-determined points is essential to ensure coverage and to expedite mosaicing of the images. It will also expand the application of these technologies into the broader-scale applications, such as imaging in broadacre cereal cropping or imaging along transects. 1 INTRODUCTION search and rescue (Goodrich et al, 2008); a very expensive and sophisticated UAV (the NASA- The use of unmanned aerial vehicles (UAVs) as developed Pathfinder) was used for coffee ripeness remote sensing tools is not new, with a number of monitoring (Johnson et al, 2004); and UAVs were recent studies detailing various applications: off-the- used to monitor wheat in small plots with colour shelf componentry was used to construct a UAV for and near-infrared (NIR) images acquired to develop rangeland photography (Hardin & Jackson, 2005); relationships between vegetation indices and field a camera-equipped UAV was used for wilderness measured parameters (Lelong et al, 2008). The above remote sensing methods used either real- * Reviewed and revised version of a paper originally presented at the 2009 Society for Engineering in Agriculture (SEAg) National time monitoring of the acquired imagery, or required Conference, Brisbane, Queensland, 13-16 September 2009. a large number of images to be taken with the most † Corresponding author Dr Troy Jensen can be contacted at appropriate being selected for future analysis. No troy.jensen@usq.edu.au. previous studies have investigated the capacity of a © Institution of Engineers Australia, 2011 Australian Journal of Multi-disciplinary Engineering, Vol 8 No 2 N11-AE06 Jenson.indd 139 25/11/11 2:12 PM
  • 2. 140 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan UAV to acquire an image only when at a particular 2 STUDY AREA location (set of coordinates). The ability of a UAV to navigate to a particular set of coordinates was, The study area was located at Watts Bridge Memorial however, detailed in a paper investigating spore Airfield, near Toogoolawah in southeast Queensland, collection from the air (Schmale III et al, 2008). These (152.460°, –27.098°), Australia (figure 1). The mission investigators utilised the global position system (GPS) was undertaken between 1300-1600 hours on 5 March track log to determine the flight path of the UAV. 2008. A slight breeze only became evident in the last hour of testing. The purpose of this investigation was to evaluate a fully autonomous image acquisition system. To achieve this objective, the ability of the autopilot to 3 PLATFORM trigger a remote sensing camera system was tested, and the three-dimensional accuracy of the autopilot To undertake this evaluation, a specially modified (x, y, z) was also evaluated. The procedures to version of a remotely-controlled “Phoenix perform this testing and evaluation are detailed in Boomerang” 60-size fixed-wing trainer aircraft this paper. fitted with an autonomous flight control system was Figure 1: Location of the Watts Bridge Memorial Airfield. Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 140 25/11/11 2:12 PM
  • 3. “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan 141 utilised (Model Sports, n.d.). The platform consisted in Jensen et al (2007). This investigation utilised 5.0 of two 60-size Boomerangs merged together to megapixel Kodak Easyshare CX7525 (Kodak, n.d.) give a wingspan of just under 3.0 m. The platform digital zoom camera (Eastman Kodak Company, (figure 2) was powered by an “OS Engines” 91FX Rochester NY). As described in the previous work, (16 cc) (O. S. Engine, 2011) methanol-glow motor. the two-camera system (one camera to capture the The baseline avionics on the platform included the colour and the other the NIR portion of the spectrum) “MicroPilot MP2028g” autopilot (MicroPilot, 2011) was remotely triggered, and was sensitive to NIR and a “microhard Systems Inc. Spectra 910A” 900 light (once the NIR cut-out filter had been removed). MHz spread spectrum modem (Microhard Systems The system was housed in a streamlined pod Inc., 2011) for communications with the ground attached to the underside of the fuselage directly control station. The speed range of the platform was beneath the wing. The pod was hinged for easy from 45-120 km/h with cruise speed being 70 km/h. access and download of the cameras (figure 3). As The payload of the platform was 3 kg and had a fly the sensor had been previously triggered using a time of 25 minutes. spare output channel of the radio control equipment, this was easily adapted to suit the autopilot system. 4 THE REMOTE SENSING SYSTEM When the UAV was within a predetermined distance of the designated location (set prior to take-off at The remote sensing system used to acquire images 20 m to allow for cross-winds, GPS error and camera was based on the system developed and detailed misalignment), the autopilot set a spare servo channel Figure 2: The Queensland University of Technology UAV ready for take-off. Figure 3: The pod opened to remove the secure digital (SD) cards from the sensors. Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 141 25/11/11 2:12 PM
  • 4. 142 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan to the maximum output for 600 ms. A microcontroller higher (hence perceived safer) altitude and likewise (PICAXE-08; Picaxe, n.d.) was programmed to detect to simulate flying over obstructions and coming this change of state on the designated channel and down to image acquisition height. Once past the trigger both cameras. The time delay between the target, the UAV resumed normal flying height. After trigger signal and the shot being taken was 0.5 s 15-20 minutes of autonomous flying, the UAV was with a further 1-2 s being required for the image to manually landed and the flight log was downloaded be stored on the memory card. The microprocessor from the autopilot. also gave both cameras a pulse every 10 s to ensure The flight log contained 52 columns of information, that they did not power down. recorded at 5 Hz. The log contained information detailing the state of the aircraft and included 5 DEPLOYMENT attributes such as attitude, position, speed, heading and servo values. Four flights were undertaken on The UAV was programmed with a flight plan the day of testing with images successfully captured to do a number of left circuits over a series of on three of these. The second flight had to be aborted pre-determined waypoints (figure 4). One of the dots and the UAV landed immediately, as conventional is brighter, indicating that this is the next waypoint aircraft came into the proximity of the UAV. The that the UAV is heading towards. When passing imagery acquired was analysed to provide flight above the origin point (the target of the image path accuracies. acquisition and where the UAV was initialised), the autopilot triggered the cameras. 6 RESULTS AND DISCUSSION The take-off of the UAV was performed manually, under the visual control of a radio-control pilot. A flight path of the three successful missions is shown Upon reaching a safe altitude (30 m), the UAV was in figure 5. Two circuits were completed on flights switched into autonomous mode and the autopilot one and three, with three circuits being made on flight started guiding the aircraft along the set track, with four. Each dot in the circuit represents the latitude flight height targeted at 120 m above ground level and longitude of the path taken by the UAV that was (AGL). When the UAV approached the imaging target updated by the GPS every second and recorded in (the initialisation point) the UAV was instructed to the flight log. The activity around the target area change altitude to 90 m AGL. The change in altitude and the reduced distance between consecutive dots was performed so that most of the flight was at a in this area indicates that this was the take-off and Figure 4: The Horizon Flight Schedule ground control station software showing the path and the flight details of the UAV being monitored in the autopilot flight software. Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 142 25/11/11 2:12 PM
  • 5. “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan 143 Figure 5: The flight paths and target positioning, Watts Bridge on 5 March 2008. (a) (b) Figure 6: (a) Image 100_4814 taken at 3:27:44 pm. (b) Image 100_4815 taken just over 3 minutes after image 100_4814. landing zone. The flight path is superimposed over The target and waypoints were arranged so that the a Spot 5 satellite image showing the infrastructure UAV should in theory fly directly down the centre of the Watts Bridge Memorial Airfield and other of the mowed grass runway that ran northeast- natural features in close proximity. Also displayed southwest in figure 5. This should have resulted in are the waypoints used in determining the flight the runway being positioned vertically in the centre path and the location of the target, over which the of each image acquired. This was not the case. The images were captured. misalignment was possibly due to a combination of cross-wind, GPS/autopilot error, the UAV not An example of two of the images captured on flight being level when the image was acquired, and/or four are shown in figure 6. These images were taken inaccuracies in positioning the sensors in the hinged on the last circuits made by the UAV on the day. pod. Defining the errors and refining them was not Even though the images were acquired a little over within the scope of this proof-of-concept research. 3 minutes apart, there is good consistency in the coverage and positioning of the target within both Flight details and inaccuracies in the image acquisition images. Ideally, if the autopilot was doing a perfect were quantified and detailed in table 1. The scale of job guiding the UAV, the target should be in the centre the images were determined using GPS coordinates of the image. As can be seen from the images (figures of known features on the images. The direction of 6), this was not quite the case. flight of the UAV was from the top of the image to Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 143 25/11/11 2:12 PM
  • 6. 144 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan Table 1: Details of the errors for the images acquired over the target. Altitude Heading Image offset Image extent Area Image # Time ASL (m) (degrees) X (m) Y (m) Absolute X (m) Y (m) (ha) Flight 1 100_4801 1:32:06 261 150 50.4 8.3 51.1 140.4 104.0 1.46 100_4803 1:35:26 83.7 17.8 85.6 133.4 98.8 1.32 100_4805 1:38:32 59.6 10.2 60.5 137.4 101.8 1.40 Flight 2 100_4806 2:36:10 aborted mission, no data collected Flight 3 100_4809 2:57:24 216 146 17.2 –1.9 17.3 103.0 76.3 0.79 100_4810 3:00:42 198 132 19.1 0.3 19.1 92.2 68.3 0.63 Flight 4 100_4812 3:21:48 211 113 39.5 6.0 40.0 108.7 80.5 0.88 100_4813 3:24:46 38.2 26.6 46.5 92.0 68.2 0.63 100_4814 3:27:44 192 114 –19.3 14.1 23.9 100.4 74.4 0.75 100_4815 3:30:50 166 125 –9.6 4.9 10.8 73.9 54.7 0.40 the bottom. In the image offset column in table 1, the response of the UAV to changes of the flight schedule. X distance is the cross-track distance with a positive An altitude plot of flight 4 is shown in figure 7, value indicating that it is to the left and negative showing the relatively steep climb of the UAV after to the right of the centre of the image. The offset take-off. Also evident is the loss of altitude, and then in the direction of flight (undershoot or overshoot) correction, due to the banking of the aircraft when is indicated by the Y column with a positive value manoeuvring to align to the next waypoint. The indicating that the image was captured before the saw-toothed nature of the plot, due to the banking, centre of the image with a negative value indicating indicates that the feedback loops to the autopilot to after capture. The absolute is the direct distance from control the flight surfaces are not finely tuned enough the centre of the image to the centre of the target. to optimise performance and ensure stable flight. No Capturing the target in the image was achieved on attempt was made to refine the triggering accuracy every flight. However, capturing the target in the in this preliminary study. However, modifications middle of the image was not as repeatable with the such as; extending the turn area to ensure the aircraft error ranging from just under 15% of the image width was in straight and level flight when the camera was (10.8 in 73.9 m; the final image on flight four) to just over target and triggered, or a more stable airframe over 60% of the image width (85.6 in 133.4 m; the such as an electric glider-style, may have improved second image of flight one). on the results obtained. The capacity to accurately acquire images over pre- determined points is essential to ensure coverage 7 CONCLUSIONS and to expedite mosaicing of the images. It will also expand the application of these technologies into This study provides proof-of-concept that a low-cost the broader-scale applications, such as imaging in auto-piloted UAV can fly on a pre-determined path broadacre cereal cropping or imaging along transects and acquire images at pre-determined locations. On (such a river systems, etc.). every attempt, the target was successfully captured in Differing altitudes were programmed for each flight the images. The proximity of the target to the centre (table 1). The first flight was undertaken at 250 m of the image varied due to a number of factors such above sea level (ASL) with the third at 204 m. The as wind speed, direction, aircraft attitude and GPS/ final flight was slightly different. The first image autopilot/camera lags. Improving on the accuracy of was acquired at the set altitude of 214 m. The three the image acquisition was beyond the scope of this circuits that followed were flown at this set height; initial evaluation, however some simple measures however the images were acquired at lower altitudes such as ensuring straight and level flight at image (194 m for images two and three, and 169 m for the acquisition, a more stable platform, careful orientation final image). These image acquisition heights were of the camera and a higher update rate GPS would changed in-flight with the intention of observing the have a beneficial effect on the accuracies obtained. Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 144 25/11/11 2:12 PM
  • 7. “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan 145 Figure 7: Altitude details for flight 4 (note the UAV reduced altitude to acquire images). This autonomous system has the potential to Applied Engineering in Agriculture, Vol. 20, No. 6, pp. be a highly suitable platform for “real world” 845-849. applications, but needs further development to overcome the accuracy issues. As the capacity to Kodak, n.d., “KODAK Digital Cameras, Printers, perform automatic registering and mosaicing of the Digital Video Cameras & more”, www.kodak.com. acquired images filters down from conventional aerial imagery, this low cost remote sensing system Lelong, C. C. D., Burger, P., Jubelin, G., Roux, B., will have great potential to be utilised in broader Labbe, S. & Baret, F. 2008, “Assessment of unmanned agricultural applications. aerial vehicles imagery for quantitative monitoring of wheat crop in small plots”, Sensors, Vol. 8, No. 5, pp. 3557-3585. REFERENCES Microhard Systems Inc., 2011, “Leaders in Wireless Goodrich, M. A., Morse, B. S., Gerhardt, D., Cooper, Communication”, http://microhardcorp.com. J. L., Quigley, M., Adams, J. A. & Humphrey, C. 2008, “Supporting wilderness search and rescue using a camera-equipped mini UAV”, Journal of Field Robotics, MicroPilot, 2011, “World Leader in Miniature UAV Vol. 25, No. 1-2, pp. 89-110. Autopilots since 1994”, www.micropilot.com. Hardin, P. J. & Jackson, M. W. 2005, “An unmanned Model Sports, n.d., “Welcome”, www.modelsports. aerial vehicle for rangeland photography”, Rangeland com.au. Ecology and Management, Vol. 58, No. 4, pp. 439-442. O. S. Engine, 2011, “O.S. Engines Homepage”, www. Jensen, T., Apan, A., Young, F. & Zeller, L. 2007, osengines.com. “Detecting the attributes of a wheat crop using digital imagery acquired from a low-altitude platform”, Picaxe, n.d., “Home”, www.picaxe.com. Computers and Electronics in Agriculture, Vol. 59, No. 1-2, pp. 66-77. Schmale III, D. G., Dingus, B. R. & Reinholtz, C. 2008, “Development and application of an autonomous Johnson, L. F., Herwitz, S. R., Lobitz, B. M. & unmanned aerial vehicle for precise aerobiological Dunagan, S. E. 2004, “Feasibility of monitoring coffee sampling above agricultural fields”, Journal of Field field ripeness with airborne multispectral imagery”, Robotics, Vol. 25, No. 3, pp. 133-147. Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 145 25/11/11 2:12 PM
  • 8. 146 “The use of an unmanned aerial vehicle as a remote sensing platform ...” – Jensen, Zeller & Apan TROY JENSEN Dr Troy Jensen is a Senior Research Fellow/Senior Lecturer with the National Centre for Engineering in Agriculture and the Faculty Engineering and Surveying, University of Southern Queensland (USQ), Toowoomba. He received his PhD degree in Engineering from USQ. Applying engineering technologies to agriculture is something that Troy has been doing since be commenced work in 1987. Since this time, he has gained extensive applied research experience in such diverse areas as agricultural machinery, animal and plant biosecurity, precision agriculture, remote sensing, controlled traffic farming, native grass seed harvesting and management, grain storage, horticultural mechanisation, and biomass reuse. His current research area focuses on the use precision agriculture technologies, and is working on a project funded by the Sugar Research and Development Corporation titled “A co-ordinated approach to Precision Agriculture RDE for the Australian Sugar Industry”. LES ZELLER Les Zeller is a Senior Research Engineer with the Queensland Department of Employment, Economic Development and Innovation based in Toowoomba. He received his Associate Diploma and Masters Degree in Engineering from the University of Southern Queensland and his Bachelor in Applied Physics from Central Queensland University. He has worked in agricultural research for over 30 years in the development and application of electronic technologies for agricultural research. A turf traction measurement device he developed has been patented by the Queensland State Government. ARMANDO APAN Dr Armando A. Apan is an Associate Professor with the Australian Centre for Sustainable Catchments and the Faculty of Engineering and Surveying, University of Southern Queensland, Toowoomba. He received his PhD degree in Geography from Monash University in Melbourne. His current research area focuses on the use of remote sensing and geographic information systems in environmental management, agriculture, forestry and ecology. He was awarded the Queensland Spatial Excellence Award (Education and Professional Development) in 2006 by the Spatial Sciences Institute, Australia. Currently, he is the Associate Dean (Research) of the Faculty of Engineering and Surveying, University of Southern Queensland. Australian Journal of Multi-disciplinary Engineering Vol 8 No 2 N11-AE06 Jenson.indd 146 25/11/11 2:12 PM
  • 9. Copyright of Australian Journal of Multi-Disciplinary Engineering is the property of Institution of Engineers Australia, trading as Engineers Australia and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.