Login | DPI Staff queries on depositing or searching to era.daf.qld.gov.au

Automating Crop Damage Assessments with Unmanned Aerial Systems and Machine Learning

Share this record

Add to FacebookAdd to LinkedinAdd to XAdd to WechatAdd to Microsoft_teamsAdd to WhatsappAdd to Any

Export this record

Puig, E., Gonzalez, F., Hamilton, G. and Grundy, P. R. (2016) Automating Crop Damage Assessments with Unmanned Aerial Systems and Machine Learning. In: Unmanned Aircraft Systems for Remote Sensing Applications 2016, 17 & 18 February 2016, University of Queensland, Brisbane, Queensland, Australia.

Full text not currently attached. Access may be available via the Publisher's website or OpenAccess link.

Article Link: http://conf2016.uas4rs.org.au/wp-content/uploads/2...

Abstract

Agricultural pests are responsible for millions of dollars in crop losses and management costs every year. In order to implement optimal site-specific treatments and reduce control costs, new methods to accurately monitor and assess pest damage need to be investigated. In this paper we explore the combination of unmanned aerial vehicles (UAV), remote sensing and machine learning techniques as a promising methodology to address this challenge. The deployment of UAVs as a sensor platform is a rapidly growing field of study for biosecurity and precision agriculture applications. In this experiment, a data collection campaign is performed over a sorghum crop severely damaged by white grubs (Coleoptera: Scarabaeidae). The larvae of these scarab beetles feed on the roots of plants, which in turn impairs root exploration of the soil profile. In the field, crop health status could be classified according to three levels: bare soil where plants were decimated, transition zones of reduced plant density and healthy canopy areas. In this study, we describe the UAV platform deployed to collect high-resolution RGB imagery as well as the image processing pipeline implemented to create an orthoimage. An unsupervised machine learning approach is formulated in order to create a meaningful partition of the image into each of the crop levels. The aim of this approach is to simplify the image analysis step by minimizing user input requirements and avoiding the manual data labelling necessary in supervised learning approaches. The implemented algorithm is based on the K-means clustering algorithm. In order to control high-frequency components present in the feature space, a neighbourhood-oriented parameter is introduced by applying Gaussian convolution kernels prior to K-means clustering. The results show the algorithm delivers consistent decision boundaries that classify the field into three clusters, one for each crop health level as shown in Figure 1. The methodology presented in this paper represents a venue for further esearch towards automated crop damage assessments and biosecurity surveillance.

Item Type:Conference or Workshop Item (Paper)
Business groups:Crop and Food Science
Keywords:Unmanned Aerial Systems (UAS), machine learning, remote sensing AgTech
Subjects:Agriculture > Agriculture (General) > Methods and systems of culture. Cropping systems
Agriculture > Agriculture (General) > Farm machinery and farm engineering
Live Archive:06 Apr 2016 00:03
Last Modified:20 Jun 2024 03:39

Repository Staff Only: item control page