<p>Current methods of broadcast herbicide
application cause a negative environmental and economic impact. Computer vision methods, specifically those
related to object detection, have been reported to aid in site-specific weed
management procedures to target apply herbicide on per-weed basis within a
field. However, a major challenge to
developing a weed detection system is the requirement for properly annotated training
data to differentiate between weeds and crops under field conditions. This research involved creating an annotated database
of weeds by using UAS-acquired imagery from corn and soybean research plots located
in North-central Indiana. A total of 27,828
RGB; 108,398 multispectral; and 23,628 thermal images, were acquired using FLIR
Duo Pro R sensor that was attached to a DJI Matrice 600 Pro UAS. An annotated
database of 306 RGB images, organized into monocot and dicot weed classes, was
used for network training. Two Deep
Learning networks namely, DetectNet and You Only Look Once version 3 (YOLO
ver3) were subjected to five training stages using four annotated image
sets. The precision for weed detection ranged
between 3.63-65.37% for monocot and 4.22-45.13% for dicot weed detection. This
research has demonstrated a need for creating a large annotated weed database for
improving precision of deep learning algorithms through better training of the network.</p>
Identifer | oai:union.ndltd.org:purdue.edu/oai:figshare.com:article/9108371 |
Date | 13 August 2019 |
Creators | Aaron Etienne (6570041) |
Source Sets | Purdue University |
Detected Language | English |
Type | Text, Thesis |
Rights | CC BY 4.0 |
Relation | https://figshare.com/articles/AUTOMATED_WEED_DETECTION_USING_MACHINE_LEARNING_TECHNIQUES_ON_UAS-ACQUIRED_IMAGERY/9108371 |
Page generated in 0.0018 seconds