Return to search

DEEP NEURAL NETWORKS AND TRANSFER LEARNINGFOR CROP PHENOTYPING USING MULTI-MODALITYREMOTE SENSING AND ENVIRONMENTAL DATA

<p>High-throughput phenotyping has emerged as a powerful approach to expedite crop breeding programs. Modern remote sensing systems, including manned aircraft, unmanned aerial vehicles (UAVs), and terrestrial platforms equipped with multiple sensors, such as RGB cameras, multispectral, hyperspectral, and infrared thermal sensors, as well as light detection and ranging (LiDAR) scanners are now widely used technologies in advancing high throughput phenotyping. These systems can collect high spatial, spectral, and temporal resolution data on various phenotypic traits, such as plant height, canopy cover, and leaf area. Enhancing the capability of utilizing such remote sensing data for automated phenotyping is crucial in advancing crop breeding. This dissertation focuses on developing deep learning and transfer learning methodologies for crop phenotyping using multi-modality remote sensing and environmental data. The techniques address two main areas: multi-temporal/across-field biomass prediction and multi-scale remote sensing data fusion.</p>
<p><br></p>
<p>Biomass is a plant characteristic that strongly correlates with biofuel production, but is also influenced by genetic and environmental factors. Previous studies have shown that deep learning-based models are effective in predicting end-of-season biomass for a single year and field. This dissertation includes development of transfer learning methodologies for multiyear,</p>
<p>across-field biomass prediction. Feature importance analysis was performed to identify and remove redundant features. The proposed model can incorporate high-dimensional genetic marker data, along with other features representing phenotypic information, environmental conditions, or management practices. It can also predict end-of-season biomass using mid-season remote sensing and environmental data to provide early rankings. The framework was evaluated using experimental trials conducted from 2017 to 2021 at the Agronomy Center for Research and Education (ACRE) at Purdue University. The proposed transfer learning techniques effectively selected the most informative training samples in the target domain, resulting in significant improvements in end-of-season yield prediction and ranking. Furthermore, the importance of input remote sensing features was assessed at different growth stages.</p>
<p><br></p>
<p>Remote sensing technology enables multi-scale, multi-temporal data acquisition. However, to fully exploit the potential of the acquired data, data fusion techniques that leverage the strengths of different sensors and platforms are necessary. In this dissertation, a generative adversarial network (GAN) based multiscale RGB-guided model and domain adaptation framework were developed to enhance the spatial resolution of multispectral images. The model was trained on limited high spatial resolution images from a wheel-based platform and then applied to low spatial resolution images acquired by UAV and airborne platforms.</p>
<p>The strategy was tested in two distinct scenarios, sorghum plant breeding, and urban areas, to evaluate its effectiveness.</p>

  1. 10.25394/pgs.22701433.v1
Identiferoai:union.ndltd.org:purdue.edu/oai:figshare.com:article/22701433
Date27 April 2023
CreatorsTaojun Wang (15360640)
Source SetsPurdue University
Detected LanguageEnglish
TypeText, Thesis
RightsCC BY 4.0
Relationhttps://figshare.com/articles/thesis/DEEP_NEURAL_NETWORKS_AND_TRANSFER_LEARNINGFOR_CROP_PHENOTYPING_USING_MULTI-MODALITYREMOTE_SENSING_AND_ENVIRONMENTAL_DATA/22701433

Page generated in 0.0019 seconds