Return to search

Autonomous Multi-Sensor and Web-Based Decision Support for Crop Diagnostics in Greenhouse

An autonomous machine vision guided plant sensing and monitoring system was designed and constructed to continuously monitor plant related features: color (red-green-blue, hue-saturation-luminance, and color brightness), morphology (top projected canopy area), textural (entropy, energy, contrast, and homogeneity), Normalized Difference Vegetative Index (NDVI) (as well as other similar indices from the color and NIR channels), and thermal (plant and canopy temperature). Several experiments with repeated water stress cycles, using the machine vision system, was conducted to evaluate the machine vision system's performance to determine the timeliness of induced plant water stress detection. The study aimed at identifying significant features separating the control and treatment from an induced water stress experiment and also identifying, amongst the plant canopy, the location of the emerging water stress with the found significant features. Plant cell severity had been ranked based on the cell's accumulated feature count and converted to a color coded graphical canopy image for the remote operator to evaluate. The overall feature analysis showed that the morphological feature, Top Projected Canopy Area, was found to be a good marker for the initial growth period while the vegetation indices (ENDVI, NDVIBlue, and NDVIRed) were more capable at capturing the repeated stress occurrences during the various stages of the lettuce crop. Furthermore, the crop's canopy temperature was shown to be a significant and dominant marker to timely detect the water stress occurrences. The graphical display for the remote user showed the severity of summed features to equal the detection of the human vision. Capabilities and limitations of the developed system and stress detection methodology were documented with recommendations for future improvements for the crop monitoring/production system. An example web based decision support platform was created for data collection, storage, analysis, and display of the data/imagery collected for a remote operator.

Identiferoai:union.ndltd.org:arizona.edu/oai:arizona.openrepository.com:10150/306925
Date January 2013
CreatorsStory, David Lee, Jr.
ContributorsKacira, Murat, Kacira, Murat, An, Lingling, Yoon, Jeong-Yeol, Akoglu, Ali
PublisherThe University of Arizona.
Source SetsUniversity of Arizona
Languageen_US
Detected LanguageEnglish
Typetext, Electronic Dissertation
RightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.

Page generated in 0.0025 seconds