This thesis presents an innovative agricultural automation technique which focuses on addressing the significant perception challenges posed by occlusion within environments such as farms and greenhouses. Automated systems tasked with duties like pruning face considerable difficulties due to occlusion, complicating the accurate identification of plant features. To tackle these challenges, this work introduces a novel approach utilizing a LiDAR camera mounted on a robot arm, enhancing the system's ability to scan plants and dynamically adjust the arm's trajectory based on machine learning-derived segmentation. This adjustment significantly increases the detection area of plant features, improving identification accuracy and efficiency.
Building on foreground isolation and instance segmentation, the thesis then presents an automated method for identifying optimal pruning points using best pose view images of indeterminate tomato plants. By integrating advanced image processing techniques, the proposed method ensures the pruning process by targeting branches with the highest leaf load.
Experimental validation of the proposed method was conducted in a simulated environment, where it demonstrated substantially enhanced performance. In terms of pruning point identification, the method achieved impressive results with 94% precision, 90% recall, and 92% F1 score for foreground isolation. Furthermore, the segmentation of isolated images significantly outperformed non-isolated ones, with improvements exceeding 30%, 27%, and 30% in precision, recall, and F1 metrics, respectively. This validation also confirmed the method's effectiveness in accurately identifying pruning points, achieving a 67% accuracy rate when compared against manually identified pruning points. These results underscore the robustness and reliability of the approach in automating pruning processes in agricultural settings. / Master of Science / This thesis explores new methods for improving automated farming systems, particularly focusing on enhancing tasks like pruning where visibility of plant features can be significantly obstructed by overlapping leaves and branches. Central to this study is the development of an innovative approach using a special camera mounted on a robotic arm, which scans plants to determine the best vantage points for precise interactions. This setup not only identifies the optimal positions for viewing but also adjusts the robot's movements in real-time to ensure it can accurately perform pruning task.
The innovative approach employed here leverages advanced technology to dynamically adjust the trajectory of the robotic arm based on real-time imaging. This enables the robot to better detect essential features of plants, which is crucial to make informed decision of where to prune the plant. By improving the robot's ability to clearly see and interact with plants, the system facilitates more precise and efficient operations.
Tests conducted in simulated environments have demonstrated that this method significantly enhances the robot's capability to isolate and identify plant features accurately. These improvements make it possible for the robot to subsequently identify pruning points, potentially reducing the time and labor typically required in traditional manual operations. Overall, this research indicates that integrating advanced sensing and machine learning technologies into agricultural automation can revolutionize farming practices, making them more efficient and less dependent on human labor, especially in environments where traditional methods are less effective.
Identifer | oai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/120574 |
Date | 01 July 2024 |
Creators | Srivastava, Chhayank |
Contributors | Mechanical Engineering, Seyyedhasani, Hasan, Li, Song, Akbari Hamed, Kaveh, Mueller, Rolf |
Publisher | Virginia Tech |
Source Sets | Virginia Tech Theses and Dissertation |
Language | English |
Detected Language | English |
Type | Thesis |
Format | ETD, application/pdf |
Rights | Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International, http://creativecommons.org/licenses/by-nc-sa/4.0/ |
Page generated in 0.0022 seconds