Herbicides applications could possibly be reduced if targeted. Targeting the applications requires prior identification and quantification of the weed population. This task could possibly be done by a weed scout robot. The ability to position a camera over the inter-row space of densely seeded crops will help to simplify the task of automatically quantifying weed infestations. As part of the development of an autonomous weed scout, a vision-based local positioning system for weed detection has been developed and tested in a laboratory setting. Four Line-detection algorithms have been tested and a robotic positioning device, or XYZtheta-table, was developed and tested. <p> The Line-detection algorithms were based respectively on a stripe analysis, a blob analysis, a linear regression and the Hough Transform. The last two also included an edge-detection step. Images of parallel line patterns representing crop rows were collected at different angles, with and without weed-simulating noise. The images were processed by the four programs. The ability of the programs to determine the angle of the rows and the location of an inter-row space centreline was evaluated in a laboratory setting. All algorithms behaved approximately the same when determining the rows angle in the noise-free images, with a mean error of 0.5°. In the same situation, all algorithms could find the centreline of an inter-row space within 2.7 mm. Generally, the mean errors increased when noise was added to the images, up to 1.1° and 8.5 mm for the Linear Regression algorithm. Specific dispersions of the weeds were identified as possible causes of increase of the error in noisy images. Because of its insensitivity to noise, the Stripe Analysis algorithm was considered the best overall. The fastest program was the Blob Analysis algorithm with a mean processing time of 0.35 s per image. Future work involves evaluation of the Line-detection algorithms with field images. <p>The XYZtheta-table consisted of rails allowing movement of a camera in the 3 orthogonal directions and of a rotational table that could rotate the camera about a vertical axis. The ability of the XYZtheta-table to accurately move the camera within the XY-space and rotate it at a desired angle was evaluated in a laboratory setting. The XYZtheta-table was able to move the camera within 7 mm of a target and to rotate it with a mean error of 0.07°. The positioning accuracy could be improved by simple mechanical modifications on the XYZtheta-table.
Identifer | oai:union.ndltd.org:USASK/oai:usask.ca:etd-05182004-051557 |
Date | 18 May 2004 |
Creators | Fontaine, Veronique |
Contributors | Guo, Huiqing, Crowe, Trever G., Bolton, Ronald J., Roberge, Martin |
Publisher | University of Saskatchewan |
Source Sets | University of Saskatchewan Library |
Language | English |
Detected Language | English |
Type | text |
Format | application/pdf, application/zip |
Source | http://library.usask.ca/theses/available/etd-05182004-051557/ |
Rights | unrestricted, I hereby certify that, if appropriate, I have obtained and attached hereto a written permission statement from the owner(s) of each third party copyrighted matter to be included in my thesis, dissertation, or project report, allowing distribution as specified below. I certify that the version I submitted is the same as that approved by my advisory committee. I hereby grant to University of Saskatchewan or its agents the non-exclusive license to archive and make accessible, under the conditions specified below, my thesis, dissertation, or project report in whole or in part in all forms of media, now or hereafter known. I retain all other ownership rights to the copyright of the thesis, dissertation or project report. I also retain the right to use in future works (such as articles or books) all or part of this thesis, dissertation, or project report. |
Page generated in 0.0029 seconds