<p> </p>
<p>The maintenance of a healthy sewer infrastructure is a major challenge due to the root damages from nearby plants that grow through pipe cracks or loose joints, which may lead to serious pipe blockages and collapse. Traditional inspections based on video surveillance to identify and localize root damages within such complex sewer networks are inefficient, laborious, and error-prone. Therefore, this study aims to develop a robust and efficient approach to automatically detect root damages and localize their circumferential and longitudinal positions in CCTV inspection videos by applying deep neural networks and computer vision techniques. With twenty inspection videos collected from various resources, keyframes were extracted from each video according to the difference in a LUV color space with certain selections of local maxima. To recognize distance information from video subtitles, OCR models such as Tesseract and CRNN-CTC were implemented and led to a 90% of recognition accuracy. In addition, a pre-trained segmentation model was applied to detect root damages, but it also found many false positive predictions. By applying a well-tuned YoloV3 model on the detection of pipe joints leveraging the Convex Hull Overlap (<em>CHO</em>) feature, we were able to achieve a 20% improvement on the reliability and accuracy of damage identifications. Moreover, an end-to-end deep learning pipeline that involved Triangle Similarity Theorem (<em>TST</em>) was successfully designed to predict the longitudinal position of each identified root damage. The prediction error was less than 1.0 feet. </p>
Identifer | oai:union.ndltd.org:purdue.edu/oai:figshare.com:article/21680771 |
Date | 03 February 2023 |
Creators | Muzi Zheng (14226701) |
Source Sets | Purdue University |
Detected Language | English |
Type | Text, Thesis |
Rights | In Copyright |
Relation | https://figshare.com/articles/thesis/Detection_and_Localization_of_Root_Damages_in_Underground_Sewer_Systems_using_Deep_Neural_Networks_and_Computer_Vision_Techniques/21680771 |
Page generated in 0.0024 seconds