Object region detection plays a vital role in many domains ranging from self-driving cars to lane detection, which heavily involves the task of object detection. Improving the performance of object region detection approaches is of great importance and therefore is an active ongoing research in Computer Vision. Traditional sliding window paradigm has been widely used to identify hundreds of thousands of windows (covering different scales, angles, and aspect ratios for objects) before the classification step. However, it is not only computationally expensive but also produces relatively low accuracy in terms of the classifier output by providing many negative samples. Object detection proposals, as discussed in detail in [19, 20], tackle these issues by filtering the windows using different features in the image before passing them to the classifier. This filtering process helps to control the quality as well as the quantity of the windows. EdgeBox is one of the most effective proposal detection approaches that focuses on the presence of dense edges in an image to identify quality proposal windows.
This thesis proposes an innovative approach that improves the accuracy of the EdgeBox approach. The improved approach uses both the color properties and the corner information from an image along with the edge information to evaluate the candidate windows. We also describe two variations of the proposed approach. Our extensive experimental results on the Visual Object Classification (VOC) [29,30] dataset clearly demonstrate the effectiveness of the proposed approach together with its two variances to improve the accuracy of the EdgeBox approach.
Identifer | oai:union.ndltd.org:UTAHS/oai:digitalcommons.usu.edu:etd-8438 |
Date | 01 December 2018 |
Creators | Yadav, Kamna |
Publisher | DigitalCommons@USU |
Source Sets | Utah State University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | All Graduate Theses and Dissertations |
Rights | Copyright for this work is held by the author. Transmission or reproduction of materials protected by copyright beyond that allowed by fair use requires the written permission of the copyright owners. Works not in the public domain cannot be commercially exploited without permission of the copyright owner. Responsibility for any use rests exclusively with the user. For more information contact digitalcommons@usu.edu. |
Page generated in 0.0019 seconds