Return to search

Stereo Matching Based on Edge-Aware T-MST

Dense stereo matching is one of the most extensively investigated topics in computer vision, since it plays an important role in many applications such as 3D scene reconstruction.

In this thesis, a novel dense stereo matching method is proposed based on edge-aware truncated minimum spanning tree (T-MST). Instead of employing non-local cost aggregation on traditional MST which is only generated from color differences of neighbouring pixels, a new tree structure, "Edge-Aware T-MST", is proposed to aggregate the cost according to the image texture. Specifically, cost aggregations are strongly enforced in large planar textureless regions due to the truncated edge weights. Meanwhile, the "edge fatten" effect is suppressed by employing a novel hybrid edge-prior which combines edge-prior and superpixel-prior to locate the potential disparity edges. Then a widely used Winner-Takes-All (WTA) strategy is performed to establish initial disparity map. An adaptive non-local refinement is also performed based on the stability of initial disparity estimation.

Given the stereo images from Middlebury benchmark, we estimate the disparity maps by using our proposed method and other five state-of-the-art tree-based non-local matching methods. The experimental results show that the proposed method successfully produced reliable disparity values within large planar textureless regions and around object disparity boundaries. Performance comparisons demonstrate that our proposed non-local stereo matching method based on edge-aware T-MST outperforms current non-local tree-based state-of-the-art stereo matching methods in most cases, especially in large textureless planar regions and around disparity bounaries.

Identiferoai:union.ndltd.org:uottawa.ca/oai:ruor.uottawa.ca:10393/35538
Date January 2016
CreatorsZhou, Dan
ContributorsZhao, Jiying
PublisherUniversité d'Ottawa / University of Ottawa
Source SetsUniversité d’Ottawa
LanguageEnglish
Detected LanguageEnglish
TypeThesis

Page generated in 0.0024 seconds