Return to search

The Detection of Crosswalks Based on Image Processing Technique

The main purpose of this thesis is to detect pedestrian crossing by static and dynamic image processing. This technique can help the blind and the disabled people to find the pedestrian crossing and walk through it safely.
Until now there is no research about detecting pedestrian crossing in Taiwan. Therefore, this article applies the algorithm of Bipolarity feature in an image-based technique. In this thesis, the Bipolarity is regarded as the main feature in detecting pedestrian crossing. At first, it uses the features of pedestrian crossing, the black road surface is painted with constant-width periodic white stripes. After computation, the analysis and comparison in an image of intensity distribution is obtained. And the background will be eliminated. Secondly, Connected Component Labeling is used to extract the most similar region, and the marked region will be detected by the image. Finally, this thesis will detect whether the crossing exist or not in the marked region and measure the length of crossing.
In dynamic model, the real-time image processing technique combines with wheelchair robot in order to walk through pedestrian crossing automatically, and image processing technique provides real-time offset and angle of displacement for the wheelchair robot to control and reach the destination. In this thesis, the image processing is in PC-base, and it receives the information by using a digital camera to record the real scene of pedestrian crossing.
Keywords:pedestrian crossing , image processing, wheelchair robot

Identiferoai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0903110-150306
Date03 September 2010
CreatorsFan, Ho-Hsiang
ContributorsChi-Cheng Cheng, Jau-Woei Perng, Inn-Chyn Her
PublisherNSYSU
Source SetsNSYSU Electronic Thesis and Dissertation Archive
LanguageCholon
Detected LanguageEnglish
Typetext
Formatapplication/pdf
Sourcehttp://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0903110-150306
Rightsnot_available, Copyright information available at source archive

Page generated in 0.0022 seconds