With about 3.6 million adults in the United States having visual impairment or blind- ness, assistive technology is essential to give these people grocery shopping independance. This thesis presents a new method to detect and localize nutrition facts tables (NFTs) on mobile devices more quickly and from less-ideal inputs than before. The method is a drop- in replacement for an existing NFT analysis pipeline and utilizes multiple image analysis methods which exploit various properties of standard NFTs.In testing, this method performs very well with no false-positives and 42% total recall. These results are ideal for real-world application where inputs are analyzed as quickly as possible. Additionally, this new method exposes many possibilities for future improvement.
Identifer | oai:union.ndltd.org:UTAHS/oai:digitalcommons.usu.edu:etd-3012 |
Date | 01 May 2013 |
Creators | Blay, Christopher |
Publisher | DigitalCommons@USU |
Source Sets | Utah State University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | All Graduate Theses and Dissertations |
Rights | Copyright for this work is held by the author. Transmission or reproduction of materials protected by copyright beyond that allowed by fair use requires the written permission of the copyright owners. Works not in the public domain cannot be commercially exploited without permission of the copyright owner. Responsibility for any use rests exclusively with the user. For more information contact Andrew Wesolek (andrew.wesolek@usu.edu). |
Page generated in 0.0017 seconds