Web developers are confronted with evaluating the usability of Web interfaces.
Automatic Web usability evaluation tools are available, but they are limited in the types
of problems they can handle. Tool support for manual usability evaluation is needed.
Accordingly, this research focuses on developing a tool for supporting manual processes
in Heuristic Evaluation inspection.
The research was conveyed in three phases. First, an observational study was
conducted in order to characterize the inspection process in Heuristic Evaluation. The
videos of evaluators applying a Heuristic Evaluation on a non-interactive, paper-based
Web interface were analyzed to dissect the inspection process. Second, based on the
study, a tool for annotating Web interfaces when applying Heuristic Evaluations was
developed. Finally, a survey is conducted to evaluate the tool and learn the role of
annotations in inspection. Recommendations for improving the use of annotations in
problem reporting are outlined. Overall, users were satisfied with the tool.
The goal of this research, designing and developing an inspection tool, is
achieved.
Identifer | oai:union.ndltd.org:tamu.edu/oai:repository.tamu.edu:1969.1/ETD-TAMU-2009-08-7210 |
Date | 14 January 2010 |
Creators | Flores Mendoza, Ana |
Contributors | Lively, William |
Source Sets | Texas A and M University |
Language | en_US |
Detected Language | English |
Type | Book, Thesis, Electronic Dissertation |
Format | application/pdf |
Page generated in 0.002 seconds