Visual programming languages employ visual representation to make programming
easier and make programs more reliable and more accessible. Visual program
testing becomes increasingly important as more and more visual programming languages
and visual programming environments come into real use. In this work, we
focus on one important class of visual programming languages: form-based visual
programming languages. This class of languages includes electronic spreadsheets
and a variety of research systems that have had a substantial impact on end-user
computing.
Research shows that form-based visual programs often contain faults, but that
their creators often have unwarranted confidence in the reliability of their programs.
Despite this evidence, we find no discussion in the research literature of techniques
for testing or assessing the reliability of form-based visual programs. This lack will
hinder the real use of visual programming languages.
Our work addresses the lack of testing methodologies for form-based visual programs.
In this document, we first examine differences between the form-based and
imperative programming paradigms, discuss effects these differences have on methodologies for testing form-based programs, and analyze challenges and opportunities
for form-based program testing.
We then present several criteria for measuring test adequacy for form-based programs,
and illustrate their application. We show that an analogue to the traditional
"all-uses" dataflow test adequacy criterion is well suited for testing form-based visual
programs: it provides important error-detection ability, and can be applied more
easily to form-based programs than to imperative programs.
Finally, we present a testing methodology that we have developed for form-based
visual programs. To accommodate the evaluation model used with these programs,
and the interactive process by which they are created, our methodology is validation-driven
and incremental. To accommodate the user base of these languages, we provide
an interface to the methodology that does not require an understanding of
testing theory. We discuss our implementation of this methodology, its time costs,
the mapping from our approach to the user interface, and empirical results achieved
in its use. / Graduation date: 1998
Identifer | oai:union.ndltd.org:ORGSU/oai:ir.library.oregonstate.edu:1957/33861 |
Date | 06 November 1997 |
Creators | Li, Lixin, 1966- |
Contributors | Rothermel, Gregg |
Source Sets | Oregon State University |
Language | en_US |
Detected Language | English |
Type | Thesis/Dissertation |
Page generated in 0.0021 seconds