Return to search

Supporting Web-based and Crowdsourced Evaluations of Data Visualizations

User studies play a vital role in data visualization research because they help measure the strengths and weaknesses of different visualization techniques quantitatively. In addition, they provide insight into what makes one technique more effective than another; and they are used to validate research contributions in the field of information visualization. For example, a new algorithm, visual encoding, or interaction technique is not considered a contribution unless it has been validated to be better than the state of the art and its competing alternatives or has been validated to be useful to intended users. However, conducting user studies is challenging, time consuming, and expensive.
User studies generally requires careful experimental designs, iterative refinement, recruitment of study participants, careful management of participants during the run of the studies, accurately collecting user responses, and expertise in statistical analysis of study results. There are several variables that are taken into consideration which can impact user study outcome if not carefully managed. Hence the process of conducting user studies successfully can take several weeks to months.
In this dissertation, we investigated how to design an online framework that can reduce the overhead involved in conducting controlled user studies involving web-based visualizations. Our main goal in this research was to lower the overhead of evaluating data visualizations quantitatively through user studies. To this end, we leveraged current research opportunities to provide a framework design that reduces the overhead involved in designing and running controlled user studies of data visualizations. Specifically, we explored the design and implementation of an open-source framework and an online service (VisUnit) that allows visualization designers to easily configure user studies for their web-based data visualizations, deploy user studies online, collect user responses, and analyze incoming results automatically. This allows evaluations to be done more easily, cheaply, and frequently to rapidly test hypotheses about visualization designs.
We evaluated the effectiveness of our framework (VisUnit) by showing that it can be used to replicate 84% of 101 controlled user studies published in IEEE Information Visualization conferences between 1995 and 2015. We evaluated the efficiency of VisUnit by showing that graduate students can use it to design sample user studies in less than an hour.
Our contributions are two-fold: first, we contribute a flexible design and implementation that facilitates the creation of a wide range of user studies with limited effort; second, we provide an evaluation of our design that shows that it can be used to replicate a wide range of user studies, can be used to reduce the time evaluators spend on user studies, and can be used to support new research.

Identiferoai:union.ndltd.org:fiu.edu/oai:digitalcommons.fiu.edu:etd-3789
Date24 June 2016
CreatorsOkoe, Mershack B
PublisherFIU Digital Commons
Source SetsFlorida International University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceFIU Electronic Theses and Dissertations

Page generated in 0.0015 seconds