1 |
Semantic Interaction for Symmetrical Analysis and Automated Foraging of Documents and TermsDowling, Michelle Veronica 23 April 2020 (has links)
Sensemaking tasks, such as reading many news articles to determine the truthfulness of a given claim, are difficult. These tasks require a series of iterative steps to first forage for relevant information and then synthesize this information into a final hypothesis. To assist with such tasks, visual analytics systems provide interactive visualizations of data to enable faster, more accurate, or more thorough analyses. For example, semantic interaction techniques leverage natural or intuitive interactions, like highlighting text, to automatically update the visualization parameters using machine learning. However, this process of using machine learning based on user interaction is not yet well defined. We begin our research efforts by developing a computational pipeline that models and captures how a system processes semantic interactions. We then expanded this model to denote specifically how each component of the pipeline supports steps of the Sensemaking Process. Additionally, we recognized a cognitive symmetry in how analysts consider data items (like news articles) and their attributes (such as terms that appear within the articles). To support this symmetry, we also modeled how to visualize and interact with data items and their attributes simultaneously. We built a testbed system and conducted a user study to determine which analytic tasks are best supported by such symmetry. Then, we augmented the testbed system to scale up to large data using semantic interaction foraging, a method for automated foraging based on user interaction. This experience enabled our development of design challenges and a corresponding future research agenda centered on semantic interaction foraging. We began investigating this research agenda by conducting a second user study on when to apply semantic interaction foraging to better match the analyst's Sensemaking Process. / Doctor of Philosophy / Sensemaking tasks such as determining the truthfulness of a claim using news articles are complex, requiring a series of steps in which the relevance of each piece of information within the articles is first determined. Relevant pieces of information are then combined together until a conclusion may be reached regarding the truthfulness of the claim. To help with these tasks, interactive visualizations of data can make it easier or faster to find or combine information together. In this research, we focus on leveraging natural or intuitive interactions, such organizing documents in a 2-D space, which the system uses to perform machine learning to automatically adjust the visualization to better support the given task. We first model how systems perform such machine learning based on interaction as well as model how each component of the system supports the user's sensemaking task. Additionally, we developed a model and accompanying testbed system for simultaneously evaluating both data items (like news articles) and their attributes (such as terms within the articles) through symmetrical visualization and interaction methods. With this testbed system, we devised and conducted a user study to determine which types of tasks are supported or hindered by such symmetry. We then combined these models to build an additional testbed system that implemented a searching technique to automatically add previously unseen, relevant pieces of information to the visualization. Using our experience in implementing this automated searching technique, we defined design challenges to guide future implementations, along with a research agenda to refine the technique. We also devised and conducted another user study to determine when such automated searching should be triggered to best support the user's sensemaking task.
|
Page generated in 0.1339 seconds