Return to search

User experience, performance, and social acceptability : usable multimodal mobile interaction

This thesis explores the social acceptability of multimodal interaction in public places with respect to acceptance, adoption and appropriation. Previous work in multimodal interaction has mainly focused on recognition and detection issues without thoroughly considering the willingness of users to adopt these kinds of interactions in their everyday lives. This thesis presents a novel approach to user experience that is theoretically motivated by phenomenology, practiced with mixed-methods, and analysed based on dramaturgical metaphors. In order to explore the acceptance of multimodal interfaces, this thesis presents three studies that look at users’ initial reactions to multimodal interaction techniques: a survey study focusing on gestures, an on-the-street user study, and a follow-up survey study looking at gesture and voice-based interaction. The investigation of multimodal interaction adoption is explored through two studies: an in situ user study of a performative interface and a focus group study using experience prototypes. This thesis explores the appropriation of multimodal interaction by demonstrating the complete design process of a multimodal interface using the performative approach to user experience presented in this thesis. Chapter 3 looks at users’ initial reactions to and acceptance of multimodal interactions. The results of the first survey explored location and audience as factors the influence how individuals behave in public places. Participants in the on-the-street study described the desirable visual aspects of the gestures as playful, cool, or embarrassing aspects of interaction and how gestures could be hidden as everyday actions. These results begin to explain why users accepted or rejected the gestures from the first survey. The second survey demonstrated that the presence of familiar spectators made interaction significantly more acceptable. This result indicates that performative interaction could be made more acceptable by interfaces that support collaborative or social interaction. Chapter 4 explores how users place interactions into a usability context for use in real world settings. In the first user study, participants took advantage of the wide variety of possible performances, and created a wide variety of input, from highly performative to hidden actions, based on location. The ability of this interface to support flexible interactions allowed users to demonstrate the the purposed of their actions differently based on the immediately co-located spectators. Participants in the focus group study discussed how they would go about placing multimodal interactions into real world contexts, using three approaches: relationship to the device, personal meaning, and relationship to functionality. These results demonstrate how users view interaction within a usability context and how that might affect social acceptability. Chapter 5 examines appropriation of multimodal interaction through the completion of an entire design process. The results of an initial survey were used as a baseline of comparison from which to design the following focus group study. Participants in the focus groups had similar motives for accepting multimodal interactions, although the ways in which these were expressed resulted in very different preferences. The desire to use technology in a comfortable and satisfying way meant different things in these different settings. During the ‘in the wild’ user study, participants adapted performance in order to make interaction acceptable in different contexts. In some cases, performance was hidden in public places or shared with familiar spectators in order to successfully incorporate interaction into public places.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:547891
Date January 2012
CreatorsWilliamson, Julie R.
PublisherUniversity of Glasgow
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://theses.gla.ac.uk/3260/

Page generated in 0.0021 seconds