Return to search

Attentive gestural user interface for touch screens

Gestural interfaces are user interfaces controlled by users’ gestures, such as taps, flicks and swipes, without the aid of a conventional pointing device, such as a mouse or a touchpad. The development of touch screen technology has resulted in an increasing number of inventive gestural interfaces. However, recent studies have shown that well-established interaction design principles are generally not followed, or even violated by gestural interfaces. As a result, severe usability issues start to surface: the absence of signifiers for operative gestures, the weakening of visual feedback, the inability to discover every possible actions in the interface, as well as the lack of consistency, all of which are undermining the user experience of these interfaces thus need to be addressed.

Further analysis of existing gestural interfaces suggests that the sole dependence on gestural input makes interface design unnecessarily complicated, which in turn makes it challenging to establish a standard. Therefore, an approach to supplement gestures with user attention is proposed. By incorporating eye gaze as a new input modality into gestural interactions, this novel type of interfaces can interact with users in a more intelligent and natural way by collecting input that reflects the users’ interest and intention, which makes the interfaces attentive.

To demonstrate the viability of this approach, a system was built to utilise eye-tracking techniques to detect visual attention of users and deliver input data to the applications on a mobile device. A paradigm for attentive gestural interfaces was introduced to provide insights into how such interfaces can be designed. A software prototype with attentive gestural interfaces was created according to the paradigm.

It has been found in an experiment that the new type of interfaces helped users learn a new application faster, and modestly increased their accuracy when completing tasks. This provided evidence that attentive gestural interfaces can improve usability in terms of learnability and effectiveness

This study is focused on interfaces of mobile devices whose major input mechanism is a touch screen, which are commonly seen and widely adopted. Despite the fact that eye-tracking capability is not generally available on these devices, this study demonstrates that it has great potential to facilitate interfaces that are both gestural and attentive, and it can enable new possibilities for future user interfaces. / published_or_final_version / Computer Science / Master / Master of Philosophy

Identiferoai:union.ndltd.org:HKU/oai:hub.hku.hk:10722/192863
Date January 2013
CreatorsLi, Sirui., 李思锐.
ContributorsLau, FCM
PublisherThe University of Hong Kong (Pokfulam, Hong Kong)
Source SetsHong Kong University Theses
LanguageEnglish
Detected LanguageEnglish
TypePG_Thesis
Sourcehttp://hub.hku.hk/bib/B50900080
RightsThe author retains all proprietary rights, (such as patent rights) and the right to use in future works., Creative Commons: Attribution 3.0 Hong Kong License
RelationHKU Theses Online (HKUTO)

Page generated in 0.0017 seconds