The fields of virtual reality and gesture-based input devices are growing and becoming more popular. In order for the two technologies to be implemented together, an understanding of gestures in relation to virtual objects and users' expectations of those gestures needs to be understood. Specifically, this thesis focuses on arm gestures for the rotation of virtual objects. Participants in the study were first asked to freely perform an arm gesture they felt should execute a task. Next, participants were asked to perform specific rotation tasks with pre- configured arm gestures on four objects. There were two types of objects: those that could only be rotated on one axis and those that could be rotated on two axes. Each object type was represented by a familiar small and large object: combination lock, water wheel, baseball and beach ball. Data on how quickly they could complete the rotation tasks was collected. After performing the tasks on each of the four objects, participants were asked to rate the intuitiveness of each gesture as well as their preferred gesture for the specific task. The captured data showed that when users were presented with virtual representations of familiar physical objects, most of them expected to rotate the objects with the same gestures they would use on the actual physical objects. Considering 1-axis objects, an arm-based twist gesture outperformed other arm-based gestures in both intuitiveness and efficiency. Also with 2-axis objects, an arm-based horizontal/vertical gesture outperformed others in both intuitiveness and efficiency. Interestingly, those gestures were most efficient for each object type regardless of the size of the object being rotated. This would indicate that users are able to mentally separate the physical and virtual experiences. Larger objects require different rotation gestures than smaller objects in the physical world, but that requirement is non-existent in a virtual world. However, while the mind can separate between the physical and virtual worlds, there is still an expected connection. This is based on the fact that the gestures most preferred for the rotation tasks are the same gestures used for similar physical tasks.
Identifer | oai:union.ndltd.org:BGMYU2/oai:scholarsarchive.byu.edu:etd-7196 |
Date | 01 December 2016 |
Creators | Garner, Brandon Michael |
Publisher | BYU ScholarsArchive |
Source Sets | Brigham Young University |
Detected Language | English |
Type | text |
Format | application/pdf |
Source | All Theses and Dissertations |
Rights | http://lib.byu.edu/about/copyright/ |
Page generated in 0.0024 seconds