<p>In Virtual Reality (VR), users often need to explore a large virtual space within a limited physical space. However, as one of the most popular and commonly-used methods for such room-scale problems, teleport always relies on hand-based controllers. In applications that require consistent hand interaction, such teleport methods may conflict with the users' hand operation, and make them uncomfortable, thus affecting their experience. </p>
<p>To alleviate these limitations, this research designs and implements a new interactive object-based VR locomotion method, ManiLoco, as an eye- and foot-based low-cost method. This research also evaluates ManiLoco and compares it with state-of-the-art Point & Teleport and Gaze Teleport methods in a within-subject experiment with 14 participants.</p>
<p>The results confirm the viability of the method and its possibility in such applications. ManiLoco makes the users feel much more comfortable with their hands and focus more on the hand interaction in the application while maintaining efficiency and presence. Further, the users' trajectory maps indicate that ManiLoco, despite the introduction of walking, can be applicable to room-scale tracking space. Finally, as a locomotion method only relied on VR head-mounted display (HMD) and software detection, ManiLoco can be easily applied to any VR applications as a plugin.</p>
Identifer | oai:union.ndltd.org:purdue.edu/oai:figshare.com:article/20321649 |
Date | 15 July 2022 |
Creators | Dayu Wan (13104111) |
Source Sets | Purdue University |
Detected Language | English |
Type | Text, Thesis |
Rights | CC BY 4.0 |
Relation | https://figshare.com/articles/thesis/ManiLoco_A_Locomotion_Method_to_Aid_Concurrent_Object_Manipulation_in_Virtual_Reality/20321649 |
Page generated in 0.0019 seconds