Spelling suggestions: "subject:"amobile VR"" "subject:"0mobile VR""
1 |
Hand Gesture based Telemedicine enabled by Mobile VRVulgari, Sofia Kiriaki January 2019 (has links)
Virtual Reality (VR) is a highly evolving domain and is used in anincreasing number of areas in today's society. Among the technologiesassociated with VR and especially mobile VR, is hand tracking and handgesture recognition. Telemedicine is one of the elds where VR is startingto thrive, and so the concept of adding the use of hand gestures came to bein order to explore the possibilities that can come from it. This researchis conducted with the development of a prototype application that usessome of the most emerging technologies. Manomotion's hand trackingand hand gesture recognition algorithms, and Photon's servers and developerkit, which makes multi-user applications achievable, allowed theconceptual idea of the prototype to become reality. In order to test itsusability and how potential users perceive it, a user study with 24 participantswas made, 8 of which were either studying or working in themedical eld. Additional expert meetings and observations from the userstudy also contributed to ndings that helped show how hand gesturescan aect a doctor consultation in Telemedicine. Findings showed thatthe participants thought of the proposed system as a less costly and timesaving solution, and that they felt immersed in the VR. The hand gestureswere accepted and understood. The participants did not have dicultieson learning or executing them, and had control of the prototype environment.In addition, the data showed that participants considered it to beusable in the medical eld in the future.
|
2 |
Implementation and Analysis of Co-Located Virtual Reality for Scientific Data VisualizationJordan M McGraw (8803076) 07 May 2020 (has links)
<div>Advancements in virtual reality (VR) technologies have led to overwhelming critique and acclaim in recent years. Academic researchers have already begun to take advantage of these immersive technologies across all manner of settings. Using immersive technologies, educators are able to more easily interpret complex information with students and colleagues. Despite the advantages these technologies bring, some drawbacks still remain. One particular drawback is the difficulty of engaging in immersive environments with others in a shared physical space (i.e., with a shared virtual environment). A common strategy for improving collaborative data exploration has been to use technological substitutions to make distant users feel they are collaborating in the same space. This research, however, is focused on how virtual reality can be used to build upon real-world interactions which take place in the same physical space (i.e., collaborative, co-located, multi-user virtual reality).</div><div><br></div><div>In this study we address two primary dimensions of collaborative data visualization and analysis as follows: [1] we detail the implementation of a novel co-located VR hardware and software system, [2] we conduct a formal user experience study of the novel system using the NASA Task Load Index (Hart, 1986) and introduce the Modified User Experience Inventory, a new user study inventory based upon the Unified User Experience Inventory, (Tcha-Tokey, Christmann, Loup-Escande, Richir, 2016) to empirically observe the dependent measures of Workload, Presence, Engagement, Consequence, and Immersion. A total of 77 participants volunteered to join a demonstration of this technology at Purdue University. In groups ranging from two to four, participants shared a co-located virtual environment built to visualize point cloud measurements of exploded supernovae. This study is not experimental but observational. We found there to be moderately high levels of user experience and moderate levels of workload demand in our results. We describe the implementation of the software platform and present user reactions to the technology that was created. These are described in detail within this manuscript.</div>
|
Page generated in 0.0322 seconds