Return to search

Merging three spaces : exploring user interface framework for spatial design in virtual reality / Merging 3 spaces : exploring user interface framework for spatial design in virtual reality / Exploring user interface framework for spatial design in VR

Thesis: S.M., Massachusetts Institute of Technology, Department of Architecture, 2016. / This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections. / Page 58 blank. Cataloged from student-submitted PDF version of thesis. / Includes bibliographical references (page 57). / This thesis proposes a framework for deploying a tool that provides designers with an alternative spatial environment in Virtual Reality (VR). The physical, projected, and immersive spaces are examined as three kinds of available spaces for them to operate. To compare these spaces, a series of subject experiments is conducted in architectural space. Then a framework for a new tool is prepared with consideration to the experiment results, and a prototype is created to demonstrate its unique user interface for virtual reality environment. Three experiments are designed to probe the differences and similarities in human perception amongst the three spaces and to prove the following hypotheses. * Hypothesis 1: VR technology can simulate perception of scale and proportion of physical space with minimal error. * Hypothesis 2A and 2B. * 3D model with realistic textures do not enhance the degree of perception for scale and proportion of the physical space; * 3D model with realistic textures enhances spatial perception with greater confidence and shorter recognition time. * Hypothesis 3: Compared to a first person view in VR, a bird's-eye view mode in 2-D screen offers better perception of orientation and location of different objects. * Hypothesis 4: Compared to bird's-eye view in 2D screen, the first person view mode in VR offers better perception of scale of objects. The results from these experiments lead to a framework for creating a user interface for VR. The experiment on hypothesis 1 supports that virtual space can replace physical space for spatial design purposes. The experiments on hypotheses 3 and 4 suggest that virtual UI should simultaneously include dual perspectives: bird-eye view and first person view. And the experiments on hypotheses 2 and 3 support providing two different modes of renderings. For dynamic interactions, such as between and among moving objects, the rendered mode should be without texture for computational efficiency. For visual interactions, such as navigation, the space can be rendered with photo-realistic textures without losing efficiency. A prototype UI that implements this framework in VR environment is built, and demonstrate how design process can be enhanced. In summary, a framework for unique Virtual Reality User Interface (VRUI) is explored for spatial design. It is derived from the way people perceive physical, projected, and immersive virtual environments. Designers can use this novel multidisciplinary design tool, whether they design for physical architecture or 3D environments for digital video games. / by Joshua Choi. / S.M.

Identiferoai:union.ndltd.org:MIT/oai:dspace.mit.edu:1721.1/106369
Date January 2016
CreatorsChoi, Joshua, S.M. Massachusetts Institute of Technology
ContributorsTakehiko Nagakura., Massachusetts Institute of Technology. Department of Architecture., Massachusetts Institute of Technology. Department of Architecture.
PublisherMassachusetts Institute of Technology
Source SetsM.I.T. Theses and Dissertation
LanguageEnglish
Detected LanguageEnglish
TypeThesis
Format58 pages, application/pdf
RightsM.I.T. theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission. See provided URL for inquiries about permission., http://dspace.mit.edu/handle/1721.1/7582

Page generated in 0.2531 seconds