• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Advanced Multi-modal User Interfaces in 3D Computer Graphics and Virtual Reality

Chen, Yenan January 2012 (has links)
Computers are developed continuously to satisfy the human demands, and typical tools used everywhere for ranging from daily life usage to all kinds of research. Virtual Reality (VR), a virtual environment simulated to present physical presence in the real word and imaginary worlds, has been widely applied to simulate the virtual environment. People’s feeling is limited to visual perception when only computers are applied for simulations, since computers are limited to display visualization of data, while human senses include sight, smell, hearing, taste, touch and so on. Other devices can be applied, such as haptics, a device for sense of touch, to enhance the human perception in virtual environment. A good way to apply VR applications is to place them in a virtual display system, a system with multiply tools displays a virtual environment with experiencing different human senses, to enhance the people’s feeling of being immersed in a virtual environment. Such virtual display systems include VR dome, recursive acronym CAVE, VR workbench, VR workstation and so on. Menus with lots of advantages in manipulating applications are common in conventional systems, operating systems or other systems in computers. Normally a system will not be usable without them. Although VR applications are more natural and intuitive, they are much less or not usable without menus. But very few studies have focused on user interfaces in VR. This situation motivates us working further in this area. We want to create two models on different purposes. One is inspired from menus in conventional system and the sense of touch. And the other one is designed based on the spatial presence of VR. The first model is a two-dimensional pie menu in pop-up style with spring force feedback. This model is in a pie shape with eight options on the root menu. And there is a pop-up style hierarchical menu belongs to each option on the root menu. When the haptics device is near an option on the root menu, the spring force will force the haptics device towards to the center of the option and that option will be selected, and then the sub menu with nine options will pop up. The pie shape together with the spring force effect is expected to both increase the speed of selection and decrease the error rate of selection. The other model is a semiautomatic three-dimensional cube menu. This cube menu is designed with a aim to provide a simple, elegant, efficient and accurate user interface approach. This model is designed with four faces, including the front, back, left and right faces of the cube. Each face represents a category and has nine widgets. Users can make selections in different categories. An efficient way to change between categories is to rotate the cube automatically. Thus, a navigable rotation animation system is built and is manipulating the cube rotate horizontally for ninety degrees each time, so one of the faces will always face users. These two models are built under H3DAPI, an open source haptics software development platform with UI toolkit, a user interface toolkit. After the implementation, we made a pilot study, which is a formative study, to evaluate the feasibility of both menus. This pilot study includes a list of tasks for each menu, a questionnaire regards to the menu performance for each subject and a discussion with each subject. Six students participated as test subjects. In the pie menu, most of the subjects feel the spring force guides them to the target option and they can control the haptics device comfortably under such force. In the cube menu, the navigation rotation system works well and the cube rotates accurately and efficiently. The results of the pilot study show the models work as we initially expected. The recorded task completion time for each menu shows that with the same amount of tasks and similar difficulties, subjects spent more time on the cube menu than on the pie menu. This may implicate that pie menu is a faster approach comparing to the cube menu. We further consider that both the pie shape and force feedback may help reducing the selection time. The result for the option selection error rate test on the cube menu may implicates that option selection without any force feedback may also achieve a considerable good effect. Through the answers from the questionnaire for each subject, both menus are comfortable to use and in good control.

Page generated in 0.0638 seconds