• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Intelligent Augmented Reality (iAR):Context-aware Inference and Adaptation in AR

Davari-Najafabadi, Shakiba 12 September 2024 (has links)
Augmented Reality (AR) transforms the entire 3D space around the user into a dynamic screen, surpassing the limitations of traditional displays and enabling efficient access to multiple pieces of information simultaneously, all day, every day. Recent developments in AR eyeglasses promise that AR could become the next generation of personal computing devices. To realize this vision of pervasive AR, the AR interface must address the challenges posed by constant and omnipresent virtual content. As the user's context changes, the virtual content in AR head-worn displays can occasionally become obtrusive, hindering the user's perception and awareness of their surroundings and their interaction with both the virtual and physical worlds. An intelligent interface is needed to adapt the presentation and interaction of AR content. This dissertation outlines a roadmap towards effective, efficient, and unobtrusive AR through intelligent AR (iAR) systems that automatically learn and adapt the interface to the user's context. To achieve this goal, we: %(1) Design multiple context-aware AR interfaces and explore their design and effectiveness in various contexts through four experiments; (1) Identify multiple AR design principles and guidelines that maintain efficiency while addressing challenges such as occlusion, social interaction, and content placement in AR. (2) Demonstrate the impact of context on AR effectiveness, validating the advantages of context-awareness and highlighting the complexities of implementing a context-aware approach in pervasive AR, particularly in scenarios involving context-switching. (3) Propose a design space for XR interfaces; (4) Develop a taxonomy of quantifiable contextual components and a framework for designing iAR interfaces. / Doctor of Philosophy / Augmented Reality (AR) integrates digital information with the real world in real-time, transforming the surrounding physical space into a dynamic, interactive screen. This technology can simultaneously provide hands-free access to unlimited virtual applications and information, facilitating fast and easy access. With recent advancements in AR eyeglasses, AR is anticipated to become the next generation of personal computing, potentially replacing mobile phones and computers. However, to be seamlessly integrated into daily life, AR must overcome challenges such as occluding important real-world objects, distraction, visual clutter, and information overload. This dissertation presents a roadmap for developing intelligent AR (iAR) systems that automatically adapt to the user's context. To achieve this goal, we identify the design space for adaptable AR elements and design and test various context-aware AR interfaces. We identify key AR design principles that ensure efficiency while addressing challenges like occlusion, social interaction, and content placement. We also highlight the impact of context on AR effectiveness and the complexities of implementing a context-aware approach, especially in context-switching scenarios. Additionally, we informed the design process of iAR interfaces by identifying the contextual components that influence their effectiveness and providing a framework and architecture for utilizing this information for automatic adaptations. These efforts aim to enhance AR effectiveness and efficiency while ensuring it remains unobtrusive in everyday use.

Page generated in 0.0282 seconds