Spelling suggestions: "subject:"[een] AUGMENTED REALITY"" "subject:"[enn] AUGMENTED REALITY""
71 |
OPIRA: The Optical-flow Perspective Invariant Registration Augmentation and other improvements for Natural Feature RegistrationClark, Adrian James January 2009 (has links)
In the domain of computer vision, registration is the process of calculating the transformation between a known object, called a marker, and a camera which is viewing it. Registration is the foundation for a number of applications across a range of disciplines such as augmented reality, medical imaging and robotic navigation.
In the set of two dimensional planar markers, there are two classes: (1) fiducial, which are designed to be easily recognisable by computers but have little to no semantic meaning to people, and (2) natural features, which have meaning to people, but can still be registered by a computer. As computers become more powerful, natural feature markers are increasingly the more popular choice; however there are still a number of inherent problems with this class of markers.
This thesis examines the most common shortcomings of natural feature markers, and proposes and evaluates solutions to these weaknesses. The work starts with a review of the existing planar registration approaches, both fiducial and natural features, with a focus on the strengths and weaknesses of each. From this review, the theory behind planar registration is discussed, from the different coordinate systems and transformations, to the computation of the registration transformation.
With a foundation of planar registration, natural feature registration is decomposed into its main stages, and each stage is described in detail. This leads into a discussion of the complete natural feature registration pipeline, highlighting common issues encountered at each step, and discussing the possible solutions for each issue.
A new implementation of natural feature registration called the Optical-flow Perspective Invariant Registration Augmentation (OPIRA) is proposed, which provides vast improvements in robustness to perspective, rotation and changes in scale to popular registration algorithms such as SIFT, SURF, and the Ferns classifier. OPIRA is shown to improve perspective invariance on average by 15% for SIFT, 25% for SURF and 20% for the Ferns Classifier, as well as provide complete rotation invariance for the rotation dependent implementations of these algorithms.
From the investigation into problems and potential resolutions at each stage during registration, each proposed solution is evaluated empirically against an external ground truth. The results are discussed and a conclusion on the improvements gained by each proposed solution and the feasibility of use in a real natural feature registration application is drawn.
Finally, some applications which use the research contained within this thesis are described, as well as some future directions for the research.
|
72 |
Real-Time Hybrid Tracking for Outdoor Augmented RealityWilliams, Samuel Grant Dawson January 2014 (has links)
Outdoor tracking and registration are important enabling technologies for mobile augmented reality. Sensor fusion and image processing can be used to improve global tracking and registration for low-cost mobile devices with limited computational power and sensor accuracy. Prior research has confirmed the benefits of this approach with high-end hardware, however the methods previously used are not ideal for current consumer mobile devices. We discuss the development of a hybrid tracking and registration algorithm that combines multiple sensors and image processing to improve on existing work in both performance and accuracy. As part of this, we developed the Transform Flow toolkit, which is one of the first open source systems for developing and quantifiably evaluating mobile AR tracking algorithms. We used this system to compare our proposed hybrid tracking algorithm with a purely sensor based approach, and to perform a user study to analyse the effects of improved precision on real world tracking tasks. Our results show that our implementation is an improvement over a purely sensor fusion based approach; accuracy is improved up to 25x in some cases with only 2-4ms additional processing per frame, in comparison with other algorithms which can take over 300ms.
|
73 |
THE UNIVERSAL MEDIA BOOKGupta, Shilpi 01 January 2006 (has links)
We explore the integration of projected imagery with a physical book that acts as a tangible interface to multimedia data. Using a camera and projector pair, a tracking framework is presented wherein the 3D position of planar pages are monitored as they are turned back and forth by a user, and data is correctly warped and projected onto each page at interactive rates to provide the user with an intuitive mixed-reality experience. The book pages are blank, so traditional camera-based approaches to tracking physical features on the display surface do not apply. Instead, in each frame, feature points are independently extracted from the camera and projector images, and matched to recover the geometry of the pages in motion. The book can be loaded with multimedia content, including images and videos. In addition, volumetric datasets can be explored by removing a page from the book and using it as a tool to navigate through a virtual 3D volume.
|
74 |
Mobile solutions and the museum experienceKoskiola, Annina January 2014 (has links)
This thesis presents four case studies from the Finnish museum sector that are using mobile technologies in diverse ways to enhance the museum experience. At the National Museum of Finland, the mobile solution functions as an aid for providing translations in different languages and thus improving the aesthetic appearance of the exhibition. At Tampere Art Museum, the outdoors mobile tour extends the museum visit outside the physical walls of the building. At Helsinki City Museum the mobile phone is perceived as a communication tool. At Luostarinmäki Handicrafts Museum the Augmented Reality game combines digital narrative with real-world events, creating a solution that is both entertaining and informative. These solutions are analysed in terms of the Contextual Model, developed by Falk and Dierking. The model divides the museum visit into three overlapping and interacting spheres – personal, social and physical. This thesis looks at how mobile solutions may enhance or hinder the museum experience in regards to each of these three spheres. Additionally, the model is compared with the results of a visitor research conducted at the National Museum of Finland in October 2013. The aim of the thesis is to identify the most successful features of these solutions and to explore how the field could be developed in the future.
|
75 |
Realization Of A Spatial Augmented Reality System - A Digital Whiteboard Using a Kinect Sensor and a PC ProjectorKolomenski, Andrei A 02 October 2013 (has links)
Recent rapid development of cost-effective, accurate digital imaging sensors, high-speed computational hardware, and tractable design software has given rise to the growing field of augmented reality in the computer vision realm. The system design of a 'Digital Whiteboard' system is presented with the intention of realizing a practical, cost-effective and publicly available spatial augmented reality system.
A Microsoft Kinect sensor and a PC projector coupled with a desktop computer form a type of spatial augmented reality system that creates a projection based graphical user interface that can turn any wall or planar surface into a 'Digital Whiteboard'. The system supports two kinds of user inputs consisting of depth and infra-red information. An infra-red collimated light source, like that of a laser pointer pen, serves as a stylus for user input. The user can point and shine the infra-red stylus on the selected planar region and the reflection of the infra-red light source is registered by the system using the infra-red camera of the Kinect. Using the geometric transformation between the Kinect and the projector, obtained with system calibration, the projector displays contours corresponding to the movement of the stylus on the 'Digital Whiteboard' region, according to a smooth curve fitting algorithm. The described projector-based spatial augmented reality system provides new unique possibilities for user interaction with digital content.
|
76 |
Smartphone-Mediated Tourist Experiences: Understanding the Influence of Augmented Reality (AR) Applications in TourismAnuar, Faiz Izwan 03 October 2013 (has links)
The synergy of smartphone, mobile applications (apps) and Augmented Reality (AR) technology has the potential to mediate tourism experiences to great extents. The advent of AR apps on smartphones provides a dynamic solution for tourists by helping convey destinations’ meanings and creates positive experiences via interactive tourist information and services almost anywhere, anytime. As a result, tourists are increasingly using AR travel apps at destinations to create more memorable travel experiences.
Despite vast literature on tourists’ experiences, there is limited research focusing on understanding the use of smartphones and AR apps for tourism. A critical review of the literature indicates that there is a need to develop a richer theoretical framework that describes the use of smartphones and AR apps for travel. In addition, there is a need to understand tourists’ experiences with smartphone-mediated technology. In particular, literature on understanding of the use of smartphones and apps for travel is largely established from a quantitative perspective, and it is argued that this perspective cannot provide an in-depth understanding of the mechanisms that affect the use of smartphones and travel apps, which in turn shapes the travel experience.
The present qualitative study was designed to understand the current use and possible benefits of smartphone-mediated tourism experiences with AR apps. Specifically, the purpose of this study was to examine the influence of AR apps on tourists’ experiences. The study sought to understand how tourists used AR apps, which specific interactions with the mobile devices were afforded, what emotions were evoked through interaction with the AR technology, and how the technology mediated tourist’s experiences. Based on this notion, the study attempted to generate an inductive middle-range theory on smartphone-mediated tourism experiences using grounded theory method.
An iPhone AR app was developed for the Texas A&M University campus to better understand how tourists used the AR app and how this use influenced their travel experiences. Forty-four participants inclusive of students, prospective students and visitors of Texas A&M University were recruited for the study. To aid theory building and enhance the solidity of the smartphone-mediated travel experience theory, the study included a control group, which involve individual, group and guided tours that only use a brochure/campus booklet or listen to a human tour guide. The AR app was tested on 10 individuals and 10 groups. For the control group, 6 individuals and 6 groups used brochure/campus booklet while touring the sites and 6 individuals and 6 groups listen to the tour-guide. This comparison provided detailed understanding of what happens in the absence of technology in travel experience, and a focus on apprehending what AR technology adds. Data were collected through face-to-face in-depth interviews with the participants and then transcribed and imported into ATLAS.TI 7.0 software for analysis.
A grounded theory approach was used to analyze the data. The interview data were coded and presented in five major sections representing the research questions. The results of the study provided theoretical contributions in understanding the smartphone-mediated tourism experiences and offered practical implications for app design and interpretative services for tourist sites.
|
77 |
Robust dynamic orientation sensing using accelerometers : model-based methods for head tracking in AR : a thesis presented for the degree of Doctor of Philosophy in Mechanical Engineering at the University of Canterbury, Christchurch, New Zealand /Keir, Matthew Stuart. January 1900 (has links)
Thesis (Ph. D.)--University of Canterbury, 2008. / Typescript (photocopy). "24 September 2008." Includes bibliographical references (p. [137]-143). Also available via the World Wide Web.
|
78 |
Understanding user engagement in immersive and interactive storiesDow, Steven P.. January 2008 (has links)
Thesis (Ph.D)--Computing, Georgia Institute of Technology, 2009. / Committee Chair: MacIntyre, Blair; Committee Member: Bolter, Jay; Committee Member: Guzdial, Mark; Committee Member: Mateas, Michael; Committee Member: Mynatt, Elizabeth. Part of the SMARTech Electronic Thesis and Dissertation Collection.
|
79 |
ARSTUDIO 2.0 : um sistema de estúdio virtual para geração de conteúdo midiático baseado no motor de jogos Unity3D /Aguilar, Ivan Abdo. January 2017 (has links)
Orientador: Antonio Carlos Sementille / Banca: Silvio Ricardo Rodrigues Sanches / Banca: Letícia Passos Affini / Resumo: Tanto as produções de cinema quanto as de televisão e internet podem se beneficiar das técnicas de combinação de elementos virtuais (2D e 3D) e elementos reais fornecidos pela Realidade Aumentada e, portanto, a aplicação destas técnicas em sistemas de estúdio virtual tem se revelado uma abordagem bastante flexível e inovadora para a geração de conteúdo midiático. Na cadeia de produção tradicional, elementos virtuais geralmente são inseridos somente na fase de pós-produção, e, consequentemente, o conteúdo virtual só é visível depois que todo processo de edição é finalizado. No estúdio real a equipe de gravação e os atores são guiados por sinais visuais simples onde as personagens ou objetos virtuais devem aparecer. Com a utilização de técnicas de Realidade Aumentada é possível, em tempo real, que os estúdios virtuais permitam a inserção, interação e visualização de objetos virtuais durante todas as etapas da cadeia de produção, minimizando os erros de gravação e, consequentemente, reduzindo custos. Um grande problema na implementação de estúdios virtuais diz respeito à geração e interação com elementos virtuais 3D de alta definição e em tempo real. Tradicionalmente este problema é tratado utilizando-se bibliotecas gráficas. No entanto, outra possibilidade é a utilização dos recursos de motores de jogos. Considerando o contexto exposto, o objetivo principal do presente trabalho consistiu em elaborar, implementar e testar um estúdio virtual, de baixo custo, capaz de combinar as ... (Resumo completo, clicar acesso eletrônico abaixo) / Abstract: Film, television and internet productions can benefit from the techniques of combining virtual elements (2D and 3D) with real elements provided by the use of Augmented Reality and, therefore, tehe application of these techniques in virtual studio systems have proved to be a quite flexible and innovative approach to the generation of media content. In traditional production pipelines, virtual elements are usually inserted only during the post-prodution phase, and consequently, the virtual content is only vesible after the editing process is finhished. While filming in the real studio, film crew and actors are guided by simple visual signals that represent where the virtual characters and objects should appear. With the use of Augmented Reality techniques it is possible, in real-time, that virtual studios allow for the insertion, interaction and visualization of virtual objects during all stages of the production pipeline minimizing the recording errors and, consequently, reducing costs. A major problem in the implementation of virtual studios is the generation and interaction with high-definition 3D virtual elements in real-time. Traditionaly this problem is handled using graphics libraries. However, another possibility is the use of game engines. Considering the above context, the main objetive of this work was to design, implement ans test a, low cost, virtual studio capable of combinig functions from Vuforia's augmented reality library and computer graphics provided by the Unity 3D game engine, as well as the motion capture capabilities from the kinect v2 from Microsoft. This virtual studio was named ARSTUDIO version 2 / Mestre
|
80 |
Design and Evaluation of a 3D Map View using Augmented Reality in Flight Training Simulators / Design och utvärdering av en kartvy i 3D med hjälp av förstärkt verklighet i flygträningssimulatorerMontalvo, Philip, Pihl, Tobias January 2018 (has links)
The ability to visualize and manipulate an airspace is an important tool for an instructor controlling a flight simulator mission. Usually, airspaces are observed and manipulated through 2D and 3D views on a computer screen. A problem with using computer screen is spatial perception, which is significantly limited when observing a 3D space. Technologies such as AR and VR provide new possibilities for presenting and manipulating objects and spaces in 3D which could be used to improve spatial perception. Microsoft's HoloLens is a see-through head mounted display which can project 3D holograms into the real world to create an AR experience. This thesis presents a prototype for displaying a 3D map view using HoloLens which has been designed to improve spatial perception while maintaining high usability. The prototype has been evaluated through heuristic and formative evaluations and was well received by its potential users. The prototype has also been used to present suggestions to improve spatial ability for similar applications.
|
Page generated in 0.0794 seconds