• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 523
  • 107
  • 87
  • 38
  • 36
  • 35
  • 19
  • 15
  • 7
  • 6
  • 6
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 1014
  • 1014
  • 294
  • 203
  • 186
  • 154
  • 152
  • 140
  • 128
  • 125
  • 117
  • 100
  • 99
  • 96
  • 94
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Road Stakeout In Wearable Outdoor Augmented Reality

Buchmann, Volkert January 2008 (has links)
This thesis advances wearable outdoor augmented reality (WOAR) research by proposing novel visualisations, by consolidating previous work, and through several formal user studies. Wearable outdoor augmented reality combines augmented reality (AR) and wearable computing to enable novel applications. AR allows the user to perceive virtual objects as part of their real environment. Using wearable computers as a platform for AR allows users to see the real and the virtual world combined in a mobile environment. This combination enables new and exciting applications that bring with them new challenges for interface and usability research. The research described in this thesis advances the field of WOAR research by developing a WOAR version of a commercial road stakeout application. This case study makes possible the first formal direct comparison of the performance of a WOAR application and its conventional counterpart. Road stakeout is the process of locating points in the real world and marking them with stakes. This process is not only relevant for road construction, but also to construction and surveying in general. AWOAR stakeout application can visualise stakeout targets on their location in the real world, while conventional stakeout systems can only guide users to these locations using indirect displays. The formal comparison found significant differences in performance, and showed that the WOAR system performed twice as fast at the same accuracy level as the conventional system. The study also identified a number of usability issues and technical problems related to WOAR systems that still need to be overcome. The thesis examines usability problems of the WOAR road stakeout application in detail, proposes solutions, and compares their efficiency in formal user studies. The basic stakeout tasks are navigating to a target location and then placing a stakeout pole on that location. Original research in the fields of directional interfaces and depth cues determined solutions for efficient navigation and pole placement in the WOAR stakeout application. Further, the presented work includes explorative implementations of obscured information visualisations. The thesis proposes interaction with artificially transparent stakeout poles and hands, and examines their feasibility with respect to perceptual and technical issues. A visualisation of a road model investigated the preservation of context while automatically providing detail when needed. The thesis presents working WOAR implementations of navigation and depth cue support, a road model visualisation, and an artificially transparent stakeout pole. In conclusion, the thesis consolidated WOAR interface research and extended the field with empirical research. The presented research is the first that allows a WOAR application to compete directly with a commercial conventional system, demonstrating the strong potential that WOAR systems already have.
2

Prototype development of low-cost, augmented reality trainer for crew service weapons

Conger, Nathan W. January 2008 (has links) (PDF)
Thesis (M.S. in Modeling Virtual Environments and Simulations (MOVES))--Naval Postgraduate School, September 2008. / Thesis Advisor(s): Kölsch, Mathias ; Sullivan, Joseph. "September 2008." Description based on title screen as viewed on November 5, 2008. Includes bibliographical references (p. 65-66). Also available in print.
3

Improving situational awareness on submarines using augmented reality

Hatt, Ronald V. January 2008 (has links) (PDF)
Thesis (M.S. in Modeling Virtual Environments and Simulation (MOVES))--Naval Postgraduate School, September 2008. / Thesis Advisor(s): Sullivan, Joseph. "September 2008." Description based on title screen as viewed on November 5, 2008. Includes bibliographical references (p. 85-86). Also available in print.
4

Exploring augmented reality

Scheinerman, Matt. January 2009 (has links)
Thesis (B.A.)--Haverford College, Dept. of Computer Science, 2009. / Includes bibliographical references.
5

Augmented Reality Interfaces for Enabling Fast and Accurate Task Localization

Sukan, Mengu January 2017 (has links)
Changing viewpoints is a common technique to gain additional visual information about the spatial relations among the objects contained within an environment. In many cases, all of the necessary visual information is not available from a single vantage point, due to factors such as occlusion, level of detail, and limited field of view. In certain instances, strategic viewpoints may need to be visited multiple times (e.g., after each step of an iterative process), which makes being able to transition between viewpoints precisely and with minimum effort advantageous for improved task performance (e.g., faster completion time, fewer errors, less dependence on memory). Many augmented reality (AR) applications are designed to make tasks easier to perform by supplementing a user's first-person view with virtual instructions. For those tasks that benefit from being seen from more than a single viewpoint, AR users typically have to physically relocalize (i.e., move a see-through display and typically themselves since those displays are often head-worn or hand-held) to those additional viewpoints. However, this physical motion may be costly or difficult, due to increased distances or obstacles in the environment. We have developed a set of interaction techniques that enable fast and accurate task localization in AR. Our first technique, SnapAR, allows users to take snapshots of augmented scenes that can be virtually revisited at later times. The system stores still images of scenes along with camera poses, so that augmentations remain dynamic and interactive. Our prototype implementation features a set of interaction techniques specifically designed to enable quick viewpoint switching. A formal evaluation of the capability to manipulate virtual objects within snapshot mode showed significant savings in time spent and gain in accuracy when compared to physically traveling between viewpoints. For cases when a user has to physically travel to a strategic viewpoint (e.g., to perform maintenance and repair on a large physical piece of equipment), we present ParaFrustum, a geometric construct that represents this set of strategic viewpoints and viewing directions and establishes constraints on a range of acceptable locations for the user's eyes and a range of acceptable angles in which the user's head can be oriented. Providing tolerance in the allowable viewing positions and directions avoids burdening the user with the need to assume a tightly constrained 6DOF pose when it is not required by the task. We describe two visualization techniques, ParaFrustum-InSitu and ParaFrustum-HUD, that guide a user to assume one of the poses defined by a ParaFrustum. A formal user study corroborated that speed improvements increase with larger tolerances and reveals interesting differences in participant trajectories based on the visualization technique. When the object to be operated on is smaller and can be handheld, instead of being large and stationary, it can be manually rotated instead of the user moving to a strategic viewpoint. Examples of such situations include tasks in which one object must be oriented relative to a second prior to assembly and tasks in which objects must be held in specific ways to inspect them. Researchers have investigated guidance mechanisms for some 6DOF tasks, using wide--field-of-view (FOV), stereoscopic virtual and augmented reality head-worn displays (HWDs). However, there has been relatively little work directed toward smaller FOV lightweight monoscopic HWDs, such as Google Glass, which may remain more comfortable and less intrusive than stereoscopic HWDs in the near future. In our Orientation Assistance work, we have designed and implemented a novel visualization approach and three additional visualizations representing different paradigms for guiding unconstrained manual 3DOF rotation, targeting these monoscopic HWDs. This chapter includes our exploration of these paradigms and the results of a user study evaluating the relative performance of the visualizations and showing the advantages of our new approach. In summary, we investigated ways of enabling an AR user to obtain visual information from multiple viewpoints, both physically and virtually. In the virtual case, we showed how one can change viewpoints precisely and with less effort. In the physical case, we explored how we can interactively guide users to obtain strategic viewpoints, either by moving their heads or re-orienting handheld objects. In both cases, we showed that our techniques help users accomplish certain types of tasks more quickly and with fewer errors, compared to when they have to change viewpoints following alternative, previously suggested methods.
6

The Seven Ages of Susanna: Immersing in the Narrative Through Augmented Reality

Unknown Date (has links)
The story is a sequence of events. Since nomadic times we have been drawn to the process of storytelling and the underlying themes hidden within these plots. Now, as the technological advancements made in new media lead us to this point, there is the need to reconcile the connection of the narrative with that of new media. Many theorists such as Manovich believe the narrative is slowly dying as new media continues to evolve. While others such as Bolter and Grusin think the story and traditional media is merely reinserting itself into new media. In the augmented reality story, The Seven Ages of Susanna, I seek to create a marriage of conventional media narrative and illustration techniques. By using new media tools of Vuforia and Unity, I aim to create an immersive experience that reconciles this issue. / Includes bibliography. / Thesis (M.F.A.)--Florida Atlantic University, 2019. / FAU Electronic Theses and Dissertations Collection
7

Handelsbranschers användning av Augmented reality : En studie om dess möjligheter, utveckling och användning

Hamrén, Oskar January 2010 (has links)
<p>Augmented reality is an interesting technology which has increased in popularity the last couple of years. Even though we see more and more of this technology the use of it doesn’t match up to its true potential. This paper aims to investigate the possibilities of this technology and illustrate how companies can use it in their business to reach out to customers in a new and exciting way. The study consists of three parts. The first part explores what companies have done today with augmented reality, which companies are of interest for this technology and how they can use it. The second part investigates who the user of augmented reality is today and who tomorrow’s user will be. The third part goes through the technology behind augmented reality. This part investigates what is possible to do today and which techniques are the most common. These three steps are then combined and make the foundation of a prototype which will illustrate how a business can adopt augmented reality. Interviews of potential users reveal that these kinds of artifacts are needed by the public, for example, to visualizing products from the internet in the customers own home environment. Conclusions from the literature study and from the development of the prototype disclose that augmented reality have a lot of problems that needs to be solved before it will be accepted by a wider public audience. These problem areas range from camera quality to standardization of markers, but like all new technology these problems have solutions and eventually augmented reality will be a natural part of our lives.</p>
8

Augmented reality visualisation for mobile robot developers

Collett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
9

Augmented reality visualisation for mobile robot developers

Collett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
10

Augmented reality visualisation for mobile robot developers

Collett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship

Page generated in 0.0851 seconds