• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 508
  • 105
  • 87
  • 38
  • 36
  • 34
  • 19
  • 14
  • 7
  • 6
  • 6
  • 4
  • 4
  • 4
  • 3
  • Tagged with
  • 994
  • 994
  • 291
  • 197
  • 182
  • 150
  • 147
  • 135
  • 127
  • 120
  • 116
  • 99
  • 96
  • 92
  • 91
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Augmented reality visualisation for mobile robot developers

Collett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
12

Augmented reality visualisation for mobile robot developers

Collett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
13

Augmented reality visualisation for mobile robot developers

Collett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
14

Affine region tracking and augmentation using MSER and adaptive SIFT model generation a thesis /

Marano, Matthew James. Slivovsky, Lynne A. January 1900 (has links)
Thesis (M.S.)--California Polytechnic State University, 2009. / Title from PDF title page; viewed on June 30, 2009. "June 2009." "In partial fulfillment of the requirements for the degree [of] Master of Science in Electrical Engineering." "Presented to the faculty of California Polytechnic State University, San Luis Obispo." Major professor: Lynne Slivovsky Ph.D. Includes bibliographical references (p. 123).
15

A patient position guidance system in radiotherapy using augmented reality : a thesis submitted in partial fulfilment of the requirements for the degree of Master of Science in the University of Canterbury /

Talbot, James January 1900 (has links)
Thesis (M. Sc.)--University of Canterbury, 2009. / Typescript (photocopy). Includes bibliographical references (p. 76-82). Also available via the World Wide Web.
16

Finding an adequate escape pod to real time augmented reality applications

Marcelo Xavier Natário Teixeira, João 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:56:01Z (GMT). No. of bitstreams: 2 arquivo2386_1.pdf: 2486508 bytes, checksum: 840656846098d64ece51b53216ff079b (MD5) license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5) Previous issue date: 2009 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / Marcelo Xavier Natário Teixeira, João; Kelner, Judith. Finding an adequate escape pod to real time augmented reality applications. 2009. Dissertação (Mestrado). Programa de Pós-Graduação em Ciência da Computação, Universidade Federal de Pernambuco, Recife, 2009.
17

The Effect of an Occluder on the Accuracy of Depth Perception in Optical See-Through Augmented Reality

Hua, Chunya 15 August 2014 (has links)
Three experiments were conducted to study the effect of an occluder on the accuracy of nearield depth perception in optical-see-through augmented reality (AR). The first experiment was a duplicate experiment of the one in Edwards et al. [2004]. We found more accurate results than Edwards et al.’s work and did not find the occluder’s main effect or its two-way interaction effect with distance on the accuracy of observers’ depth matching. The second experiment was an updated version of the first one using a within-subject design and a more accurate calibration method. The results were that errors ranged from –5 to 3 mm when the occluder was present, –3 to 2 mm when the occluder was absent, and observers judged the virtual object to be closer after the presentation of the occluder. The third experiment was conducted on three subjects who were depth perception researchers. The result showed significant individual effects.
18

Preferred Amounts of Virtual Image Sharpening in Augmented Reality Applications using the Sharpview Algorithm

Cook, Henry Ford 11 August 2017 (has links)
The thesis presented in this paper is an attempt to quantify generally preferred amounts of virtual image sharpening in augmented reality applications. This preferred amount of sharpening is sought after in an effort to alleviate eye fatigue, and other negative symptoms, caused by accommodation switching between virtual images and real objects in augmented reality (AR) systems. This is an important area of research within the AR world due to the presence of many AR applications that supplement the real world with virtual information, often in the form of virtual text for users to read. An experiment, involving human subjects choosing between higher and lower sharpening amounts, was run to expose preferred amounts of sharpening or patterns of chosen amounts in relation to a number of variables within the experiment; those variables are: virtual text accommodative distance, real text accommodative distance, and the object of focus (real or virtual). The results of this experimentation may benefit future AR research and implementations, specifically in how they handle users switching focus.
19

A Pilot Study for Identifying Tasks and Degrees of Visual Fidelity for Applications of Head Mounted Display Systems for Construction

Soto, Cecilia Irene 14 September 2017 (has links)
The rise in technology and reduced costs has led to more research on the use of Augmented Reality (AR). However, applications for AR Head Mounted Display (HMD) systems are still being defined. AR HMD systems have potential to help users interact and experience information in a way that could improve their performance. In the construction sector, workers use black and white construction level of detail drawings for assembly and inspection tasks. For this thesis, Microsoft HoloLens was used in an experiment to see the effects of AR models on user performance and comprehension. There were three conditions for this study, two of the conditions used AR model displays and the third condition used a traditional paper drawing of the model. This study measured participants' accuracy and comprehension of the model presented to them. The conclusion of this thesis is that using 3D AR models may improve participants' comprehension of construction drawings. / Master of Science
20

MARCS: Mobile Augmented Reality for Cybersecurity

Mattina, Brendan Casey 19 June 2017 (has links)
Network analysts have long used two-dimensional security visualizations to make sense of network data. As networks grow larger and more complex, two-dimensional visualizations become more convoluted, potentially compromising user situational awareness of cyber threats. To combat this problem, augmented reality (AR) can be employed to visualize data within a cyber-physical context to restore user perception and improve comprehension; thereby, enhancing cyber situational awareness. Multiple generations of prototypes, known collectively as Mobile Augmented Reality for Cyber Security, or MARCS, were developed to study the impact of AR on cyber situational awareness. First generation prototypes were subjected to a formative pilot study of 44 participants, to generate user-centric performance data and feedback, which motivated the design and development of second generation prototypes and provided initial insight into the potentially beneficial impact of AR on cyber situational awareness. Second generation prototypes were subjected to a summative secondary study by 50 participants, to compare the impact of AR and non-AR visualizations on cyber situational awareness. Results of the secondary study suggest that employing AR to visualize cyber threats in a cyber-physical context collectively improves user threat perception and comprehension, indicating that, in some cases, AR security visualizations improve user cyber situational awareness over non-AR security visualizations. / Master of Science

Page generated in 0.0574 seconds