Spelling suggestions: "subject:"augmented"" "subject:"ugmented""
11 |
Augmented reality visualisation for mobile robot developersCollett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
|
12 |
Augmented reality visualisation for mobile robot developersCollett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
|
13 |
Augmented reality visualisation for mobile robot developersCollett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
|
14 |
Augmented reality visualisation for mobile robot developersCollett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
|
15 |
Augmented reality visualisation for mobile robot developersCollett, Toby H. J. January 2007 (has links)
Developer interactions with robots during the testing and debugging phases of robot development are more complex than and distinct from general software application development. One of the primary differences is the need to understand the robot's view of the environment and the inconsistencies between this and the actual environment. Augmented reality (AR) provides an ideal way to achieve this, allowing robot program data to be displayed in context with the real world. This allows for easy comparison by the developer, highlighting the cause for any bugs in the robot behaviour. An AR debugging space is created in this work that allows the developer to have this enhanced understanding of the robot's world-view, thus improving developer efficiency. Over the past decade robots have begun to move out of industrial assembly lines and into environments that must be shared with human users. Many of the tasks that we wish robots to perform in these environments require close interaction and collaboration with human users. The move away from the constrained environment of a production line means that the tasks required of robots are more varied, their operating environment is far more complex and unpredictable, and safety can no longer be achieved through isolation of the robot. The result of these influences has been to change robot programming from a simple task of instructing the robot to perform a sequence of steps to an open ended challenge of specifying dynamic interactions that robot developers are still coming to terms with. Robot development is more than just design and code entry and a broader approach to improving robot development is needed. One of the founding principles of this thesis is that robot development should be approached as a human-robot interaction issue, this particularly applies to the testing and debugging phases of the development. The nature of the robot platform, the tasks the robot is required to perform and the environments that robots work within are significantly different from those of the desktop application. Hence robot developers need a tailored tool chain that focuses on this unique combination of issues. Current robot programming research is dominated by robot APIs and frameworks, leaving support tools to be developed in an ad hoc manner by developers as features are required. This leads to disjointed tools that have minimal feature sets; tools that generally have poor portability when applied to other robot developments. This work examines the needs of the developer in terms of a general purpose robot visualisation tool. One of the fundamental requirements of a general purpose robot visualisation tool is that a set of stock visualisations must be available for the developer. A prerequisite to providing these is to have a set of standard interfaces to provide the visualisations for. The open source robot framework Player/Stage was used throughout this work to provide standardised access to robot hardware. As part of this research the author has contributed heavily to the Player/Stage project, particularly as one of the key developers of the 2.0 release of Player. This new release simplifies Player development and increases the ease of maintenance of Player drivers and the efficiency of the server core. To evaluate the benefits of AR visualisation an intelligent debugging space was developed, which runs as a permanent installation in a robotic development lab providing an enhanced view of the robot's behaviour to the developer. The space is capable of automatically detecting the presence of robots and displaying visualisations of the standard interfaces of the robot, such as its sensors and effectors. The debugging space also allows the developer to create custom renderings, leveraging the developer's ability to determine the most salient items of their code and display these. A set of representative case studies was carried out using the debugging space for testing and debugging. These studies showed that AR provides an opportunity to understand the type of errors that are encountered during debugging. Debugging is essentially a process of elimination and by understanding the type of error developers can quickly eliminate large sets of potential bug sources, focusing on the sections of code that are causing the bug and therefore substantially reducing debugging time. The implemented system also shows that AR provides an important stepping stone between simulation environments and the real world. This thesis contributes the novel approach of applying AR to developer interactions with robots. The use of AR has been shown to have significant benefits for the robot developer, enhancing their understanding of the robot's world-view and hence reducing debugging time. As part of the work a flexible AR visualisation tool was developed with close integration to the Player/Stage project. This tool creates an intelligent debugging space where developers can exploit the benefits of the AR visualisation with minimal overhead. / New Zealand Tertiary Education Commission through the Top Achiever Doctoral scholarship
|
16 |
Affine region tracking and augmentation using MSER and adaptive SIFT model generation a thesis /Marano, Matthew James. Slivovsky, Lynne A. January 1900 (has links)
Thesis (M.S.)--California Polytechnic State University, 2009. / Title from PDF title page; viewed on June 30, 2009. "June 2009." "In partial fulfillment of the requirements for the degree [of] Master of Science in Electrical Engineering." "Presented to the faculty of California Polytechnic State University, San Luis Obispo." Major professor: Lynne Slivovsky Ph.D. Includes bibliographical references (p. 123).
|
17 |
A patient position guidance system in radiotherapy using augmented reality : a thesis submitted in partial fulfilment of the requirements for the degree of Master of Science in the University of Canterbury /Talbot, James January 1900 (has links)
Thesis (M. Sc.)--University of Canterbury, 2009. / Typescript (photocopy). Includes bibliographical references (p. 76-82). Also available via the World Wide Web.
|
18 |
Finding an adequate escape pod to real time augmented reality applicationsMarcelo Xavier Natário Teixeira, João 31 January 2009 (has links)
Made available in DSpace on 2014-06-12T15:56:01Z (GMT). No. of bitstreams: 2
arquivo2386_1.pdf: 2486508 bytes, checksum: 840656846098d64ece51b53216ff079b (MD5)
license.txt: 1748 bytes, checksum: 8a4605be74aa9ea9d79846c1fba20a33 (MD5)
Previous issue date: 2009 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / Marcelo Xavier Natário Teixeira, João; Kelner, Judith. Finding an adequate escape pod to real time augmented reality applications. 2009. Dissertação (Mestrado). Programa de Pós-Graduação em Ciência da Computação, Universidade Federal de Pernambuco, Recife, 2009.
|
19 |
The Effect of an Occluder on the Accuracy of Depth Perception in Optical See-Through Augmented RealityHua, Chunya 15 August 2014 (has links)
Three experiments were conducted to study the effect of an occluder on the accuracy of nearield depth perception in optical-see-through augmented reality (AR). The first experiment was a duplicate experiment of the one in Edwards et al. [2004]. We found more accurate results than Edwards et al.’s work and did not find the occluder’s main effect or its two-way interaction effect with distance on the accuracy of observers’ depth matching. The second experiment was an updated version of the first one using a within-subject design and a more accurate calibration method. The results were that errors ranged from –5 to 3 mm when the occluder was present, –3 to 2 mm when the occluder was absent, and observers judged the virtual object to be closer after the presentation of the occluder. The third experiment was conducted on three subjects who were depth perception researchers. The result showed significant individual effects.
|
20 |
Preferred Amounts of Virtual Image Sharpening in Augmented Reality Applications using the Sharpview AlgorithmCook, Henry Ford 11 August 2017 (has links)
The thesis presented in this paper is an attempt to quantify generally preferred amounts of virtual image sharpening in augmented reality applications. This preferred amount of sharpening is sought after in an effort to alleviate eye fatigue, and other negative symptoms, caused by accommodation switching between virtual images and real objects in augmented reality (AR) systems. This is an important area of research within the AR world due to the presence of many AR applications that supplement the real world with virtual information, often in the form of virtual text for users to read. An experiment, involving human subjects choosing between higher and lower sharpening amounts, was run to expose preferred amounts of sharpening or patterns of chosen amounts in relation to a number of variables within the experiment; those variables are: virtual text accommodative distance, real text accommodative distance, and the object of focus (real or virtual). The results of this experimentation may benefit future AR research and implementations, specifically in how they handle users switching focus.
|
Page generated in 0.0434 seconds