Spelling suggestions: "subject:"array vision"" "subject:"foray vision""
1 |
Near-Field Depth Perception in See-Through Augmented RealitySingh, Gurjot 07 August 2010 (has links)
This research studied egocentric depth perception in an augmented reality (AR) environment. Specifically, it involved measuring depth perception in the near visual field by using quantitative methods to measure the depth relationships between real and virtual objects. This research involved two goals; first, engineering a depth perception measurement apparatus and related calibration andmeasuring techniques for collecting depth judgments, and second, testing its effectiveness by conducting an experiment. The experiment compared two complimentary depth judgment protocols: perceptual matching (a closed-loop task) and blind reaching (an open-loop task). It also studied the effect of a highly salient occluding surface; this surface appeared behind, coincident with, and in front of virtual objects. Finally, the experiment studied the relationship between dark vergence and depth perception.
|
2 |
X-ray vision at action space distances: depth perception in contextPhillips, Nate 09 August 2022 (has links) (PDF)
Accurate and usable x-ray vision has long been a goal in augmented reality (AR) research and development. X-ray vision, or the ability to comprehend location and object information when such is viewed through an opaque barrier, would be imminently useful in a variety of contexts, including industrial, disaster reconnaissance, and tactical applications. In order for x-ray vision to be a useful tool for many of these applications, it would need to extend operators’ perceptual awareness of the task or environment. The effectiveness with which x-ray vision can do this is of significant research interest and is a determinant of its usefulness in an application context.
In substance, then, it is crucial to evaluate the effectiveness of x-ray vision—how does information presented through x-ray vision compare to real-world information? This approach requires narrowing as x-ray vision suffers from inherent limitations, analogous to viewing an object through a window. In both cases, information is presented beyond the local context, exists past an apparently solid object, and is limited by certain conditions. Further, in both cases, the naturally suggestive use cases occur over action space distances. These distances range from 1.5 to 30 meters and represent the area in which observers might contemplate immediate visually directed actions. These actions, simple tasks with a visual antecedent, represent action potentials for x-ray vision; in effect, x-ray vision extends an operators’ awareness and ability to visualize these actions into a new context.
Thus, this work seeks to answer the question “Can a real window be replaced with an AR window?” This evaluation focuses on perceived object location, investigated through a series of experiments using visually directed actions as experimental measures. This approach leverages established methodology to investigate this topic by experimentally analyzing each of several distinct variables on a continuum between real-world depth perception and fully realized x-ray vision. It was found that a real window could not be replaced with an AR window without some loss of depth perception acuity and accuracy. However, no significant difference was found between a target viewed through an opaque wall and a target viewed through a real window.
|
3 |
Design And Development of Mobile Image Overlay System For Image-Guided InterventionsANAND, Manjunath 26 June 2014 (has links)
Numerous studies have demonstrated the potential efficacy of percutaneous image-guided interventions over open surgical interventions. The conventional image-guided procedures are limited by the freehand technique, requiring mental 3D registration and hand-eye coordination for needle placement. The outcomes of these procedures are associated with longer duration and increased patient discomfort with high radiation exposure. Previously, a static image overlay system was proposed for aiding needle interventions. Certain drawbacks associated with the static system limited the clinical translation.
To overcome the ergonomic issues and longer calibration duration associated with static system, an adjustable image overlay system was proposed. The system consisted of monitor and semi-transparent mirror, attached together to an articulated mobile arm. The 90-degree mirror-monitor configuration was proposed to improve the physician access around the patient. MicronTracker was integrated for dynamic tracking of the patient and device. A novel method for auto-direct calibration of the virtual image overlay plane was proposed. Due to large mechanical structure, the precise movement was limited and consumed useful space in the procedure room. A mobile image overlay system with reduced system weight and smaller dimensions was proposed to eliminate the need for mechanical structure. A tablet computer and beamsplitter were used as the display device and mirror respectively. An image overlay visualization module of the 3D Slicer was developed to project the correct image slice upon the tablet device.
The system weight was reduced to 1 kg and the image overlay plane tracking precision (0.11mm STD=0.05) was similar to the printed physical markers. The auto-calibration of the image overlay plane can be done in two simple steps, away from the patient table and without additional phantom. Based on the successful pre-clinical testing of the previous static system, the mobile image overlay system with reduced weight, increased tracking precision and easier maneuverability, can be possibly hand-held by the physician to explore the image volume over the patient and be used for a wide range of procedures. The mobile image overlay system shall be classified as Class II device as per FDA regulations, do not require extensive verification and validation efforts and further improves the commercialization opportunities. / Thesis (Master, Mechanical and Materials Engineering) -- Queen's University, 2014-06-26 18:51:03.958
|
Page generated in 0.0809 seconds