Spelling suggestions: "subject:"cognitive psychology|computer science"" "subject:"cognitive psychology|coomputer science""
11 |
Intelligent support for knowledge capture and constructionMaguitman, Ana Gabriela. January 2004 (has links)
Thesis (Ph.D.)--Indiana University, Dept. of Computer Science, 2004. / Source: Dissertation Abstracts International, Volume: 66-01, Section: A, page: 0012. Chairman: David B. Leake. Title from dissertation home page (viewed Oct. 12, 2006).
|
12 |
Mind-craft| Exploring the relation between "digital" visual experience and orientation in visual contour perceptionHipp, Daniel 26 January 2016 (has links)
<p> Visual perception depends fundamentally on statistical regularities in the environment to make sense of the world. One such regularity is the orientation anisotropy typical of natural scenes; most natural scenes contain slightly more horizontal and vertical information than oblique information. This property is likely a primary cause of the “oblique effect” in visual perception, in which subjects experience greater perceptual fluently with horizontally and vertically oriented content than oblique. However, recent changes in the visual environment, including the “carpentered” content in urban scenes and the framed, caricatured content in digital screen media presentations, may have altered the level of orientation anisotropy typical in natural scenes. Over a series of three experiments, the current work aims to evaluate whether “digital” visual experience, or visual experience with framed digital content, has the potential to alter the magnitude of the oblique effect in visual perception. Experiment 1 established a novel eye tracking method developed to index the visual oblique effect quickly and reliably using no overt responding other than eye movements. Results indicated that canonical (horizontal and vertical) contours embedded in visual noise were detected more accurately and quickly than oblique contours. For Experiment 2, the orientation anisotropy of natural, urban, and digital scenes was analyzed, in order to compare the magnitude of this anisotropic pattern across each image type. Results indicate that urban scenes contain exaggerated orientation anisotropy relative to natural scenes, and digital scenes possess this pattern to an even greater extent. Building off these two results, Experiment 3 adopts the eye tracking method of Experiment 1 as a pre- post-test measure of the oblique effect. Participants were eye tracked in the contour detection task several times before and after either a “training” session, in which they played Minecraft (Mojang, 2011) for four hours uninterrupted in a darkened room, or a “control” session, in which they simply did not interact with screens for four hours. It was predicted, based on the results of Experiment 2, that several hours of exposure to the caricatured orientation statistics of the digital stimulus would suffice to alter the magnitude of participants’ oblique effect, as indexed by the difference in the post-test relative to the pre-test. While no accuracy differences were observed in this primary manipulation, detection speed for canonical contours did alter significantly in the Minecraft subjects relative to controls. These results indicate that the oblique effect is quite robust at the level of visual contours and is measurable using eye tracking, that digital scenes contain caricatured orientation anisotropy relative to other types of scenes, and that exposure to naturalistic but caricatured scene statistics for only a few hours can alter certain aspects of visual perception.</p>
|
13 |
Human Systems Integration of an Extravehicular Activity Space Suit Augmented Reality Display SystemMitra, Paromita 28 August 2018 (has links)
<p> During an extravehicular activity (EVA), the role of an astronaut involves a multitude of complex tasks. Whether that task is a science experiment aboard the International Space Station, or traversing extraterrestrial terrain – attention, communication, and instruction are essential. As an aid, augmented reality (AR) can portray suit informatics and procedures within line-of-sight while minimizing attentional loss. Currently, there exists little research highlighting the human systems considerations to qualify AR systems for space suit applications. This study quantifies user interface (UI) and human performance measures for an AR prototype on the Mark III space suit. For user testing, 21 military pilots and personnel (11 men, 10 women) evaluated UI search tasks and completed a series of AR-instructed EVA dexterity tasks in an elevated luminosity, background clutter, and workload scenario. UI results suggest correlations for readability and usability; whereas, human performance results provide situational awareness, workload, and task performance data.</p><p>
|
Page generated in 0.099 seconds