• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 7
  • 2
  • Tagged with
  • 10
  • 10
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Relation between Hazard Perception and Visual Behaviour among Older Drivers / Förhållandet mellan riskuppfattning och visuellt beteende bland äldre förare

Eriksson Thörnell, Emelie January 2010 (has links)
<p>The hazard perception test developed by Sagberg and Bjornskau (2006) measuring reaction times in relation to different hazardous situations in traffic, has been used in the present study to analyze older drivers’ visual behaviour when passing/responding to the test.</p><p>The overall objective of this study has been to investigate the relation between hazard perception in traffic and visual behaviour among older drivers in comparison with a younger age group. The purpose of the study was to provide knowledge on what traffic situations that are more difficult for older drivers to interpret or perceive as hazardous. The elderly were expected to have more problems in situations that included objects classified as context hazards. Context hazards consist of objects that are slowly moving on the side of the road, which poses a situation where the driver should be prepared for the potential behaviour of that object.</p><p>The study was composed of two groups of drivers, one group of middle-aged drivers, 35-55 years old, and one group of older drivers, 65 years old and above, who performed the hazard perception test wearing an eye tracker. Hazard interpretation level within age group and situation was investigated, and eye movement data analyzed in terms of fixation duration time.</p><p>Overall results showed that the older participants had more problems in interpreting situations classified as context hazards as risky, especially context hazards consisting of pedestrians or cyclists. The differences were nevertheless significant. In addition, when investigating total fixation time on the hazard objects, the differences between age groups were shown to be significant for one of the situations consisting of pedestrians, classified as context/hidden hazard. No significant differences between age groups were found in either of the other situations.</p><p>The conclusions are that the elderly tentatively should be exposed to context hazards composed by pedestrians or cyclist in future training schemes. Since there were no significant differences between age groups, more research is, however, needed in the area. Also, since the class of context/hidden hazards, which showed significant differences in fixation time between age groups, was composed by only one situation, resembling situations should be investigated in order to verify these differences.</p>
2

Relation between Hazard Perception and Visual Behaviour among Older Drivers / Förhållandet mellan riskuppfattning och visuellt beteende bland äldre förare

Eriksson Thörnell, Emelie January 2010 (has links)
The hazard perception test developed by Sagberg and Bjornskau (2006) measuring reaction times in relation to different hazardous situations in traffic, has been used in the present study to analyze older drivers’ visual behaviour when passing/responding to the test. The overall objective of this study has been to investigate the relation between hazard perception in traffic and visual behaviour among older drivers in comparison with a younger age group. The purpose of the study was to provide knowledge on what traffic situations that are more difficult for older drivers to interpret or perceive as hazardous. The elderly were expected to have more problems in situations that included objects classified as context hazards. Context hazards consist of objects that are slowly moving on the side of the road, which poses a situation where the driver should be prepared for the potential behaviour of that object. The study was composed of two groups of drivers, one group of middle-aged drivers, 35-55 years old, and one group of older drivers, 65 years old and above, who performed the hazard perception test wearing an eye tracker. Hazard interpretation level within age group and situation was investigated, and eye movement data analyzed in terms of fixation duration time. Overall results showed that the older participants had more problems in interpreting situations classified as context hazards as risky, especially context hazards consisting of pedestrians or cyclists. The differences were nevertheless significant. In addition, when investigating total fixation time on the hazard objects, the differences between age groups were shown to be significant for one of the situations consisting of pedestrians, classified as context/hidden hazard. No significant differences between age groups were found in either of the other situations. The conclusions are that the elderly tentatively should be exposed to context hazards composed by pedestrians or cyclist in future training schemes. Since there were no significant differences between age groups, more research is, however, needed in the area. Also, since the class of context/hidden hazards, which showed significant differences in fixation time between age groups, was composed by only one situation, resembling situations should be investigated in order to verify these differences.
3

Examining expertise through eye movements : a study of clinicians interpreting electrocardiograms

Davies, Alan January 2018 (has links)
The electrocardiogram (ECG) is a graphical representation of the electrical activity of the heart. The 12-lead ECG shows this activity in 12 "views" called "leads", relative to the location of sensors attached to the body surface. The ECG is a routinely applied cost effective diagnostic medical test, utilised in healthcare settings around the world. Although more than three hundred million ECGs are recorded each year, correctly interpreting them is considered a complex task. Failure to make correct interpretations can lead to injury or death and costs vast sums in litigation payments. Many automated attempts at interpreting ECGs have been implemented and continue to be developed and improved. Despite this, automated methods are still considered to be less reliable than expert human interpretation. As ECG interpretation is both a cognitive and visual task, eye-tracking holds great potential as an investigative methodology. This thesis aims to identify any cues in visual behaviour that may indicate differences in subsequent ECG interpretation accuracy. This is the first work that uses eye-tracking to analyse how practitioners interpret ECGs as a function of accuracy. In order to investigate these phenomenon, several experiments were carried out using eye-movements captured from clinical practitioners that interpret ECGs as part of their usual clinical role. The findings presented in this thesis have advanced research in the understanding of ECG interpretation. Specifically: Clinical history makes a difference to how people look at ECGs; different gaze patterns are often found in accurate and inaccurate interpretation groups. Grouping data to account for within ECG lead behaviour (eye-movement patterns within a lead) is more revealing than analysis at the level of the lead (eye-movements between leads). Findings suggest analysing visual behaviour at this level is crucial in order to detect behaviour in ECG interpretation. Further to this the thesis presents eye-tracking techniques that can be applied to wider areas of task performance. These methods work over complex stimuli, are able to deal post hoc with differently sized groups and generate appropriate areas of interest on a stimulus. These methods detect important differences in eye-movement behaviour between groups that are missed when applying standard inferential statistical techniques.
4

Visualising the Visual Behaviour of Vehicle Drivers / Visualisering av visuellt beteende hos fordonsförare

Blissing, Björn January 2002 (has links)
<p>Most traffic accidents are caused by human factors. The design of the driver environment has proven essential to facilitate safe driving. With the advent of new devices such as mobile telephones, GPS-navigation and similar systems the workload on the driver has been even more complicated. There is an obvious need for tools supporting objective evaluation of such systems, in order to design more effective and simpler driver environments. </p><p>At the moment video is the most used technique for capturing the drivers visual behaviour. But the analysis of these recordings is very time consuming and only give an estimate of where the visual attention is. An automated tool for analysing visual behaviour would minimize the post processing drastically and leave more time for understanding the data. </p><p>In this thesis the development of a tool for visualising where the driver’s attention is while driving the vehicle. This includes methods for playing back data stored on a hard drive, but also methods for joining data from multiple different sources.</p>
5

Visualising the Visual Behaviour of Vehicle Drivers / Visualisering av visuellt beteende hos fordonsförare

Blissing, Björn January 2002 (has links)
Most traffic accidents are caused by human factors. The design of the driver environment has proven essential to facilitate safe driving. With the advent of new devices such as mobile telephones, GPS-navigation and similar systems the workload on the driver has been even more complicated. There is an obvious need for tools supporting objective evaluation of such systems, in order to design more effective and simpler driver environments. At the moment video is the most used technique for capturing the drivers visual behaviour. But the analysis of these recordings is very time consuming and only give an estimate of where the visual attention is. An automated tool for analysing visual behaviour would minimize the post processing drastically and leave more time for understanding the data. In this thesis the development of a tool for visualising where the driver’s attention is while driving the vehicle. This includes methods for playing back data stored on a hard drive, but also methods for joining data from multiple different sources.
6

Recreating Believability In NPCs: The Effects Of Visual And Logical Behaviour

Ohrberg, Simon January 2019 (has links)
NPCs are in many games the foundation on which it operates. NPCs create the illusion of inhabitants or assign purpose to the player. Whatever they do, they represent characters. Introducing a character has a certain margin of error, as a poorly portrayed character may cause more damage than the NPC would add. Without creating a believable environment which would include the NPCs, immersion could be difficult. As such this thesis investigates the effect of different behaviours to NPCs. With the focus on how visual and logical behaviours affect the players’ perception of their believability. The experiment was conducted in a game of the RTS survival genre. With the visual behaviours selected from the games Banished[1] and Frostpunk [2] the logical behaviours were inspired by F.E.A.R [3]. In the experiment testers were used to test three versions of the artifact. The first acted as a default and was used as the starting point for the second version which introduced enhanced visual behaviours. The third continued and added a predictive functionality as logical behaviour. The tests concluded that the visual behaviours had a positive effect on perception but no conclusive evidence to suggest the logical behaviour had the same effect.
7

Automatic Visual Behavior Analysis / Automatic Visual Behavior Analysis

Larsson, Petter January 2002 (has links)
<p>This work explores the possibilities of robust, noise adaptive and automatic segmentation of driver eye movements into comparable quantities as defined in the ISO 15007 and SAE J2396 standards for in-vehicle visual demand measurements. Driver eye movements have many potential applications, from the detection of driver distraction, drowsiness and mental workload, to the optimization of in-vehicle HMIs. This work focuses on SeeingMachines head and eye-tracking system SleepyHead (or FaceLAB), but is applicable to data from other similar eye-tracking systems. A robust and noise adaptive hybrid algorithm, based on two different change detection protocols and facts about eye-physiology, has been developed. The algorithm has been validated against data, video transcribed according to the ISO/SAE standards. This approach was highly successful, revealing correlations in the region of 0.999 between analysis types i.e. video transcription and the analysis developed in this work. Also, a real-time segmentation algorithm, with a unique initialization fefature, has been developed and validated based on the same approach.</p><p>This work enables real-time in-vehicle systems, based on driver eye-movements, to be developed and tested in real driving conditions. Furthermore, it has augmented FaceLAB by providing a tool that can easily be used when analysis of eye movements are of interest e.g. HMI and ergonomics studies, analysis of warnings, driver workload estimation etc.</p>
8

Automatic Visual Behavior Analysis / Automatic Visual Behavior Analysis

Larsson, Petter January 2002 (has links)
This work explores the possibilities of robust, noise adaptive and automatic segmentation of driver eye movements into comparable quantities as defined in the ISO 15007 and SAE J2396 standards for in-vehicle visual demand measurements. Driver eye movements have many potential applications, from the detection of driver distraction, drowsiness and mental workload, to the optimization of in-vehicle HMIs. This work focuses on SeeingMachines head and eye-tracking system SleepyHead (or FaceLAB), but is applicable to data from other similar eye-tracking systems. A robust and noise adaptive hybrid algorithm, based on two different change detection protocols and facts about eye-physiology, has been developed. The algorithm has been validated against data, video transcribed according to the ISO/SAE standards. This approach was highly successful, revealing correlations in the region of 0.999 between analysis types i.e. video transcription and the analysis developed in this work. Also, a real-time segmentation algorithm, with a unique initialization fefature, has been developed and validated based on the same approach. This work enables real-time in-vehicle systems, based on driver eye-movements, to be developed and tested in real driving conditions. Furthermore, it has augmented FaceLAB by providing a tool that can easily be used when analysis of eye movements are of interest e.g. HMI and ergonomics studies, analysis of warnings, driver workload estimation etc.
9

Molecular Evolution and Functional Characterization of the Visual Pigment Proteins of the Great Bowerbird (Chlamydera nuchalis) and Other Vertebrates

van Hazel, Ilke 16 December 2013 (has links)
Visual pigments are light sensitive receptors in the eye that form the basis of sensory visual transduction. This thesis presents three studies that explore visual pigment proteins in vertebrates using a number of computational and experimental methods in an evolutionary framework. The objective is not only to identify, but also to experimentally investigate the functional consequences of genetic variation in vertebrate visual pigments. The focus is on great bowerbirds (Chlamydera nuchalis), which are a model system in visual ecology due to their spectacular behaviour of building and decorating courtship bowers. There are 4 chapters: Chapter 1 introduces background information on visual pigments and vision in birds. Among visual pigment types, the short-wavelength-sensitive (SWS1) pigments have garnered particular interest due to the broad spectral range among vertebrates and the importance of UV signals in communication. Chapter 2 investigates the evolutionary history of SWS1 in vertebrates with a view toward its utility as a phylogenetic marker. Chapter 3 investigates SWS1 evolution and short-wavelength vision in birds, with particular focus on C. nuchalis and its SWS1. The evolution of spectral tuning mechanisms mediating UV/violet vision in passerines and parrots is elucidated in this chapter using site-directed mutagenesis, protein expression, and phylogenetic recreation of ancestral opsins. While cone opsins mediate colour vision in bright light, the rhodopsin visual pigment contained in rod photoreceptors is critical for dim light vision. Detailed characterization of rhodopsin function has only been conducted on a few model systems. Chapter 4 examines C. nuchalis RH1 using a number of functional assays in addition to absorbance spectra, including hydroxylamine sensitivity and the rate of retinal release. This chapter includes an investigation into the role of amino acid mutations typical of dim-light adapted vertebrates, D83N and A292S, in regulating functional properties of bovine and avian RH1s using site-directed mutagenesis. Together these chapters describe naturally occurring mutations in visual pigments and explore the way they can influence visual perception. These represent one of the few investigations of visual pigments from a species that is not a model lab organism and form a significant contribution to the field of visual pigment biochemistry and evolution.
10

Molecular Evolution and Functional Characterization of the Visual Pigment Proteins of the Great Bowerbird (Chlamydera nuchalis) and Other Vertebrates

van Hazel, Ilke 16 December 2013 (has links)
Visual pigments are light sensitive receptors in the eye that form the basis of sensory visual transduction. This thesis presents three studies that explore visual pigment proteins in vertebrates using a number of computational and experimental methods in an evolutionary framework. The objective is not only to identify, but also to experimentally investigate the functional consequences of genetic variation in vertebrate visual pigments. The focus is on great bowerbirds (Chlamydera nuchalis), which are a model system in visual ecology due to their spectacular behaviour of building and decorating courtship bowers. There are 4 chapters: Chapter 1 introduces background information on visual pigments and vision in birds. Among visual pigment types, the short-wavelength-sensitive (SWS1) pigments have garnered particular interest due to the broad spectral range among vertebrates and the importance of UV signals in communication. Chapter 2 investigates the evolutionary history of SWS1 in vertebrates with a view toward its utility as a phylogenetic marker. Chapter 3 investigates SWS1 evolution and short-wavelength vision in birds, with particular focus on C. nuchalis and its SWS1. The evolution of spectral tuning mechanisms mediating UV/violet vision in passerines and parrots is elucidated in this chapter using site-directed mutagenesis, protein expression, and phylogenetic recreation of ancestral opsins. While cone opsins mediate colour vision in bright light, the rhodopsin visual pigment contained in rod photoreceptors is critical for dim light vision. Detailed characterization of rhodopsin function has only been conducted on a few model systems. Chapter 4 examines C. nuchalis RH1 using a number of functional assays in addition to absorbance spectra, including hydroxylamine sensitivity and the rate of retinal release. This chapter includes an investigation into the role of amino acid mutations typical of dim-light adapted vertebrates, D83N and A292S, in regulating functional properties of bovine and avian RH1s using site-directed mutagenesis. Together these chapters describe naturally occurring mutations in visual pigments and explore the way they can influence visual perception. These represent one of the few investigations of visual pigments from a species that is not a model lab organism and form a significant contribution to the field of visual pigment biochemistry and evolution.

Page generated in 0.0326 seconds