Spelling suggestions: "subject:"cisual evoked response"" "subject:"4visual evoked response""
31 |
Changes in chromatic pattern-onset VEP with full-body inversionHighsmith, Jennifer Rea. January 2007 (has links)
Thesis (M.A.)--University of Nevada, Reno, 2007. / "August, 2007." Includes bibliographical references (leaves 10-11). Online version available on the World Wide Web.
|
32 |
Changing the shape of circadian rhythms with light no brighter than moonlightEvans, Jennifer Anne. January 2007 (has links)
Thesis (Ph. D.)--University of California, San Diego, 2007. / Title from first page of PDF file (viewed June 8, 2007). Available via ProQuest Digital Dissertations. Vita. Includes bibliographical references (p. 169-188).
|
33 |
The use of visually evoked cortical potentials to evaluate changes in the rate of recovery from phenobarbital anesthesia in the ratHeggeness, Steven Theodore 01 January 1981 (has links)
Signal averaging of visually evoked cortical potentials was done on Wistar strain rats during recovery from nembutal (sodium pentobarbital) anesthesia. Several studies (Dafny, 1978; Gines et al., 1963; Roig et al., 1961) have shown significant differences between recordings from unanesthetized rats and from rats anesthetized with various agents. The purpose of this study was to evaluate changes in the cortical response throughout an eight hour nembutal recovery period in order to determine the feasibility of using a signal averaging technique for classification of anesthetic depth.
The results of this study show that the recovery from nembutal anesthesia is characterized by three major changes in the cortical response: the presence or absence of the secondary response component, the appearance of desynchronized cortical firing and a change in the latency of the individual component peaks. Using these neurophysiological signs, the rate of recovery from nembutal anesthesia can be described and quantified. The characterization of these changes will provide future researchers with a tool to evaluate electrophysiologically the usefulness of various treatments at altering the rate of recovery from anesthesia.
|
34 |
Relations interhémisphériques dans le traitement de la forme et de la position visuellesAchim, André. January 1980 (has links)
No description available.
|
35 |
Crossmodal interactions in stimulus-driven spatial attention and inhibition of return: evidence from behavioural and electrophysiological measuresMacDonald, John J. 05 1900 (has links)
Ten experiments examined the interactions between vision and audition in stimulusdriven
spatial attention orienting and inhibition of return (IOR). IOR is the demonstration that
subjects are slower to respond to stimuli that are presented at a previously stimulated location. In
each experiment, subjects made go/no-go responses to peripheral targets but not to central
targets. On every trial, a target was preceded by a sensory event, called a "cue," either in the
same modality (intramodal conditions) or in a different modality (crossmodal conditions). The
cue did not predict the location of the target stimulus in any experiment. In some experiments,
the cue and target modalities were fixed and different. Under these conditions, response times to
a visual target were shorter when it appeared at the same location as an auditory cue than when it
appeared on the opposite side of fixation, particularly at short (100 ms) cue-target stimulus onset
asynchronies (Experiments 1A and IB). Similarly, response times to an auditory target were
shorter when it appeared at the same location as a visual cue than when it appeared at a location
on the opposite side of fixation (Experiments 2A and 2B). These crossmodal effects indicate that
stimulus-driven spatial attention orienting might arise from a single supramodal brain
mechanism. IOR was not observed in either crossmodal experiment indicating that it might arise
from modality specific mechanisms. However, for many subjects, IOR did occur between
auditory cues and visual targets (Experiments 3A and 3B) and between visual cues and auditory
targets (Experiment 4A and 4B) when the target could appear in the same modality as the cue on
half of the trials. Finally, the crossmodal effects of stimulus-driven spatial attention orienting on
auditory and visual event-related brain potentials (ERPs) were examined in the final two
experiments. Auditory cues modulated the ERPs to visual targets and visual cues modulated the
ERPs to auditory targets, demonstrating that the mechanisms for spatial attention orienting
cannot be completely modality specific. However, these crossmodal ERP effects were very
different from each other indicating that the mechanisms for spatial attention orienting cannot be
completely shared.
|
36 |
Crossmodal interactions in stimulus-driven spatial attention and inhibition of return: evidence from behavioural and electrophysiological measuresMacDonald, John J. 05 1900 (has links)
Ten experiments examined the interactions between vision and audition in stimulusdriven
spatial attention orienting and inhibition of return (IOR). IOR is the demonstration that
subjects are slower to respond to stimuli that are presented at a previously stimulated location. In
each experiment, subjects made go/no-go responses to peripheral targets but not to central
targets. On every trial, a target was preceded by a sensory event, called a "cue," either in the
same modality (intramodal conditions) or in a different modality (crossmodal conditions). The
cue did not predict the location of the target stimulus in any experiment. In some experiments,
the cue and target modalities were fixed and different. Under these conditions, response times to
a visual target were shorter when it appeared at the same location as an auditory cue than when it
appeared on the opposite side of fixation, particularly at short (100 ms) cue-target stimulus onset
asynchronies (Experiments 1A and IB). Similarly, response times to an auditory target were
shorter when it appeared at the same location as a visual cue than when it appeared at a location
on the opposite side of fixation (Experiments 2A and 2B). These crossmodal effects indicate that
stimulus-driven spatial attention orienting might arise from a single supramodal brain
mechanism. IOR was not observed in either crossmodal experiment indicating that it might arise
from modality specific mechanisms. However, for many subjects, IOR did occur between
auditory cues and visual targets (Experiments 3A and 3B) and between visual cues and auditory
targets (Experiment 4A and 4B) when the target could appear in the same modality as the cue on
half of the trials. Finally, the crossmodal effects of stimulus-driven spatial attention orienting on
auditory and visual event-related brain potentials (ERPs) were examined in the final two
experiments. Auditory cues modulated the ERPs to visual targets and visual cues modulated the
ERPs to auditory targets, demonstrating that the mechanisms for spatial attention orienting
cannot be completely modality specific. However, these crossmodal ERP effects were very
different from each other indicating that the mechanisms for spatial attention orienting cannot be
completely shared. / Arts, Faculty of / Psychology, Department of / Graduate
|
37 |
Integrated scenic modeling of environmentally induced color changes in a coniferous forest canopy.Clay, Gary Robert. January 1995 (has links)
The relationship between the changes in color values of scenic landscapes, and the corresponding shifts in viewers' preferences to those changed environments, was the focus of the presented research. Color modifications, either natural or based on some human intervention, provide visual clues that an environment has undergone some transformation. These color changes can occur at both the micro and macro scale, can having temporal dimensions, and can be a result of combinations of both physical landscape change, and shifts in an observer's perspective with respect to that landscape. The research reviewed two existing models and related them in an integrated program of scenic change analysis. The first, a bio-physical remote sensing model, identified the relationships between the existing bio-physical environmental conditions and measured color signatures of selected landscape features. The second, a psychophysical perception model, established relationships between the landscape's bio-physical attributes and measured perceptual responses to those environments. By merging aspects of each model, the research related the changing scenic color patterns with observers' responses to those changed environments. The research methodology presented a program of scenic change analysis incorporating several technologies including (1) ground-based biological inventories, (2) remote sensing, (3) geographic information systems (GIS), and (4) computer visualization. A series of investigations focused on landscape scenes selected from a high elevation coniferous forest in southern Utah. Three initial scenic investigations compared (1) the impact of changing view angles on scenic color values, (2) color shifts due to changing sun-illumination angles within a day, and (3) color shifts due to changing biological conditions over a 12-month period. A fourth investigation measured the color changes caused by a spruce bark beetle outbreak, and developed a series of color signatures to simulate the color shifts indicative of an outbreak at different stages of development. These signatures were applied to digitized site photographs to produce a series of visualizations displaying different levels of beetle damage. The visualizations were then applied in a series of perceptual experiments to test the precision and reliability of the visual simulations.
|
38 |
A gaze-addressing communication system using artificial neural networksBaud-Bovy, Gabriel 01 January 1992 (has links)
Severe motor disabilities can render a person almost completely incapable of communication. Nevertheless, in many cases, the sensory systems are intact and the eye movements are still under good control. In these cases, one can use a device such as the Brain Response Interface (BRI) to command a remote control (e.g. room temperature, bed position), a word-processor, a speech synthesizer, and so on.
|
Page generated in 0.0707 seconds