• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 8
  • 2
  • 2
  • Tagged with
  • 12
  • 12
  • 12
  • 10
  • 6
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Synthèse de textures dynamiques pour l'étude de la vision en psychophysique et électrophysiologie / Dynamic Textures Synthesis for Probing Vision in Psychophysics and Electrophysiology

Vacher, Jonathan 18 January 2017 (has links)
Le but de cette thèse est de proposer une modélisation mathématique des stimulations visuelles afin d'analyser finement des données expérimentales en psychophysique et en électrophysiologie. Plus précis\'ement, afin de pouvoir exploiter des techniques d'analyse de données issues des statistiques Bayésiennes et de l'apprentissage automatique, il est nécessaire de développer un ensemble de stimulations qui doivent être dynamiques, stochastiques et d'une complexité paramétrée. Il s'agit d'un problème important afin de comprendre la capacité du système visuel à intégrer et discriminer différents stimuli. En particulier, les mesures effectuées à de multiples échelles (neurone, population de neurones, cognition) nous permette d'étudier les sensibilités particulières des neurones, leur organisation fonctionnelle et leur impact sur la prise de décision. Dans ce but, nous proposons un ensemble de contributions théoriques, numériques et expérimentales, organisées autour de trois axes principaux : (1) un modèle de synthèse de textures dynamiques Gaussiennes spécialement paramétrée pour l'étude de la vision; (2) un modèle d'observateur Bayésien rendant compte du biais positif induit par fréquence spatiale sur la perception de la vitesse; (3) l'utilisation de méthodes d'apprentissage automatique pour l'analyse de données obtenues en imagerie optique par colorant potentiométrique et au cours d'enregistrements extra-cellulaires. Ce travail, au carrefour des neurosciences, de la psychophysique et des mathématiques, est le fruit de plusieurs collaborations interdisciplinaires. / The goal of this thesis is to propose a mathematical model of visual stimulations in order to finely analyze experimental data in psychophysics and electrophysiology. More precisely, it is necessary to develop a set of dynamic, stochastic and parametric stimulations in order to exploit data analysis techniques from Bayesian statistics and machine learning. This problem is important to understand the visual system capacity to integrate and discriminate between stimuli. In particular, the measures performed at different scales (neurons, neural population, cognition) allow to study the particular sensitivities of neurons, their functional organization and their impact on decision making. To this purpose, we propose a set of theoretical, numerical and experimental contributions organized around three principal axes: (1) a Gaussian dynamic texture synthesis model specially crafted to probe vision; (2) a Bayesian observer model that accounts for the positive effect of spatial frequency over speed perception; (3) the use of machine learning techniques to analyze voltage sensitive dye optical imaging and extracellular data. This work, at the crossroads of neurosciences, psychophysics and mathematics is the fruit of several interdisciplinary collaborations.
12

Context Effects in Early Visual Processing and Eye Movement Control

Nortmann, Nora 29 April 2015 (has links)
There is a difference between the raw sensory input to the brain and our stable perception of entities in the environment. A first approach to investigate perception is to study relationships between properties of currently presented stimuli and biological correlates of perceptual processes. However, it is known that such processes are not only dependent on the current stimulus. Sampling of information and the concurrent neuronal processing of stimulus content rely on contextual relationships in the environment, and between the environment and the body. Perceptual processes dynamically adjust to relevant context, such as the current task of the organism and its immediate history. To understand perception, we have to study how processing of current stimulus content is influenced by such contextual factors. This thesis investigates the influence of such factors on visual processing. In particular, it investigates effects of temporal context in early visual processing and the effect of task context in eye movement control. To investigate effects of contextual factors on early visual processing of current stimulus content, we study neuronal processing of visual information in the primary visual cortex. We use real-time optical imaging with voltage sensitive dyes to capture neuronal population activity in the millisecond range across several millimeters of cortical area. To characterize the cortical layout concerning the mapping of orientation, previous to further investigations, we use smoothly moving grating stimuli. Investigating responses to this stimulus type systematically, we find independent encoding of local contrast and orientation, and a direct mapping of current stimulus content onto cortical activity (Study 1). To investigate the influence of the previous stimulus as context on processing of current stimulus content, we use abrupt visual changes in sequences of modified natural images. In earlier studies, investigating relatively fast timescales, it was found that the primary visual cortex continuously represents current input (ongoing encoding), with little interference from past stimuli. We investigate whether this coding scheme generalizes to cases in which stimuli change more slowly, as frequently encountered in natural visual input. We use sequences of natural scene contours, comprised of vertically and horizontally filtered natural images, their superpositions, and a blank stimulus, presented with 10 or 33 Hz. We show that at the low temporal frequency, cortical activity patterns do not encode the present orientations but instead reflect their relative changes in time. For example, when a stimulus with horizontal orientation is followed by the superposition of both orientations, the pattern of cortical activity represents the newly added vertical orientations instead of the full sum of orientations. Correspondingly, contour removal from the superposition leads to the representation of orientations that have disappeared rather than those that remain. This is in sharp contrast to more rapid sequences for which we find an ongoing representation of present input, consistent with earlier studies. In summary, we find that for slow stimulus sequences, populations of neurons in the primary visual cortex are no longer tuned to orientations within individual stimuli but instead represent the difference between consecutive stimuli. Our results emphasize the influence of the temporal context on early visual processing and consequentially on information transmission to higher cortical areas (Study 2). To study effects of contextual factors on the sampling of visual information, we focus on human eye movement control. The eyes are actively moved to sample visual information from the environment. Some traditional approaches predict eye movements solely on simple stimulus properties, such as local contrasts (stimulus-driven factors). Recent arguments, however, emphasize the influence of tasks (task context) and bodily factors (spatial bias). To investigate how contextual factors affect eye movement control, we quantify the relative influences of the task context, spatial biases and stimulus-driven factors. Participants view and classify natural scenery and faces while their eye movements are recorded. The stimuli are composed of small image patches. For each of these patches we derive a measure that quantifies stimulus-driven factors, based on the image content of a patch, and spatial viewing biases, based on the location of the patch. Utilizing the participants’ classification responses, we additionally derive a measure, which reflects the information content of a patch in the context of a given task. We show that the effect of spatial biases is highest, that task context is a close runner-up, and that stimulus-driven factors have, on average, a smaller influence. Remarkably, all three factors make independent and significant contributions to the selection of viewed locations. Hence, in addition to stimulus-driven factors and spatial biases, the task context contributes to visual sampling behavior and has to be considered in a model of human eye movements. Visual processing of current stimulus content, in particular visual sampling behavior and early processing, is inherently dependent on context. We show that already in the first cortical stage, temporal context strongly affects the processing of new visual information and that visual sampling by eye movements is significantly influenced by the task context, independently of spatial factors and stimulus-driven factors. The empirical results presented provide foundations for an improved theoretical understanding of the role of context in perceptual processes.

Page generated in 0.0688 seconds