Spelling suggestions: "subject:"computational neuroscience."" "subject:"eomputational neuroscience.""
21 |
An Information Theoretic Analysis of Neural MultiplexingWilliams, Ezekiel 21 April 2020 (has links)
How the brain encodes information in sequences of voltage spikes is an open question. Past literature suggests the importance of bursts, high-frequency spike events, as a key step towards answering this question. In particular, it was recently shown that neurons could use bursts to communicate two streams of information simultaneously, resulting in higher information rates than seen with other neural code theories. However, it is unknown how a neuron’s spiking statistics might affect communication via this new code. To investigate the influence of spike statistics, we study a bursting neuron model with the goal of estimating its information rate as a function of its spike statistics. To this end we extend a recently proposed method for estimating information rate. We find the information rate in our burst-multiplexing model is robust to changes in spike-train statistics, providing evidence for the utility of a burst-multiplexing code to diverse brain networks.
|
22 |
Linear Discriminant Analysis and Noise Correlations in Neuronal ActivityCalderini, Matias 17 December 2019 (has links)
The effects of noise correlations on neuronal stimulus discrimination have been the subject of sustained debate. Both experimental and computational work suggest beneficial and detrimental contributions of noise correlations. The aim of this study is to develop an analytically tractable model of stimulus discrimination that reveals the conditions leading to improved or impaired performance from model parameters and levels of noise correlation.
We begin with a mean firing rate integrator model as an approximation of underlying spiking activity in neuronal circuits. We consider two independent units receiving constant input and time fluctuating noise whose correlation across units can be tuned independently of firing rate. We implement a perceptron-like readout with Fisher Linear Discriminant Analysis (LDA). We exploit its closed form solution to find explicit expressions for discrimination error as a function of network parameters (leak, shared inputs, and noise gain) as well as the strength of noise correlation.
First, we derive equations for discrimination error as a function of noise correlation. We find that four qualitatively different sets of results exist, based on the ratios of the difference of means and variance of the distributions of neural activity. From network parameters, we find the conditions for which an increase in noise correlation can lead to monotonic decrease or monotonic increase of error, as well as conditions for which error evolves non-monotonically as a function of correlations. These results provide a potential explanation for previously reported contradictory effects of noise correlation.
Second, we expand on the dependency of the quantitative behaviour of the error curve on the tuning of specific subsets of network parameters. Particularly, when the noise gain of a pair of units is increased, the error rate as a function of noise correlation increases multiplicatively. However, when the noise gain of a single unit is increased, under certain conditions, the effect of noise can be beneficial to stimulus discrimination.
In sum, we present a framework of analysis that explains a series of non-trivial properties of neuronal discrimination via a simple linear classifier. We show explicitly how different configurations of parameters can lead to drastically different conclusions on the impact of noise correlations. These effects shed light on abundant experimental and computational results reporting conflicting effects of noise correlations. The derived analyses rely on few assumptions and may therefore be applicable to a broad class of neural models whose activity can be approximated by a multivariate distribution.
|
23 |
Time Frequency Analysis of Neural Oscillations in Multi-Attribute Decision-MakingLieuw, Iris 01 January 2015 (has links)
In our daily lives, we often make decisions that require the use of self-control, weighing trade-offs between various attributes: for example, selecting a food based on its health rather than its taste. Previous research suggests that re-weighting attributes may rely on selective attention, associated with decreased neural oscillations over posterior brain regions in the alpha (8-12 Hz) frequency range. Here, we utilized the high temporal resolution and whole-brain coverage of electroencephalography (EEG) to test this hypothesis in data collected from hungry human subjects exercising dietary self-control. Prior analysis of this data has found time-locked neural activity associated with each food’s perceived taste and health properties from approximately 400 to 650 ms after stimulus onset (Harris et al., 2013). We conducted time-frequency analyses to examine the role of alpha-band oscillations in this attribute weighting. Specifically, we predicted that there would be decreased alpha power in posterior electrodes beginning approximately 400 ms after stimulus onset for the presentation of healthy food relative to unhealthy food, reflecting shifts in selective attention. Consistent with this hypothesis, we found a significant decrease in alpha power for presentations of healthy relative to unhealthy foods. As predicted, this effect was most pronounced at posterior occipital and parietal electrodes and was significant from approximately 450 to 700 ms post-stimulus onset. Additionally, we found significant alpha-band decreases in right temporal electrodes during these times. These results extend previous attention research to multi-attribute choice, suggesting that the re-weighting of attributes can be measured neuro-computationally.
|
24 |
Cortical Plasticity and TinnitusChrostowski, Michal 10 1900 (has links)
<p>Tinnitus is an auditory disorder characterized by the perception of a ringing, hissing or buzzing sound with no external stimulus. Because the most common cause of chronic tinnitus is hearing loss, this neurological disorder is becoming increasingly prevalent in our noise-exposed and ageing society. With no cure and a lack of effective treatments, there is a need for a comprehensive understanding of the neural underpinnings of tinnitus. This dissertation outlines the development and validation of a comprehensive theoretical model of cortical correlates of tinnitus that is used to shed light on the development of tinnitus and to propose improvements to tinnitus treatment strategies.</p> <p>The first study involved the development of a computational model that predicts how homeostatic plasticity acting in the auditory cortex responds to hearing loss. A subsequent empirical study validated a more biologically plausible version of this computational model. The goal of these studies was to determine whether and how a form of plasticity that maintains balance in neural circuits can lead to aberrant activity in the auditory cortex. The final study extends the validated computational model to develop a comprehensive theoretical framework characterizing the potential role of homeostatic and Hebbian plasticity in the development of most major cortical correlates of tinnitus.</p> <p>These theoretical and empirical studies provide a novel and complete description of how neural plasticity in adult auditory cortex can respond to hearing loss and result in the development of tinnitus correlates.</p> / Doctor of Philosophy (PhD)
|
25 |
Quantifying dynamics and variability in neural systemsNorman, Sharon Elizabeth 07 January 2016 (has links)
Synchronized neural activity, in which the firing of neurons is coordinated in time, is an observed phenomenon in many neural functions. The conditions that promote synchrony and the dynamics of synchronized activity are active areas of investigation because they are incompletely understood. In addition, variability is intrinsic to biological systems, but the effect of neuron spike time variability on synchronization dynamics is a question that merits more attention.
Previous experiments using a hybrid circuit of one biological neuron coupled to one computational neuron revealed that irregularity in biological neuron spike timing could change synchronization in the circuit, transitioning the activity between phase-locked and phase slipping. Simulations of this circuit could not replicate the transitions in network activity if neuron period was represented as a Gaussian process, but could if a process with history and a stochastic component were used. The phase resetting curve (PRC), which describes how neuron cycles change in response to input, can be used to construct a map that predicts if synchronization will occur in hybrid circuits. Without modification, these maps did not always capture observed network activity.
I conducted long-term recordings of invertebrate neurons and show that interspike interval (ISI) can be represented as an autoregressive integrated moving average process, where ISI is dependent on past history and a stochastic component with history. Using integrate and fire model simulations, I suggest that stochastic activity in adaptation channels could be responsible for the history dependence and correlational structure observed in these neurons. This evidence for stochastic, history-dependent noise in neural systems indicates that our understanding of network dynamics could be enhanced by including more complex, but relevant, forms of noise.
I show that cycle-by-cycle dynamics of the coupled system can be used to infer features of the dynamic map, even if it cannot be measured or is changing over time. Using this method, stable fixed points can be distinguished from ghost attractors in the presence of noise, networks with similar phase but different underlying dynamics can be resolved, and the movement of stable fixed points can be observed. The time-series vector method is a valuable tool for distinguishing dynamics and describing robustness. It can be adapted for use in larger populations and non-reciprocal circuits.
Finally, some larger implications of neuroscience research, specifically the use of neural interfaces for national security, are discussed. Neural interfaces for human enhancement in a national security context raise a number of unique ethical and policy concerns not common to dual use research of concern or traditional human subjects research. Guidelines about which technologies should be developed are lacking. We discuss a two-step framework with 1) an initial screen to prioritize technologies that should be reviewed immediately, and 2) a comprehensive ethical review regarding concerns for the enhanced individual, operational norms, and multi-use applications in the case of transfer to civilian contexts.
|
26 |
Self organisation and hierarchical concept representation in networks of spiking neuronsRumbell, Timothy January 2013 (has links)
The aim of this work is to introduce modular processing mechanisms for cortical functions implemented in networks of spiking neurons. Neural maps are a feature of cortical processing found to be generic throughout sensory cortical areas, and self-organisation to the fundamental properties of input spike trains has been shown to be an important property of cortical organisation. Additionally, oscillatory behaviour, temporal coding of information, and learning through spike timing dependent plasticity are all frequently observed in the cortex. The traditional self-organising map (SOM) algorithm attempts to capture the computational properties of this cortical self-organisation in a neural network. As such, a cognitive module for a spiking SOM using oscillations, phasic coding and STDP has been implemented. This model is capable of mapping to distributions of input data in a manner consistent with the traditional SOM algorithm, and of categorising generic input data sets. Higher-level cortical processing areas appear to feature a hierarchical category structure that is founded on a feature-based object representation. The spiking SOM model is therefore extended to facilitate input patterns in the form of sets of binary feature-object relations, such as those seen in the field of formal concept analysis. It is demonstrated that this extended model is capable of learning to represent the hierarchical conceptual structure of an input data set using the existing learning scheme. Furthermore, manipulations of network parameters allow the level of hierarchy used for either learning or recall to be adjusted, and the network is capable of learning comparable representations when trained with incomplete input patterns. Together these two modules provide related approaches to the generation of both topographic mapping and hierarchical representation of input spaces that can be potentially combined and used as the basis for advanced spiking neuron models of the learning of complex representations.
|
27 |
Models of primate supraretinal visual representationsMender, Bedeho M. W. January 2014 (has links)
This thesis investigates a set of non-classical visual receptive field properties observed in the primate brain. Two main phenomena were explored. The first phenomenon was neurons with head-centered visual receptive fields, in which a neuron responds maximally to a visual stimulus in the same head-centered location across all eye positions. The second phenomenon was perisaccadic receptive field dynamics, which involves a range of experimentally observed response behaviours of an eye-centered neuron associated with the advent of a saccade that relocates the neuron's receptive field. For each of these two phenomena, a hypothesis was proposed for how a neural circuit with a suitable initial architecture and synaptic learning rules could, when subjected to visually-guided training, develop the receptive field properties in question. Corresponding neural network models were first trained as hypothesized, and subsequently tested in conditions similar to experimental tasks used to interrogate the physiology of the relevant primate neural circuits. The behaviour of the models was compared to neurophysiological observations as a metric for their explanatory power. In both cases the neural network models were in broad agreement with experimental observations, and the operation of these models was studied to shed light on the neural processing behind these neural phenomena in the brain.
|
28 |
A BCU scalable sensory acquisition system for EEG embedded applicationsUnknown Date (has links)
Electroencephalogram (EEG) Recording has been through a lot of changes and modification since it was first introduced in 1929 due to rising technologies and signal processing advancements. The EEG Data acquisition stage is the first and most valuable component in any EEG recording System, it has the role of gathering and conditioning its input and outputting reliable data to be effectively analyzed and studied by digital signal processors using sophisticated and advanced algorithms which help in numerous medical and consumer applications. We have designed a low noise low power EEG data acquisition system that can be set to act as a standalone mobile EEG data processing unit providing data preprocessing functions; it can also be a very reliable high speed data acquisition interface to an EEG processing unit. / by Sherif S. Fathalla. / Thesis (M.S.C.S.)--Florida Atlantic University, 2010. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2010. Mode of access: World Wide Web.
|
29 |
Point process modeling and estimation: advances in the analysis of dynamic neural spiking dataDeng, Xinyi 12 August 2016 (has links)
A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions.
Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes.
We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in population spiking data. Lastly, we proposed a general three-step paradigm that allows us to relate behavioral outcomes of various tasks to simultaneously recorded neural activity across multiple brain areas, which is a step towards closed-loop therapies for psychological diseases using real-time neural stimulation. These methods are suitable for real-time implementation for content-based feedback experiments.
|
30 |
Neural Synchrony in the Zebra Finch BrainGoings, Sydney Pia 01 April 2012 (has links)
I am interested in discovering the role of field potential oscillations in producing synchrony within the song system of the male zebra finch brain. An important function attributed to neural synchrony is sensorimotor integration. In the production of birdsong, sensorimotor integration is crucial, as auditory feedback is necessary for the maintenance of the song. A cortical-thalamic-cortical feedback loop is thought to play a role in the integration of auditory and motor information for the purpose of producing song. Synchronous activity has been observed between at least two nuclei in this feedback loop, MMAN and HVC. Since low frequency field potential oscillations have been shown to play a role in the synchronization of nuclei within the brain of other model animals, I hypothesized that this may be the case in the zebra finch song system. In order to investigate whether oscillatory activity is a mechanism behind the synchronous activity observed between HVC and MMAN, I performed dual extracellular recordings of neural activity within the zebra finch song system. Results suggest that oscillations are likely not involved in the synchrony observed in these nuclei. Future study may reveal that the structure of the feedback loop is necessary, and possibly even sufficient, for the synchronous activity in the zebra finch song system.
|
Page generated in 0.2155 seconds