Spelling suggestions: "subject:"spike brain"" "subject:"spike grain""
1 |
Encoding of Sensory Signals Through Balanced Ionotropic Receptor Dynamics and Voltage Dependent Membrane NoiseMarcoux, Curtis January 2016 (has links)
Encoding behaviorally relevant stimuli in a noisy background is critical for animals to survive in their natural environment. We identify core biophysical and synaptic mechanisms that permit the encoding of low frequency signals in pyramidal neurons of the weakly electric fish Apteronotus leptorhynchus, an animal that can accurately encode miniscule (0.1%) amplitude modulations of its self-generated electric field. We demonstrate that slow NMDA-R mediated EPSPs are able to summate over many interspike intervals of the primary electrosensory afferents (EAs), effectively eliminating the EA spike train serial correlations from the pyramidal cell input. This permits stimulus-evoked changes in EA spiking to be transmitted efficiently to downstream ELL pyramidal cells, where a dynamic balance of NMDA-R and GABA-A-R currents is critical for encoding low frequency signals. Interestingly, AMPA-R activity is depressed and plays a negligible role in the generation of action potentials; instead, cell intrinsic membrane noise implements voltage-dependent stochastic resonance to amplify weak sensory input and appears to drive a significant proportion of pyramidal cell spikes. Together, these mechanisms may be sufficient for the ELL to encode signals near the threshold of behavioral detection.
|
2 |
A Protocol for Isolating Neural Activity of Neurons and Analyzing Their Behavior in a Pattern Separation TaskMoradi Salavat, Faraz 06 October 2023 (has links)
Understanding how the human brain works can lead to new discoveries and improved treatments for brain related diseases and disabilities such as Alzheimer's and autism. One method for studying brain activity is through electrophysiological recordings, particularly through the use of in vivo recording techniques. While these techniques have advanced significantly over the years, data analysis tools have not kept pace, making it difficult to isolate the activity of individual neurons from the recordings. In this thesis, we propose a unified protocol for isolating the spike activity of a neuron from an electrophysiology recording. Additionally, we conducted customized spike train analysis on the recorded cells in a pattern separation task. Preliminary results suggest that changes in the neural activity of mossy cells was not significant. However, for granule cells and interneurons, responses to punishment and reward were observed.
|
3 |
Sparse coding for speech recognitionSmit, Willem Jacobus 11 November 2008 (has links)
The brain is a complex organ that is computationally strong. Recent research in the field of neurobiology help scientists to better understand the working of the brain, especially how the brain represents or codes external signals. The research shows that the neural code is sparse. A sparse code is a code in which few neurons participate in the representation of a signal. Neurons communicate with each other by sending pulses or spikes at certain times. The spikes send between several neurons over time is called a spike train. A spike train contains all the important information about the signal that it codes. This thesis shows how sparse coding can be used to do speech recognition. The recognition process consists of three parts. First the speech signal is transformed into a spectrogram. Thereafter a sparse code to represent the spectrogram is found. The spectrogram serves as the input to a linear generative model. The output of themodel is a sparse code that can be interpreted as a spike train. Lastly a spike train model recognises the words that are encoded in the spike train. The algorithms that search for sparse codes to represent signals require many computations. We therefore propose an algorithm that is more efficient than current algorithms. The algorithm makes it possible to find sparse codes in reasonable time if the spectrogram is fairly coarse. The system achieves a word error rate of 19% with a coarse spectrogram, while a system based on Hidden Markov Models achieves a word error rate of 15% on the same spectrograms. / Thesis (PhD)--University of Pretoria, 2008. / Electrical, Electronic and Computer Engineering / unrestricted
|
4 |
Modeling inhibition-mediated neural dynamics in the rodent spatial navigation systemLyttle, David Nolan January 2013 (has links)
The work presented in this dissertation focuses on the use of computational and mathematical models to investigate how mammalian brains construct and maintain stable representations of space and location. Recordings of the activities of cells in the hippocampus and entorhinal cortex have provided strong, direct evidence that these cells and brain areas are involved in generating internal representations of the location of an animal in space. The emphasis of the first two portions of the dissertation are on understanding the factors that influence the scale and stability of these representations, both of which are important for accurate spatial navigation. In addition, it is argued in both cases that many of the computations observed in these systems emerge at least in part as a consequence of a particular type of network structure, where excitatory neurons are driven by external sources, and then mutually inhibit each other via interactions mediated by inhibitory cells. The first contribution of this thesis, which is described in chapter 2, is an investigation into the origin of the change in the scale of spatial representations across the dorsoventral axis of the hippocampus. Here it will be argued that this change in scale is due to increased processing of nonspatial information, rather than a dorsoventral change in the scale of the spatially-modulated inputs to this structure. Chapter 3 explores the factors influencing the dynamical stability of class of pattern-forming networks known as continuous attractor networks, which have been used to model various components of the spatial navigation systems, including head direction cells, place cells, and grid cells. Here it will be shown that network architecture, the amount of input drive, and the timescales at which cells interact all influence the stability of the patterns formed by these networks. Finally, in chapter 4, a new technique for analyzing neural data is introduced. This technique is a spike train similarity measure designed to compare spike trains on the basis of shared inhibition and bursts.
|
5 |
Spike train propagation in the axon of a visual interneuron, the descending contralateral movement detector of Locusta migratoriaSPROULE, MICHAEL 07 October 2011 (has links)
Neurons perform complex computations, communications and precise
transmissions of information in the form of action potentials (APs). The high level of
heterogeneity and complexity at all levels of organization within a neuron and the
functional requirement of highly permeable cell membranes leave neurons exposed to
damage when energy levels are insufficient for the active maintenance of ionic gradients.
When energy is limiting the ionic gradient across a neuron’s cell membrane risks being
dissipated which can have dire consequences. Other researchers have advocated
“generalized channel arrest” and/or “spike arrest” as a means of reducing the neuronal
permeability allowing neurons to adjust the demands placed on their electrogenic pumps
to lower levels of energy supply. I investigated the consequences of hypoxia on the
propagation of a train of APs down the length of a fast conducting axon capable of
transmitting APs at very high frequencies. Under normoxic conditions I found that APs
show conduction velocities and instantaneous frequencies nearly double that of neurons
experiencing energy limiting hypoxic conditions. I show that hypoxia affects AP
conduction differently for different lengths of axon and for APs of different instantaneous
frequencies. Action potentials of high instantaneous frequency in branching lengths of
axon within ganglia were delayed more significantly than those in non-branching lengths
contained within the connective and fail preferentially in branching axon. I found that
octopamine attenuates the effects of hypoxia on AP propagation for the branching length
of axon but has no effect on the non-branching length of axon. Additionally, for
energetically stable cells, application of the anti-diabetic medication metformin or the
hyperpolarization-activated cyclic nucleotide-gated (HCN) channel blocker ZD7288
resulted in a reduced performance similar to that seen in neurons experiencing energetic
stress. Furthermore both metformin and ZD7288 affect the shape of individual APs
within an AP train as well as the original temporal sequence of the AP train, which
encodes behaviourally relevant information. I propose that the reduced performance
observed in an energetically compromised cell represents an adaptive mechanism
employed by neurons in order to maintain the integrity of their highly heterogeneous and
complex organization during periods of reduced energy supply. / Thesis (Master, Biology) -- Queen's University, 2011-10-07 14:41:46.972
|
6 |
Mining Statistically Significant Temporal Associations In Multiple Event SequencesLiang, Han Unknown Date
No description available.
|
7 |
Considerations in the practical implementation of a travelling wave cochlear implant processorDu Preez, Christiaan Cronje 10 August 2012 (has links)
Speech processing in the human cochlea introduces travelling waves on the basilar membrane.
These travelling waves have largely been ignored in most processing strategies. This
study implements a hydrodynamical model in a speech processing strategy in order to investigate
the neural spike train patterns for a travelling wave processing strategy. In cochlear implants
a trade-off remains between the simulation rate and the number of electrode channels.
This trade-off was investigated in the proposed travelling wave strategy. Taking into consideration
existing current spread and electrical stimulation models, predicted neural spike train
responses have shown that stimulating fewer channels (six and four) at stimulation rates of
2 400 pps and 3 600 pps gives better approximations of predicted normal hearing responses
for input frequencies of 200 Hz, 600 Hz and 1 kHz, compared to stimulating more channels at lower channel stimulation rates. The predicted neural spike train patterns suggest
that these resulting neural patterns might contain both spatial and temporal information that
could be extracted by the auditory system. For a frequency of 4 kHz the predicted neural
patterns for a channel-number stimulation-rate configuration of 2 - 7 200 pps suggested that
although there is no travelling wave delay information, the predicted neural patterns still contain
temporal information. The predicted ISI histograms show peaks at the input tone period
and multiples thereof, with clusters of spikes evident around the tone period in the predicted
spatio-temporal neural spike train patterns. Similar peaks at the tone period were observed
for calculated ISI histograms for predicted normal hearing neural patterns and measured neural
responses. The predicted spatio-temporal neural patterns for the input frequency of 200
Hz show the travelling wave delay with clusters of spikes at the tone period. This travelling
wave delay can also be seen from predicted normal hearing neural responses. The current
spread, however, shows a significant distortion effect around the characteristic frequency
place where the travelling wave delay increases rapidly. Spacing electrodes more closely
results in an increase in this distortion, with the nerve fibre threshold decreasing in adjacent
populations of nerve fibres, increasing the probability of firing. The current spread showed a
more limited distortion effect on travelling wave delays when electrodes were spaced across
the cochlea, at an electrode spacing of 6.08 mm. ISI histogram results also showed increased
peaks around the tone period and multiples thereof. These predicted neural spike train patterns
suggest that travelling waves in processing strategies, although mostly ignored, might
provide the auditory system with both the spatial and temporal information needed for better
pitch perception. / Dissertation (MEng)--University of Pretoria, 2012. / Electrical, Electronic and Computer Engineering / MEng / Unrestricted
|
8 |
Causal pattern inference from neural spike train dataEchtermeyer, Christoph January 2009 (has links)
Electrophysiological recordings are a valuable tool for neuroscience in order to monitor the activity of multiple or even single neurons. Significant insights into the nervous system have been gained by analyses of resulting data; in particular, many findings were gained from spike trains whose correlations can give valuable indications about neural interplay. But detecting, specifying, and representing neural interactions is mathematically challenging. Further, recent advances of recording techniques led to an increase in volume of collected data, which often poses additional computational problems. These developments call for new, improved methods in order to extract crucial information. The matter of this thesis is twofold: It presents a novel method for the analysis of neural spike train data, as well as a generic framework in order to assess the new and related techniques. The new computational method, the Snap Shot Score, can be used to inspect spike trains with respect to temporal dependencies, which are visualised as an information flow network. These networks can specify the relationships in the data, indicate changes in dependencies, and point to causal interactions. The Snap Shot Score is demonstrated to reveal plausible networks both in a variety of simulations and for real data, which indicate its value for understanding neural dynamics. Additional to the Snap Shot Score, a neural simulation framework is suggested, which facilitates the assessment of neural network inference techniques in a highly automated fashion. Due to a new formal concept to rate learned networks, the framework can be used to test techniques under partial observability conditions. In the presence of hidden units quantification of results has been a tedious task that had to be done by hand, but which can now be automated. Thereby high throughput assessments become possible, which facilitate a comprehensive simulation-based characterisation of new methods.
|
9 |
Compressed wavefield extrapolation with curveletsLin, Tim T. Y., Herrmann, Felix J. January 2007 (has links)
An explicit algorithm for the extrapolation of one-way wavefields is proposed which combines recent developments in information theory and theoretical signal processing with the physics of wave propagation. Because of excessive memory requirements, explicit formulations for wave propagation have proven to be a challenge in {3-D}. By using ideas from ``compressed sensing'', we are able to formulate the (inverse) wavefield extrapolation problem on small subsets of the data volume, thereby reducing the size of the operators. According {to} compressed sensing theory, signals can successfully be recovered from an imcomplete set of measurements when the measurement basis is incoherent} with the representation in which the wavefield is sparse. In this new approach, the eigenfunctions of the Helmholtz operator are recognized as a basis that is incoherent with curvelets that are known to compress seismic wavefields. By casting the wavefield extrapolation problem in this framework, wavefields can successfully be extrapolated in the modal domain via a computationally cheaper operatoion. A proof of principle for the ``compressed sensing'' method is given for wavefield extrapolation in 2-D. The results show that our method is stable and produces identical results compared to the direct application of the full extrapolation operator.
|
10 |
Multivariate Multiscale Analysis of Neural Spike TrainsRamezan, Reza 10 December 2013 (has links)
This dissertation introduces new methodologies for the analysis of neural spike trains. Biological properties of the nervous system, and how they are reflected in neural data, can motivate specific analytic tools. Some of these biological aspects motivate multiscale frameworks, which allow for simultaneous modelling of the local and global behaviour of neurons. Chapter 1 provides the preliminary background on the biology of the nervous system and details the concept of information and randomness in the analysis of the neural spike trains. It also provides the reader with a thorough literature review on the current statistical models in the analysis of neural spike trains. The material presented in the next six chapters (2-7) have been the focus of three papers, which have either already been published or are being prepared for publication.
It is demonstrated in Chapters 2 and 3 that the multiscale complexity penalized likelihood method, introduced in Kolaczyk and Nowak (2004), is a powerful model in the simultaneous modelling of spike trains with biological properties from different time scales. To detect the periodic spiking activities of neurons, two periodic models from the literature, Bickel et al. (2007, 2008); Shao and Li (2011), were combined and modified in a multiscale penalized likelihood model. The contributions of these chapters are (1) employinh a powerful visualization tool, inter-spike interval (ISI) plot, (2) combining the multiscale method of Kolaczyk and Nowak (2004) with the periodic models ofBickel et al. (2007, 2008) and Shao and Li (2011), to introduce the so-called additive and multiplicative models for the intensity function of neural spike trains and introducing a cross-validation scheme to estimate their tuning parameters, (3) providing the numerical bootstrap confidence bands for the multiscale estimate of the intensity
function, and (4) studying the effect of time-scale on the statistical properties of spike counts.
Motivated by neural integration phenomena, as well as the adjustments for the neural refractory period, Chapters 4 and 5 study the Skellam process and introduce the Skellam Process with Resetting (SPR). Introducing SPR and its application in the analysis of neural spike trains is one of the major contributions of this dissertation. This stochastic process is biologically plausible, and unlike the Poisson process, it does not suffer from limited dependency structure. It also has multivariate generalizations for the simultaneous analysis of multiple spike trains. A computationally efficient recursive algorithm for the estimation of the parameters of SPR is introduced in Chapter 5. Except for the literature review at the beginning of Chapter 4, the rest of the material within these two chapters is original. The specific contributions of Chapters 4 and 5 are (1) introducing the Skellam Process with Resetting as a statistical tool to analyze neural spike trains and studying its properties, including all theorems and lemmas provided in Chapter 4, (2) the two fairly standard definitions of the Skellam process (homogeneous and inhomogeneous) and the proof of their equivalency, (3) deriving the likelihood function based on the observable data (spike trains) and developing a computationally efficient recursive algorithm for parameter estimation, and (4) studying the effect of time scales on the SPR model.
The challenging problem of multivariate analysis of the neural spike trains is addressed in Chapter 6. As far as we know, the multivariate models which are available in the literature suffer from limited dependency structures. In particular, modelling negative correlation among spike trains is a challenging problem. To address this issue, the multivariate Skellam distribution, as well as the multivariate Skellam process, which both have flexible dependency structures, are developed. Chapter 5 also introduces a multivariate version of Skellam Process with Resetting (MSPR), and a so-called profile-moment likelihood estimation of its parameters. This chapter generalizes the results of Chapter 4 and 5, and therefore, except for the brief literature review provided at the beginning of the chapter, the remainder of the material is original work. In particular, the contributions of this chapter are (1) introducing multivariate Skellam distribution, (2) introducing two definitions of the Multivariate Skellam process in both homogeneous and inhomogeneous cases and proving their equivalence, (3) introducing Multivariate Skellam Process with Resetting (MSPR) to simultaneously model spike trains from an ensemble of neurons, and (4) utilizing the so-called profile-moment likelihood method to compute estimates of the parameters of MSPR.
The discussion of the developed methodologies as well as the ``next steps'' are outlined in Chapter 7.
|
Page generated in 0.0618 seconds