• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 4
  • 2
  • 2
  • Tagged with
  • 23
  • 23
  • 23
  • 23
  • 14
  • 12
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

STDP Implementation Using CBRAM Devices in CMOS

January 2015 (has links)
abstract: Alternative computation based on neural systems on a nanoscale device are of increasing interest because of the massive parallelism and scalability they provide. Neural based computation systems also offer defect finding and self healing capabilities. Traditional Von Neumann based architectures (which separate the memory and computation units) inherently suffer from the Von Neumann bottleneck whereby the processor is limited by the number of instructions it fetches. The clock driven based Von Neumann computer survived because of technology scaling. However as transistor scaling is slowly coming to an end with channel lengths becoming a few nanometers in length, processor speeds are beginning to saturate. This lead to the development of multi-core systems which process data in parallel, with each core being based on the Von Neumann architecture. The human brain has always been a mystery to scientists. Modern day super computers are outperformed by the human brain in certain computations. The brain occupies far less space and consumes a fraction of the power a super computer does with certain processes such as pattern recognition. Neuromorphic computing aims to mimic biological neural systems on silicon to exploit the massive parallelism that neural systems offer. Neuromorphic systems are event driven systems rather than being clock driven. One of the issues faced by neuromorphic computing was the area occupied by these circuits. With recent developments in the field of nanotechnology, memristive devices on a nanoscale have been developed and show a promising solution. Memristor based synapses can be up to three times smaller than Complementary Metal Oxide Semiconductor (CMOS) based synapses. In this thesis, the Programmable Metallization Cell (a memristive device) is used to prove a learning algorithm known as Spike Time Dependant Plasticity (STDP). This learning algorithm is an extension to Hebb’s learning rule in which the synapses weight can be altered by the relative timing of spikes across it. The synaptic weight with the memristor will be its conductance, and CMOS oscillator based circuits will be used to produce spikes that can modulate the memristor conductance by firing with different phases differences. / Dissertation/Thesis / Masters Thesis Electrical Engineering 2015
12

Mathematical Description of Differential Hebbian Plasticity and its Relation to Reinforcement Learning / Mathematische Beschreibung Hebb'scher Plastizität und deren Beziehung zu Bestärkendem Lernen

Kolodziejski, Christoph Markus 13 February 2009 (has links)
No description available.
13

Learning in silicon: a floating-gate based, biophysically inspired, neuromorphic hardware system with synaptic plasticity

Brink, Stephen Isaac 24 August 2012 (has links)
The goal of neuromorphic engineering is to create electronic systems that model the behavior of biological neural systems. Neuromorphic systems can leverage a combination of analog and digital circuit design techniques to enable computational modeling, with orders of magnitude of reduction in size, weight, and power consumption compared to the traditional modeling approach based upon numerical integration. These benefits of neuromorphic modeling have the potential to facilitate neural modeling in resource-constrained research environments. Moreover, they will make it practical to use neural computation in the design of intelligent machines, including portable, battery-powered, and energy harvesting applications. Floating-gate transistor technology is a powerful tool for neuromorphic engineering because it allows dense implementation of synapses with nonvolatile storage of synaptic weights, cancellation of process mismatch, and reconfigurable system design. A novel neuromorphic hardware system, featuring compact and efficient channel-based model neurons and floating-gate transistor synapses, was developed. This system was used to model a variety of network topologies with up to 100 neurons. The networks were shown to possess computational capabilities such as spatio-temporal pattern generation and recognition, winner-take-all competition, bistable activity implementing a "volatile memory", and wavefront-based robotic path planning. Some canonical features of synaptic plasticity, such as potentiation of high frequency inputs and potentiation of correlated inputs in the presence of uncorrelated noise, were demonstrated. Preliminary results regarding formation of receptive fields were obtained. Several advances in enabling technologies, including methods for floating-gate transistor array programming, and the creation of a reconfigurable system for studying adaptation in floating-gate transistor circuits, were made.
14

Deep learning in event-based neuromorphic systems / L'apprentissage profond dans les systèmes évènementiels, bio-inspirés

Thiele, Johannes C. 22 November 2019 (has links)
Inférence et apprentissage dans les réseaux de neurones profonds nécessitent une grande quantité de calculs qui, dans beaucoup de cas, limite leur intégration dans les environnements limités en ressources. Les réseaux de neurones évènementiels de type « spike » présentent une alternative aux réseaux de neurones artificiels classiques, et promettent une meilleure efficacité énergétique. Cependant, entraîner les réseaux spike demeure un défi important, particulièrement dans le cas où l’apprentissage doit être exécuté sur du matériel de calcul bio-inspiré, dit matériel neuromorphique. Cette thèse constitue une étude sur les algorithmes d’apprentissage et le codage de l’information dans les réseaux de neurones spike.A partir d’une règle d’apprentissage bio-inspirée, nous analysons quelles propriétés sont nécessaires dans les réseaux spike pour rendre possible un apprentissage embarqué dans un scénario d’apprentissage continu. Nous montrons qu’une règle basée sur le temps de déclenchement des neurones (type « spike-timing dependent plasticity ») est capable d’extraire des caractéristiques pertinentes pour permettre une classification d’objets simples comme ceux des bases de données MNIST et N-MNIST.Pour dépasser certaines limites de cette approche, nous élaborons un nouvel outil pour l’apprentissage dans les réseaux spike, SpikeGrad, qui représente une implémentation entièrement évènementielle de la rétro-propagation du gradient. Nous montrons comment cette approche peut être utilisée pour l’entrainement d’un réseau spike qui est capable d’inférer des relations entre valeurs numériques et des images MNIST. Nous démontrons que cet outil est capable d’entrainer un réseau convolutif profond, qui donne des taux de reconnaissance d’image compétitifs avec l’état de l’art sur les bases de données MNIST et CIFAR10. De plus, SpikeGrad permet de formaliser la réponse d’un réseau spike comme celle d’un réseau de neurones artificiels classique, permettant un entraînement plus rapide.Nos travaux introduisent ainsi plusieurs mécanismes d’apprentissage puissants pour les réseaux évènementiels, contribuant à rendre l’apprentissage des réseaux spike plus adaptés à des problèmes réels. / Inference and training in deep neural networks require large amounts of computation, which in many cases prevents the integration of deep networks in resource constrained environments. Event-based spiking neural networks represent an alternative to standard artificial neural networks that holds the promise of being capable of more energy efficient processing. However, training spiking neural networks to achieve high inference performance is still challenging, in particular when learning is also required to be compatible with neuromorphic constraints. This thesis studies training algorithms and information encoding in such deep networks of spiking neurons. Starting from a biologically inspired learning rule, we analyze which properties of learning rules are necessary in deep spiking neural networks to enable embedded learning in a continuous learning scenario. We show that a time scale invariant learning rule based on spike-timing dependent plasticity is able to perform hierarchical feature extraction and classification of simple objects of the MNIST and N-MNIST dataset. To overcome certain limitations of this approach we design a novel framework for spike-based learning, SpikeGrad, which represents a fully event-based implementation of the gradient backpropagation algorithm. We show how this algorithm can be used to train a spiking network that performs inference of relations between numbers and MNIST images. Additionally, we demonstrate that the framework is able to train large-scale convolutional spiking networks to competitive recognition rates on the MNIST and CIFAR10 datasets. In addition to being an effective and precise learning mechanism, SpikeGrad allows the description of the response of the spiking neural network in terms of a standard artificial neural network, which allows a faster simulation of spiking neural network training. Our work therefore introduces several powerful training concepts for on-chip learning in neuromorphic devices, that could help to scale spiking neural networks to real-world problems.
15

Pattern formation in neural circuits by the interaction of travelling waves with spike-timing dependent plasticity

Bennett, James Edward Matthew January 2014 (has links)
Spontaneous travelling waves of neuronal activity are a prominent feature throughout the developing brain and have been shown to be essential for achieving normal function, but the mechanism of their action on post-synaptic connections remains unknown. A well-known and widespread mechanism for altering synaptic strengths is spike-timing dependent plasticity (STDP), whereby the temporal relationship between the pre- and post-synaptic spikes determines whether a synapse is strengthened or weakened. Here, I answer the theoretical question of how these two phenomenon interact: what types of connectivity patterns can emerge when travelling waves drive a downstream area that implements STDP, and what are the critical features of the waves and the plasticity rules that shape these patterns? I then demonstrate how the theory can be applied to the development of the visual system, where retinal waves are hypothesised to play a role in the refinement of downstream connections. My major findings are as follows. (1) Mathematically, STDP translates the correlated activity of travelling waves into coherent patterns of synaptic connectivity; it maps the spatiotemporal structure in waves into a spatial pattern of synaptic strengths, building periodic structures into feedforward circuits. This is analogous to pattern formation in reaction diffusion systems. The theory reveals a role for the wave speed and time scale of the STDP rule in determining the spatial frequency of the connectivity pattern. (2) Simulations verify the theory and extend it from one-dimensional to two-dimensional cases, and from simplified linear wavefronts to more complex realistic and noisy wave patterns. (3) With appropriate constraints, these pattern formation abilities can be harnessed to explain a wide range of developmental phenomena, including how receptive fields (RFs) in the visual system are refined in size and topography and how simple-cell and direction selective RFs can develop. The theory is applied to the visual system here but generalises across different brain areas and STDP rules. The theory makes several predictions that are testable using existing experimental paradigms.
16

Redundant Input Cancellation by a Bursting Neural Network

Bol, Kieran G. 20 June 2011 (has links)
One of the most powerful and important applications that the brain accomplishes is solving the sensory "cocktail party problem:" to adaptively suppress extraneous signals in an environment. Theoretical studies suggest that the solution to the problem involves an adaptive filter, which learns to remove the redundant noise. However, neural learning is also in its infancy and there are still many questions about the stability and application of synaptic learning rules for neural computation. In this thesis, the implementation of an adaptive filter in the brain of a weakly electric fish, A. Leptorhynchus, was studied. It was found to require a cerebellar architecture that could supply independent frequency channels of delayed feedback and multiple burst learning rules that could shape this feedback. This unifies two ideas about the function of the cerebellum that were previously separate: the cerebellum as an adaptive filter and as a generator of precise temporal inputs.
17

Redundant Input Cancellation by a Bursting Neural Network

Bol, Kieran G. 20 June 2011 (has links)
One of the most powerful and important applications that the brain accomplishes is solving the sensory "cocktail party problem:" to adaptively suppress extraneous signals in an environment. Theoretical studies suggest that the solution to the problem involves an adaptive filter, which learns to remove the redundant noise. However, neural learning is also in its infancy and there are still many questions about the stability and application of synaptic learning rules for neural computation. In this thesis, the implementation of an adaptive filter in the brain of a weakly electric fish, A. Leptorhynchus, was studied. It was found to require a cerebellar architecture that could supply independent frequency channels of delayed feedback and multiple burst learning rules that could shape this feedback. This unifies two ideas about the function of the cerebellum that were previously separate: the cerebellum as an adaptive filter and as a generator of precise temporal inputs.
18

Redundant Input Cancellation by a Bursting Neural Network

Bol, Kieran G. 20 June 2011 (has links)
One of the most powerful and important applications that the brain accomplishes is solving the sensory "cocktail party problem:" to adaptively suppress extraneous signals in an environment. Theoretical studies suggest that the solution to the problem involves an adaptive filter, which learns to remove the redundant noise. However, neural learning is also in its infancy and there are still many questions about the stability and application of synaptic learning rules for neural computation. In this thesis, the implementation of an adaptive filter in the brain of a weakly electric fish, A. Leptorhynchus, was studied. It was found to require a cerebellar architecture that could supply independent frequency channels of delayed feedback and multiple burst learning rules that could shape this feedback. This unifies two ideas about the function of the cerebellum that were previously separate: the cerebellum as an adaptive filter and as a generator of precise temporal inputs.
19

Spike-Timing-Dependent Plasticity at Excitatory Synapses on the Rat Subicular Pyramidal Neurons

Pandey, Anurag January 2014 (has links) (PDF)
The subiculum is a structure that forms a bridge between the hippocampus and the entorhinal cortex (EC) in the brain, and plays a major role in the memory consolidation process. It consists of different types of pyramidal neurons. Based on their firing behavior, these excitatory neurons are classified into strong burst firing (SBF), weak burst firing (WBF) and regular firing (RF) neurons. In the first part of the work, morphological differences in the different neuronal subtypes was explored by biocytin staining after classifying the neurons based on the differences in electrophysiological properties. Detailed morphological properties of these three neuronal subtypes were analyzed using Neurolucida neuron reconstruction method. Unlike the differences in their electrophysiological properties, no difference was found in the morphometric properties of these neuronal subtypes. In the second part of the thesis, experimental results on spike- timing- dependent plasticity (STDP) at the proximal excitatory inputs on the subicular pyramidal neurons of the juvenile (P15-P19) rat are described. The STDP was studied in the WBF and RF neurons. Causal pairing of a single EPSP with a single back propagating action potential (bAP) at a time interval of 10 ms failed to induce plasticity. However, increasing the number of bAPs in such EPSP-bAP pair to three at 50 Hz (bAP burst) induced LTD in both, the RF, as well as the WBF neurons. Increasing the frequency of action potentials to 150 Hz in the bAP burst during causal pairing also induced LTD in both the neuronal subtypes. However, all other STDP related experiments were performed only with the bAP bursts consisting of 3 bAPs evoked at 50 Hz. Amplitude of the causal pairing induced LTD decreased with increasing time interval between EPSP and the bAP burst. Reversing the order of the EPSP and the bAP burst in the pair induced LTP only with a short time interval of 10 ms. This finding is in contrast to most of the reports on excitatory synapses, wherein the pre-before post (causal) pairing induced LTP and vice-versa. The results of causal and anti-causal pairing were used to plot the STDP curve for the WBF neurons. In the STDP curve observed in these synapses, LTD was observed upto a causal time interval of 30 ms, while LTP was limited to 10 ms time interval. Hence, the STDP curve was biased towards LTD. These results reaffirm the earlier observations that the relative timing of the pre- and postsynaptic activities can lead to multiple types of STDP curves. Next, the mechanism of non-Hebbian LTD was studied in both, the RF and WBF neurons. The involvement of calcium in the postsynaptic neuron in plasticity induction was studied by chelating intracellular calcium with BAPTA. The results indicate that the LTD induction in WBF neurons required postsynaptic calcium, while LTD induction in the RF neurons was independent of postsynaptic calcium. Paired pulse ratio (PPR) experiments suggested the involvement of a presynaptic mechanism in the induction of LTD in the RF neurons, and not in the WBF neurons since the PPR was unaffected by the induction protocol only in the WBF neurons. LTD induction in the WBF neurons required activity of the NMDA receptors since LTD was not observed in the presence of the NMDA receptor blocker in the WBF neurons, while it was unaffected in the RF neurons. However, the RF neurons required the activity of L-type calcium channels for plasticity induction, since LTD was affected in the presence of the L-type calcium channel blockers, although the WBF neurons did not require the L-type calcium channel activity for plasticity induction. Hence, in addition to a non-Hebbian STDP curve, a novel mechanism of LTD induction has been reported, where L-type calcium channels are involved in a synaptic plasticity that is expressed via change in the release probability. The findings on the STDP in subicular pyramidal neurons may have strong implications in the memory consolidation process owing to the central role of the subiculum and LTD in it.
20

Redundant Input Cancellation by a Bursting Neural Network

Bol, Kieran G. January 2011 (has links)
One of the most powerful and important applications that the brain accomplishes is solving the sensory "cocktail party problem:" to adaptively suppress extraneous signals in an environment. Theoretical studies suggest that the solution to the problem involves an adaptive filter, which learns to remove the redundant noise. However, neural learning is also in its infancy and there are still many questions about the stability and application of synaptic learning rules for neural computation. In this thesis, the implementation of an adaptive filter in the brain of a weakly electric fish, A. Leptorhynchus, was studied. It was found to require a cerebellar architecture that could supply independent frequency channels of delayed feedback and multiple burst learning rules that could shape this feedback. This unifies two ideas about the function of the cerebellum that were previously separate: the cerebellum as an adaptive filter and as a generator of precise temporal inputs.

Page generated in 0.1094 seconds