• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Artificial Grammar Recognition Using Spiking Neural Networks

Cavaco, Philip January 2009 (has links)
<p>This thesis explores the feasibility of Artificial Grammar (AG) recognition using spiking neural networks. A biologically inspired minicolumn model is designed as the base computational unit. Two network topographies are defined with different ideologies. Both networks consists of minicolumn models, referred to as nodes, connected with excitatory and inhibitory connections. The first network contains nodes for every bigram and trigram producible by the grammar’s finite state machine (FSM). The second network has only nodes required to identify unique internal states of the FSM. The networks produce predictable activity for tested input strings. Future work to improve the performance of the networks is discussed. The modeling framework developed can be used by neurophysiological research to implement network layouts and compare simulated performance characteristics to actual subject performance.</p>
2

Artificial Grammar Recognition Using Spiking Neural Networks

Cavaco, Philip January 2009 (has links)
This thesis explores the feasibility of Artificial Grammar (AG) recognition using spiking neural networks. A biologically inspired minicolumn model is designed as the base computational unit. Two network topographies are defined with different ideologies. Both networks consists of minicolumn models, referred to as nodes, connected with excitatory and inhibitory connections. The first network contains nodes for every bigram and trigram producible by the grammar’s finite state machine (FSM). The second network has only nodes required to identify unique internal states of the FSM. The networks produce predictable activity for tested input strings. Future work to improve the performance of the networks is discussed. The modeling framework developed can be used by neurophysiological research to implement network layouts and compare simulated performance characteristics to actual subject performance.

Page generated in 0.1208 seconds