• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 786
  • 117
  • 65
  • 34
  • 18
  • 15
  • 15
  • 15
  • 15
  • 15
  • 15
  • 9
  • 4
  • 3
  • 2
  • Tagged with
  • 1157
  • 1157
  • 1157
  • 1134
  • 256
  • 154
  • 141
  • 139
  • 129
  • 123
  • 123
  • 123
  • 119
  • 106
  • 105
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

"Neural network" based process monitoring

Guerin, Olivier Cedrick 05 1900 (has links)
No description available.
92

On the performance issues of the bidirectional associative memory

Bragansa, John 05 1900 (has links)
No description available.
93

Design of an online trained neural network to control an unknown plant

Roesslinger, Lionel 05 1900 (has links)
No description available.
94

Backcalculation of flexible pavement moduli from falling weight deflectometer data using artificial neural networks

Meier, Roger William 05 1900 (has links)
No description available.
95

Approximating discrete-time optimal control using a neural network

Barth, Eric J. 12 1900 (has links)
No description available.
96

Online training of a neural network controller by improved reinforcement back-propagation

Rose, Stephen Matthew 05 1900 (has links)
No description available.
97

Multi-feature signature analysis for bearing condition monitoring using neural network methodology

Cease, Barry T. 12 1900 (has links)
No description available.
98

The application of artificial neural networks to the detection of bovine mastitis /

Yang, Xing Zhu. January 1998 (has links)
The overall objective of this research was to investigate the feasibility of using artificial neural networks to detect the incidence of clinical bovine mastitis and to determine the major factors influencing it. The first part of this research was devoted to a general examination of the learning ability of artificial neural networks by training them with relatively small data sets. These data sets (a total of 460,474 records) contained suspected indicators of mastitis such as milk production, stage of lactation and somatic cell count, and it was hoped that artificial neural networks would be able to detect what statistical modelling had already established elsewhere in the literature. The second part of this research was extended to examine the roles of more information resources such as conformation traits and their genetic values---factors that have not been studied extensively, with either conventional approaches or emerging technologies like artificial neural networks. (Abstract shortened by UMI.)
99

Using neural nets to generate and improve computer graphic procedures

Wenzel, Brent C. January 1992 (has links)
Image compression using neural networks in the past has focused on just reducing the number of bytes that had to be stored even thought the bytes had no meaning. This study looks at a new process that reduces the number of bytes stored but also maintains meaning behind the bytes. The bytes of the compressed image will correspond to parameters of an existing graphic algorithm. After a brief review of common neural networks and graphic algorithms, the back propagation neural network was chosen to be tested for this new process. Both three layer and four layer networks were tested. The four layer network was used in further tests because of its improved response compared to the three layer network. Two different training sets were used, a normal training set which was small and an extended version which included extreme value sets. These two training sets were shown to the neural network in two forms. The first was the raw format with no preprocessing. The second form used a Fast Fourier Transform to preprocess the data in an effort to distribute the image data throughout the image plane. The neural network’s response was good on images that it was trained on but responded poorly to new images that were not used in the training sets. / Department of Computer Science
100

Statistical mechanics of neural networks

Whyte, William John January 1995 (has links)
We investigate five different problems in the field of the statistical mechanics of neural networks. The first three problems involve attractor neural networks that optimise particular cost functions for storage of static memories as attractors of the neural dynamics. We study the effects of replica symmetry breaking (RSB) and attempt to find algorithms that will produce the optimal network if error-free storage is impossible. For the Gardner-Derrida network we show that full RSB is necessary for an exact solution everywhere above saturation. We also show that, no matter what the cost function that is optimised, if the distribution of stabilities has a gap then the Parisi replica ansatz that has been made is unstable. For the noise-optimal network we find a continuous transition to replica symmetry breaking at the AT line, in line with previous studies of RSB for different networks. The change to RSBl improves the agreement between "experimental" and theoretical calculations of the local stability distribution ρ(λ) significantly. The effect on observables is smaller. We show that if the network is presented with a training set which has been generated from a set of prototypes by some noisy rule, but neither the noise level nor the prototypes are known, then the perceptron algorithm is the best initial choice to produce a network that will generalise well. If additional information is available more sophisticated algorithms will be faster and give a smaller generalisation error. The remaining problems deal with attractor neural networks with separable interaction matrices which can be used (under parallel dynamics) to store sequences of patterns without the need for time delays. We look at the effects of correlations on a singlesequence network, and numerically investigate the storage capacity of a network storing an extensive number of patterns in such sequences. When correlations are implemented along with a term in the interaction matrix designed to suppress some of the effects of those correlations, the competition between the two produces a rich range of behaviour. Contrary to expectations, increasing the correlations and the operating temperature proves capable of improving the sequenceprocessing behaviour of the network. Finally, we demonstrate that a network storing a large number of sequences of patterns using a Hebb-like rule can store approximately twice as many patterns as the network trained with the Hebb rule to store individual patterns.

Page generated in 0.0503 seconds