Historically, artificial neural networks have been loosely defined as biologically inspired computational models. When deciding what sort of network to use for a given task there are two things that need to be considered. The first is the representational power of the given network, that is what class of problems can be solved by this network? Given a set of problems to be solved by neural networks, a network that can solve any of these problems is called a universal approximator. The second problem is the ability to find a desired network given an initial network via a learning rule. Here we are interested in the question
of universal approximation. A general definition of artificial neural networks is provided along with definitions for different kinds of universal approximation. We then prove that the recurrent temporal restricted Boltzmann machine (RTRBM) satisfies a general type of universal approximation for stochastic processes, an extention of previous results for the simple RBM. We conclude by examining the potential use of such temporal artificial neural networks in the biological process of perception. / Graduate
Identifer | oai:union.ndltd.org:uvic.ca/oai:dspace.library.uvic.ca:1828/7040 |
Date | 15 January 2016 |
Creators | Odense, Simon |
Contributors | Edwards, Roderick |
Source Sets | University of Victoria |
Language | English, English |
Detected Language | English |
Type | Thesis |
Format | application/pdf |
Rights | Available to the World Wide Web |
Page generated in 0.002 seconds