Return to search

Neural Networks and the Natural Gradient

Neural network training algorithms have always suffered from the problem of local minima. The advent of natural gradient algorithms promised to overcome this shortcoming by finding better local minima. However, they require additional training parameters and computational overhead. By using a new formulation for the natural gradient, an algorithm is described that uses less memory and processing time than previous algorithms with comparable performance.

Identiferoai:union.ndltd.org:UTAHS/oai:digitalcommons.usu.edu:etd-1535
Date01 May 2010
CreatorsBastian, Michael R.
PublisherDigitalCommons@USU
Source SetsUtah State University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceAll Graduate Theses and Dissertations
RightsCopyright for this work is held by the author. Transmission or reproduction of materials protected by copyright beyond that allowed by fair use requires the written permission of the copyright owners. Works not in the public domain cannot be commercially exploited without permission of the copyright owner. Responsibility for any use rests exclusively with the user. For more information contact Andrew Wesolek (andrew.wesolek@usu.edu).

Page generated in 0.002 seconds