Return to search

A Neural Network Classifier for Spectral Pattern Recognition. On-Line versus Off-Line Backpropagation Training

In this contributon we evaluate on-line and off-line techniques to train a single
hidden layer neural network classifier with logistic hidden and softmax output transfer
functions on a multispectral pixel-by-pixel classification problem. In contrast to
current practice a multiple class cross-entropy error function has been chosen as the
function to be minimized. The non-linear diffierential equations cannot be solved in
closed form. To solve for a set of locally minimizing parameters we use the gradient
descent technique for parameter updating based upon the backpropagation technique
for evaluating the partial derivatives of the error function with respect to the
parameter weights. Empirical evidence shows that on-line and epoch-based gradient
descent backpropagation fail to converge within 100,000 iterations, due to the fixed
step size. Batch gradient descent backpropagation training is superior in terms of
learning speed and convergence behaviour. Stochastic epoch-based training tends to
be slightly more effective than on-line and batch training in terms of generalization
performance, especially when the number of training examples is larger. Moreover, it
is less prone to fall into local minima than on-line and batch modes of operation. (authors' abstract) / Series: Discussion Papers of the Institute for Economic Geography and GIScience

Identiferoai:union.ndltd.org:VIENNA/oai:epub.wu-wien.ac.at:4152
Date12 1900
CreatorsStaufer-Steinnocher, Petra, Fischer, Manfred M.
PublisherWU Vienna University of Economics and Business
Source SetsWirtschaftsuniversität Wien
LanguageEnglish
Detected LanguageEnglish
TypePaper, NonPeerReviewed
Formatapplication/pdf
Relationhttp://epub.wu.ac.at/4152/

Page generated in 0.0029 seconds