In this contributon we evaluate on-line and off-line techniques to train a single
hidden layer neural network classifier with logistic hidden and softmax output transfer
functions on a multispectral pixel-by-pixel classification problem. In contrast to
current practice a multiple class cross-entropy error function has been chosen as the
function to be minimized. The non-linear diffierential equations cannot be solved in
closed form. To solve for a set of locally minimizing parameters we use the gradient
descent technique for parameter updating based upon the backpropagation technique
for evaluating the partial derivatives of the error function with respect to the
parameter weights. Empirical evidence shows that on-line and epoch-based gradient
descent backpropagation fail to converge within 100,000 iterations, due to the fixed
step size. Batch gradient descent backpropagation training is superior in terms of
learning speed and convergence behaviour. Stochastic epoch-based training tends to
be slightly more effective than on-line and batch training in terms of generalization
performance, especially when the number of training examples is larger. Moreover, it
is less prone to fall into local minima than on-line and batch modes of operation. (authors' abstract) / Series: Discussion Papers of the Institute for Economic Geography and GIScience
Identifer | oai:union.ndltd.org:VIENNA/oai:epub.wu-wien.ac.at:4152 |
Date | 12 1900 |
Creators | Staufer-Steinnocher, Petra, Fischer, Manfred M. |
Publisher | WU Vienna University of Economics and Business |
Source Sets | Wirtschaftsuniversität Wien |
Language | English |
Detected Language | English |
Type | Paper, NonPeerReviewed |
Format | application/pdf |
Relation | http://epub.wu.ac.at/4152/ |
Page generated in 0.002 seconds