1 |
Comparing generalized additive neural networks with multilayer perceptrons / Johannes Christiaan GoosenGoosen, Johannes Christiaan January 2011 (has links)
In this dissertation, generalized additive neural networks (GANNs) and multilayer perceptrons (MLPs) are studied
and compared as prediction techniques. MLPs are the most widely used type of artificial neural network
(ANN), but are considered black boxes with regard to interpretability. There is currently no simple a priori
method to determine the number of hidden neurons in each of the hidden layers of ANNs. Guidelines exist that
are either heuristic or based on simulations that are derived from limited experiments. A modified version of
the neural network construction with cross–validation samples (N2C2S) algorithm is therefore implemented and
utilized to construct good MLP models. This algorithm enables the comparison with GANN models. GANNs
are a relatively new type of ANN, based on the generalized additive model. The architecture of a GANN is less
complex compared to MLPs and results can be interpreted with a graphical method, called the partial residual
plot. A GANN consists of an input layer where each of the input nodes has its own MLP with one hidden layer.
Originally, GANNs were constructed by interpreting partial residual plots. This method is time consuming and
subjective, which may lead to the creation of suboptimal models. Consequently, an automated construction
algorithm for GANNs was created and implemented in the SAS R
statistical language. This system was called
AutoGANN and is used to create good GANN models.
A number of experiments are conducted on five publicly available data sets to gain insight into the similarities
and differences between GANN and MLP models. The data sets include regression and classification tasks.
In–sample model selection with the SBC model selection criterion and out–of–sample model selection with the
average validation error as model selection criterion are performed. The models created are compared in terms
of predictive accuracy, model complexity, comprehensibility, ease of construction and utility.
The results show that the choice of model is highly dependent on the problem, as no single model always
outperforms the other in terms of predictive accuracy. GANNs may be suggested for problems where interpretability
of the results is important. The time taken to construct good MLP models by the modified N2C2S
algorithm may be shorter than the time to build good GANN models by the automated construction algorithm / Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2011.
|
2 |
Comparing generalized additive neural networks with multilayer perceptrons / Johannes Christiaan GoosenGoosen, Johannes Christiaan January 2011 (has links)
In this dissertation, generalized additive neural networks (GANNs) and multilayer perceptrons (MLPs) are studied
and compared as prediction techniques. MLPs are the most widely used type of artificial neural network
(ANN), but are considered black boxes with regard to interpretability. There is currently no simple a priori
method to determine the number of hidden neurons in each of the hidden layers of ANNs. Guidelines exist that
are either heuristic or based on simulations that are derived from limited experiments. A modified version of
the neural network construction with cross–validation samples (N2C2S) algorithm is therefore implemented and
utilized to construct good MLP models. This algorithm enables the comparison with GANN models. GANNs
are a relatively new type of ANN, based on the generalized additive model. The architecture of a GANN is less
complex compared to MLPs and results can be interpreted with a graphical method, called the partial residual
plot. A GANN consists of an input layer where each of the input nodes has its own MLP with one hidden layer.
Originally, GANNs were constructed by interpreting partial residual plots. This method is time consuming and
subjective, which may lead to the creation of suboptimal models. Consequently, an automated construction
algorithm for GANNs was created and implemented in the SAS R
statistical language. This system was called
AutoGANN and is used to create good GANN models.
A number of experiments are conducted on five publicly available data sets to gain insight into the similarities
and differences between GANN and MLP models. The data sets include regression and classification tasks.
In–sample model selection with the SBC model selection criterion and out–of–sample model selection with the
average validation error as model selection criterion are performed. The models created are compared in terms
of predictive accuracy, model complexity, comprehensibility, ease of construction and utility.
The results show that the choice of model is highly dependent on the problem, as no single model always
outperforms the other in terms of predictive accuracy. GANNs may be suggested for problems where interpretability
of the results is important. The time taken to construct good MLP models by the modified N2C2S
algorithm may be shorter than the time to build good GANN models by the automated construction algorithm / Thesis (M.Sc. (Computer Science))--North-West University, Potchefstroom Campus, 2011.
|
Page generated in 0.0481 seconds