Return to search

Rule extraction using destructive learning in artificial neural networks

The use of inductive learning to extract general rules from examples would be a promising way to overcome the knowledge acquisition bottleneck. Over the last decade, many such techniques have been proposed. None of these have proved to be the efficient, general rule-extractors for complex real-world applications. Recent research has indicated that some kinds of hybrid-learning techniques which integrate two or more learning strategies outperform single learning techniques. In designing such a hybrid-learning method, neural network learning can be expected to be a good partner because it is tolerant for noisy data and is very flexible for approximate data. / This dissertation proposes another such method--a rule extraction method using an artificial neural network (ANN) that is trained by destructive learning. Unlike other published methods, the method proposed here takes advantage of the smart (pruned) network which contains more exact knowledge regarding the problem domain (environment). The method consists of three phases: training, pruning, and rule-extracting. The training phase is concerned with ANN learning, using a general backpropagation (BP) learning algorithm. In the pruning phase, redundant hidden units and links are deleted from a trained network, and then, the link weights remaining in the network are retrained to obtain near-saturated outputs from hidden units. The rule extraction algorithm uses the pruned network to extract rules. / The proposed method is evaluated empirically on three application domains--the MONK's problems, the IRIS-classification data set, and the thyroid-disease diagnosis data set--and its performance is compared with that of other classification and/or machine learning methods. It is shown that for discrete samples, the proposed method outperforms others, while for continuous samples it can beat most other methods with which it is compared. The classifying accuracy of the proposed method is higher than that of either backpropagation learning or the pruned network on which it is based. / Source: Dissertation Abstracts International, Volume: 55-04, Section: B, page: 1526. / Major Professor: R. C. Lacher. / Thesis (Ph.D.)--The Florida State University, 1994.

Identiferoai:union.ndltd.org:fsu.edu/oai:fsu.digital.flvc.org:fsu_77177
ContributorsYoon, Byungjoo., Florida State University
Source SetsFlorida State University
LanguageEnglish
Detected LanguageEnglish
TypeText
Format152 p.
RightsOn campus use only.
RelationDissertation Abstracts International

Page generated in 0.0021 seconds