• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 75
  • 69
  • 16
  • 9
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 211
  • 211
  • 64
  • 52
  • 50
  • 50
  • 48
  • 48
  • 46
  • 32
  • 32
  • 31
  • 31
  • 28
  • 27
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

A learning framework for zero-knowledge game playing agents

Duminy, Willem Harklaas 17 October 2007 (has links)
The subjects of perfect information games, machine learning and computational intelligence combine in an experiment that investigates a method to build the skill of a game-playing agent from zero game knowledge. The skill of a playing agent is determined by two aspects, the first is the quantity and quality of the knowledge it uses and the second aspect is its search capacity. This thesis introduces a novel representation language that combines symbols and numeric elements to capture game knowledge. Insofar search is concerned; an extension to an existing knowledge-based search method is developed. Empirical tests show an improvement over alpha-beta, especially in learning conditions where the knowledge may be weak. Current machine learning techniques as applied to game agents is reviewed. From these techniques a learning framework is established. The data-mining algorithm, ID3, and the computational intelligence technique, Particle Swarm Optimisation (PSO), form the key learning components of this framework. The classification trees produced by ID3 are subjected to new post-pruning processes specifically defined for the mentioned representation language. Different combinations of these pruning processes are tested and a dominant combination is chosen for use in the learning framework. As an extension to PSO, tournaments are introduced as a relative fitness function. A variety of alternative tournament methods are described and some experiments are conducted to evaluate these. The final design decisions are incorporated into the learning frame-work configuration, and learning experiments are conducted on Checkers and some variations of Checkers. These experiments show that learning has occurred, but also highlights the need for further development and experimentation. Some ideas in this regard conclude the thesis. / Dissertation (MSc)--University of Pretoria, 2007. / Computer Science / MSc / Unrestricted
132

Prestructuring Multilayer Perceptrons based on Information-Theoretic Modeling of a Partido-Alto-based Grammar for Afro-Brazilian Music: Enhanced Generalization and Principles of Parsimony, including an Investigation of Statistical Paradigms

Vurkaç, Mehmet 01 January 2011 (has links)
The present study shows that prestructuring based on domain knowledge leads to statistically significant generalization-performance improvement in artificial neural networks (NNs) of the multilayer perceptron (MLP) type, specifically in the case of a noisy real-world problem with numerous interacting variables. The prestructuring of MLPs based on knowledge of the structure of a problem domain has previously been shown to improve generalization performance. However, the problem domains for those demonstrations suffered from significant shortcomings: 1) They were purely logical problems, and 2) they contained small numbers of variables in comparison to most data-mining applications today. Two implications of the former were a) the underlying structure of the problem was completely known to the network designer by virtue of having been conceived for the problem at hand, and b) noise was not a significant concern in contrast with real-world conditions. As for the size of the problem, neither computational resources nor mathematical modeling techniques were advanced enough to handle complex relationships among more than a few variables until recently, so such problems were left out of the mainstream of prestructuring investigations. In the present work, domain knowledge is built into the solution through Reconstructability Analysis, a form of information-theoretic modeling, which is used to identify mathematical models that can be transformed into a graphic representation of the problem domain's underlying structure. Employing the latter as a pattern allows the researcher to prestructure the MLP, for instance, by disallowing certain connections in the network. Prestructuring reduces the set of all possible maps (SAPM) that are realizable by the NN. The reduced SAPM--according to the Lendaris-Stanley conjecture, conditional probability, and Occam's razor--enables better generalization performance than with a fully connected MLP that has learned the same I/O mapping to the same extent. In addition to showing statistically significant improvement over the generalization performance of fully connected networks, the prestructured networks in the present study also compared favorably to both the performance of qualified human agents and the generalization rates in classification through Reconstructability Analysis alone, which serves as the alternative algorithm for comparison.
133

ECG Classification with an Adaptive Neuro-Fuzzy Inference System

Funsten, Brad Thomas 01 June 2015 (has links) (PDF)
Heart signals allow for a comprehensive analysis of the heart. Electrocardiography (ECG or EKG) uses electrodes to measure the electrical activity of the heart. Extracting ECG signals is a non-invasive process that opens the door to new possibilities for the application of advanced signal processing and data analysis techniques in the diagnosis of heart diseases. With the help of today’s large database of ECG signals, a computationally intelligent system can learn and take the place of a cardiologist. Detection of various abnormalities in the patient’s heart to identify various heart diseases can be made through an Adaptive Neuro-Fuzzy Inference System (ANFIS) preprocessed by subtractive clustering. Six types of heartbeats are classified: normal sinus rhythm, premature ventricular contraction (PVC), atrial premature contraction (APC), left bundle branch block (LBBB), right bundle branch block (RBBB), and paced beats. The goal is to detect important characteristics of an ECG signal to determine if the patient’s heartbeat is normal or irregular. The results from three trials indicate an average accuracy of 98.10%, average sensitivity of 94.99%, and average specificity of 98.87%. These results are comparable to two artificial neural network (ANN) algorithms: gradient descent and Levenberg Marquardt, as well as the ANFIS preprocessed by grid partitioning.
134

Computational Intelligence and Data Mining Techniques Using the Fire Data Set

Storer, Jeremy J. 04 May 2016 (has links)
No description available.
135

Applications of Soft Computing

Tiwari, A., Knowles, J., Avineri, E., Dahal, Keshav P., Roy, R. January 2006 (has links)
No
136

A decision support system for multi-objective programming problems

Rangoaga, Moeti Joseph 11 1900 (has links)
Many concrete problems may be cast in a multi-objective optimisation framework. The redundancy of existing methods for solving multi-objective programming problems susceptible to inconsistencies, coupled with the necessity for making in- herent assumptions before using a given method, make it hard for a nonspecialist to choose a method that ¯ts the situation at hand well. Moreover, using a method blindly, as suggested by the hammer principle (when you only have a hammer, you want everything in your hand to be a nail) is an awkward approach at best and a caricatural one at worst. This brings challenges to the design, development, implementation and deployment of a Decision Support System able to choose a method that is appropriate for a given problem and to apply the chosen method to solve the problem under consideration. The choice of method should be made according to the structure of the problem and the decision maker's opinion. The aim here is to embed a sample of methods representing the main multi-objective programming techniques and to help the decision maker find the most appropriate method for his problem. / Decisions Sciences / M. Sc. (Operations Research )
137

Learning lost temporal fuzzy association rules

Matthews, Stephen January 2012 (has links)
Fuzzy association rule mining discovers patterns in transactions, such as shopping baskets in a supermarket, or Web page accesses by a visitor to a Web site. Temporal patterns can be present in fuzzy association rules because the underlying process generating the data can be dynamic. However, existing solutions may not discover all interesting patterns because of a previously unrecognised problem that is revealed in this thesis. The contextual meaning of fuzzy association rules changes because of the dynamic feature of data. The static fuzzy representation and traditional search method are inadequate. The Genetic Iterative Temporal Fuzzy Association Rule Mining (GITFARM) framework solves the problem by utilising flexible fuzzy representations from a fuzzy rule-based system (FRBS). The combination of temporal, fuzzy and itemset space was simultaneously searched with a genetic algorithm (GA) to overcome the problem. The framework transforms the dataset to a graph for efficiently searching the dataset. A choice of model in fuzzy representation provides a trade-off in usage between an approximate and descriptive model. A method for verifying the solution to the hypothesised problem was presented. The proposed GA-based solution was compared with a traditional approach that uses an exhaustive search method. It was shown how the GA-based solution discovered rules that the traditional approach did not. This shows that simultaneously searching for rules and membership functions with a GA is a suitable solution for mining temporal fuzzy association rules. So, in practice, more knowledge can be discovered for making well-informed decisions that would otherwise be lost with a traditional approach.
138

A new approach to automatic saliency identification in images based on irregularity of regions

Al-Azawi, Mohammad Ali Naji Said January 2015 (has links)
This research introduces an image retrieval system which is, in different ways, inspired by the human vision system. The main problems with existing machine vision systems and image understanding are studied and identified, in order to design a system that relies on human image understanding. The main improvement of the developed system is that it uses the human attention principles in the process of image contents identification. Human attention shall be represented by saliency extraction algorithms, which extract the salient regions or in other words, the regions of interest. This work presents a new approach for the saliency identification which relies on the irregularity of the region. Irregularity is clearly defined and measuring tools developed. These measures are derived from the formality and variation of the region with respect to the surrounding regions. Both local and global saliency have been studied and appropriate algorithms were developed based on the local and global irregularity defined in this work. The need for suitable automatic clustering techniques motivate us to study the available clustering techniques and to development of a technique that is suitable for salient points clustering. Based on the fact that humans usually look at the surrounding region of the gaze point, an agglomerative clustering technique is developed utilising the principles of blobs extraction and intersection. Automatic thresholding was needed in different stages of the system development. Therefore, a Fuzzy thresholding technique was developed. Evaluation methods of saliency region extraction have been studied and analysed; subsequently we have developed evaluation techniques based on the extracted regions (or points) and compared them with the ground truth data. The proposed algorithms were tested against standard datasets and compared with the existing state-of-the-art algorithms. Both quantitative and qualitative benchmarking are presented in this thesis and a detailed discussion for the results has been included. The benchmarking showed promising results in different algorithms. The developed algorithms have been utilised in designing an integrated saliency-based image retrieval system which uses the salient regions to give a description for the scene. The system auto-labels the objects in the image by identifying the salient objects and gives labels based on the knowledge database contents. In addition, the system identifies the unimportant part of the image (background) to give a full description for the scene.
139

Diagnostic monitoring of dynamic systems using artificial immune systems

Maree, Charl 12 1900 (has links)
Thesis (MScEng (Process Engineering))--University of Stellenbosch, 2006. / The natural immune system is an exceptional pattern recognition system based on memory and learning that is capable of detecting both known and unknown pathogens. Artificial immune systems (AIS) employ some of the functionalities of the natural immune system in detecting change in dynamic process systems. The emerging field of artificial immune systems has enormous potential in the application of fault detection systems in process engineering. This thesis aims to firstly familiarise the reader with the various current methods in the field of fault detection and identification. Secondly, the notion of artificial immune systems is to be introduced and explained. Finally, this thesis aims to investigate the performance of AIS on data gathered from simulated case studies both with and without noise. Three different methods of generating detectors are used to monitor various different processes for anomalous events. These are: (1) Random Generation of detectors, (2) Convex Hulls, (3) The Hypercube Vertex Approach. It is found that random generation provides a reasonable rate of detection, while convex hulls fail to achieve the required objectives. The hypercube vertex method achieved the highest detection rate and lowest false alarm rate in all case studies. The hypercube vertex method originates from this project and is the recommended method for use with all real valued systems, with a small number of variables at least. It is found that, in some cases AIS are capable of perfect classification, where 100% of anomalous events are identified and no false alarms are generated. Noise has, expectedly so, some effect on the detection capability on all case studies. The computational cost of the various methods is compared, which concluded that the hypercube vertex method had a higher cost than other methods researched. This increased computational cost is however not exceeding reasonable confines therefore the hypercube vertex method nonetheless remains the chosen method. The thesis concludes with considering AIS’s performance in the comparative criteria for diagnostic methods. It is found that AIS compare well to current methods and that some of their limitations are indeed solved and their abilities surpassed in certain cases. Recommendations are made to future study in the field of AIS. Further the use of the Hypercube Vertex method is highly recommended in real valued scenarios such as Process Engineering.
140

Exploração de relações entre as técnicas nebulosas e evolutivas da inteligência computacional. / Exploration of relations between the fuzzy and the evolutionary techniques of computational intelligence.

Fialho, Álvaro Roberto Silvestre 12 April 2007 (has links)
Neste trabalho foi realizada uma busca por relações, regras e transformações entre duas metodologias constituintes da Inteligência Computacional - a Computação Nebulosa e a Computação Evolutiva. Com a organização e sistematização da existência de tais transformações, obtém-se uma mudança na modelagem de soluções que as utilizam de forma conjunta, possibilitando que teorias e modelos bem estabelecidos em uma das metodologias possam ser aproveitados pela outra de uma forma mais robusta, correta por construção, intrínseca e transparente. Um modelo foi proposto para direcionar o trabalho de pesquisa. Através da análise desse modelo e de uma revisão bibliográfica realizada, transformações pontuais entre as metodologias foram elencadas, e posteriormente consolidadas por meio de experimentos práticos: uma Base de Conhecimento (BC) de um Controlador Lógico Nebuloso foi criada e modificada, conforme a necessidade, através de um Algoritmo Genético (AG). Com a abordagem desenvolvida, além da criação de BCs a partir de pouquíssimo conhecimento sobre o domínio do problema, tornou-se possível a inserção de novos \"comportamentos desejados\" em BCs já existentes, automaticamente, através de AGs. Os resultados desses experimentos, realizados sobre uma plataforma computacional especificada e implementada para este fim, foram apresentados e analisados. / This work addressed a search of relations, rules and transformations between two Computational Intelligence constituent methodologies - Fuzzy Computing and Evolutionary Computing. The existence of these relations changes the actual way of solutions modeling that uses these methodologies, allowing the utilization of well established theories and models of one technique by the other in a more robust, intrinsic and transparent way. Besides the research and systematization of points that indicate the existence of relations between the two methodologies, a model to guide these exploration was proposed. By this model analysis and by the bibliographic revision made, punctual transformations were pointed out, and further consolidated through practical experiments: a Knowledge Base (KB) of a Fuzzy Logic Controller was created and modified automatically by a Genetic Algorithm. With the developed approach, besides the creation of KBs, it became possible to automatically insert new \"desired behaviors\" to existent KBs. The results of such experiments, realized through a computational platform specified and implemented to this task, were presented and analyzed.

Page generated in 0.1401 seconds