741 |
Risk-based proactive availability management - attaining high performance and resilience with dynamic self-management in Enterprise Distributed SystemsCai, Zhongtang 10 January 2008 (has links)
Complex distributed systems such as distributed information flows systems
which continuously acquire manipulate and disseminate
information across an enterprise's distributed sites and machines,
and distributed server applications co-deployed in one or multiple shared data centers,
with each of them having different performance/availability requirements
that vary over time and competing with each other for the shared resources,
have been playing a more serious role in industry and society now.
Consequently, it becomes more important for enterprise scale IT infrastructure to
provide timely and sustained/reliable delivery and processing of service requests.
This hasn't become easier, despite more than 30 years of progress in distributed
computer connectivity, availability and reliability, if not more difficult~cite{ReliableDistributedSys},
because of many reasons. Some of them are, the increasing complexity
of enterprise scale computing infrastructure; the distributed
nature of these systems which make them prone to failures,
e.g., because of inevitable Heisenbugs in these complex distributed systems;
the need to consider diverse and complex business objectives and policies
including risk preference and attitudes in enterprise computing;
the issues of performance and availability conflicts, varying importance of
sub-systems in an enterprise's distributed infrastructure which compete for
resource in currently typical shared environment; and
the best effort nature of resources such as network resources, which implies
resource availability itself an issue, etc.
This thesis proposes a novel business policy-driven risk-based automated availability management
which uses an automated decision engine to make various availability decisions and
meet business policies while optimizing overall system utility,
uses utility theory to capture users' risk attitudes,
and address the potentially conflicting business goals and resource demands in enterprise scale
distributed systems.
For the critical and complex enterprise applications,
since a key contributor to application utility is the time taken to
recover from failures, we develop a novel proactive fault tolerance approach,
which uses online methods for failure prediction to dynamically determine the acceptable amounts of
additional processing and communication resources to be used (i.e., costs)
to attain certain levels of utility and acceptable delays in failure
recovery.
Since resource availability itself is often not guaranteed in typical shared enterprise
IT environments, this thesis provides IQ-Paths with probabilistic
service guarantee, to address the dynamic network
behavior in realistic enterprise computing environment.
The risk-based formulation is used as an effective
way to link the operational guarantees expressed by utility and
enforced by the PGOS algorithm with the higher level business objectives sought
by end users.
Together, this thesis proposes novel availability management framework and methods for
large-scale enterprise applications and systems, with the goal to provide different
levels of performance/availability guarantees for multiple applications and
sub-systems in a complex shared distributed computing infrastructure. More specifically,
this thesis addresses the following problems. For data center environments,
(1) how to provide availability management for applications and systems that
vary in both resource requirements and in their importance to the enterprise,
based both on operational level quantities and on business level objectives;
(2) how to deal with managerial policies such as risk attitude; and
(3) how to deal with the tradeoff between performance and availability,
given limited resources in a typical data center.
Since realistic business settings extend beyond single data centers, a second
set of problems addressed in this thesis concerns predictable and reliable
operation in wide area settings. For such systems, we explore (4) how to
provide high availability in widely distributed operational systems with
low cost fault tolerance mechanisms, and (5) how to provide probabilistic
service guarantees given best effort network resources.
|
742 |
A computational framework for unsupervised analysis of everyday human activitiesHamid, Muhammad Raffay 07 July 2008 (has links)
In order to make computers proactive and assistive, we must enable them to perceive, learn, and predict what is happening in their surroundings. This presents us with the challenge of formalizing computational models of everyday human activities. For a majority of environments, the structure of the in situ activities is generally not known a priori. This thesis therefore investigates knowledge representations and manipulation techniques that can facilitate learning of such everyday human activities in a minimally supervised manner.
A key step towards this end is finding appropriate representations for human activities. We posit that if we chose to describe activities as finite sequences of an appropriate set of events, then the global structure of these activities can be uniquely encoded using their local event sub-sequences. With this perspective at hand, we particularly investigate representations that characterize activities in terms of their fixed and variable length event subsequences. We comparatively analyze these representations in terms of their representational scope, feature cardinality and noise sensitivity.
Exploiting such representations, we propose a computational framework to discover the various activity-classes taking place in an environment. We model these activity-classes as maximally similar activity-cliques in a completely connected graph of activities, and describe how to discover them efficiently. Moreover, we propose methods for finding concise characterizations of these discovered activity-classes, both from a holistic as well as a by-parts perspective. Using such characterizations, we present an incremental method to classify
a new activity instance to one of the discovered activity-classes, and to automatically detect if it is anomalous with respect to the general characteristics of its membership class. Our results show the efficacy of our framework in a variety of everyday environments.
|
743 |
Multi-agent decision support system in avionics : improving maintenance and reliability predictions in an intelligent environmentHaider, Kamal January 2009 (has links)
Safety of the airborne platforms rests heavily on the way they are maintained. This maintenance includes repairs and testing, to reduce platform down time. Maintenance is performed using generic and specific test equipment within the existing maintenance management system (MMS). This thesis reports the work undertaken to improve maintainability and availability of avionics systems using an intelligent decision support system (IDSS). In order to understand the shortcomings of the existing system, the prevalent practices and methodologies are researched. This research thesis reports the development and implementation of an IDSS and the significant improvements made by this IDSS by integrating autonomous and independent information sources by employing a multi-agent system (MAS). Data mining techniques and intelligence agents (IA) are employed to create an expert system. The developed IDSS successfully demonstrates its ability to integrate and collate the available information and convert into valuable knowledge. Using this knowledge, the IDSS is able to generate interpreted alerts, warnings and recommendations thereby reasonably improving platform maintainability and availability. All facets of integrated logistics support (ILS) are considered to create a holistic picture. As the system ages, the IDSS also matures to assist managers and maintainers in making informed decisions about the platform, the unit under test (UUT) and even the environment that supports the platform.
|
744 |
Ανάπτυξη ενός "συστήματος τεχνητής νοημοσύνης" ενεργού ελέγχου δονήσεων και θορύβου με τη χρήση ενός τεχνητού νευρωνικού δικτύου και ενός γενετικού αλγορίθμου / Development of an "expert system" for active vibration and noise control by means of an artificial neural network and a genetic algorithmΕυθήμερος, Γεώργιος 11 August 2011 (has links)
Είναι ευρύτατα γνωστό ότι ο θόρυβος δημιουργείται από δονούμενες επιφάνειες. Για την αντιμετώπιση του θορύβου στην πηγή του, δηλαδή τη δονούμενη επιφάνεια, δύο κυρίως τρόποι έχουν αναπτυχθεί. Ο πρώτος τρόπος αφορά τη χρησιμοποίηση παθητικών μέσων, δηλαδή ηχομονωτικών υλικών που αποσβένουν συγκεκριμένες συχνότητες. Ο δεύτερος τρόπος αφορά τη χρήση ενεργητικών μέσων.
Τα ενεργητικά μέσα είναι διατάξεις που αποτελούνται από ένα σύστημα ελέγχου και ένα σύνολο αισθητήρων και ενεργοποιητών. Η λειτουργία ενός τέτοιου Συστήματος Ενεργού Ελέγχου Δονήσεων (ΣΕΕΔ) βασίζεται στην καταγραφή μέσω των αισθητήρων του τρόπου δόνησης της επιφάνειας (πρωτεύον πεδίο δόνησης), την δημιουργία σημάτων ελέγχου από τον ελεγκτή (ίδιου πλάτους αλλά με διαφορά φάσης 180o) και την αποστολή τους στους ενεργοποιητές που θα δημιουργήσουν ένα δευτερεύον πεδίο δόνησης. Η υπέρθεση των δύο πεδίων έχει σαν αποτέλεσμα την δημιουργία ενός εναπομείναντος πεδίου με πλάτη δόνησης αισθητά χαμηλότερα από αυτά του πρωτεύοντος.
Το αντικείμενο της παρούσας διατριβής είναι η ανάπτυξη ενός γενικευμένου ΣΕΕΔ, ο έλεγχος του οποίου βασίζεται σε εργαλεία Τεχνητής Νοημοσύνης όπως τα Τεχνητά Νευρωνικά Δίκτυα και οι Γενετικοί Αλγόριθμοι για την αναγνώριση του τρόπου δόνησης οποιασδήποτε επιφάνειας και το βέλτιστο έλεγχο της δόνησής της, χωρίς να απαιτείται καμία πρότερη γνώση της δυναμικής συμπεριφοράς της επιφάνειας. Επιπλέον, το υπό μελέτη ΣΕΕΔ είναι ικανό να ελέγχει τέσσερις συχνότητες αντί μιας που απαντάται συνήθως στην πλειονότητα των εφαρμογών.
Ο σκοπός της διατριβής αυτής είναι η απόδειξη της αρχής λειτουργίας ενός τέτοιου συστήματος. Η προσέγγιση για την επίτευξη αυτού του στόχου περιλαμβάνει πειραματικές μετρήσεις ενός πρωτότυπου ΣΕΕΔ σε μία απλοποιημένη πειραματική διάταξη.
Τα αποτελέσματα από την εφαρμογή του εν λόγω ΣΕΕΔ δείχνουν ότι παρά τους περιορισμούς που υπεισέρχονται λόγω των δυνατοτήτων του υλικού (hardware) του χρησιμοποιούμενου εξοπλισμού, το υπό μελέτη ΣΕΕΔ λειτουργεί επιτυχώς στη βασική αρχή του, ενώ έχει τις προϋποθέσεις και τη δυναμική για περαιτέρω βελτιστοποίηση και εξέλιξη σε ένα ευρύ φάσμα εφαρμογών. / It is generally approved that noise is created by vibrating surfaces. In order to tackle this phenomenon at its source, mainly two approaches have been followed. The first approach involves passive means, that is sound insulation materials that dampen certain frequencies. The second approach involves the use of active means.
The active means are arrangements that consist of a control system and a set of sensors and actuators. The application of such an arrangement for vibration control is called Active Vibration Control (AVC) and is based on the sampling (by means of sensors) of the primary field of vibration of the surface, the creation of control signals by the controller (secondary field - of the same amplitude but with phase difference of 180o) and finally applying these control signals on the vibrating surface, by means of the actuators. The superimposing of the two vibration signals (primary and secondary) results to a residual field where the amplitudes of vibration are significantly lower than in the primary.
The objective of the thesis at hand is to develop a Generic AVC with the controller developed using Artificial Intelligence tools such as the Artificial Neural Networks (ANNs) and Genetic Algorithms (GAs), in order to identify the vibration patterns of any surface and the optimal control of its vibration, without any prior knowledge of the dynamic behavior of the surface. Moreover, the developed AVC system will be able to identify and control four dominating frequencies instead of one that is usually the choice in the majority of similar applications.
The scope of this work is the ‘Proof of Concept’ of the successful operation of such a generic AVC system. The approach to this end includes experimental testing of a prototype AVC system on a simplified experimental set-up.
The results of the application of the developed AVC system, performed also by independent parties in the framework of a EC-funded Basic Research project, prove the successful operation of the developed AVCS, even within the limitation of the contemporary data acquisition platform (hardware and software) used, imposes limitations in the efficiency of the AVCS, and provide the basis for its further development and application in a multitude of problems.
|
745 |
Algoritmo rastreador web especialista nuclear / Nuclear expert web crawler algorithmThiago Reis 12 November 2013 (has links)
Nos últimos anos a Web obteve um crescimento exponencial, se tornando o maior repositório de informações já criado pelo homem e representando uma fonte nova e relevante de informações potencialmente úteis para diversas áreas, inclusive a área nuclear. Entretanto, devido as suas características e, principalmente, devido ao seu grande volume de dados, emerge um problema desafiador relacionado à utilização das suas informações: a busca e recuperação informações relevantes e úteis. Este problema é tratado por algoritmos de busca e recuperação de informação que trabalham na Web, denominados rastreadores web. Neste trabalho é apresentada a pesquisa e desenvolvimento de um algoritmo rastreador que efetua buscas e recupera páginas na Web com conteúdo textual relacionado ao domínio nuclear e seus temas, de forma autônoma e massiva. Este algoritmo foi projetado sob o modelo de um sistema especialista, possuindo, desta forma, uma base de conhecimento que contem tópicos nucleares e palavras-chave que os definem e um mecanismo de inferência constituído por uma rede neural artificial perceptron multicamadas que efetua a estimação da relevância das páginas na Web para um determinado tópico nuclear, no decorrer do processo de busca, utilizando a base de conhecimento. Deste modo, o algoritmo é capaz de, autonomamente, buscar páginas na Web seguindo os hiperlinks que as interconectam e recuperar aquelas que são mais relevantes para o tópico nuclear selecionado, emulando a habilidade que um especialista nuclear tem de navegar na Web e verificar informações nucleares. Resultados experimentais preliminares apresentam uma precisão de recuperação de 80% para o tópico área nuclear em geral e 72% para o tópico de energia nuclear, indicando que o algoritmo proposto é efetivo e eficiente na busca e recuperação de informações relevantes para o domínio nuclear. / Over the last years the Web has obtained an exponential growth, becoming the largest information repository ever created and representing a new and valuable source of potentially useful information for several topics and also for nuclear-related themes. However, due to the Web characteristics and, mainly, because of its huge data volume, finding and retrieving relevant and useful information are non-trivial tasks. This challenge is addressed by web search and retrieval algorithms called web crawlers. This work presents the research and development of a crawler algorithm able to search and retrieve webpages with nuclear-related textual content, in autonomous and massive fashion. This algorithm was designed under the expert systems model, having, this way, a knowledge base that contains a list of nuclear topics and keywords that define them and an inference engine composed of a multi-layer perceptron artificial neural network that performs webpages relevance estimates to some knowledge base nuclear topic while searching the Web. Thus, the algorithm is able to autonomously search the Web by following the hyperlinks that interconnect the webpages and retrieving those that are more relevant to some predefined nuclear topic, emulating the ability a nuclear expert has to browse the Web and evaluate nuclear information. Preliminary experimental results show a retrieval precision of 80% for the nuclear general domain topic and 72% for the nuclear power topic, indicating that the proposed algorithm is effective and efficient to search the Web and to retrieve nuclear-related information.
|
746 |
A combined case-based reasoning and process execution approach for knowledge-intensive workMartin, Andreas 11 1900 (has links)
Knowledge and knowledge work are key factors of today’s successful companies. This study devises an approach for increasing the performance of knowledge work by shifting it towards a process orientation. Business process management and workflow management are methods for structured and predefined work but are not flexible enough to support knowledge work in a comprehensive way. Case-based reasoning (CBR) uses the knowledge of previously experienced cases in order to propose a solution to a problem. CBR can be used to retrieve, reuse, revise, retain and store functional and process knowledge. The aim of the research was to develop an approach that combines CBR and process execution to improve knowledge work. The research goals are: a casedescription for knowledge work that can be integrated into a process execution system and that contains both functional and process knowledge; a similarity algorithm for the retrieval of functional and procedural knowledge; and an adaptation mechanism that deals with the different granularities of solution parts. This thesis contains a profound literature framework and follows a design science research (DSR) strategy. During the awareness phase of the design science research process, an application scenario was acquired using the case study research method, which is the admission process for a study programme at a university. This application scenario is used to introduce and showcase the combined CBR and process execution approach called ICEBERG-PE, which consists of a case model and CBR services. The approach is implemented as a prototype and can be instantiated using the ICEBERG-PE procedure model, a specific procedure model for ontology-based, CBR projects. The ICEBERG-PE prototype has been evaluated using triangulated evaluation data and different evaluation settings to confirm that the approach is transferable to other contexts. Finally, this thesis concludes with potential recommendations for future research. / Computing / D. Phil. (Information Systems)
|
747 |
Knowledge management and its effectiveness for organisational transformation through knowledge sharing and transferMazorodze, Alfred Hove 06 1900 (has links)
Knowledge Management aims to improve organisational performance and it marks the beginning of organisational transformation. The two types of knowledge managed are respectively categorised “tacit” and “explicit.” This research investigated the effectiveness of Knowledge Management for organisational transformation in Namibia. It was necessitated by the lack of knowledge sharing among employees and also lack of appropriate tools for effective Knowledge Management. Moreover, some organisations engage in Knowledge Management practices without a full understanding of the processes involved. This was determined by a through literature review which indicated that there were very few studies conducted on Knowledge Management in Namibia as shown on Table 1.1 on page 6. The study therefore provided a nuanced understanding of Knowledge Management. The study additionally established that the use of appropriate tools and technologies to better manage the knowledge ultimately improves organisational performance.
The research objectives sought to explore the initiatives deployed to enable knowledge sharing, identify barriers to effective Knowledge Management, analyse the role of social media for knowledge sharing and also measure the effectiveness of the knowledge transfer activities. A mixed method research methodology was used to conduct this investigation. Participants were selected through purposive sampling. Out of 130 questionnaires distributed, 112 were fully completed and returned. This represented an 86.1% response rate. The results of the study revealed that organisational transformation is dependent on effective Knowledge Management. In addition to that, the study found that there is a correlation of 0.6 between Information Technology and Knowledge Management. The study further revealed that initiatives to enable knowledge sharing start with executive support and the employees should be motivated to share knowledge. More so, it was also found that lack of funds for Knowledge Management projects is the greatest barrier in organisations. Effective Knowledge Management is facilitated by social media. Finally, it was found that the most effective knowledge transfer activity is a collaborative virtual workspace followed by Communities of Practice. / School of Computing / M.Sc. (Computing)
|
748 |
Modelagem computacional de dados: um sistema de tomada de decisão para gestão de recursos agrometeorológicos - SIAGRO / Computer modeling of data: a system making decision for management of agrometeorology resources - SIAGRODiego Roman 27 August 2007 (has links)
A maioria das aplicações envolvendo a influência do clima na agricultura requer um grande volume de dados que, geralmente, não estão disponíveis. Desta forma, há necessidade de um aplicativo computacional para facilitar a organização dos dados necessários. O sistema computacional SIAGRO foi desenvolvido para dar suporte a uma plataforma de coleta de dados termo-pluviométricos e para atender à demanda dos usuários da
informação agrometeorológica para agricultura. O sistema proposto permite, a partir de dados coletados a intervalos de 15 minutos, cadastrar outras estações, importar dados, calcular a evapotranspiração por diferentes modelos (Thornthwaite; Camargo; Thornthwaite modificado por Camargo e Hagreaves e
Samani), utilizar a classificação climática de Thornthwaite e determinar médias para os parâmetros coletados em períodos distintos de tempo. Os resultados são apresentados em
forma de gráficos e tabelas num computador pessoal ou via Internet, que podem ser exportados para uso em outros aplicativos computacionais ou comparados com os resultados de outras estações cadastradas no sistema. Disponibilizar o SIAGRO de informação que permita gerir de forma eficiente programas de
irrigação para atender as carências de água nos cultivos, permitiu que se avaliasse o desempenho de três métodos de referência para estimar a evapotranspiração com dados obtidos em lisímetros de lençol freático constante. Os dados foram coletados diariamente e processados em escala mensal. O desempenho dos métodos foi analisado a partir do coeficiente de correlação r e do índice de concordância de Willmot d. Os resultados mostraram que a melhor estimativa foi obtida com o modelo de Thornthwaite modificado por Camargo, devido ao seu melhor ajuste aos dados lisimétricos, apresentando uma concordância ótima, com índice d de 0,91. / Since most of the applications involving the influence of climate in agriculture require a great amount of data that usually are unavailable, a computational tool is needed to help to
organize the necessary data. The computational system SIAGRO was developed in an attempt to support such a demand
of users of climate information in agriculture. The system makes it possible to register other stations, import climatic data, to calculate evapotranspiration by means of different methods (Thornthwaite; Camargo; Thornthwaite modified by Camargo and Hagreaves e Samani), to apply a climatic classification and to
determine averages for different periods of time from daily data. The system presents its results in graphics and tables, which can be copied for use in other computer applications or
used to be compared with results of other weather stations registered in this system. To supply SIAGRO with profitable information for irrigation scheduling and increase the efficiency in water use by crops, allowed the evaluation of three reference methods to estimating evapotranspiration through correlation with data obtained in constant water table lisimeter. The data were collected daily and processed in a monthly basis. The performance evaluations of the methods were based on the correlation coefficient r and Willmott agreement coefficient d. The results showed that the best estimate was obtained with the Thornthwaite modified by Camargo model, which shows the best adjustment to lysimeter data, with the index d equal to 0.91.
|
749 |
Hyper Friburgo: um sistema hyper texto baseado em agentes inteligentes para informações turísticas / Hyper Friburgo: a hyper text system based on intelligent agents for touristic informationsGeraldo Luiz Kern Martins 04 March 1999 (has links)
Nova Friburgo é uma cidade turística brasileira, a qual atrai diversos tipos de visitantes. Esta dissertação descreve um sistema inteligente que ficará instalado em um centro de informações turísticas para auxílio daqueles que necessitam localizar as principais atrações da cidade. Atualmente, existe uma aplicação multimídia comum de suporte ao turismo, mas esta sendo expandida para incluir uma base de conhecimento com perfis de turistas. Dependendo dos caminhos percorridos pelo usuário durante sua consulta, o sistema seleciona um perfil, que reflete o melhor possível às opções escolhidas. De acordo com o perfil
selecionado, o sistema poderá sugerir vários roteiros e destinos baseados nos possíveis interesses do turista, fornecendo inclusive, informações sobre hotéis e restaurantes do seu interesse. A base de conhecimento, que fornece este tipo de inferência, esta desenvolvida em Visual C++. / Nova Friburgo is a Brazilian tourist city, which attracts different types of visitors. This dissertation describes an intelligent system to be installed at a tourist information center for assisting those who need directions to this citys major attractions. Currently, there is an ordinary multimedia application for tourism support but such system is being expanded to include a knowledge base of profiles of tourists. On the base of the tracks or paths a user has already performed during his/her consultation, the system selects the profile, which most closely reflects the options taken. According to this selected profile, the system is expected to suggest various routes and destinations based on the tourists possible interests, in addition to provide information on hotels and restaurants he/she will likely be interested in. The knowledge base, which provides this type of inference, is being developed
in Visual C++.
|
750 |
Aprendizagem em sistemas hibridos / Learning in hybrid systemsGuazzelli, Alex January 1994 (has links)
O presente trabalho apresenta dois novas modelos conexionistas, baseados na teoria da adaptação ressonante (ART): Simplified Fuzzy ARTMAP e Semantic ART (SMART). Descreve-se a modelagem, adaptação, implementação e validação destes, enquanto incorporados ao sistema hibrido HYCONES, para resolução de problemas de diagnostico medico em cardiopatias congênitas e nefrologia. HYCONES é uma ferramenta para a construção de sistemas especialistas híbridos que integra redes neurais com frames, assimilando as qualidades inerentes aos dois paradigmas. 0 mecanismo de frames fornece tipos construtores flexíveis para a modelagem do conhecimento do domínio, enquanto as redes neurais, representadas na versão original de HYCONES pelo modelo neural combinatório (MNC), possibilitam tanto a automação da aquisição de conhecimento, a partir de uma base de casos, quanta a implementação de aprendizado indutivo e dedutivo. A teoria da adaptação ressonante 6 caracterizada, principalmente, pela manutenção do equilíbrio entre as propriedades de plasticidade e estabilidade durante o processo de aprendizagem. ART inclui vários modelos conexionistas, tais como: Fuzzy ARTMAP, Fuzzy ART, ART 1, ART 2 e ART 3. Dentre estes, a rede neural Fuzzy ARTMAP destaca-se por possibilitar o tratamento de padr6es analógicos a partir de dois módulos ART básicos. O modelo Simplified Fuzzy ARTMAP, como o pr6prio nome o diz, a uma simplificação da rede neural Fuzzy ARTMAP. Ao contrario desta, o novo modelo possibilita o tratamento de padrões analógicos, a partir de apenas um modulo ART, responsável pelo tratamento dos padrões de entrada, adicionado de uma camada, responsável pelos padrões alvo. Mesmo com apenas um modulo ART, o modelo Simplified Fuzzy ARTMAP 6 capaz de reter o mesmo nível de desempenho obtido com a rede neural Fuzzy ARTMAP pois, continua a garantir, conjuntamente, a maximização da generalização e a minimização do erro preditivo, através da execução da estratégia match-tracking. Para a construção da base de casos de cardiopatias congênitas, 66 prontuários médicos, das três cardiopatias congênitas mais freqüentes, foram extraídos do banco de dados de pacientes submetidos a cirurgia cardíaca no Instituto de Cardiologia RS (ICFUC-RS). Tais prontuários abrangem o período de janeiro de 1986 a dezembro de 1990 e reportam 22 casos de Comunicação Interatrial (CIA), 29 de Comunicação Interventricular (CIV) e 15 de Defeito Septal Atrioventricular (DSAV). Para a análise de desempenho do sistema, 33 casos adicionais, do referido período, foram extraídos aleatoriamente do banco de dados do ICFUC-RS. Destes 33 casos, 13 apresentam CIA, 10 CIV e 10 DSAV. Para a construção da base de casos de síndromes renais, 381 prontuários do banco de dados de síndromes renais da Escola Paulista de Medicina foram analisados e 58 evidencias, correspondentes a dados de hist6ria clinica e exame físico dos pacientes, foram extraídas semi-automaticamente. Do total de casos selecionados, 136 apresentam Uremia, 85 Nefrite, 100 Hipertensão e 60 Litiase. Dos 381 casos analisados, 254 foram escolhidos aleatoriamente para a composicao do conjunto de treinamento, enquanto que os demais foram utilizados para a elaboração do conjunto de testes. Para que HYCONES II fosse validado, foram construídas 46 versões da base de conhecimento hibrida (BCH) para o domínio de cardiopatias congênitas e 46 versões da BCH para o de nefrologia. Em ambos os domínios médicos as respectivas bases de conhecimento foram construídas, automaticamente, a partir das respectivas bases de casos de treinamento. Das 46 versões geradas para cada grupo, uma representa o modelo MNC e 45 os modelos ART. As versões ART dividem-se em grupos de 3: 15 versões foram formadas a partir do modelo Simplified Fuzzy ARTMAP; 15 a partir deste mesmo modelo, sem que os padrões de entrada fossem normalizados; e, finalmente, 15 para o modelo Semantic ART. Na base de testes CHD, o desempenho da versa° HYCONES II - Simplified Fuzzy ARTMAP foi semelhante ao da versa° MNC. A primeira acertou 29 dos 33 diagnósticos (87,9%), enquanto a segunda apontou corretamente 31 dos 33 diagnósticos apresentados (93,9%). Na base de testes de síndromes renais, o desempenho de HYCONES II Fuzzy ARTMAP foi superior ao da versão MNC (p < 0,05). Ambas -Simplified acertaram, respectivamente, 108 (85%) e 95 (74,8%) diagnósticos, em 127 casos submetidos. Ainda que o desempenho da versão HYCONES II - Simplified Fuzzy ARTMAP se revelasse promissor, ao se examinar o conteúdo das redes geradas por este modelo, pode-se observar que estas divergiam completamente daquelas obtidas pelo MNC. As redes que levaram a conclusão diagnostica, na versão HYCONES - MNC, possuíam conteúdo praticamente igual aos grafos de conhecimento, elicitados de especialistas em cardiopatias congênitas. JA, as redes ativadas na versa° HYCONES II - Simplified Fuzzy ARTMAP, além de representarem numero bem major de evidencias que as redes MNC, a grande maioria destas ultimas representam a negação do padrão de entrada. Este fato deve-se a um processo de normalização, inerente ao modelo Simplified Fuzzy ARTMAP, no qual cada padrão de entrada e duplicado. Nesta duplicação, são representadas as evidências presentes em cada caso e, ao mesmo tempo, complementarmente, as evidencias ausentes, em relação ao total geral das mesmas na base de casos. Esta codificação inviabiliza o mecanismo de explanação do sistema HYCONES, pois, na área módica, os diagnósticos costumam ser feitos a partir de um conjunto de evidencias presentes e, não, pela ausência delas. Tentou-se, então, melhorar o conteúdo semântico das redes Simplified Fuzzy ARTMAP. Para tal, o processo de normalização ou codificação complementar da implementação do modelo foi retirado, validando-o novamente, contra o mesma base de testes. Na base de testes CHD, o desempenho de HYCONES II - Simplified Fuzzy ARTMAP, sem a codificação complementar, foi inferior ao da versão MNC (p < 0,05). A primeira acertou 25 dos 33 diagnósticos (75,8%), enquanto a segunda apontou corretamente 31 dos mesmos (93,9%). Na base de testes renais, o desempenho da versa° HYCONES II - Simplified Fuzzy ARTMAP, sem a codificação complementar, foi semelhante ao da versa° MNC. Dos 127 casos apresentados, a primeira acertou 98 diagn6sticos (77,2%), contra 95 da segunda (74,8%). Constatou-se, ainda, que as categorias de reconhecimento formadas pelo modelo Simplified Fuzzy ARTMAP continuavam a apresentar diferenças marcantes quanto ao seu conteúdo, quando comparadas as redes MNC ou aos grafos de conhecimento elicitados de especialistas. O modelo Semantic ART foi, então, proposto, na tentativa de se melhorar o conteúdo semantic° das redes ART. Modificou-se, então, o algoritmo de aprendizado do modelo Simplified Fuzzy ARTMAP, introduzindo-se o mecanismo de aprendizado indutivo do modelo MNC, i.e., o algoritmo de punições e recompensas, associado ao de poda e normalização. Nova validação com a mesma base de testes foi realizada. Para a base de testes de CHD, o desempenho de HYCONES II - SMART foi semelhante ao da versão Simplified Fuzzy ARTMAP e da versão MNC. A primeira e a segunda acertaram 29 dos 33 diagnósticos (87,9%), enquanto a versão MNC apontou corretamente 31 dos 33 diagnósticos apresentados (93,9%). Na base de testes de síndromes renais, o desempenho de HYCONES II - SMART foi superior ao da versão MNC (p < 0,05) e igual ao da versão Simplified Fuzzy ARTMAP. A primeira e a Ultima acertaram 108 dos 127 diagnósticos (85%), enquanto a segunda apontou corretamente 95 dos mesmos (74,8%). Desta feita, observou-se que as redes neurais geradas por HYCONES II - SMART eram semelhantes em conteúdo as redes MNC e aos grafos de conhecimento elicitados de múltiplos especialistas. As principais contribuições desta dissertação são: o projeto, implementação e validação dos modelos Simplified Fuzzy ARTMAP e SMART. Destaca-se, porem, o modelo SMART, que apresentou major valor semântico nas categorias de reconhecimento do que o observado nos modelos ART convencionais, graças a incorporação dos conceitos de especificidade e relevância. Esta dissertação, entretanto, representa não só a modelagem e validação de dois novos modelos neurais, mas sim, o enriquecimento do sistema HYCONES, a partir da continuação de dissertação de mestrado previamente defendida. A partir do presente trabalho, portanto, é dada a possibilidade de escolha, ao engenheiro de conhecimento, de um entre três modelos neurais: o MNC, o Semantic ART e o Simplified Fuzzy ARTMAP que, sem exceção, apresentam Born desempenho. Os dois primeiros destacam-se, contudo, por suportarem semanticamente o contexto. / This dissertation presents two new connectionist models based on the adaptive resonance theory (ART): Simplified Fuzzy ARTMAP and Semantic ART (SMART). The modeling, adaptation, implementation and validation of these models are described, in their association to HYCONES, a hybrid connectionist expert system to solve classification problems. HYCONES integrates the knowledge representation mechanism of frames with neural networks, incorporating the inherent qualities of the two paradigms. While the frames mechanism provides flexible constructs for modeling the domain knowledge, neural networks, implemented in HYCONES' first version by the combinatorial neuron model (CNM), provide the means for automatic knowledge acquisition from a case database, enabling, as well, the implementation of deductive and inductive learning. The Adaptive Resonance Theory (ART) deals with a system involving selfstabilizing input patterns into recognition categories, while maintaining a balance between the properties of plasticity and stability. ART includes a series of different connectionist models: Fuzzy ARTMAP, Fuzzy ART, ART 1, ART 2, and ART 3. Among them, the Fuzzy ARTMAP one stands out for being capable of learning analogical patterns, using two basic ART modules. The Simplified Fuzzy ARTMAP model is a simplification of the Fuzzy ARTMAP neural network. Constrating the first model, the new one is capable of learning analogical patterns using only one ART module. This module is responsible for the categorization of the input patterns. However, it has one more layer, which is responsible for receiving and propagating the target patterns through the network. The presence of a single ART module does not hamper the Simplified Fuzzy ARTMAP model. The same performance levels are attained when the latter one runs without the second ART module. This is certified by the match-tracking strategy, that conjointly maximizes generalization and minimizes predictive error. Two medical domains were chosen to validate HYCONES performance: congenital heart diseases (CHD) and renal syndromes. To build up the CHD case base, 66 medical records were extracted from the cardiac surgery database of the Institute of Cardiology RS (ICFUC-RS). These records cover the period from January 1986 to December 1990 and describe 22 cases of Atrial Septal Defect (ASD), 29 of Ventriculal Septal Defect (VSD), and 15 of Atrial- Ventricular Septa! Defect (AVSD), the three most frequent congenital heart diseases. For validation purposes, 33 additional cases, from the same database and period mentioned above, were also extracted. From these cases, 13 report ASD, 10 VSD and 10 AVSD. To build the renal syndromes case base, 381 medical records from the database of the Escola Paulista de Medicina were analyzed and 58 evidences, covering the patients' clinical history and physical examination data, were semiautomatically extracted. From the total number of selected cases, 136 exhibit Uremia, 85 Nephritis, 100 Hypertension, and 60 Calculosis. From the 381 cases analyzed, 245 were randomically chosen to build the training set, while the remaining ones were used to build the testing set. To validate HYCONES II, 46 versions of the hybrid knowledge base (HKB) with congenital heart diseases were built; for the renal domain, another set of 46 HKB versions were constructed. For both medical domains, the HKBs were automatically generated from the training databases. From these 46 versions, one operates with the CNM model and the other 45 deals with two ART models. These ART versions are divided in three groups: 15 versions were built using the Simplified Fuzzy ARTMAP model; 15 used the Simplified Fuzzy ARTMAP model without the normalization of the input patterns, and 15 used the Semantic ART model. HYCONES II - Simplified Fuzzy ARTMAP and HYCONES - CNM performed similarly for the CH D domain. The first one pointed out correctly to 29 of the 33 testing cases (87,9%), while the second one indicated correctly 31 of the same cases (93,9%). In the renal syndromes domain, however, the performance of HYCONES II - Simplified Fuzzy ARTMAP was superior to the one exhibited by CNM (p < 0,05). Both versions pointed out correctly, respectively, 108 (85%) and 95 (74.8%) diagnoses of the 127 testing cases presented to the system. HYCONES II - Simplified Fuzzy ARTMAP, therefore, displayed a satisfactory performance. However, the semantic contents of the neural nets it generated were completely different from the ones stemming from the CNM version. The networks that pointed out the final diagnosis in HYCONES - CNM were very similar to the knowledge graphs elicited from experts in congenital heart diseases. On the other hand, the networks activated in HYCONES II - Simplified Fuzzy ARTMAP operated with far more evidences than the CNM version. Besides this quantitative difference, there was a striking qualitative discrepancy among these two models. The Simplified Fuzzy ARTMAP version, even though pointing out to the correct diagnoses, used evidences that represented the complementary coding of the input pattern. This coding, inherent to the Simplified Fuzzy ARTMAP model, duplicates the input pattern, generating a new one depicting the evidence observed (on-cell) and, at the same time, the absent evidence, in relation to the total evidence employed to represent the input cases (off-cell). This coding shuts out the HYCONES explanation mechanism, since medical doctors usually reach a diagnostic conclusion rather from a set of observed evidences than from their absence. The next step taken was to improve the semantic contents of the Simplified Fuzzy ARTMAP model. To achieve this, the complement coding process was removed and the modified model was, then, revalidated, through the same testing sets as above described. In the CHD domain, the performance of HYCONES II - Simplified Fuzzy ARTMAP, without complementary coding, proved to be inferior to the one presented by CNM (p < 0,05). The first model singled out correctly 25 out of the 33 testing cases (75,8%), while the second one singled out correctly 31 out of the same 33 cases (93,9%). In the renal syndromes domain, the performances of HYCONES II - Simplified Fuzzy ARTMAP, without complementary coding, and HYCONES - CNM were similar. The first pointed out correctly to 98 of the 127 testing cases (77,2%), while the second one pointed out correctly to 95 of the same cases (74.8%). However, the recognition categories formed by this modified Simplified Fuzzy ARTMAP still presented quantitative and qualitative differences in their contents, when compared to the networks activated by CNM and to the knowledge graphs elicited from experts. This discrepancy, although smaller than the one observed in the original Fuzzy ARTMAP model, still restrained HYCONES explanation mechanism. The Semantic ART model (SMART) was, then, proposed. Its goal was to improve the semantic contents of ART recognition categories. To build this new model, the Simplified Fuzzy ARTMAP archictecture was preserved, while its learning algorithm was replaced by the CNM inductive learning mechanism (the punishments and rewards algorithm, associated with the pruning and normalization mechanisms). A new validation phase was, then, performed over the same testing sets. For the CHD domain, the perfomance comparison among SMART, Simplified Fuzzy ARTMAP, and CNM versions showed similar results. The first and the second versions pointed out correctly to 29 of the 33 testing cases (87,9%), while the third one singled out correctly 31 of the same testing cases (93,9%). For the renal syndromes domain, the performance of HYCONES II - SMART was superior to the one presented by the CNM version (p < 0,05), and equal to the performance presented by the Simplified Fuzzy ARTMAP version. SMART and Simplified Fuzzy ARTMAP singled out correctly 108 of the 127 testing cases (85%), while the CNM version pointed out correctly 95 of the same 127 testing cases (74.8%). Finally, it was observed that the neural networks generated by HYCONES II - SMART had a similar content to the networks generated by CNM and to the knowledge graphs elicited from multiple experts. The main contributions of this dissertation are: the design, implementation and validation of the Simplified Fuzzy ARTMAP and SMART models. The latter one, however, stands out for its learning mechanism, which provides a higher semantic value to the recognition categories, when compared to the categories formed by conventional ART models. This important enhancement is obtained by incorporating specificity and relevance concepts to ART's dynamics. This dissertation, however, represents not only the design and validation of two new connectionist models, but also, the enrichment of HYCONES. This is obtained through the continuation of a previous MSc dissertation, under the same supervision supervision. From the present work, therefore, it is given to the knowledge engineering, the choice among three different neural networks: CNM, Semantic ART and Simplified Fuzzy ARTMAP, all of which, display good performance. Indeed, the first and second models, in contrast to the third, support the context in a semantic way.
|
Page generated in 0.0291 seconds