• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 106
  • 87
  • 70
  • 29
  • 26
  • 24
  • 14
  • 13
  • 12
  • 12
  • 10
  • 8
  • 6
  • 4
  • 2
  • Tagged with
  • 449
  • 124
  • 77
  • 77
  • 75
  • 60
  • 56
  • 49
  • 42
  • 40
  • 39
  • 39
  • 36
  • 35
  • 33
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
271

Análise de similaridades de modelagem no emprego de técnicas conexionistas e evolutivas da inteligência computacional visando à resolução de problemas de otimização combinatorial: estudo de caso - problema do caixeiro viajante. / Similarity analysis for conexionist and evolutionary tecniques of the computational intelligence fild focused on the resolution of combinatorial optimization problems: case study - traveling salesman problem.

Fernandes, David Saraiva Farias 08 June 2009 (has links)
Este trabalho realiza uma análise dos modelos pertencentes à Computação Neural e à Computação Evolutiva visando identificar semelhanças entre as áreas e sustentar mapeamentos entre as semelhanças identificadas. Neste contexto, a identificação de similaridades visando à resolução de problemas de otimização combinatorial resulta em uma comparação entre a Máquina de Boltzmann e os Algoritmos Evolutivos binários com população composta por um único indivíduo pai e um único indivíduo descendente. Como forma de auxiliar nas análises, o trabalho utiliza o Problema do Caixeiro Viajante como plataforma de ensaios, propondo mapeamentos entre as equações da Máquina de Boltzmann e os operadores evolutivos da Estratégia Evolutiva (1+1)-ES. / An analysis between the Evolutionary Computation and the Neural Computation fields was presented in order to identify similarities and mappings between the theories. In the analysis, the identification of similarities between the models designed for combinatorial optimization problems results in a comparison between the Boltzmann Machine and the Two-Membered Evolutionary Algorithms. In order to analyze the class of combinatorial optimization problems, this work used the Traveling Salesman Problem as a study subject, where the Boltzmann Machine equations were used to implement the evolutionary operators of an Evolution Strategy (1+1)-ES.
272

An ontologies and agents based approach for undersea feature characterisation and generalisation / Une approche fondée sur les ontologies et les agents pour la caractérisation et la généralisation de formes de relief sous-marines

Yan, Jingya 10 December 2014 (has links)
Une carte marine est un type de carte utilisé pour décrire la morphologie du fond marin et du littoral adjacent. Un de ses principaux objectifs est de garantir la sécurité de la navigation maritime. En conséquence, la construction d'une carte marine est contrainte par des règles très précises. Le cartographe doit choisir et mettre en évidence les formes du relief sous-marin en fonction de leur intérêt pour la navigation. Au sein d'un processus automatisé, le système doit être en mesure d'identifier et de classifier ces formes de relief à partir d’un modèle de terrain.Un relief sous-marin est une individuation subjective d'une partie du fond océanique. La reconnaissance de la morphologie du fond sous-marin est une tâche difficile, car les définitions des formes de relief reposent généralement sur une description qualitative et floue. Obtenir la reconnaissance automatique des formes de relief nécessite donc une définition formelle des propriétés des reliefs et de leur modélisation. Dans le domaine maritime, l'Organisation Hydrographique Internationale a publié une terminologie standard des noms des formes de relief sous-marines qui formalise un ensemble de définitions principalement pour des objectifs de communication. Cette terminologie a été utilisée ici comme point de départ pour la classification automatique des formes de relief sous-marines d'un modèle numérique de terrain.Afin d'intégrer les connaissances sur le relief sous-marin et sa représentation sur une carte nautique, cette recherche vise à définir des ontologies du relief sous-marin et des cartes marines. Les ontologies sont ensuite utilisées à des fins de généralisation de carte marine. Nos travaux de recherche sont structurés en deux parties principales. Dans la première partie de la recherche, une ontologie est définie afin d'organiser la connaissance géographique et cartographique pour la représentation du relief sous-marin et la généralisation des cartes marines. Tout d'abord, une ontologie de domaine du relief sous-marin présente les différents concepts de formes de relief sous-marines avec leurs propriétés géométriques et topologiques. Cette ontologie est requise pour la classification des formes de relief. Deuxièmement, une ontologie de représentation est présentée, qui décrit la façon dont les entités bathymétriques sont représentées sur la carte.Troisièmement, une ontologie du processus de généralisation définit les contraintes et les opérations usitées pour la généralisation de carte marine. Dans la deuxième partie de la recherche, un processus de généralisation fondé sur l'ontologie est conçu en s'appuyant sur un système multi-agents (SMA). Quatre types d'agents (isobathe, sonde, forme de relief et groupe de formes de relief) sont définis pour gérer les objets cartographiques sur la carte. Un modèle de base de données a été généré à partir de l'ontologie. Les données bathymétriques et l'ontologie sont stockées dans une base de données de type ``triple store'', et sont connectées à un système d'information implémenté en Java et C++. Le système proposé classe automatiquement les formes de relief sous-marines extraites à partir de la bathymétrie, et évalue les contraintes cartographiques. Dans un premier temps, les propriétés géométriques décrivant une forme de relief sont calculées à partir des sondes et des isobathes et sont utilisées pour la classification des formes de relief. Ensuite, les conflits de distance et de superficie sont évalués dans le SMA et des plans de généralisation sont proposés au cartographe. Des tests ont été réalisés avec des données bathymétriques du monde réel montrant ainsi l'intérêt de la recherche dans le domaine de la cartographie nautique. / A nautical chart is a kind of map used to describe the seafloor morphology and the shoreline of adjacent lands. One of its main purposes is to guaranty safety of maritime navigation. As a consequence, construction of a nautical chart follows very specific rules. The cartographer has to select and highlight undersea features according to their relevance to navigation. In an automated process, the system must be able to identify and classify these features from the terrain model.An undersea feature is a subjective individuation of a part of the seafloor. Landform recognition is a difficult task because its definition usually relies on a qualitative and fuzzy description. Achieving automatic recognition of landforms requires a formal definition of the landforms properties and their modelling. In the maritime domain, the International Hydrographic Organisation published a standard terminology of undersea feature names which formalises a set of definitions mainly for naming features and communication purpose. This terminology is here used as a starting point for the automatic classification of the features from a terrain model.In order to integrate knowledge about the submarine relief and its representation on the chart, this research aims to define ontologies of the submarine relief and nautical chart. Then, the ontologies are applied to generalisation of nautical chart. It includes two main parts. In the first part of the research, an ontology is defined to organize geographical and cartographic knowledge for undersea feature representation and nautical chart generalisation. First, a domain ontology of the submarine relief introduces the different concepts of undersea features with their geometric and topological properties. This ontology is required for the classification of features. Second, a representation ontology is presented, which describes how bathymetric entities are portrayed on the map. Third, a generalisation process ontology defines constraints and operations in nautical chart generalisation. In the second part, a generalisation process based on the ontology is designed relying on a multi-agent system. Four kinds of agents (isobath, sounding, feature and group of features) are defined to manage cartographic objects on the chart. A database model was generated from the ontology. The bathymetric data and the ontology are stored in a triplestore database, and are connected to an interface in Java and C++ to automatically classify the undersea features extracted from the bathymetry, and evaluate the cartographic constraints. At first, geometrical properties describing the feature shape are computed from soundings and isobaths and are used for feature classification. Then, conflicts are evaluated in a MAS and generalisation plans are provided.
273

Construction d’abaques numériques dédiés aux études paramétriques du procédé de soudage par des méthodes de réduction de modèles espace-temps / Construction of computational vademecum dedicated to parametric studies of welding processes by space-time model order reduction techniques

Lu, Ye 03 November 2017 (has links)
Le recours à des simulations numériques pour l’étude de l’influence des paramètres d’entrée (matériaux, chargements, conditions aux limites, géométrie, etc.) sur les différentes quantités d’intérêt en soudage (contraintes résiduelles, distorsion, etc.) s’avère trop long et coûteux vu l’aspect multi-paramétrique de ces simulations. Pour explorer des espaces paramétriques de grandes dimensions, avec des calculs moins coûteux, il parait opportun d’utiliser des approches de réduction de modèle. Dans ce travail, d’une façon a posteriori, une stratégie non-intrusive est développée pour construire les abaques dédiées aux études paramétriques du soudage. Dans une phase offline, une base de données (‘snapshots’) a été pré-calculée avec un choix optimal des paramètres d'entrée donnés par une approche multi-grille (dans l’espace des paramètres). Pour explorer d’autres valeurs de paramètres, une méthode d’interpolation basée sur la variété Grassmannienne est alors proposée pour adapter les bases réduites espace-temps issues de la méthode SVD. Cette méthode a été constatée plus performante que les méthodes d’interpolation standards, notamment en non-linéaire. Afin d’explorer des espaces paramétriques de grandes dimensions, une méthode de type décomposition tensorielle (i.e. HOPGD) a été également étudiée. Pour l’aspect d’optimalité de l’abaque, nous proposons une technique d’accélération de convergence pour la HOPGD et une approche ‘sparse grids’ qui permet d’échantillonner efficacement l’espace des paramètres. Finalement, les abaques optimaux de dimension jusqu’à 10 à précision contrôlée ont été construits pour différents types de paramètres (matériaux, chargements, géométrie) du procédé de soudage. / The use of standard numerical simulations for studies of the influence of input parameters (materials, loading, boundary conditions, geometry, etc.) on the quantities of interest in welding (residual stresses, distortion, etc.) proves to be too long and costly due to the multiparametric aspect of welding. In order to explore high-dimensional parametric spaces, with cheaper calculations, it seems to be appropriate to use model reduction approaches. In this work, in an a posteriori way, a non-intrusive strategy is developed to construct computational vademecum dedicated to parametric studies of welding. In an offline phase, a snapshots database is pre-computed with an optimal choice of input parameters given by a “multi-grids” approach (in parameter space). To explore other parameter values, an interpolation method based on Grassmann manifolds is proposed to adapt both the space and time reduced bases derived from the SVD. This method seems more efficient than standard interpolation methods, especially in non-linear cases. In order to explore highdimensional parametric spaces, a tensor decomposition method (i.e. HOPGD) has also been studied. For the optimality aspect of the computational vademecum, we propose a convergence acceleration technique for HOPGD and a “sparse grids” approach which allows efficient sampling of the parameter space. Finally, computational vademecums of dimension up to 10 with controlled accuracy have been constructed for different types of welding parameters (materials, loading, geometry).
274

Os quadros parietais nas escolas do Sudeste brasileiro (1890-1970)

Faria, Joana Borges de 31 July 2017 (has links)
Submitted by Filipe dos Santos (fsantos@pucsp.br) on 2017-08-02T14:32:48Z No. of bitstreams: 1 Joana Borges de Faria.pdf: 16414260 bytes, checksum: 2c19dac13c811d0243bfbdb4bcb74aaa (MD5) / Made available in DSpace on 2017-08-02T14:32:48Z (GMT). No. of bitstreams: 1 Joana Borges de Faria.pdf: 16414260 bytes, checksum: 2c19dac13c811d0243bfbdb4bcb74aaa (MD5) Previous issue date: 2017-07-31 / Conselho Nacional de Pesquisa e Desenvolvimento Científico e Tecnológico - CNPq / Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES / The wall chart is type of didactic material used to transmit school knowledge. It is the graphic representation on a flat surface of a determined subject matter. They are called wall charts because they are hung on the wall or within other apparatuses so that they can be viewed by all students simultaneously. Upon taking into consideration that these wall charts were widely utilized in schools across Brazil and in other occidental countries from the second half of the 19th Century until the middle of the 20th century, this research seeks to discover the history of this kind of didactic material in Brazil, focusing on how they were circulated, the individuals and companies involved in their production and distribution, their materiality, and their pedagogical functions, taking into consideration the school subjects of reference, the subject matter broached, and the visual language employed. For this purpose, the research used as source material the wall charts found in the following archives: CRE Mario Covas - Acervo da E.E. Caetano de Campos; Memorial do Colégio São Luís; Colégio Marista Glória e Centro de Memória da Educação Brasileira do Instituto Superior de Educação do Rio de Janeiro. Additional sources included the catalogues of companies that sold wall charts, mainly the French company Maison Deyrolle; manuals explaining the use of wall charts; work done by students of the Intituto de Educação do Rio de Janeiro from the 1950s and 1960s; photographs of classrooms, natural history, physics and chemistry classrooms, educational museums in schools, libraries, and dental clinics from schools in São Paulo; official documents like legislation and teaching programs for the primary and secondary schools of São Paulo and Rio de Janeiro, inventory logs as well as annual reports from directors and school inspectors; guides and visitors logs to universal expositions that occured in the second half of the 19th century; educational periodicals like Revista Pedagógica, Revista de Ensino, Revista Escolar, and Revista do Ensino do Rio Grande do Sul; and online databases related to scientific and educational heritage. The wall chart is an artefact and a visual image at the same time. For this reason, the study considers its principle theoretical references to be studies in material culture, in the material culture of schools, and in visual culture / O quadro parietal é um material didático usado para a transmissão de conhecimentos escolares. É uma representação gráfica de determinados conteúdos numa superfície plana. São chamados de parietais, pois são pendurados nas paredes ou em outros dispositivos para serem observados por todos os alunos simultaneamente. Ao levar em consideração que os parietais foram amplamente utilizados nas escolas brasileiras e dos países ocidentais em geral a partir da metade do século XIX até meados do século XX, esta pesquisa tem como objetivo conhecer a história deste material didático no Brasil, enfocando como se deu a sua circulação, os sujeitos e empresas envolvidos em sua produção e distribuição, a sua materialidade, suas funções pedagógicas levando em consideração as disciplinas escolares de referência, os conteúdos abordados e as linguagens visuais empregadas. Para tanto, a pesquisa utilizou como fontes os próprios quadros parietais localizados em quatro acervos escolares: CRE Mario Covas - Acervo da E.E. Caetano de Campos, Memorial do Colégio São Luís, Colégio Marista Glória e Centro de Memória da Educação Brasileira do Instituto Superior de Educação do Rio de Janeiro; catálogos de empresas que vendiam quadros parietais, principalmente da francesa Maison Deyrolle; manuais explicativos dos usos dos parietais; trabalhos de alunos do Instituto de Educação do Rio de Janeiro das décadas de 1950 e 1960; fotografias de salas de aula, gabinetes ou salas de história natural, física e química, Museu Pedagógico, Biblioteca e gabinetes dentários de escolas paulistas; documentos oficiais, como legislação e programas de ensino das escolas primárias e secundárias paulistas e cariocas, livros de inventário e relatório anuais de diretores e de inspetores de ensino; guias e relatórios de visitas às exposições universais que ocorreram na segunda metade do século XIX; periódicos educacionais, como: Revista Pedagógica, Revista do Ensino, Revista Escolar, e Revista do Ensino do Rio Grande do Sul; e banco de dados online relacionados à patrimônio científico e educativo. O quadro parietal é ao mesmo tempo um artefato e uma imagem visual, por isso, o estudo tem como principais referenciais teóricos os estudos da cultura material, cultura material escolar e da cultura visual
275

Análise de similaridades de modelagem no emprego de técnicas conexionistas e evolutivas da inteligência computacional visando à resolução de problemas de otimização combinatorial: estudo de caso - problema do caixeiro viajante. / Similarity analysis for conexionist and evolutionary tecniques of the computational intelligence fild focused on the resolution of combinatorial optimization problems: case study - traveling salesman problem.

David Saraiva Farias Fernandes 08 June 2009 (has links)
Este trabalho realiza uma análise dos modelos pertencentes à Computação Neural e à Computação Evolutiva visando identificar semelhanças entre as áreas e sustentar mapeamentos entre as semelhanças identificadas. Neste contexto, a identificação de similaridades visando à resolução de problemas de otimização combinatorial resulta em uma comparação entre a Máquina de Boltzmann e os Algoritmos Evolutivos binários com população composta por um único indivíduo pai e um único indivíduo descendente. Como forma de auxiliar nas análises, o trabalho utiliza o Problema do Caixeiro Viajante como plataforma de ensaios, propondo mapeamentos entre as equações da Máquina de Boltzmann e os operadores evolutivos da Estratégia Evolutiva (1+1)-ES. / An analysis between the Evolutionary Computation and the Neural Computation fields was presented in order to identify similarities and mappings between the theories. In the analysis, the identification of similarities between the models designed for combinatorial optimization problems results in a comparison between the Boltzmann Machine and the Two-Membered Evolutionary Algorithms. In order to analyze the class of combinatorial optimization problems, this work used the Traveling Salesman Problem as a study subject, where the Boltzmann Machine equations were used to implement the evolutionary operators of an Evolution Strategy (1+1)-ES.
276

Aplicação de cartas de controle como ferramenta de melhoria frente às dificuldades operacionais de laboratórios acreditados na ABNT NBR ISO/IEC 17025 / Application of Control charts as an improvement tool considering the operational difficulties of accredited laboratories according to ISO/IEC 17025

Tatiana Barbosa Turuta 05 March 2015 (has links)
É crescente o número de laboratórios que têm implantado sistemas de gestão da qualidade conforme a norma ABNT NBR ISO/IEC 17025, entretanto, para continuidade no atendimento aos requisitos da norma e às necessidades dos clientes, os laboratórios podem apresentar dificuldades que precisam ser identificadas e analisadas para melhoria continua da qualidade dos resultados. Desta maneira, uma avaliação preliminar das dificuldades operacionais após implantação do sistema de gestão da qualidade ABNT NBR ISO/IEC 17025 aponta como um dos requisitos de maior dificuldade a garantia da qualidade de resultados (requisito 5.9). Este requisito abrange a importância do monitoramento contínuo para validade dos ensaios realizados e acompanhamento de possíveis tendências das análises realizadas pelos laboratórios. Para realização desse monitoramento são aplicadas as cartas de controle que requerem maior atenção na sua aplicabilidade e interpretação em função do método de análise, da amostra de controle e da periodicidade do acompanhamento. As cartas de controle são ferramentas que possibilitam o monitoramento contínuo dos processos analíticos e permitem a avaliação de possíveis otimizações das atividades técnicas como, por exemplo, com a redução de custo analítico decorrente das curvas de calibração em análises cromatográficas. Desta maneira, neste trabalho, após realizar uma pesquisa de mercado e confirmar que o requisito 5.9 está entre os requisitos de maior dificuldades para laboratórios de rotina, foi realizado um estudo prático e orientativo de aplicação dos diferentes tipos de cartas de controle que podem ser utilizados pelos laboratórios de ensaio na manutenção contínua da qualidade dos resultados. Este estudo demonstra que a escolha de uma carta de controle está relacionada com a criticidade do método analítico e que pode gerar controles adequados dentro de uma relação custo/benefício que seja satisfatória. A interpretação de todos estes resultados também gerou a proposta de um guia orientativo para aplicação de cartas de controle aos laboratórios analíticos. / A growing number of laboratories have implemented quality management systems according to ISO / IEC 17025 standard, however, for continuity in meeting the requirements of the standard and the needs of customers, laboratories may present difficulties that need to be identified and analyzed to continuously improvement of the quality results. In this way, a preliminary assessment of operational difficulties following the implementation of ISO / IEC 17025 points as one of the most difficult requirements to guarantee the quality of results (requirement 5.9). This requirement covers the importance of continuous monitoring for validity of testing and monitoring of possible trends of the analyzes performed by laboratories. To conduct this monitoring are applied control charts that need more attention in its applicability and interpretation depending on the method of analysis, the control sample and the monitoring frequency. Control charts are tools that enable continuous monitoring of analytical processes and make assessments of possible optimizations of the technical activities, for example, by reducing analytical costs arising from the calibration curves in chromatographic analysis. Thus, in this study, after conducting market research and confirm that the requirement 5.9 is among the most difficult requirements for routine labs, was developed a practical study of application of different types of control charts that can be used by testing laboratories in the ongoing maintenance of the quality of results. This study demonstrates that the choice of a control chart is related to the criticality of the analytical method and that can generate adequate controls within a cost / benefit that is satisfactory. The interpretation of all these results also led to the proposal of an reference guide to application of control charts to analytical laboratories.
277

Utilização da AHP e controle estatístico do processo na avaliação de módulos de irrigação por gotejamento / Ahp use and statistical control of the process to evaluate modules for drip irrigation

Reisdörfer, Marcelo 07 November 2013 (has links)
Made available in DSpace on 2017-05-12T14:46:58Z (GMT). No. of bitstreams: 1 Marcelo Reisdorfer.pdf: 6217647 bytes, checksum: defcf69e9ade22f915fc6d407b4577d0 (MD5) Previous issue date: 2013-11-07 / This paper shows the studied results of uniformity evaluation in 23 irrigation modules tracked by gravity in familiar farms in Salto do Lontra city / PR from 2009 to 2013. Uniformity coefficients (UC, UD, UE and ) and CEP plus the decision support AHP method were applied in this trial in order to evaluate the whole drip irrigation project. The results obtained by the uniformity coefficients indicated that the irrigation modules remained stable during the studied period. There was no significant variation from one period to another, with values of 85 and 86% for UC, so that for UD the answers were 78 and 80%; for UE, the results were 57 to 59% and for , the values ranged from 0.19 to 0.17. The Shewhart charts for individual measures, concerning the statistical control of the process, could show the special causes at each point. While, the and indices indicated that the irrigation modules met the quality control specifications and were considered "acceptable" and centralized by Montgomery requirements (2004). Regarding the use of multicriteria decision support method AHP (Analysis Hierarchic Process), it proved to be effective during the evaluation of irrigation projects. It weighted different criteria according to the perspective of a project classification as "Excellent", "Good", "Fair" and "Poor". Thus, in this trial, the AHP method rated as "excellent" the irrigation project, which was implemented in Salto do Lontra city/ PR based on both analyzed periods, whose values of consistency ratio (CR) were 0 and 0.02 from 2009 to 2010 and from 2011 to 2013 / Este trabalho apresenta os resultados da avaliação da uniformidade em 23 módulos de irrigação localizados por gravidade em propriedades agrícolas de base familiar no Município de Salto do Lontra/PR, no período de 2009 a 2013. Neste trabalho, foram aplicados os coeficientes de uniformidade (UC, UD, UE e ) e CEP, juntamente com o método AHP de apoio à decisão para avaliar o projeto de irrigação por gotejamento na sua totalidade. Os resultados obtidos pelos coeficientes de uniformidade apontaram que os módulos de irrigação se mantiveram estáveis durante o período avaliado, não havendo variação significativa de um período para outro, com valores de UC de 85 e 86%, UD de 78 e 80%, UE de 57 e 59%, e de 0,19 e 0,17. Quanto ao controle estatístico de processo, os gráficos de Shewhart para medidas individuais foram capazes de mostrar pontualmente as causas especiais, e os índices e indicaram que os módulos de irrigação atenderam as especificações do controle de qualidade, sendo considerados ―aceitáveis‖ e centralizados pelos requisitos de Montgomery (2004). Quanto ao uso do método multicritério de apoio à decisão AHP (Análise Hierárquica de Processo), este se mostrou eficaz na avaliação de projetos de irrigação ponderando diferentes critérios sob a ótica da classificação do projeto em ―Excelente‖, ―Bom‖, ―Regular‖ e ―Ruim‖. Nesta pesquisa, o método AHP classificou como ―Excelente‖ o projeto de irrigação implantado no Município de Salto do Lontra/PR para os dois períodos analisados, com valores de razão de consistência (RC) de 0 e 0,02 para o período de 2009 a 2010 e 2011 a 2013
278

Computer-aided applications in process plant safety

An, Hong January 2010 (has links)
Process plants that produce chemical products through pre-designed processes are fundamental in the Chemical Engineering industry. The safety of hazardous processing plants is of paramount importance as an accident could cause major damage to property and/or injury to people. HAZID is a computer system that helps designers and operators of process plants to identify potential design and operation problems given a process plant design. However, there are issues that need to be addressed before such a system will be accepted for common use. This research project considers how to improve the usability and acceptability of such a system by developing tools to test the developed models in order for the users to gain confidence in HAZID s output as HAZID is a model based system with a library of equipment models. The research also investigates the development of computer-aided safety applications and how they can be integrated together to extend HAZID to support different kinds of safety-related reasoning tasks. Three computer-aided tools and one reasoning system have been developed from this project. The first is called Model Test Bed, which is to test the correctness of models that have been built. The second is called Safe Isolation Tool, which is to define isolation boundary and identify potential hazards for isolation work. The third is an Instrument Checker, which lists all the instruments and their connections with process items in a process plant for the engineers to consider whether the instrument and its loop provide safeguards to the equipment during the hazard identification procedure. The fourth is a cause-effect analysis system that can automatically generate cause-effect tables for the control engineers to consider the safety design of the control of a plant as the table shows process events and corresponding process responses designed by the control engineer. The thesis provides a full description of the above four tools and how they are integrated into the HAZID system to perform control safety analysis and hazard identification in process plants.
279

INTERACTIVE CLINICAL EVENT PATTERN MINING AND VISUALIZATION USING INSURANCE CLAIMS DATA

Piao, Zhenhui 01 January 2018 (has links)
With exponential growth on a daily basis, there is potentially valuable information hidden in complex electronic medical records (EMR) systems. In this thesis, several efficient data mining algorithms were explored to discover hidden knowledge in insurance claims data. The first aim was to cluster three levels of information overload(IO) groups among chronic rheumatic disease (CRD) patient groups based on their clinical events extracted from insurance claims data. The second aim was to discover hidden patterns using three renowned pattern mining algorithms: Apriori, frequent pattern growth(FP-Growth), and sequential pattern discovery using equivalence classes(SPADE). The SPADE algorithm was found to be the most efficient method for the dataset used. Finally, a prototype system named myDietPHIL was developed to manage clinical events for CRD patients’ and visualize the relationships of frequent clinical events. The system has been tested and visualization of relationships could facilitate patient education.
280

敦煌寫本書儀中應用文書研究 / An Analaysis of the Practial writing in the Written Shu-i in Tun-huan

葉淑珍, Yeh, Shu Chen Unknown Date (has links)
書儀是研究唐、五代社會應用文書的重要資料也是敦煌文學中重要的一環。本文在大陸學者已有相關資料匯集之後進一步分析書儀中應用文書的特性。文中先探討「書儀」及「應用文書」的定義並探討二者的共同特性及相異之處,以此作為分析敦煌寫本書儀應用文書的準則並簡單介紹敦煌寫本書儀的作者與年代;其次為敦煌寫本書儀進行分類,依使用狀況的不同,分為一般應用文書、吉儀、凶儀、書、啟、狀再依其使用者的不同細分成類闡明各類書儀的形式與內容勾勒出敦煌所存唐、五代寫本書儀的一個模型。最後則探討敦煌寫本書儀在應用文上的特性,並與唐、五代的應用文書相比較檢視其異同之處。就應用文書的特性來看敦煌寫本書儀保存了當時書信形式及語言也反映當時人對於書信範本的需求及使用應用文書時的普遍性規則。其中表、啟、狀等書牘的基本架構與全唐所載的相關文獻是相符的。雖寫本年代前後不一其用語、形式上卻有相當的一致性。此外在書儀可以見到當時人的人際關係、風俗禮儀、語言文學等。

Page generated in 0.0682 seconds