• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 27
  • 27
  • 9
  • 8
  • 8
  • 7
  • 6
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Relational evidence theory and spatial interpretation procedures.

Pearce, Adrian January 1996 (has links)
Spatial interpretation involves the intelligent processing of images for learning, planning and visualisation. This involves building systems which learn to recognise patterns from the content of unconstrained data such as handwritten schematic symbols, photographic, and video images.The efficiency of spatial interpretation systems is measured not only in terms of their ability to learn to classify patterns, but their computational complexity and capacity to accommodate different patterns. This is reflected in the ease of human factors involved in the interactive process of inputting and manipulating training patterns, particularly if large numbers of patterns are used.This thesis focuses on the theoretical and procedural issues involved in applying machine learning to computer vision for efficient spatial interpretation. Two different approaches to evidential learning are consolidated in how they apply to generalising relational data structures. Relational Evidence Theory integrates information theoretic methods from, decision trees with graph matching methods from constraint interpretation. It offers an evidence-based framework for evaluating and updating relational representations suitable for spatial applications.A new algorithm is developed, Rulegraphs, which combines graph matching with rule-based approaches from machine learning. This algorithm reduces of the cardinality of the graph matching problem by replacing pattern parts by rules. Rulegraphs not only reduce the search space but, also improve the uniqueness of the matching process. The system is demonstrated for difficult two-dimensional pattern recognition and three-dimensional object recognition problems. An empirical comparison with an evidence-based neural network system is conducted.A consolidated learning algorithm based on relational evidence theory (CLARET) is presented which integrates Rulegraph ++ / matching with rule generation techniques from inductive logic programming. The approach utilises the relational constraints in spatial data to optimise the representational hierarchies and search strategies used during learning.An on-line schematic and symbol recognition application is demonstrated for learning to recognise symbols and patterns invariant to rotation, scale, and shift. The classification performance, computational efficiency, and the human factors involved in incrementally training the system are empirically compared with other inductive logic programming techniques.The significance of this work is twofold. Firstly, it extends the applicability of machine learning theories and algorithms into new domains. The techniques complement the image query and retrieval tools currently available in computer vision by offering additional ways of recognising and manipulating spatial information. Secondly, the development of a working schematic system allows for the evaluation of the efficiency of spatial interpretation techniques, and places emphasis on the dialogue between the user and the technology.
2

Time and evidence in databases : a model and its theoretic foundations

Dai, Bingning January 1998 (has links)
No description available.
3

Multilevel Design Optimization and the Effect of Epistemic Uncertainty

Nesbit, Benjamin Edward 13 December 2014 (has links)
This work presents the state of the art in hierarchically decomposed multilevel optimization. This work is expanded with the inclusion of evidence theory with the multilevel framework for the quantification of epistemic uncertainty. The novel method, Evidence-Based Multilevel Design optimization, is then used to solve two analytical optimization problems. This method is also used to explore the effect of the belief structure on the final solution. A methodology is presented to reduce the costs of evidence-based optimization through manipulation of the belief structure. In addition, a transport aircraft wing is also solved with multilevel optimization without uncertainty. This complex, real world optimization problem shows the capability of decomposed multilevel framework to reduce costs of solving computationally expensive problems with black box analyses.
4

Resource allocation and Uncertainties: An application case study of portfolio decision analysis and a numerical analysis on evidence theory

Gasparini, Gaia 09 October 2023 (has links)
The thesis is divided into two parts concerning different topics. The first is solving a multi-period portfolio decision problem, and the second, more theoretical, is a numerical comparison of uncertainty measures within evidence theory. Nowadays, portfolio problems are very common and present in several fields of study. The problem is inspired by a real-world infrastructure manage- ment case in the energy distribution sector. The problem consists of the optimal selection of a set of activities and their scheduling over time. In scheduling, various constraints and limits that the company has to meet must be considered, and the selection must be based on prioritizing the activities with a higher priority value. The problem is addressed by Port- folio Decision Analysis: the priority value of activities is assigned using the Multi-Attribute Value Theory method, which is then integrated with a multi-period optimization problem with activities durations and con- straints. Compared to other problems in the literature, in this case, the ac- tivities have different durations that must be taken into account for proper planning. The planning obtained is suitable for the user’s requirements both in terms of speed in providing results and in terms of simplicity and comprehensibility. In recent years, measures of uncertainty or entropy within evidence theory have again become a topic of interest in the literature. However, this has led to an increase in the already numerous measures of total uncertainty, that is, one that considers both conflict and nonspecificity measures. The research aims to find a unique measure, but none of those proposed so far can meet the required properties. The measures are often complex, and especially in the field of application, it is difficult to understand which is the best one to choose and to understand the numerical results obtained. Therefore, a numerical approach that compares a wide range of measures in pairs is proposed alongside comparisons based on mathematical proper- ties. Rank correlation, hierarchical clustering, and eigenvector centrality are used for comparison. The results obtained are discussed and com- mented on to gain a broader understanding of the behavior of the measures and the similarities and non-similarities between them.
5

DS-ARM: An Association Rule Based Predictor that Can Learn from Imperfect Data

Sooriyaarachchi Wickramaratna, Kasun Jayamal 13 January 2010 (has links)
Over the past decades, many industries have heavily spent on computerizing their work environments with the intention to simplify and expedite access to information and its processing. Typical of real-world data are various types of imperfections, uncertainties, ambiguities, that have complicated attempts at automated knowledge discovery. Indeed, it soon became obvious that adequate methods to deal with these problems were critically needed. Simple methods such as "interpolating" or just ignoring data imperfections being found often to lead to inferences of dubious practical value, the search for appropriate modification of knowledge-induction techniques began. Sometimes, rather non-standard approaches turned out to be necessary. For instance, the probabilistic approaches by earlier works are not sufficiently capable of handling the wider range of data imperfections that appear in many new applications (e.g., medical data). Dempster-Shafer theory provides a much stronger framework, and this is why it has been chosen as the fundamental paradigm exploited in this dissertation. The task of association rule mining is to detect frequently co-occurring groups of items in transactional databases. The majority of the papers in this field concentrate on how to expedite the search. Less attention has been devoted to how to employ the identified frequent itemsets for prediction purposes; worse still, methods to tailor association-mining techniques so that they can handle data imperfections are virtually nonexistent. This dissertation proposes a technique referred to by the acronym DS-ARM (Dempster-Shafer based Association Rule Mining) where the DS-theoretic framework is used to enhance a more traditional association-mining mechanism. Of particular interest is here a method to employ the knowledge of partial contents of a "shopping cart" for the prediction of what else the customer is likely to add to it. This formalized problem has many applications in the analysis of medical databases. A recently-proposed data structure, an itemset tree (IT-tree), is used to extract association rules in a computationally efficient manner, thus addressing the scalability problem that has disqualified more traditional techniques from real-world applications. The proposed algorithm is based on the Dempster-Shafer theory of evidence combination. Extensive experiments explore the algorithm's behavior; some of them use synthetically generated data, others relied on data obtained from a machine-learning repository, yet others use a movie ratings dataset or a HIV/AIDS patient dataset.
6

An ontology-driven evidence theory method for activity recognition / Uma abordagem baseada em ontologias e teoria da evidência para o reconhecimento de atividades

Rey, Vítor Fortes January 2016 (has links)
O reconhecimento de atividaes é vital no contexto dos ambientes inteligentes. Mesmo com a facilidade de acesso a sensores móveis baratos, reconhecer atividades continua sendo um problema difícil devido à incerteza nas leituras dos sensores e à complexidade das atividades. A teoria da evidência provê um modelo de reconhecimento de atividades que detecta atividades mesmo na presença de incerteza nas leituras dos sensores, mas ainda não é capaz de modelar atividades complexas ou mudanças na configuração dos sensores ou do ambiente. Este trabalho propõe combinar abordagens baseadas em modelagem de conhecimento com a teoria da evidência, melhorando assim a construção dos modelos da última trazendo a reusabilidade, flexibilidade e semântica rica da primeira. / Activity recognition is a vital need in the field of ambient intelligence. It is essential for many internet of things applications including energy management, healthcare systems and home automation. But, even with the many cheap mobile sensors envisioned by the internet of things, activity recognition remains a hard problem. This is due to uncertainty in sensor readings and the complexity of activities themselves. Evidence theory models provide activity recognition even in the presence of uncertain sensor readings, but cannot yet model complex activities or dynamic changes in sensor and environment configurations. This work proposes combining knowledge-based approaches with evidence theory, improving the construction of evidence theory models for activity recognition by bringing reusability, flexibility and rich semantics.
7

An ontology-driven evidence theory method for activity recognition / Uma abordagem baseada em ontologias e teoria da evidência para o reconhecimento de atividades

Rey, Vítor Fortes January 2016 (has links)
O reconhecimento de atividaes é vital no contexto dos ambientes inteligentes. Mesmo com a facilidade de acesso a sensores móveis baratos, reconhecer atividades continua sendo um problema difícil devido à incerteza nas leituras dos sensores e à complexidade das atividades. A teoria da evidência provê um modelo de reconhecimento de atividades que detecta atividades mesmo na presença de incerteza nas leituras dos sensores, mas ainda não é capaz de modelar atividades complexas ou mudanças na configuração dos sensores ou do ambiente. Este trabalho propõe combinar abordagens baseadas em modelagem de conhecimento com a teoria da evidência, melhorando assim a construção dos modelos da última trazendo a reusabilidade, flexibilidade e semântica rica da primeira. / Activity recognition is a vital need in the field of ambient intelligence. It is essential for many internet of things applications including energy management, healthcare systems and home automation. But, even with the many cheap mobile sensors envisioned by the internet of things, activity recognition remains a hard problem. This is due to uncertainty in sensor readings and the complexity of activities themselves. Evidence theory models provide activity recognition even in the presence of uncertain sensor readings, but cannot yet model complex activities or dynamic changes in sensor and environment configurations. This work proposes combining knowledge-based approaches with evidence theory, improving the construction of evidence theory models for activity recognition by bringing reusability, flexibility and rich semantics.
8

An ontology-driven evidence theory method for activity recognition / Uma abordagem baseada em ontologias e teoria da evidência para o reconhecimento de atividades

Rey, Vítor Fortes January 2016 (has links)
O reconhecimento de atividaes é vital no contexto dos ambientes inteligentes. Mesmo com a facilidade de acesso a sensores móveis baratos, reconhecer atividades continua sendo um problema difícil devido à incerteza nas leituras dos sensores e à complexidade das atividades. A teoria da evidência provê um modelo de reconhecimento de atividades que detecta atividades mesmo na presença de incerteza nas leituras dos sensores, mas ainda não é capaz de modelar atividades complexas ou mudanças na configuração dos sensores ou do ambiente. Este trabalho propõe combinar abordagens baseadas em modelagem de conhecimento com a teoria da evidência, melhorando assim a construção dos modelos da última trazendo a reusabilidade, flexibilidade e semântica rica da primeira. / Activity recognition is a vital need in the field of ambient intelligence. It is essential for many internet of things applications including energy management, healthcare systems and home automation. But, even with the many cheap mobile sensors envisioned by the internet of things, activity recognition remains a hard problem. This is due to uncertainty in sensor readings and the complexity of activities themselves. Evidence theory models provide activity recognition even in the presence of uncertain sensor readings, but cannot yet model complex activities or dynamic changes in sensor and environment configurations. This work proposes combining knowledge-based approaches with evidence theory, improving the construction of evidence theory models for activity recognition by bringing reusability, flexibility and rich semantics.
9

Quantificação de incertezas aplicada à geomecânica de reservatórios

PEREIRA, Leonardo Cabral 08 July 2015 (has links)
Submitted by Fabio Sobreira Campos da Costa (fabio.sobreira@ufpe.br) on 2016-07-04T11:22:15Z No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) TeseLeoCabral_vrsFinal.pdf: 37484380 bytes, checksum: b61e5bb415f505345e69623ffd098b9e (MD5) / Made available in DSpace on 2016-07-04T11:22:15Z (GMT). No. of bitstreams: 2 license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) TeseLeoCabral_vrsFinal.pdf: 37484380 bytes, checksum: b61e5bb415f505345e69623ffd098b9e (MD5) Previous issue date: 2015-07-08 / A disciplina de geomecânica de reservatórios engloba aspectos relacionados não somente à mecânica de rochas, mas também à geologia estrutural e engenharia de petróleo e deve ser entendida no intuito de melhor explicar aspectos críticos presentes nas fases de exploração e produção de reservatórios de petróleo, tais como: predição de poro pressões, estimativa de potenciais selantes de falhas geológicas, determinação de trajetórias de poços, cálculo da pressão de fratura, reativação de falhas, compactação de reservatórios, injeção de CO2, entre outros. Uma representação adequada da quantificação de incertezas é parte essencial de qualquer projeto. Especificamente, uma análise que se destina a fornecer informações sobre o comportamento de um sistema deve prover uma avaliação da incerteza associada aos resultados. Sem tal estimativa, perspectivas traçadas a partir da análise e decisões tomadas com base nos resultados são questionáveis. O processo de quantificação de incertezas para modelos multifísicos de grande escala, como os modelos relacionados à geomecânica de reservatórios, requer uma atenção especial, principalmente, devido ao fato de comumente se deparar com cenários em que a disponibilidade de dados é nula ou escassa. Esta tese se propôs a avaliar e integrar estes dois temas: quantificação de incertezas e geomecânica de reservatórios. Para isso, foi realizada uma extensa revisão bibliográfica sobre os principais problemas relacionados à geomecânica de reservatórios, tais como: injeção acima da pressão de fratura, reativação de falhas geológicas, compactação de reservatórios e injeção de CO2. Esta revisão contou com a dedução e implementação de soluções analíticas disponíveis na literatura relatas aos fenômenos descritos acima. Desta forma, a primeira contribuição desta tese foi agrupar diferentes soluções analíticas relacionadas à geomecânica de reservatórios em um único documento. O processo de quantificação de incertezas foi amplamente discutido. Desde a definição de tipos de incertezas - aleatórias ou epistêmicas, até a apresentação de diferentes metodologias para quantificação de incertezas. A teoria da evidência, também conhecida como Dempster-Shafer theory, foi detalhada e apresentada como uma generalização da teoria da probabilidade. Apesar de vastamente utilizada em diversas áreas da engenharia, pela primeira vez a teoria da evidência foi utilizada na engenharia de reservatórios, o que torna tal fato uma contribuição fundamental desta tese. O conceito de decisões sob incerteza foi introduzido e catapultou a integração desses dois temas extremamente relevantes na engenharia de reservatórios. Diferentes cenários inerentes à tomada de decisão foram descritos e discutidos, entre eles: a ausência de dados de entrada disponíveis, a situação em que os parâmetros de entrada são conhecidos, a inferência de novos dados ao longo do projeto e, por fim, uma modelagem híbrida. Como resultado desta integração foram submetidos 3 artigos a revistas indexadas. Por fim, foi deduzida a equação de fluxo em meios porosos deformáveis e proposta uma metodologia explícita para incorporação dos efeitos geomecânicos na simulação de reservatórios tradicional. Esta metodologia apresentou resultados bastante efetivos quando comparada a métodos totalmente acoplados ou iterativos presentes na literatura. / Reservoir geomechanics encompasses aspects related to rock mechanics, structural geology and petroleum engineering. The geomechanics of reservoirs must be understood in order to better explain critical aspects present in petroleum reservoirs exploration and production phases, such as: pore pressure prediction, geological fault seal potential, well design, fracture propagation, fault reactivation, reservoir compaction, CO2 injection, among others. An adequate representation of the uncertainties is an essential part of any project. Specifically, an analysis that is intended to provide information about the behavior of a system should provide an assessment of the uncertainty associated with the results. Without such estimate, perspectives drawn from the analysis and decisions made based on the results are questionable. The process of uncertainty quantification for large scale multiphysics models, such as reservoir geomechanics models, requires special attention, due to the fact that scenarios where data availability is nil or scarce commonly come across. This thesis aimed to evaluate and integrate these two themes: uncertainty quantification and reservoir geomechanics. For this, an extensive literature review on key issues related to reservoir geomechanics was carried out, such as: injection above the fracture pressure, fault reactivation, reservoir compaction and CO2 injection. This review included the deduction and implementation of analytical solutions available in the literature. Thus, the first contribution of this thesis was to group different analytical solutions related to reservoir geomechanics into a single document. The process of uncertainty quantification has been widely discussed. The definition of types of uncertainty - aleatory or epistemic and different methods for uncertainty quantification were presented. Evidence theory, also known as Dempster- Shafer theory, was detailed and presented as a probability theory generalization. Although widely used in different fields of engineering, for the first time the evidence theory was used in reservoir engineering, which makes this fact a fundamental contribution of this thesis. The concept of decisions under uncertainty was introduced and catapulted the integration of these two extremely important issues in reservoir engineering. Different scenarios inherent in the decision-making have been described and discussed, among them: the lack of available input data, the situation in which the input parameters are known, the inference of new data along the design time, and finally a hybrid modeling. As a result of this integration three articles were submitted to peer review journals. Finally, the flow equation in deformable porous media was presented and an explicit methodology was proposed to incorporate geomechanical effects in the reservoir simulation. This methodology presented quite effective results when compared to fully coupled or iterative methods in the literature.
10

Evaluation and Implementation of Traceable Uncertainty for Threat Evaluation

Haglind, Carl January 2014 (has links)
Threat evaluation is used in various applications to find threatening objects or situations and neutralize them before they cause any damage. To make the threat evaluation as user-friendly as possible, it is important to know where the uncertainties are. The method Traceable Uncertainty can make the threat evaluation process more transparent and hopefully easier to rely on. Traceable Uncertainty is used when different sources of information are combined to find support for the decision making process. The uncertainty of the current information is measured before and after the combination. If the magnitude of uncertainty has changed more than a threshold, a new branch will be created which excludes the new information from the combination of evidence. Traceable Uncertainty has never been tested on any realistic scenario to investigate whether it is possible to implement the method on a large scale system. The hypothesis of this thesis is that Traceable Uncertainty can be used on large scale systems if its threshold parameter is tuned in the right way. Different threshold values were tested when recorded radar data were analyzed for threatening targets. Experiments combining random generated evidence were also analyzed for different threshold values. The results showed that a threshold value in the range [0.15, 0.25] generated a satisfying amount of interpretations that were not too similar to eachother. The results could also be filtered to take away unnecessary interpretations. This shows that in this aspect and for this data set, Traceable Uncertainty can be used on large scale systems.

Page generated in 0.0687 seconds