• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 16
  • 13
  • 4
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 44
  • 16
  • 16
  • 12
  • 11
  • 11
  • 10
  • 9
  • 9
  • 8
  • 7
  • 6
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Imprecise probability analysis for integrated assessment of climate change

Kriegler, Elmar January 2005 (has links)
<p> We present an application of imprecise probability theory to the quantification of uncertainty in the integrated assessment of climate change. Our work is motivated by the fact that uncertainty about climate change is pervasive, and therefore requires a thorough treatment in the integrated assessment process. Classical probability theory faces some severe difficulties in this respect, since it cannot capture very poor states of information in a satisfactory manner. A more general framework is provided by imprecise probability theory, which offers a similarly firm evidential and behavioural foundation, while at the same time allowing to capture more diverse states of information. An imprecise probability describes the information in terms of lower and upper bounds on probability.</p> <p> For the purpose of our imprecise probability analysis, we construct a diffusion ocean energy balance climate model that parameterises the global mean temperature response to secular trends in the radiative forcing in terms of climate sensitivity and effective vertical ocean heat diffusivity. We compare the model behaviour to the 20th century temperature record in order to derive a likelihood function for these two parameters and the forcing strength of anthropogenic sulphate aerosols. Results show a strong positive correlation between climate sensitivity and ocean heat diffusivity, and between climate sensitivity and absolute strength of the sulphate forcing.</p> <p> We identify two suitable imprecise probability classes for an efficient representation of the uncertainty about the climate model parameters and provide an algorithm to construct a belief function for the prior parameter uncertainty from a set of probability constraints that can be deduced from the literature or observational data. For the purpose of updating the prior with the likelihood function, we establish a methodological framework that allows us to perform the updating procedure efficiently for two different updating rules: Dempster's rule of conditioning and the Generalised Bayes' rule. Dempster's rule yields a posterior belief function in good qualitative agreement with previous studies that tried to constrain climate sensitivity and sulphate aerosol cooling. In contrast, we are not able to produce meaningful imprecise posterior probability bounds from the application of the Generalised Bayes' Rule. We can attribute this result mainly to our choice of representing the prior uncertainty by a belief function.</p> <p> We project the Dempster-updated belief function for the climate model parameters onto estimates of future global mean temperature change under several emissions scenarios for the 21st century, and several long-term stabilisation policies. Within the limitations of our analysis we find that it requires a stringent stabilisation level of around 450 ppm carbon dioxide equivalent concentration to obtain a non-negligible lower probability of limiting the warming to 2 degrees Celsius. We discuss several frameworks of decision-making under ambiguity and show that they can lead to a variety of, possibly imprecise, climate policy recommendations. We find, however, that poor states of information do not necessarily impede a useful policy advice.</p> <p> We conclude that imprecise probabilities constitute indeed a promising candidate for the adequate treatment of uncertainty in the integrated assessment of climate change. We have constructed prior belief functions that allow much weaker assumptions on the prior state of information than a prior probability would require and, nevertheless, can be propagated through the entire assessment process. As a caveat, the updating issue needs further investigation. Belief functions constitute only a sensible choice for the prior uncertainty representation if more restrictive updating rules than the Generalised Bayes'Rule are available.</p> / <p> Diese Arbeit untersucht die Eignung der Theorie der unscharfen Wahrscheinlichkeiten für die Beschreibung der Unsicherheit in der integrierten Analyse des Klimawandels. Die wissenschaftliche Unsicherheit bezüglich vieler Aspekte des Klimawandels ist beträchtlich, so dass ihre angemessene Beschreibung von großer Wichtigkeit ist. Die klassische Wahrscheinlichkeitstheorie weist in diesem Zusammenhang einige Probleme auf, da sie Zustände sehr geringer Information nicht zufriedenstellend beschreiben kann. Die unscharfe Wahrscheinlichkeitstheorie bietet ein gleichermaßen fundiertes Theoriegebäude, welches jedoch eine größere Flexibilität bei der Beschreibung verschiedenartiger Informationszustände erlaubt. Unscharfe Wahrscheinlichkeiten erfassen solche Informationszustände durch die Spezifizierung von unteren und oberen Grenzen an zulässige Werte der Wahrscheinlichkeit.</p> <p> Unsere Analyse des Klimawandels beruht auf einem Energiebilanzmodell mit diffusivem Ozean, welches die globale Temperaturantwort auf eine Änderung der Strahlungsbilanz in Abhängigkeit von zwei Parametern beschreibt: die Klimasensitivität, und die effektive vertikale Wärmediffusivität im Ozean. Wir vergleichen das Modellverhalten mit den Temperaturmessungen des 20. Jahrhunderts, um eine sogenannte Likelihood-Funktion für die Hypothesen zu diesen beiden Parametern sowie dem kühlenden Einfluss der Sulfataerosole zu ermitteln. Im Ergebnis zeigt sich eine stark positive Korrelation zwischen Klimasensitivität und Wärmediffusivität im Ozean, und Klimasensitivität und kühlendem Einfluss der Sulfataerosole.</p> <p> Für die effiziente Beschreibung der Parameterunsicherheit ziehen wir zwei geeignete Modelltypen aus der unscharfen Wahrscheinlichkeitstheorie heran. Wir formulieren einen Algorithmus, der den Informationsgehalt beider Modelle durch eine sogenannte Belief-Funktion beschreibt. Mit Hilfe dieses Algorithmus konstruieren wir Belief-Funktionen für die A-priori-Parameterunsicherheit auf der Grundlage von divergierenden Wahrscheinlichkeitsschätzungen in der Literatur bzw. Beobachtungsdaten. Wir leiten eine Methode her, um die A-priori-Belief-Funktion im Lichte der Likelihood-Funktion zu aktualisieren. Dabei ziehen wir zwei verschiedene Regeln zur Durchführung des Lernprozesses in Betracht: die Dempstersche Regel und die verallgemeinerte Bayessche Regel. Durch Anwendung der Dempsterschen Regel erhalten wir eineA-posteriori-Belief-Funktion, deren Informationsgehalt qualitativ mit den Ergebnissen bisheriger Studien übereinstimmt, die eine Einschränkung der Unsicherheit über die Klimasensitivität und die kühlende Wirkung der Sulfataerosole versucht haben. Im Gegensatz dazu finden wir bei Anwendung der verallgemeinerten Bayesschen Regel keine sinnvollen unteren und oberen Grenzen an die A-posteriori-Wahrscheinlichkeit. Wir stellen fest, dass dieses Resultat maßgeblich durch die Wahl einer Belief-Funktion zur Beschreibung der A-priori-Unsicherheit bedingt ist.</p> <p> Die A-posteriori-Belief-Funktion für die Modellparameter, die wir aus der Anwendung der Dempsterschen Regel erhalten haben, wird zur Abschätzung des zukünftigen Temperaturanstiegs eingesetzt. Wir betrachten verschiedene Emissionsszenarien für das 21. Jahrhundert sowie verschiedene Stabilisierungsziele für den Treibhausgasgehalt in der Atmosphäre. Im Rahmen unserer Analyse finden wir, dass sehr strikte Stabilisierungsziele im Bereich einer Kohlendioxid-Äquivalentkonzentration von ca. 450 ppm in der Atmosphäre notwendig sind, um nicht eine vernachlässigbar kleine untere Wahrscheinlichkeit für die Begrenzung der Erwärmung auf 2 Grad Celsius zu erhalten. Wir diskutieren verschiedene Kriterien für die Entscheidungsfindung unter unscharfer Wahrscheinlichkeit, und zeigen dass sie zu verschiedenen teilweise unscharfen Politikempfehlungen führen können. Nichtsdestotrotz stellen wir fest, dass eine klare Politikempfehlung auch bei Zuständen schwacher Information möglich sein kann.</p> <p> Wir schließen, dass unscharfe Wahrscheinlichkeiten tatsächlich ein geeignetes Mittel zur Beschreibung der Unsicherheit in der integrierten Analyse des Klimawandels darstellen. Wir haben Algorithmen zur Generierung und Weiterverarbeitung von Belief-Funktionen etabliert, die eine deutlich größere A-priori-Unsicherheit beschreiben können, als durch eine A-priori-Wahrscheinlichkeit möglich wäre. Allerdings erfordert die Frage des Lernprozesses für unscharfe Wahrscheinlichkeiten eine weitergehende Untersuchung. Belief-Funktionen stellen nur dann eine vernünftige Wahl für die Beschreibung der A-priori-Unsicherheit dar, wenn striktere Regeln als die verallgemeinerte Bayessche Regel für den Lernprozess gerechtfertigt werden können.</p>
32

Management de l'incertitude pour les systèmes booléens complexes - Application à la maintenance préventive des avions / Uncertainty Management for Boolean Complex Systems Application to Preventive Maintenance of Aircrafts

Jacob, Christelle 25 February 2014 (has links)
Les analyses de sûreté de fonctionnement standards sont basées sur la représentation des événements redoutés par des arbres de défaillances, qui les décrivent à l'aide de combinaison logiques d'événements plus basiques (formules Booléennes complexes). Les analyses quantitatives se font avec l'hypothèse que les probabilités d'occurrence de ces événements basiques sont connues. Le but de cette thèse est d'étudier l'impact de l'incertitude épistémique sur les événements élémentaires, ainsi que la propagation de cette incertitude à de plus hauts niveaux. Le problème soulevé est comment calculer l'intervalle de probabilité dans lequel se trouvera l'occurrence d'un événement redouté, lorsque les événements basiques qui le décrivent ont eux-mêmes une probabilité imprécise. Lorsque l'indépendance stochastique est supposée, on se retrouve avec un problème NP-hard. Nous avons donc développé un algorithme permettant de calculer l'intervalle exact dans lequel se trouvera la probabilité d'occurrence d'un événement redouté, grâce à des techniques d'analyse par intervalles. Cet algorithme a également été étendu dans le cas où les probabilités des événements basiques évolueraient en fonction du temps. Nous avons également utilisé une approche par fonctions de croyance pour étudier le cas où l'indépendance stochastique des événements ne peut pas être démontrée : on suppose alors que les probabilités viennent de différentes sources d'information Indépendantes. Dans ce cas, les mesures de plausibilité et de nécessité d'une formule Booléenne complexe sont difficiles à calculer, néanmoins nous avons pu dégager des situations pratiques dans le cadre de leur utilisation pour les Arbres de défaillances pour lesquelles elles se prêtent aux calculs. / Standard approaches to reliability analysis relies on a probabilistic analysis of critical events based on fault tree representations. However in practice, and especially for preventive maintenance tasks, the probabilities ruling the occurrence of these events are seldom precisely known. The aim of this thesis is to study the impact of epistemic uncertainty on probabilities of elementary events such as failures over the probability of some higher level critical event. The fundamental problem addressed by the thesis is thus to compute the probability interval for a Boolean proposition representing a failure condition, given the probability intervals of atomic propositions. When the stochastic independence is assumed, we face a problem of interval analysis, which is NP-hard in general. We have provided an original algorithm that computes the output probability interval exactly, taking into account the monotonicity of the obtained function in terms of some variables so as to reduce the uncertainty. We have also considered the evolution of the probability interval with time, assuming parameters of the reliability function to be imprecisely known. Besides, taking advantage of the fact that a probability interval on a binary space can be modelled by a belief function, we have solved the same problem with a different assumption, namely information source independence. While the problem of computing the belief and plausibility of a Boolean proposition are even harder to compute, we have shown that in practical situations such as usual fault-trees, the additivity condition of probability theory is still valid, which simplifies this calculation. A prototype has been developed to compute the probability interval for a complex Boolean proposition.
33

Contribution à l’évaluation et à l’amélioration multicritères en contexte incertain : application à la conception préliminaire / Contribution to multi-criteria assessment and improvement in uncertain context : Application to preliminary design

Sow, Diadié 18 December 2017 (has links)
Dans un environnement fortement concurrentiel, les industriels doivent constamment améliorer leurs produits pour rester compétitifs et satisfaire au mieux leurs clients tout en minimisant les coûts et la prise de risques en conception. En phase préliminaire de conception ou de (re-)engineering, prédire les performances de nouveaux produits est une tâche difficile. En effet, l’impact des changements opérés sur les caractéristiques du produit relativement aux performances de celui-ci ne peut être estimé que de façon imprécise. Décideurs et concepteurs doivent pourtant identifier les performances à améliorer en limitant les efforts d’ingénierie déployés pour des améliorations innovantes. Bien que plusieurs indicateurs de plus-value aient été proposés par la communauté de la décision multicritère pour évaluer a priori l’amélioration que confère un changement de configuration à un produit, ils semblent néanmoins reposer sur des hypothèses de réalisabilité des gains de performance peu réalistes dans le contexte manufacturier.Sur la base de techniques d’analyse multicritère et de théorie des possibilités, cette thèse propose une extension de ces indices de plus-values lorsque la vraisemblance des améliorations espérées ne peut être évaluée de façon précise comme c’est le cas en phase de conception préliminaire. Cette connaissance imparfaite des relations entre les actions d’amélioration et les performances espérées rend la question « comment se fixer des objectifs ambitieux lorsque l’on conçoit ou améliore un produit tout en faisant en sorte que ces objectifs restent à la portée du manufacturier ? » d’autant plus complexe. Ainsi, les améliorations qu’il faut planifier doivent à la fois avoir un impact significatif sur la performance du produit et correspondre à des compétences maîtrisées par le manufacturier. Plusieurs approches de la littérature se sont intéressées à ces deux aspects de l’amélioration, mais peu d’entre elles les considèrent conjointement. Nous proposons dans cette thèse plusieurs approches qualitatives et possibilistes qui concilient les deux points de vue à travers des problèmes d’optimisation multi-attributs. La notion d’interaction entre deux dimensions de la performance y est centrale. Un exemple relatif à la conception d’un robot autonome est proposé pour illustrer chacune de nos propositions. Cette étude de cas est issue du challenge Robafis qui est organisé annuellement par l’Association Française d’ingénierie Système (AFIS) pour promouvoir l’ingénierie système dans les écoles d’ingénieurs. / In a highly competitive and unstable environment, manufacturers must constantly improve their products to remain competitive and satisfy their customers while minimizing incurred costs and risk taking at the design stage. At the early stages of (re-)engineering, performances forecasting of new product is complicated. Indeed, the impacts of any characteristic change on the product performance are not precisely known. Decision makers must thus identify the performances to be improved while limiting the engineering efforts spent on innovative upgrades. Although some theoretical worth indexes have been proposed in the multiple criteria decision-making literature to estimate the expectable gains when improving changes are planned, they generally rely on non-realistic assumptions on the achievability of the expected improvements that cannot hold in manufacturing contexts. Based on multi-criteria decision analysis techniques and uncertainty theory, this thesis proposes an extension of the worth index concept when the likelihood of the expected improvements is not precisely known as it is the case at the preliminary stages of design activities. This poor knowledge of the relationships between improvement actions and expected performances makes the issue “how to set ambitious targets when improving or designing product while these targets remain within the reach of the manufacturer” all the thornier. Thus, improvements to be focused on are those which both have a significant positive impact on product performance and correspond to operational changes properly under control by the manufacturer. While some approaches in the literature have already addressed each of the two aspects of the improvement problem, few deal with both of them at the same time. We investigate several qualitative and possibilistic approaches that conciliate both points of view as multi attributes optimization problems. The notion of interaction between any two objectives to be simultaneously satisfied is central in our framework. An illustrative example related to the design phase of autonomous robot is provided. This case study is issued from the robotic challenge Robafis that is organized annually by the French association of Systems Engineering AFIS to promote Systems Engineering practice in engineers’ schools.
34

Processos de decisão Markovianos com probabilidades imprecisas e representações relacionais: algoritmos e fundamentos. / Markov decision processes with imprecise probabilities and relational representations: foundations and algorithms.

Ricardo Shirota Filho 03 May 2012 (has links)
Este trabalho é dedicado ao desenvolvimento teórico e algorítmico de processos de decisão markovianos com probabilidades imprecisas e representações relacionais. Na literatura, essa configuração tem sido importante dentro da área de planejamento em inteligência artificial, onde o uso de representações relacionais permite obter descrições compactas, e o emprego de probabilidades imprecisas resulta em formas mais gerais de incerteza. São três as principais contribuições deste trabalho. Primeiro, efetua-se uma discussão sobre os fundamentos de tomada de decisão sequencial com probabilidades imprecisas, em que evidencia-se alguns problemas ainda em aberto. Esses resultados afetam diretamente o (porém não restrito ao) modelo de interesse deste trabalho, os processos de decisão markovianos com probabilidades imprecisas. Segundo, propõe-se três algoritmos para processos de decisão markovianos com probabilidades imprecisas baseadas em programação (otimização) matemática. E terceiro, desenvolvem-se ideias propostas por Trevizan, Cozman e de Barros (2008) no uso de variantes do algoritmo Real-Time Dynamic Programming para resolução de problemas de planejamento probabilístico descritos através de versões estendidas da linguagem de descrição de domínios de planejamento (PPDDL). / This work is devoted to the theoretical and algorithmic development of Markov Decision Processes with Imprecise Probabilities and relational representations. In the literature, this configuration is important within artificial intelligence planning, where the use of relational representations allow compact representations and imprecise probabilities result in a more general form of uncertainty. There are three main contributions. First, we present a brief discussion of the foundations of decision making with imprecise probabilities, pointing towards key questions that remain unanswered. These results have direct influence upon the model discussed within this text, that is, Markov Decision Processes with Imprecise Probabilities. Second, we propose three algorithms for Markov Decision Processes with Imprecise Probabilities based on mathematical programming. And third, we develop ideas proposed by Trevizan, Cozman e de Barros (2008) on the use of variants of Real-Time Dynamic Programming to solve problems of probabilistic planning described by an extension of the Probabilistic Planning Domain Definition Language (PPDDL).
35

Operações espaciais robustas à imprecisão nas coordenadas geográficas / Spatial operations robusts to geographic coordinate imprecision

Oliveira, Welder Batista de 21 August 2017 (has links)
Submitted by Marlene Santos (marlene.bc.ufg@gmail.com) on 2017-10-05T20:06:57Z No. of bitstreams: 2 Dissertacao- Welder Batista de Oliveira - 2017.pdf: 2420889 bytes, checksum: c26aee2605e42f2a9aecb9ec2523464f (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Approved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-10-06T11:09:11Z (GMT) No. of bitstreams: 2 Dissertacao- Welder Batista de Oliveira - 2017.pdf: 2420889 bytes, checksum: c26aee2605e42f2a9aecb9ec2523464f (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) / Made available in DSpace on 2017-10-06T11:09:11Z (GMT). No. of bitstreams: 2 Dissertacao- Welder Batista de Oliveira - 2017.pdf: 2420889 bytes, checksum: c26aee2605e42f2a9aecb9ec2523464f (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-08-21 / Fundação de Amparo à Pesquisa do Estado de Goiás - FAPEG / Geographic Information Systems have revolutionized geographic research over the past three decades. These systems commonly provide a number of features for processing andanalyzing spatial data, such as spatial join and skyline. Although relevant, the effectiveness of such functionalities is affected by the imprecision of the geographic coordinates obtained by the georeferencing method employed. Moreover, the error contained in the coordinates may present several distributional patterns, which demands the development of solutions that are generalist concerning the error pattern that they can handle properly. Finally, spatial operations are already computationally expensive in their deterministic version, which is aggravated by the introduction of the stochastic component. The pre-sent work presents a general structure of spatial operations solutions robust to imprecise coordinates based on the use of simulations and probabilistic adaptations of heuristics in the literature. In addition, to deal with the problems mentioned, the proposed structure is designed to contemplate the requirements of generality, accuracy and efficiency at levels that enable its practical application. The overall solution structure is composed of the combination of probabilistic versions of heuristics of the deterministic versions of the spatial operations and by Monte Carlo simulations. From that structure, specific solutions - as case studies - are developed for the spatial join and skyline. Theoretical and experimental results demonstrated the potential of the developed solutions to meet the threerequirements established in this work. / Os Sistemas de Informação Geográfica revolucionaram a pesquisa geográfica nas últimas três décadas. Esses sistemas comumente disponibilizam uma série de funcionalidades para processar e analisar dados espaciais, como, por exemplo, a junção espacial e a consulta skyline. Embora relevantes, a eficácia dessas funcionalidades é impactada pela imprecisão das coordenadas geográficas obtidas pelo método de georreferenciamento empregado. Além disso, o erro contido nas coordenadas pode apresentar diversos padrões distribucionais, o que demanda o desenvolvimento de soluções que sejam generalistas quanto ao padrão de erro que conseguem tratar adequadamente. Por fim, operações espaciais já são computacionalmente caras em sua versão determinística, o que se agrava com a introdução do componente estocástico. O presente trabalho apresenta uma estrutura geral para o desenvolvimento de soluções para operações espaciais robustas a coordenadas imprecisas. Além disso, para lidar com os problemas mencionados, a estrutura proposta é projetada para contemplar os requisitos de generalidade, eficácia e eficiência em patamares que viabilizem sua aplicação prática. A estrutura geral de solução é composta pela combinação de versões probabilísticas de heurísticas das versões determinísticas das operações espaciais e por simulações de Monte Carlo. A partir dela, são desenvolvidas as soluções específicas – como estudo de caso - para a skyline espacial e da junção espacial. Resultados teóricos e experimentais demonstraram o potencial das soluções desenvolvidas em atender aos três requisitos estabelecidos nesse trabalho.
36

Transformação de redes de Petri coloridas em processos de decisão markovianos com probabilidades imprecisas. / Conversion from colored Petri nets into Markov decision processes with imprecise probabilities.

Eboli, Mônica Goes 01 July 2010 (has links)
Este trabalho foi motivado pela necessidade de considerar comportamento estocástico durante o planejamento da produção de sistemas de manufatura, ou seja, o que produzir e em que ordem. Estes sistemas possuem um comportamento estocástico geralmente não considerado no planejamento da produção. O principal objetivo deste trabalho foi obter um método que modelasse sistemas de manufatura e representasse seu comportamento estocástico durante o planejamento de produção destes sistemas. Como os métodos que eram ideais para planejamento não forneciam a modelagem adequada dos sistemas, e os com modelagem adequada não forneciam a capacidade de planejamento necessária, decidiu-se combinar dois métodos para atingir o objetivo desejado. Decidiu-se modelar os sistemas em rede de Petri e convertê-los em processos de decisão markovianos, e então realizar o planejamento com o ultimo. Para que fosse possível modelar as probabilidades envolvidas nos processos, foi proposto um tipo especial de rede de Petri, nomeada rede de Petri fatorada. Utilizando este tipo de rede de Petri, foi desenvolvido o método de conversão em processos de decisão markovianos. A conversão ocorreu com sucesso, conforme testes que mostraram que planos podem ser produzidos utilizando-se algoritmos de ponta para processos de decisão markovianos. / The present work was motivated by the need to consider stochastic behavior when planning the production mix in a manufacturing system. These systems are exposed to stochastic behavior that is usually not considered during production planning. The main goal of this work was to obtain a method to model manufacturing systems and to represent their stochastic behavior when planning the production for these systems. Because the methods that were suitable for planning were not adequate for modeling the systems and vice-versa, two methods were combined to achieve the main goal. It was decided to model the systems in Petri nets and to convert them into Markov decision processes, to do the planning with the latter. In order to represent probabilities in the process, a special type of Petri nets, named Factored Petri nets, were proposed. Using this kind of Petri nets, a conversion method into Markov decision processes was developed. The conversion is successful as tests showed that plans can be produced within seconds using state-of-art algorithms for Markov decision processes.
37

Transformação de redes de Petri coloridas em processos de decisão markovianos com probabilidades imprecisas. / Conversion from colored Petri nets into Markov decision processes with imprecise probabilities.

Mônica Goes Eboli 01 July 2010 (has links)
Este trabalho foi motivado pela necessidade de considerar comportamento estocástico durante o planejamento da produção de sistemas de manufatura, ou seja, o que produzir e em que ordem. Estes sistemas possuem um comportamento estocástico geralmente não considerado no planejamento da produção. O principal objetivo deste trabalho foi obter um método que modelasse sistemas de manufatura e representasse seu comportamento estocástico durante o planejamento de produção destes sistemas. Como os métodos que eram ideais para planejamento não forneciam a modelagem adequada dos sistemas, e os com modelagem adequada não forneciam a capacidade de planejamento necessária, decidiu-se combinar dois métodos para atingir o objetivo desejado. Decidiu-se modelar os sistemas em rede de Petri e convertê-los em processos de decisão markovianos, e então realizar o planejamento com o ultimo. Para que fosse possível modelar as probabilidades envolvidas nos processos, foi proposto um tipo especial de rede de Petri, nomeada rede de Petri fatorada. Utilizando este tipo de rede de Petri, foi desenvolvido o método de conversão em processos de decisão markovianos. A conversão ocorreu com sucesso, conforme testes que mostraram que planos podem ser produzidos utilizando-se algoritmos de ponta para processos de decisão markovianos. / The present work was motivated by the need to consider stochastic behavior when planning the production mix in a manufacturing system. These systems are exposed to stochastic behavior that is usually not considered during production planning. The main goal of this work was to obtain a method to model manufacturing systems and to represent their stochastic behavior when planning the production for these systems. Because the methods that were suitable for planning were not adequate for modeling the systems and vice-versa, two methods were combined to achieve the main goal. It was decided to model the systems in Petri nets and to convert them into Markov decision processes, to do the planning with the latter. In order to represent probabilities in the process, a special type of Petri nets, named Factored Petri nets, were proposed. Using this kind of Petri nets, a conversion method into Markov decision processes was developed. The conversion is successful as tests showed that plans can be produced within seconds using state-of-art algorithms for Markov decision processes.
38

Sound source localization with data and model uncertainties using the EM and Evidential EM algorithms / Estimation de sources acoustiques avec prise en compte de l'incertitude de propagation

Wang, Xun 09 December 2014 (has links)
Ce travail de thèse se penche sur le problème de la localisation de sources acoustiques à partir de signaux déterministes et aléatoires mesurés par un réseau de microphones. Le problème est résolu dans un cadre statistique, par estimation via la méthode du maximum de vraisemblance. La pression mesurée par un microphone est interprétée comme étant un mélange de signaux latents émis par les sources. Les positions et les amplitudes des sources acoustiques sont estimées en utilisant l’algorithme espérance-maximisation (EM). Dans cette thèse, deux types d’incertitude sont également pris en compte : les positions des microphones et le nombre d’onde sont supposés mal connus. Ces incertitudes sont transposées aux données dans le cadre théorique des fonctions de croyance. Ensuite, les positions et les amplitudes des sources acoustiques peuvent être estimées en utilisant l’algorithme E2M, qui est une variante de l’algorithme EM pour les données incertaines.La première partie des travaux considère le modèle de signal déterministe sans prise en compte de l’incertitude. L’algorithme EM est utilisé pour estimer les positions et les amplitudes des sources. En outre, les résultats expérimentaux sont présentés et comparés avec le beamforming et la holographie optimisée statistiquement en champ proche (SONAH), ce qui démontre l’avantage de l’algorithme EM. La deuxième partie considère le problème de l’incertitude du modèle et montre comment les incertitudes sur les positions des microphones et le nombre d’onde peuvent être quantifiées sur les données. Dans ce cas, la fonction de vraisemblance est étendue aux données incertaines. Ensuite, l’algorithme E2M est utilisé pour estimer les sources acoustiques. Finalement, les expériences réalisées sur les données réelles et simulées montrent que les algorithmes EM et E2M donnent des résultats similaires lorsque les données sont certaines, mais que ce dernier est plus robuste en présence d’incertitudes sur les paramètres du modèle. La troisième partie des travaux présente le cas de signaux aléatoires, dont l’amplitude est considérée comme une variable aléatoire gaussienne. Dans le modèle sans incertitude, l’algorithme EM est utilisé pour estimer les sources acoustiques. Dans le modèle incertain, les incertitudes sur les positions des microphones et le nombre d’onde sont transposées aux données comme dans la deuxième partie. Enfin, les positions et les variances des amplitudes aléatoires des sources acoustiques sont estimées en utilisant l’algorithme E2M. Les résultats montrent ici encore l’avantage d’utiliser un modèle statistique pour estimer les sources en présence, et l’intérêt de prendre en compte l’incertitude sur les paramètres du modèle. / This work addresses the problem of multiple sound source localization for both deterministic and random signals measured by an array of microphones. The problem is solved in a statistical framework via maximum likelihood. The pressure measured by a microphone is interpreted as a mixture of latent signals emitted by the sources; then, both the sound source locations and strengths can be estimated using an expectation-maximization (EM) algorithm. In this thesis, two kinds of uncertainties are also considered: on the microphone locations and on the wave number. These uncertainties are transposed to the data in the belief functions framework. Then, the source locations and strengths can be estimated using a variant of the EM algorithm, known as Evidential EM (E2M) algorithm. The first part of this work begins with the deterministic signal model without consideration of uncertainty. The EM algorithm is then used to estimate the source locations and strengths : the update equations for the model parameters are provided. Furthermore, experimental results are presented and compared with the beamforming and the statistically optimized near-field holography (SONAH), which demonstrates the advantage of the EM algorithm. The second part raises the issue of model uncertainty and shows how the uncertainties on microphone locations and wave number can be taken into account at the data level. In this case, the notion of the likelihood is extended to the uncertain data. Then, the E2M algorithm is used to solve the sound source estimation problem. In both the simulation and real experiment, the E2M algorithm proves to be more robust in the presence of model and data uncertainty. The third part of this work considers the case of random signals, in which the amplitude is modeled by a Gaussian random variable. Both the certain and uncertain cases are investigated. In the former case, the EM algorithm is employed to estimate the sound sources. In the latter case, microphone location and wave number uncertainties are quantified similarly to the second part of the thesis. Finally, the source locations and the variance of the random amplitudes are estimated using the E2M algorithm.
39

Bayesian Methods Under Unknown Prior Distributions with Applications to The Analysis of Gene Expression Data

Rahal, Abbas 14 July 2021 (has links)
The local false discovery rate (LFDR) is one of many existing statistical methods that analyze multiple hypothesis testing. As a Bayesian quantity, the LFDR is based on the prior probability of the null hypothesis and a mixture distribution of null and non-null hypothesis. In practice, the LFDR is unknown and needs to be estimated. The empirical Bayes approach can be used to estimate that mixture distribution. Empirical Bayes does not require complete information about the prior and hyper prior distributions as in hierarchical Bayes. When we do not have enough information at the prior level, and instead of placing a distribution at the hyper prior level in the hierarchical Bayes model, empirical Bayes estimates the prior parameters using the data via, often, the marginal distribution. In this research, we developed new Bayesian methods under unknown prior distribution. A set of adequate prior distributions maybe defined using Bayesian model checking by setting a threshold on the posterior predictive p-value, prior predictive p-value, calibrated p-value, Bayes factor, or integrated likelihood. We derive a set of adequate posterior distributions from that set. In order to obtain a single posterior distribution instead of a set of adequate posterior distributions, we used a blended distribution, which minimizes the relative entropy of a set of adequate prior (or posterior) distributions to a "benchmark" prior (or posterior) distribution. We present two approaches to generate a blended posterior distribution, namely, updating-before-blending and blending-before-updating. The blended posterior distribution can be used to estimate the LFDR by considering the nonlocal false discovery rate as a benchmark and the different LFDR estimators as an adequate set. The likelihood ratio can often be misleading in multiple testing, unless it is supplemented by adjusted p-values or posterior probabilities based on sufficiently strong prior distributions. In case of unknown prior distributions, they can be estimated by empirical Bayes methods or blended distributions. We propose a general framework for applying the laws of likelihood to problems involving multiple hypotheses by bringing together multiple statistical models. We have applied the proposed framework to data sets from genomics, COVID-19 and other data.
40

Dialogue graphique intelligent, fondé sur une ontologie, pour une prothèse de mémoire / Smart graphical dialogue, based on an ontology, for a memory prosthesis

Ghorbel, Fatma 10 July 2018 (has links)
Dans le cadre de cette thèse, nous proposons une prothèse de mémoire « intelligente », appelée CAPTAIN MEMO, destinée aux malades d’Alzheimer, pour pallier leurs problèmes mnésiques. Cette prothèse est basée sur l’ontologie temporelle, floue et multilingue appelée MemoFuzzyOnto.Cette prothèse offre des interfaces accessibles à cette classe particulière d’utilisateurs. Nous proposons, pour mettre en œuvre ces interfaces, une méthodologie de conception appelée InterfaceToAlz pour concevoir des interfaces accessibles aux malades d’Alzheimer, et qui offre un guide de 146 bonnes pratiques ergonomiques. De plus, nous proposons un outil de visualisation d’ontologies appelé Memo Graph qui génère un graphe dont la visualisation et la manipulation sont accessibles aux malades d’Alzheimer. Cette proposition est motivée par le fait que CAPTAIN MEMO a besoin de générer et d’éditer le graphe de la famille et de l’entourage du patient, à partir de l’ontologie MemoFuzzyOnto qui structure sa base de connaissances. Memo Graph est fondé sur notre guide de bonnes pratiques ergonomiques et notre approche, appelée Incremental Key-Instances Extraction and Visualisation, qui permet une extraction et une visualisation incrémentale du résumé des assertions ABox de l’ontologie. Il supporte également la visualisation des données ouvertes liées (Linked Data) et le passage à l’échelle. Par ailleurs, nous proposons, dans le cadre de cette thèse, une typologie de l’imperfection des données saisies (principalement due à la discordance mnésique provoquée par la maladie), et une méthodologie pour permettre à CAPTAIN MEMO d’être tolérante à la saisie des données fausses. Nous proposons un modèle d’évaluation de la crédibilité et une approche, nommée Data Believability Estimation for Applications to Alzheimer Patients, permettant d’estimer qualitativement et quantitativement la crédibilité de chaque donnée saisie. Enfin, pour que CAPTAIN MEMO soit tolérante à la saisie des intervalles temporels imprécis nous proposons deux approches : l’une basée sur un environnement précis et l’autre basée sur un environnement flou. Dans chacune des deux approches, nous étendons l’approche 4D-fluents pour représenter les intervalles temporels imprécis et les relations temporelles qualitatives, puis nous étendons l’algèbre d’Allen pour prendre en compte les intervalles imprécis dans le cadre de notre ontologie MemoFuzzyOnto. Nos contributions sont implémentées et évaluées. Nous avons évalué l’accessibilité de ses interfaces utilisateurs, le service de CAPTAIN MEMO qui a pour but de stimuler la mémoire du patient, notre approche pour l’estimation quantitative de la crédibilité des données saisies ainsi que la visualisation du graphe générée à l’aide de Memo Graph. Nous avons également évalué la performance de Memo Graph et son utilisabilité par des experts du domaine. / In the context of this thesis, we propose a “smart” memory prosthesis, called CAPTAIN MEMO, to help Alzheimer’s disease patients to palliate mnesic problems. It is based on a temporal, fuzzy and multilingual ontology named MemoFuzzyOnto. It provides accessible user interfaces to this demographic. To design these interfaces, we propose a methodology named InterfaceToAlz which serves as an information base for guiding and evaluating the design of user interfaces for Alzheimer’s disease patients. It identifies 146 design guidelines.Besides, we propose an ontology visualization tool called Memo Graph which offers an accessible and understandable visualization to Alzheimer’s disease patients. In fact, in the context of CAPTAIN MEMO, there is a need to generate the patient entourage/family tree from its personal data structured according to MemoFuzzyOnto. Memo Graph is based on our design guidelines and our approach, named Incremental Key-Instances Extraction and Visualisation, to extract and visualize descriptive instance summarizations from a given ontology and generate “summary instance graphs” from the most important data. It supports Linked Data visualization and scaling.Furthermore, we propose a typology of the imperfection of the data entered (mainly due to the memory discordance caused by this disease), and a methodology to allow false data entry. We propose a believability model and an approach called Data Believability Estimation for Applications to Alzheimer Patients to estimate qualitatively and quantitatively the believability of each data entered. Finally, CAPTAIN MEMO allows imprecise time intervals entry. We propose two approaches: a crisp-based approach and a fuzzy-based approach. The first one uses only crisp standards and tools and is modeled in OWL 2. The second approach is based on fuzzy sets theory and fuzzy tools and is modeled in Fuzzy-OWL 2. For the two approaches, we extend the 4D-fluents model to represent imprecise time intervals and qualitative interval relations. Then, we extend the Allen’s interval algebra to compare imprecise time interval in the context of MemoFuzzyOnto. Our contributions are implemented and evaluated. We evaluated the service of CAPTAIN MEMO which has the aim to stimulate the patient’s memory, the accessibility of its user interfaces, the efficiency of our approach to estimate quantitatively the believability of each data entered and the visualization generated with Memo Graph. We also evaluated Memo Graph with domain expert users.

Page generated in 0.0403 seconds