• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 5
  • Tagged with
  • 6
  • 6
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Using Poisson processes for rare event simulation / De l'utilisation des processus de Poisson pour la simulation d'événements rares

Walter, Clément 21 October 2016 (has links)
Cette thèse est une contribution à la problématique de la simulation d'événements rares. A partir de l'étude des méthodes de Splitting, un nouveau cadre théorique est développé, indépendant de tout algorithme. Ce cadre, basé sur la définition d'un processus ponctuel associé à toute variable aléatoire réelle, permet de définir des estimateurs de probabilités, quantiles et moments sans aucune hypothèse sur la variable aléatoire. Le caractère artificiel du Splitting (sélection de seuils) disparaît et l'estimateur de la probabilité de dépasser un seuil est en fait un estimateur de la fonction de répartition jusqu'au seuil considéré. De plus, les estimateurs sont basés sur des processus ponctuels indépendants et identiquement distribués et permettent donc l'utilisation de machine de calcul massivement parallèle. Des algorithmes pratiques sont ainsi également proposés.Enfin l'utilisation de métamodèles est parfois nécessaire à cause d'un temps de calcul toujours trop important. Le cas de la modélisation par processus aléatoire est abordé. L'approche par processus ponctuel permet une estimation simplifiée de l'espérance et de la variance conditionnelles de la variable aléaoire résultante et définit un nouveau critère d'enrichissement SUR adapté aux événements rares / This thesis address the issue of extreme event simulation. From a original understanding of the Splitting methods, a new theoretical framework is proposed, regardless of any algorithm. This framework is based on a point process associated with any real-valued random variable and lets defined probability, quantile and moment estimators without any hypothesis on this random variable. The artificial selection of threshold in Splitting vanishes and the estimator of the probability of exceeding a threshold is indeed an estimator of the whole cumulative distribution function until the given threshold. These estimators are based on the simulation of independent and identically distributed replicas of the point process. So they allow for the use of massively parallel computer cluster. Suitable practical algorithms are thus proposed.Finally it can happen that these advanced statistics still require too much samples. In this context the computer code is considered as a random process with known distribution. The point process framework lets handle this additional source of uncertainty and estimate easily the conditional expectation and variance of the resulting random variable. It also defines new SUR enrichment criteria designed for extreme event probability estimation.
2

Kriging-based Approaches for the Probabilistic Analysis of Strip Footings Resting on Spatially Varying Soils

Thajeel, Jawad 08 December 2017 (has links)
L’analyse probabiliste des ouvrages géotechniques est généralement réalisée en utilisant la méthode de simulation de Monte Carlo. Cette méthode n’est pas adaptée pour le calcul des faibles probabilités de rupture rencontrées dans la pratique car elle devient très coûteuse dans ces cas en raison du grand nombre de simulations requises pour obtenir la probabilité de rupture. Dans cette thèse, nous avons développé trois méthodes probabilistes (appelées AK-MCS, AK-IS et AK-SS) basées sur une méthode d’apprentissage (Active learning) et combinant la technique de Krigeage et l’une des trois méthodes de simulation (i.e. Monte Carlo Simulation MCS, Importance Sampling IS ou Subset Simulation SS). Dans AK-MCS, la population est prédite en utilisant un méta-modèle de krigeage qui est défini en utilisant seulement quelques points de la population, ce qui réduit considérablement le temps de calcul par rapport à la méthode MCS. Dans AK-IS, une technique d'échantillonnage plus efficace 'IS' est utilisée. Dans le cadre de cette approche, la faible probabilité de rupture est estimée avec une précision similaire à celle de AK-MCS, mais en utilisant une taille beaucoup plus petite de la population initiale, ce qui réduit considérablement le temps de calcul. Enfin, dans AK-SS, une technique d'échantillonnage plus efficace 'SS' est proposée. Cette technique ne nécessite pas la recherche de points de conception et par conséquent, elle peut traiter des surfaces d’état limite de forme arbitraire. Toutes les trois méthodes ont été appliquées au cas d'une fondation filante chargée verticalement et reposant sur un sol spatialement variable. Les résultats obtenus sont présentés et discutés. / The probabilistic analysis of geotechnical structures involving spatially varying soil properties is generally performed using Monte Carlo Simulation methodology. This method is not suitable for the computation of the small failure probabilities encountered in practice because it becomes very time-expensive in such cases due to the large number of simulations required to calculate accurate values of the failure probability. Three probabilistic approaches (named AK-MCS, AK-IS and AK-SS) based on an Active learning and combining Kriging and one of the three simulation techniques (i.e. Monte Carlo Simulation MCS, Importance Sampling IS or Subset Simulation SS) were developed. Within AK-MCS, a Monte Carlo simulation without evaluating the whole population is performed. Indeed, the population is predicted using a kriging meta-model which is defined using only a few points of the population thus significantly reducing the computation time with respect to the crude MCS. In AK-IS, a more efficient sampling technique ‘IS’ is used instead of ‘MCS’. In the framework of this approach, the small failure probability is estimated with a similar accuracy as AK-MCS but using a much smaller size of the initial population, thus significantly reducing the computation time. Finally, in AK-SS, a more efficient sampling technique ‘SS’ is proposed. This technique overcomes the search of the design points and thus it can deal with arbitrary shapes of the limit state surfaces. All the three methods were applied to the case of a vertically loaded strip footing resting on a spatially varying soil. The obtained results are presented and discussed.
3

Sequential Design of Experiments to Estimate a Probability of Failure.

Li, Ling 16 May 2012 (has links) (PDF)
This thesis deals with the problem of estimating the probability of failure of a system from computer simulations. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited, which is incompatible with the use of classical Monte Carlo methods. In fact, estimating a small probability of failure with very few simulations, as required in some complex industrial problems, is a particularly difficult topic. A classical approach consists in replacing the expensive-to-simulate model with a surrogate model that will use little computer resources. Using such a surrogate model, two operations can be achieved. The first operation consists in choosing a number, as small as possible, of simulations to learn the regions in the parameter space of the system that will lead to a failure of the system. The second operation is about constructing good estimators of the probability of failure. The contributions in this thesis consist of two parts. First, we derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. Second, we propose a new algorithm, called Bayesian Subset Simulation, that takes the best from the Subset Simulation algorithm and from sequential Bayesian methods based on Gaussian process modeling. The new strategies are supported by numerical results from several benchmark examples in reliability analysis. The methods proposed show good performances compared to methods of the literature.
4

An Efficient Method to Assess Reliability under Dynamic Stochastic Loads

Norouzi, Mahdi January 2012 (has links)
No description available.
5

Minimal model reasoning for modal logic

Papacchini, Fabio January 2015 (has links)
Model generation and minimal model generation are useful for tasks such as model checking, query answering and for debugging of logical specifications. Due to this variety of applications, several minimality criteria and model generation methods for classical logics have been studied. Minimal model generation for modal logics how ever did not receive the same attention from the research community. This thesis aims to fill this gap by investigating minimality criteria and designing minimal model generation procedures for all the sublogics of the multi-modal logic S5(m) and their extensions with universal modalities. All the procedures are minimal model sound and complete, in the sense that they generate all and only minimal models. The starting point of the investigation is the definition of a Herbrand semantics for modal logics on which a syntactic minimality criterion is devised. The syntactic nature of the minimality criterion allows for an efficient minimal model generation procedure, but, on the other hand, the resulting minimal models can be redundant or semantically non minimal with respect to each other. To overcome the syntactic limitations of the first minimality criterion, the thesis moves from minimal modal Herbrand models to semantic minimality criteria based on subset-simulation. At first, theoretical procedures for the generation of models minimal modulo subset-simulation are presented. These procedures for the generation of models minimal modulo subset-simulation are minimal model sound and complete, but they might not terminate. The minimality criterion and the procedures are then refined in such a way that termination can be ensured while preserving minimal model soundness and completeness.
6

Sequential Design of Experiments to Estimate a Probability of Failure. / Planification d'expériences séquentielle pour l'estimation de probabilités de défaillance

Li, Ling 16 May 2012 (has links)
Cette thèse aborde le problème de l'estimation de la probabilité de défaillance d'un système à partir de simulations informatiques. Lorsqu'on dispose seulement d'un modèle du système coûteux à simuler, le budget de simulations est généralement très limité, ce qui est incompatible avec l’utilisation de méthodes Monte Carlo classiques. En fait, l’estimation d’une petite probabilité de défaillance à partir de simulations très coûteuses, comme on peut rencontrer dans certains problèmes industriels complexes, est un sujet particulièrement difficile. Une approche classique consiste à remplacer le modèle coûteux à simuler par un modèle de substitution nécessitant de faibles ressources informatiques. A partir d’un tel modèle de substitution, deux opérations peuvent être réalisées. La première opération consiste à choisir des simulations, en nombre aussi petit que possible, pour apprendre les régions de l’espace des paramètres du système qui construire de bons estimateurs de la probabilité de défaillance. Cette thèse propose deux contributions. Premièrement, nous proposons des stratégies de type SUR (Stepwise Uncertainty Reduction) à partir d’une formulation bayésienne du problème d’estimation d’une probabilité de défaillance. Deuxièmement, nous proposons un nouvel algorithme, appelé Bayesian Subset Simulation, qui prend le meilleur de l’algorithme Subset Simulation et des approches séquentielles bayésiennes utilisant la modélisation du système par processus gaussiens. Ces nouveaux algorithmes sont illustrés par des résultats numériques concernant plusieurs exemples de référence dans la littérature de la fiabilité. Les méthodes proposées montrent de bonnes performances par rapport aux méthodes concurrentes. / This thesis deals with the problem of estimating the probability of failure of a system from computer simulations. When only an expensive-to-simulate model of the system is available, the budget for simulations is usually severely limited, which is incompatible with the use of classical Monte Carlo methods. In fact, estimating a small probability of failure with very few simulations, as required in some complex industrial problems, is a particularly difficult topic. A classical approach consists in replacing the expensive-to-simulate model with a surrogate model that will use little computer resources. Using such a surrogate model, two operations can be achieved. The first operation consists in choosing a number, as small as possible, of simulations to learn the regions in the parameter space of the system that will lead to a failure of the system. The second operation is about constructing good estimators of the probability of failure. The contributions in this thesis consist of two parts. First, we derive SUR (stepwise uncertainty reduction) strategies from a Bayesian-theoretic formulation of the problem of estimating a probability of failure. Second, we propose a new algorithm, called Bayesian Subset Simulation, that takes the best from the Subset Simulation algorithm and from sequential Bayesian methods based on Gaussian process modeling. The new strategies are supported by numerical results from several benchmark examples in reliability analysis. The methods proposed show good performances compared to methods of the literature.

Page generated in 0.1028 seconds