• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 2
  • 2
  • Tagged with
  • 11
  • 11
  • 8
  • 6
  • 6
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Modelling porosity and permeability in early cemented carbonates

Hosa, Aleksandra Maria January 2016 (has links)
Cabonate-hosted hydrocarbon reservoirs will play an increasingly important role in the energy supply, as 60% of the world's remaining hydrocarbon resources are trapped within carbonate rocks. The properties of carbonates are controlled by deposition and diagenesis, which includes calcite cementation that begins immediately after deposition and may have a strong impact on subsequent diagenetic pathways. This thesis aims to understand the impact of early calcite cementation on reservoir properties through object-based modelling and Lattice Boltzmann ow simulation to obtain permeability. A Bayesian inference framework is also developed to quantify the ability of Lattice Boltzmann method to predict the permeability of porous media. Modelling focuses on the impact of carbonate grain type on properties of early cemented grainstones and on the examination of the theoretical changes to the morphology of the pore space. For that purpose process-based models of early cementation are developed in both 2D (Calcite2D) and 3D (Calcite3D, which also includes modelling of deposition). Both models assume the existence of two grain types: polycrystalline and monocrystalline, and two early calcite cement types specific to these grain types: isopachous and syntaxial, respectively. Of the many possible crystal forms that syntaxial cement can take, this thesis focuses on two common rhombohedral forms: a blocky form 01¯12 and an elongated form 40¯41. The results of the 2D and 3D modelling demonstrate the effect of competition of growing grains for the available pore space: the more monocrystalline grains present in the sample, the stronger this competition becomes and the lesser the impact of each individual grain on the resulting early calcite cement volume and porosity. The synthetic samples with syntaxial cements grown of the more elongated crystal form 40¯41 have lower porosity for the same monocrystalline grains content than synthetic samples grown following more blocky crystal form 01¯12. Moreover, permeability at a constant porosity is reduced for synthetic samples with the form 40¯41. Additionally, synthetic samples with form 40¯41 exhibit greater variability in the results as this rhombohedral form is more elongated and has the potential for producing a greater volume of cement. The results of the 2D study suggest that for samples at constant porosity the higher the proportion of monocrystalline grains are in the sample, the higher the permeability. The 3D study suggests that for samples with crystal form 01¯12 at constant porosity the permeability becomes lower as the proportion of monocrystalline grains increase, but this impact is relatively minor. In the case of samples with crystal form 40¯41 the results are inconclusive. This dependence of permeability on monocrystalline grains is weaker than in the 2D study, which is most probably a result of the bias of flow simulation in the 2D as well as of the treatment of the porous medium before the cement growth model is applied. The range of the permeability results in the 2D modelling may be artificially overly wide, which could lead to the dependence of permeability on sediment type being exaggerated. Poroperm results of the 2D modelling (10-8000mD) are in reasonable agreement with the data reported for grainstones in literature (0.1-5000mD) as well as for the plug data of the samples used in modelling (porosity 22 - 27%, permeability 200 - 3000mD), however permeability results at any given porosity have a wide range due to the bias inherent to the 2D flow modelling. Poroperm results in the 3D modelling (10 - 30, 000mD) exhibit permeabilities above the range of that reported in the literature or the plug data, but the reason for that is that the initial synthetic sediment deposit has very high permeability (58, 900mD). However, the trend in poroperm closely resembles those reported in carbonate rocks. As the modelling depends heavily on the use of Lattice Boltzmann method (flow simulation to obtain permeability results), a Bayesian inference framework is presented to quantify the predictive power of Lattice Boltzmann models. This calibration methodology is presented on the example of Fontainebleau sandstone. The framework enables a systematic parameter estimation of Lattice Boltzmann model parameters (in the scope of this work, the relaxation parameter τ ), for the currently used calibrations of Lattice Boltzmann based on Hagen-Poiseuille law. Our prediction of permeability using the Hagen-Poiseuille calibration suggests that this method for calibration is not optimal and in fact leads to substantial discrepancies with experimental measurements, especially for highly porous complex media such as carbonates. We proceed to recalibrate the Lattice Boltzmann model using permeability data from porous media, which results in a substantially different value of the optimal τ parameter than those used previously (0.654 here compared to 0.9). We augment our model introducing porosity-dependence, where we find that the optimal value for τ decreases for samples of higher porosity. In this new semi-empirical model one first identifies the porosity of the given medium, and on that basis chooses an appropriate Lattice Boltzmann relaxation parameter. These two approaches result in permeability predictions much closer to the experimental permeability data, with the porosity-dependent case being the better of the two. Validation of this calibration method with independent samples of the same rock type yields permeability predictions that fall close to the experimental data, and again the porosity-dependent model provides better results. We thus conclude that our calibration model is a powerful tool for accurate prediction of complex porous media permeability.
2

A Bayesian approach to financial model calibration, uncertainty measures and optimal hedging

Gupta, Alok January 2010 (has links)
In this thesis we address problems associated with financial modelling from a Bayesian point of view. Specifically, we look at the problem of calibrating financial models, measuring the model uncertainty of a claim and choosing an optimal hedging strategy. Throughout the study, the local volatility model is used as a working example to clarify the proposed methods. This thesis assumes a prior probability density for the unknown parameter in a model we try to calibrate. The prior probability density regularises the ill-posedness of the calibration problem. Further observations of market prices are used to update this prior, using Bayes law, and give a posterior probability density for the unknown model parameter. Resulting Bayes estimators are shown to be consistent for finite-dimensional model parameters. The posterior density is then used to compute the Bayesian model average price. In tests on local volatility models it is shown that this price is closer than the prices of comparable calibration methods to the price given by the true model. The second part of the thesis focuses on quantifying model uncertainty. Using the framework for market risk measures we propose axioms for new classes of model uncertainty measures. Similar to the market risk case, we prove representation theorems for coherent and convex model uncertainty measures. Example measures from the latter class are provided using the Bayesian posterior. These are used to value the model uncertainty for a range of financial contracts priced in the local volatility model. In the final part of the thesis we propose a method for selecting the model, from a set of candidate models, that optimises the hedging of a specified financial contract. In particular we choose the model whose corresponding price and hedge optimises some hedging performance indicator. The selection problem is solved using Bayesian loss functions to encapsulate the loss from using one model to price and hedge when the true model is a different model. Linkages are made with convex model uncertainty measures and traditional utility functions. Numerical experiments on a stochastic volatility model and the local volatility model show that the Bayesian strategy can outperform traditional strategies, especially for exotic options.
3

Bayesian calibration of building energy models for energy retrofit decision-making under uncertainty

Heo, Yeonsook 10 November 2011 (has links)
Retrofitting of existing buildings is essential to reach reduction targets in energy consumption and greenhouse gas emission. In the current practice of a retrofit decision process, professionals perform energy audits, and construct dynamic simulation models to benchmark the performance of existing buildings and predict the effect of retrofit interventions. In order to enhance the reliability of simulation models, they typically calibrate simulation models based on monitored energy use data. The calibration techniques used for this purpose are manual and expert-driven. The current practice has major drawbacks: (1) the modeling and calibration methods do not scale to large portfolio of buildings due to their high costs and heavy reliance on expertise, and (2) the resulting deterministic models do not provide insight into underperforming risks associated with each retrofit intervention. This thesis has developed a new retrofit analysis framework that is suitable for large-scale analysis and risk-conscious decision-making. The framework is based on the use of normative models and Bayesian calibration techniques. Normative models are light-weight quasi-steady state energy models that can scale up to large sets of buildings, i.e. to city and regional scale. In addition, they do not require modeling expertise since they follow a set of modeling rules that produce a standard measure for energy performance. The normative models are calibrated under a Bayesian approach such that the resulting calibrated models quantify uncertainties in the energy outcomes of a building. Bayesian calibration models can also incorporate additional uncertainties associated with retrofit interventions to generate probability distributions of retrofit performance. Probabilistic outputs can be straightforwardly translated into a measure that quantifies underperforming risks of retrofit interventions and thus enable decision making relative to the decision-makers' rational objectives and risk attitude. This thesis demonstrates the feasibility of the new framework on retrofit applications by verifying the following two hypotheses: (1) normative models supported by Bayesian calibration have sufficient model fidelity to adequately support retrofit decisions, and (2) they can support risk-conscious decision-making by explicitly quantifying risks associated with retrofit options. The first and second hypotheses are examined through case studies that compare outcomes from the calibrated normative model with those from a similarly calibrated transient simulation model and compare decisions derived by the proposed framework with those derived by standard practices respectively. The new framework will enable cost-effective retrofit analysis at urban scale with explicit management of uncertainties.
4

Toward a predictive model of tumor growth

Hawkins-Daarud, Andrea Jeanine 16 June 2011 (has links)
In this work, an attempt is made to lay out a framework in which models of tumor growth can be built, calibrated, validated, and differentiated in their level of goodness in such a manner that all the uncertainties associated with each step of the modeling process can be accounted for in the final model prediction. The study can be divided into four basic parts. The first involves the development of a general family of mathematical models of interacting species representing the various constituents of living tissue, which generalizes those previously available in the literature. In this theory, surface effects are introduced by incorporating in the Helmholtz free ` gradients of the volume fractions of the interacting species, thus providing a generalization of the Cahn-Hilliard theory of phase change in binary media and leading to fourth-order, coupled systems of nonlinear evolution equations. A subset of these governing equations is selected as the primary class of models of tumor growth considered in this work. The second component of this study focuses on the emerging and fundamentally important issue of predictive modeling, the study of model calibration, validation, and quantification of uncertainty in predictions of target outputs of models. The Bayesian framework suggested by Babuska, Nobile, and Tempone is employed to embed the calibration and validation processes within the framework of statistical inverse theory. Extensions of the theory are developed which are regarded as necessary for certain scenarios in these methods to models of tumor growth. The third part of the study focuses on the numerical approximation of the diffuse-interface models of tumor growth and on the numerical implementations of the statistical inverse methods at the core of the validation process. A class of mixed finite element models is developed for the considered mass-conservation models of tumor growth. A family of time marching schemes is developed and applied to representative problems of tumor evolution. Finally, in the fourth component of this investigation, a collection of synthetic examples, mostly in two-dimensions, is considered to provide a proof-of-concept of the theory and methods developed in this work. / text
5

Application of Bayesian Inference Techniques for Calibrating Eutrophication Models

Zhang, Weitao 26 February 2009 (has links)
This research aims to integrate mathematical water quality models with Bayesian inference techniques for obtaining effective model calibration and rigorous assessment of the uncertainty underlying model predictions. The first part of my work combines a Bayesian calibration framework with a complex biogeochemical model to reproduce oligo-, meso- and eutrophic lake conditions. The model accurately describes the observed patterns and also provides realistic estimates of predictive uncertainty for water quality variables. The Bayesian estimations are also used for appraising the exceedance frequency and confidence of compliance of different water quality criteria. The second part introduces a Bayesian hierarchical framework (BHF) for calibrating eutrophication models at multiple systems (or sites of the same system). The models calibrated under the BHF provided accurate system representations for all the scenarios examined. The BHF allows overcoming problems of insufficient local data by “borrowing strength” from well-studied sites. Both frameworks can facilitate environmental management decisions.
6

Calibration Bayésienne d'un modèle d'étude d'écosystème prairial : outils et applications à l'échelle de l'Europe / no title available

Ben Touhami, Haythem 07 March 2014 (has links)
Les prairies représentent 45% de la surface agricole en France et 40% en Europe, ce qui montre qu’il s’agit d’un secteur important particulièrement dans un contexte de changement climatique où les prairies contribuent d’un côté aux émissions de gaz à effet de serre et en sont impactées de l’autre côté. L’enjeu de cette thèse a été de contribuer à l’évaluation des incertitudes dans les sorties de modèles de simulation de prairies (et utilisés dans les études d’impact aux changements climatiques) dépendant du paramétrage du modèle. Nous avons fait appel aux méthodes de la statistique Bayésienne, basées sur le théorème de Bayes, afin de calibrer les paramètres d’un modèle référent et améliorer ainsi ses résultats en réduisant l’incertitude liée à ses paramètres et, par conséquent, à ses sorties. Notre démarche s’est basée essentiellement sur l’utilisation du modèle d’écosystème prairial PaSim, déjà utilisé dans plusieurs projets européens pour simuler l’impact des changements climatiques sur les prairies. L’originalité de notre travail de thèse a été d’adapter la méthode Bayésienne à un modèle d’écosystème complexe comme PaSim (appliqué dans un contexte de climat altéré et à l’échelle du territoire européen) et de montrer ses avantages potentiels dans la réduction d’incertitudes et l’amélioration des résultats, en combinant notamment méthodes statistiques (technique Bayésienne et analyse de sensibilité avec la méthode de Morris) et outils informatiques (couplage code R-PaSim et utilisation d’un cluster de calcul). Cela nous a conduit à produire d’abord un nouveau paramétrage pour des sites prairiaux soumis à des conditions de sécheresse, et ensuite à un paramétrage commun pour les prairies européennes. Nous avons également fourni un outil informatique de calibration générique pouvant être réutilisé avec d’autres modèles et sur d’autres sites. Enfin, nous avons évalué la performance du modèle calibré par le biais de la technique Bayésienne sur des sites de validation, et dont les résultats ont confirmé l’efficacité de cette technique pour la réduction d’incertitude et l’amélioration de la fiabilité des sorties. / Grasslands cover 45% of the agricultural area in France and 40% in Europe. Grassland ecosystems have a central role in the climate change context, not only because they are impacted by climate changes but also because grasslands contribute to greenhouse gas emissions. The aim of this thesis was to contribute to the assessment of uncertainties in the outputs of grassland simulation models, which are used in impact studies, with focus on model parameterization. In particular, we used the Bayesian statistical method, based on Bayes’ theorem, to calibrate the parameters of a reference model, and thus improve performance by reducing the uncertainty in the parameters and, consequently, in the outputs provided by models. Our approach is essentially based on the use of the grassland ecosystem model PaSim (Pasture Simulation model) already applied in a variety of international projects to simulate the impact of climate changes on grassland systems. The originality of this thesis was to adapt the Bayesian method to a complex ecosystem model such as PaSim (applied in the context of altered climate and across the European territory) and show its potential benefits in reducing uncertainty and improving the quality of model outputs. This was obtained by combining statistical methods (Bayesian techniques and sensitivity analysis with the method of Morris) and computing tools (R code -PaSim coupling and use of cluster computing resources). We have first produced a new parameterization for grassland sites under drought conditions, and then a common parameterization for European grasslands. We have also provided a generic software tool for calibration for reuse with other models and sites. Finally, we have evaluated the performance of the calibrated model through the Bayesian technique against data from validation sites. The results have confirmed the efficiency of this technique for reducing uncertainty and improving the reliability of simulation outputs.
7

Assessment of carbon sequestration and timber production of Scots pine across Scotland using the process-based model 3-PGN

Xenakis, Georgios January 2007 (has links)
Forests are a valuable resource for humans providing a range of products and services such as construction timber, paper and fuel wood, recreation, as well as living quarters for indigenous populations and habitats for many animal and bird species. Most recent international political agreements such as the Kyoto Protocol emphasise the role of forests as a major sink for atmospheric carbon dioxide mitigation. However, forest areas are rapidly decreasing world wide. Thus, it is vital that efficient strategies and tools are developed to encourage sustainable ecosystem management. These tools must be based on known ecological principles (such as tree physiological and soil nutrient cycle processes), capable of supplying fast and accurate temporal and spatial predictions of the effects of management on both timber production and carbon sequestration. This thesis had two main objectives. The first was to investigate the environmental factors affecting growth and carbon sequestration of Scots pine (Pinus sylvestris L.) across Scotland, by developing a knowledge base through a statistical analysis of old and novel field datasets. Furthermore, the process-based ecosystem model 3-PGN was developed, by coupling the existing models 3-PG and ICBM. 3-PGN calibrated using a Bayesian approach based on Monte Carlo Markov Chain simulations and it was validated for plantation stands. Sensitivity and uncertainty analyses provided an understanding of the internal feedbacks of the model. Further simulations gave a detailed eco-physiological interpretation of the environmental factors affecting Scots pine growth and it provided an assessment of carbon sequestration under the scenario of sustainable, normal production and its effects from the environment. Finally, the study investigated the spatial and temporal patterns of timber production and carbon sequestration by using the spatial version of the model and applying advanced spatial analyses techniques. The second objective was to help close the gap between environmental research and forest management, by setting a strategic framework for a process-based tool for sustainable ecosystem management. The thesis demonstrated the procedures for a site classification scheme based on modelling results and a yield table validation procedure, which can provide a way forward in supporting policies for forest management and ensuring their continued existence in the face of the present and future challenges.
8

Application of Bayesian Inference Techniques for Calibrating Eutrophication Models

Zhang, Weitao 26 February 2009 (has links)
This research aims to integrate mathematical water quality models with Bayesian inference techniques for obtaining effective model calibration and rigorous assessment of the uncertainty underlying model predictions. The first part of my work combines a Bayesian calibration framework with a complex biogeochemical model to reproduce oligo-, meso- and eutrophic lake conditions. The model accurately describes the observed patterns and also provides realistic estimates of predictive uncertainty for water quality variables. The Bayesian estimations are also used for appraising the exceedance frequency and confidence of compliance of different water quality criteria. The second part introduces a Bayesian hierarchical framework (BHF) for calibrating eutrophication models at multiple systems (or sites of the same system). The models calibrated under the BHF provided accurate system representations for all the scenarios examined. The BHF allows overcoming problems of insufficient local data by “borrowing strength” from well-studied sites. Both frameworks can facilitate environmental management decisions.
9

Etude de stratégies de gestion en temps réel pour des bâtiments énergétiquement performants / Study of real time control strategies for energy efficient buildings

Robillart, Maxime 28 September 2015 (has links)
Dans l'objectif de réduire les consommations énergétiques des bâtiments et de diminuer leur impact sur le réseau électrique, il est utile de disposer de stratégies de gestion énergétique en temps réel. Il s'agit en effet d'un verrou clé dans la perspective des réseaux intelligents (« smart grids ») et des programmes de gestion de la demande (« demand response »). Cette thèse propose ainsi le développement de stratégies de gestion en temps réel du chauffage électrique d'un bâtiment énergétiquement performant en période de pointe électrique. Tout d'abord, ces stratégies nécessitent l'utilisation et le développement de plusieurs modèles, à savoir un modèle de prévision météorologique, un modèle d'occupation et un modèle énergétique dynamique du bâtiment. Ensuite, dans l'objectif d'un suivi fiable des performances énergétiques et pour un pilotage optimal des installations, le calibrage du modèle de bâtiment à partir de relevés in situ est préférable. Une nouvelle méthodologie, basée sur un criblage des paramètres incertains et sur l'utilisation d'une méthode d'inférence bayésienne (calcul bayésien approché) a ainsi été développée. Enfin, deux méthodes d'optimisation ont été étudiées pour le développement de stratégies de régulation adaptées au temps réel. La première repose sur une méthode d'optimisation hors-ligne dont l'objectif est d'approximer les résultats d'une stratégie optimale calculée par une méthode d'optimisation exacte et ainsi identifier des lois de commandes simplifiées. La deuxième méthode repose quant à elle sur la commande prédictive et l'adaptation au temps réel de la commande optimale sous contraintes d'état et de commande utilisant la pénalisation intérieure. Une maison de la plateforme INCAS de l'Institut National de l'Énergie Solaire (INES) a été utilisée comme cas d'application pour étudier par simulation les différentes stratégies développées. / To reach the objectives of reducing the energy consumption of buildings and decreasing their impact on the electrical grid, it is necessary to elaborate real time control strategies in view of smart grids and demand response programs. In this context, this thesis aims at developing real time control strategies for electric load shifting in energy efficient buildings. First, these strategies require appropriate models regarding weather forecast, occupants' behaviour and building energy simulation. Then, in order to improve the reliability of building energy simulation and to ensure optimal control of facilities, a calibration process of the model based on on-site measurements is recommended. In this way a new methodology was developed , based on a screening technique and a bayesian inference method (approximate bayesian computation). Finally, two optimisation techniques were studied to develop real time control strategies. The first technique was based on offline optimisation methods. The principle is to approximate optimisation results (and more specifically model based predictive controllers results) and to extract simplified control strategies. The second method consisted in using model predictive control and, more precisely, in solving in real time a state and input constrained optimal control problem by interior penalty methods. An actual experimental passive house being part of the INCAS platform built by the National Solar Energy Institute (INES) was used to study by numerical simulation the different strategies developed.
10

Statistical inverse problem in nonlinear high-speed train dynamics / Problème statistique inverse en dynamique non-linéaire des trains à grande vitesse

Lebel, David 30 November 2018 (has links)
Ce travail de thèse traite du développement d'une méthode de télédiagnostique de l'état de santé des suspensions des trains à grande vitesse à partir de mesures de la réponse dynamique du train en circulation par des accéléromètres embarqués. Un train en circulation est un système dynamique dont l'excitation provient des irrégularités de la géométrie de la voie ferrée. Ses éléments de suspension jouent un rôle fondamental de sécurité et de confort. La réponse dynamique du train étant dépendante des caractéristiques mécaniques des éléments de suspension, il est possible d'obtenir en inverse des informations sur l'état de ces éléments à partir de mesures accélérométriques embarquées. Connaître l'état de santé réel des suspensions permettrait d'améliorer la maintenance des trains. D’un point de vue mathématique, la méthode de télédiagnostique proposée consiste à résoudre un problème statistique inverse. Elle s'appuie sur un modèle numérique de dynamique ferroviaire et prend en compte l'incertitude de modèle ainsi que les erreurs de mesures. Les paramètres mécaniques associés aux éléments de suspension sont identifiés par calibration Bayésienne à partir de mesures simultanées des entrées (les irrégularités de la géométrie de la voie) et sorties (la réponse dynamique du train) du système. La calibration Bayésienne classique implique le calcul de la fonction de vraisemblance à partir du modèle stochastique de réponse et des données expérimentales. Le modèle numérique étant numériquement coûteux d'une part, ses entrées et sorties étant fonctionnelles d'autre part, une méthode de calibration Bayésienne originale est proposée. Elle utilise un métamodèle par processus Gaussien de la fonction de vraisemblance. Cette thèse présente comment un métamodèle aléatoire peut être utilisé pour estimer la loi de probabilité des paramètres du modèle. La méthode proposée permet la prise en compte du nouveau type d'incertitude induit par l'utilisation d'un métamodèle. Cette prise en compte est nécessaire pour une estimation correcte de la précision de la calibration. La nouvelle méthode de calibration Bayésienne a été testée sur le cas applicatif ferroviaire, et a produit des résultats concluants. La validation a été faite par expériences numériques. Par ailleurs, l'évolution à long terme des paramètres mécaniques de suspensions a été étudiée à partir de mesures réelles de la réponse dynamique du train / The work presented here deals with the development of a health-state monitoring method for high-speed train suspensions using in-service measurements of the train dynamical response by embedded acceleration sensors. A rolling train is a dynamical system excited by the track-geometry irregularities. The suspension elements play a key role for the ride safety and comfort. The train dynamical response being dependent on the suspensions mechanical characteristics, information about the suspensions state can be inferred from acceleration measurements in the train by embedded sensors. This information about the actual suspensions state would allow for providing a more efficient train maintenance. Mathematically, the proposed monitoring solution consists in solving a statistical inverse problem. It is based on a train-dynamics computational model, and takes into account the model uncertainty and the measurement errors. A Bayesian calibration approach is adopted to identify the probability distribution of the mechanical parameters of the suspension elements from joint measurements of the system input (the track-geometry irregularities) and output (the train dynamical response).Classical Bayesian calibration implies the computation of the likelihood function using the stochastic model of the system output and experimental data. To cope with the fact that each run of the computational model is numerically expensive, and because of the functional nature of the system input and output, a novel Bayesian calibration method using a Gaussian-process surrogate model of the likelihood function is proposed. This thesis presents how such a random surrogate model can be used to estimate the probability distribution of the model parameters. The proposed method allows for taking into account the new type of uncertainty induced by the use of a surrogate model, which is necessary to correctly assess the calibration accuracy. The novel Bayesian calibration method has been tested on the railway application and has achieved conclusive results. Numerical experiments were used for validation. The long-term evolution of the suspension mechanical parameters has been studied using actual measurements of the train dynamical response

Page generated in 0.1107 seconds