• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 125
  • 23
  • 16
  • 8
  • 1
  • Tagged with
  • 240
  • 240
  • 61
  • 56
  • 52
  • 36
  • 35
  • 33
  • 32
  • 28
  • 26
  • 25
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Linear Parameter Uncertainty Quantification using Surrogate Gaussian Processes

Macatula, Romcholo Yulo 21 July 2020 (has links)
We consider uncertainty quantification using surrogate Gaussian processes. We take a previous sampling algorithm and provide a closed form expression of the resulting posterior distribution. We extend the method to weighted least squares and a Bayesian approach both with closed form expressions of the resulting posterior distributions. We test methods on 1D deconvolution and 2D tomography. Our new methods improve on the previous algorithm, however fall short in some aspects to a typical Bayesian inference method. / Master of Science / Parameter uncertainty quantification seeks to determine both estimates and uncertainty regarding estimates of model parameters. Example of model parameters can include physical properties such as density, growth rates, or even deblurred images. Previous work has shown that replacing data with a surrogate model can provide promising estimates with low uncertainty. We extend the previous methods in the specific field of linear models. Theoretical results are tested on simulated computed tomography problems.
82

Subgrid models for heat transfer in multiphase flows with immersed geometry

Lane, William 21 June 2016 (has links)
Multiphase flows are ubiquitous across engineering disciplines: water-sediment river flows in civil engineering, oil-water-sand transportation flows in petroleum engineering; and sorbent-flue gas reactor flows in chemical engineering. These multiphase flows can include a combination of momentum, heat, and mass transfer. Studying and understanding the behavior of multiphase, multiphysics flow configurations can be crucial for safe and efficient engineering design. In this work, a framework for the development and validation, verification and uncertainty quantification (VVUQ) of subgrid models for heat transfer in multiphase flows is presented. The framework is developed for a carbon capture reactor; however, the concepts and methods described in this dissertation can be generalized and applied broadly to multiphase/multiphysics problems. When combined with VVUQ methods, these tools can provide accurate results at many length scales, enabling large upscaling problems to be simulated accurately and with calculable errors. The system of interest is a post-combustion solid-sorbent carbon capture reactor featuring a solid-sorbent bed that is fluidized with post-combustion flue gas. As the flue gas passes through the bed, the carbon dioxide is exothermically adsorbed onto the sorbent particle’s surface, and the clean gas is passed onto further processes. To prevent overheating and degradation of the sorbent material, cooling cylinders are immersed in the flow to regulate temperatures. Simulating a full-scale, gas-particle reactor using traditional methods is computationally intractable due to the long time scale and variations in length scales: reactor, O(10 m); cylinders, O(1 cm); and sorbent particles, O(100 um). This research developed an efficient subgrid method for simulating such a system. A constitutive model was derived to predict the effective suspension-cylinder Nusselt number based on the local flow and material properties and the cylinder geometry, analogous to single-phase Nusselt number correlations. This model was implemented in an open source computational fluid dynamics code, MFIX, and has undergone VVUQ. Verification and validation showed great agreement with comparable highly-resolved simulations, achieving speedups of up to 10,000 times faster. Our model is currently being used to simulate a 1 MW, solid-sorbent carbon capture unit and is outperforming previous methods in both speed and physically accuracy. / 2017-06-21T00:00:00Z
83

Uncertainty Qualification of Photothermal Radiometry Measurements Using Monte Carlo Simulation and Experimental Repeatability

Fleming, Austin 01 May 2014 (has links)
Photothermal Radiometry is a common thermal property measurement technique which is used to measure the properties of layered materials. Photothermal Radiometry uses a modulated laser to heat a sample, in which the thermal response can be used to determine the thermal properties of layers in the sample. The motivation for this work is to provide a better understanding of the accuracy and the repeatability of the Photothermal Radiometry measurement technique. Through this work the sensitivity of results to input uncertainties will be determined. Additionally, using numerical simulations the overall uncertainty on a theoretical measurement will be determined. The repeatability of Photothermal Radiometry measurements is tested with the use of a proton irradiated zirconium carbide sample. Due to the proton irradiation this sample contains two layers with a thermal resistance between the layers. This sample has been independently measured by three different researchers, in three different countries and the results are compared to determine the repeatability of Photothermal Radiometry measurements. Finally, from sensitivity and uncertainty analysis experimental procedures and suggestions are provided to reduce the uncertainty in experimentally measured results.
84

Advanced Sampling Methods for Solving Large-Scale Inverse Problems

Attia, Ahmed Mohamed Mohamed 19 September 2016 (has links)
Ensemble and variational techniques have gained wide popularity as the two main approaches for solving data assimilation and inverse problems. The majority of the methods in these two approaches are derived (at least implicitly) under the assumption that the underlying probability distributions are Gaussian. It is well accepted, however, that the Gaussianity assumption is too restrictive when applied to large nonlinear models, nonlinear observation operators, and large levels of uncertainty. This work develops a family of fully non-Gaussian data assimilation algorithms that work by directly sampling the posterior distribution. The sampling strategy is based on a Hybrid/Hamiltonian Monte Carlo (HMC) approach that can handle non-normal probability distributions. The first algorithm proposed in this work is the "HMC sampling filter", an ensemble-based data assimilation algorithm for solving the sequential filtering problem. Unlike traditional ensemble-based filters, such as the ensemble Kalman filter and the maximum likelihood ensemble filter, the proposed sampling filter naturally accommodates non-Gaussian errors and nonlinear model dynamics, as well as nonlinear observations. To test the capabilities of the HMC sampling filter numerical experiments are carried out using the Lorenz-96 model and observation operators with different levels of nonlinearity and differentiability. The filter is also tested with shallow water model on the sphere with linear observation operator. Numerical results show that the sampling filter performs well even in highly nonlinear situations where the traditional filters diverge. Next, the HMC sampling approach is extended to the four-dimensional case, where several observations are assimilated simultaneously, resulting in the second member of the proposed family of algorithms. The new algorithm, named "HMC sampling smoother", is an ensemble-based smoother for four-dimensional data assimilation that works by sampling from the posterior probability density of the solution at the initial time. The sampling smoother naturally accommodates non-Gaussian errors and nonlinear model dynamics and observation operators, and provides a full description of the posterior distribution. Numerical experiments for this algorithm are carried out using a shallow water model on the sphere with observation operators of different levels of nonlinearity. The numerical results demonstrate the advantages of the proposed method compared to the traditional variational and ensemble-based smoothing methods. The HMC sampling smoother, in its original formulation, is computationally expensive due to the innate requirement of running the forward and adjoint models repeatedly. The proposed family of algorithms proceeds by developing computationally efficient versions of the HMC sampling smoother based on reduced-order approximations of the underlying model dynamics. The reduced-order HMC sampling smoothers, developed as extensions to the original HMC smoother, are tested numerically using the shallow-water equations model in Cartesian coordinates. The results reveal that the reduced-order versions of the smoother are capable of accurately capturing the posterior probability density, while being significantly faster than the original full order formulation. In the presence of nonlinear model dynamics, nonlinear observation operator, or non-Gaussian errors, the prior distribution in the sequential data assimilation framework is not analytically tractable. In the original formulation of the HMC sampling filter, the prior distribution is approximated by a Gaussian distribution whose parameters are inferred from the ensemble of forecasts. The Gaussian prior assumption in the original HMC filter is relaxed. Specifically, a clustering step is introduced after the forecast phase of the filter, and the prior density function is estimated by fitting a Gaussian Mixture Model (GMM) to the prior ensemble. The base filter developed following this strategy is named cluster HMC sampling filter (ClHMC ). A multi-chain version of the ClHMC filter, namely MC-ClHMC , is also proposed to guarantee that samples are taken from the vicinities of all probability modes of the formulated posterior. These methodologies are tested using a quasi-geostrophic (QG) model with double-gyre wind forcing and bi-harmonic friction. Numerical results demonstrate the usefulness of using GMMs to relax the Gaussian prior assumption in the HMC filtering paradigm. To provide a unified platform for data assimilation research, a flexible and a highly-extensible testing suite, named DATeS , is developed and described in this work. The core of DATeS is implemented in Python to enable for Object-Oriented capabilities. The main components, such as the models, the data assimilation algorithms, the linear algebra solvers, and the time discretization routines are independent of each other, such as to offer maximum flexibility to configure data assimilation studies. / Ph. D.
85

Statistical analysis of river discharge change in the Indochinese Peninsula using largo ensemble future climate projections / 多数アンサンブル将来気候予測情報を用いたインドシナ半島での河川流量変化の統計的分析

Hanittinan, Patinya 25 September 2017 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第20677号 / 工博第4374号 / 新制||工||1680(附属図書館) / 京都大学大学院工学研究科社会基盤工学専攻 / (主査)教授 立川 康人, 教授 中北 英一, 准教授 森 信人 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
86

Effective Field Theory Truncation Errors and Why They Matter

Melendez, Jordan Andrew 09 July 2020 (has links)
No description available.
87

Evidence-Based Uncertainty Modeling of Constitutive Models with Application in Design Optimization

Salehghaffari, Shahabedin 12 May 2012 (has links)
Phenomenological material models such as Johnson-Cook plasticity are often used in finite element simulations of large deformation processes at different strain rates and temperatures. Since the material constants that appear in such models depend on the material, experimental data, fitting method, as well as the mathematical representation of strain rate and temperature effects, the predicted material behavior is subject to uncertainty. In this dissertation, evidence theory is used for modeling uncertainty in the material constants, which is represented by separate belief structures that are combined into a joint belief structure and propagated using impact loading simulation of structures. Yager’s rule is used for combining evidence obtained from more than one source. Uncertainty is quantified using belief, plausibility, and plausibility-decision functions. An evidence-based design optimization (EBDO) approach is presented where the nondeterministic response functions are expressed using evidential reasoning. The EBDO approach accommodates field material uncertainty in addition to the embedded uncertainty in the material constants. This approach is applied to EBDO of an externally stiffened circular tube under axial impact load with and without consideration of material field uncertainty caused by spatial variation of material uncertainties due to manufacturing effects. Surrogate models are developed for approximation of structural response functions and uncertainty propagation. The EBDO example problem is solved using genetic algorithms. The uncertainty modeling and EBDO results are presented and discussed.
88

Bayesian Additive Regression Trees: Sensitivity Analysis and Multiobjective Optimization

Horiguchi, Akira January 2020 (has links)
No description available.
89

Simulation and Calibration of Uncertain Space Fractional Diffusion Equations

Alzahrani, Hasnaa H. 10 January 2023 (has links)
Fractional diffusion equations have played an increasingly important role in ex- plaining long-range interactions, nonlocal dynamics and anomalous diffusion, pro- viding effective means of describing the memory and hereditary properties of such processes. This dissertation explores the uncertainty propagation in space fractional diffusion equations in one and multiple dimensions with variable diffusivity and order parameters. This is achieved by:(i) deploying accurate numerical schemes of the forward problem, and (ii) employing uncertainty quantifications tools that accelerate the inverse problem. We begin by focusing on parameter calibration of a variable- diffusivity fractional diffusion model. A random, spatially-varying diffusivity field is considered together with an uncertain but spatially homogeneous fractional operator order. Polynomial chaos (PC) techniques are used to express the dependence of the stochastic solution on these random variables. A non-intrusive methodology is used, and a deterministic finite-difference solver of the fractional diffusion model is utilized for this purpose. The surrogates are first used to assess the sensitivity of quantities of interest (QoIs) to uncertain inputs and to examine their statistics. In particular, the analysis indicates that the fractional order has a dominant effect on the variance of the QoIs considered. The PC surrogates are further exploited to calibrate the uncertain parameters using a Bayesian methodology. In the broad range of parameters addressed, the analysis shows that the uncertain parameters having a significant impact on the variance of the solution can be reliably inferred, even from limited observations. Next, we address the numerical challenges when multidimensional space-fractional diffusion equations have spatially varying diffusivity and fractional order. Significant computational challenges arise due to the kernel singularity in the fractional integral operator as well as the resulting dense discretized operators. Hence, we present a singularity-aware discretization scheme that regularizes the singular integrals through a singularity subtraction technique adapted to the spatial variability of diffusivity and fractional order. This regularization strategy is conveniently formulated as a sparse matrix correction that is added to the dense operator, and is applicable to different formulations of fractional diffusion equations. Numerical results show that the singularity treatment is robust, substantially reduces discretization errors, and attains the first-order convergence rate allowed by the regularity of the solutions. In the last part, we explore the application of a Bayesian formalism to detect an anomaly in a fractional medium. Specifically, a computational method is presented for inferring the location and properties of an inclusion inside a two-dimensional domain. The anomaly is assumed to have known shape, but unknown diffusivity and fractional order parameters, and is assumed to be embedded in a fractional medium of known fractional properties. To detect the presence of the anomaly, the medium is forced using a collection of localized sources, and its response is measured at the source locations. To this end, the singularity-aware finite-difference scheme is applied. A non-intrusive regression approach is used to explore the dependence of the computed signals on the properties of the anomaly, and the resulting surrogates are first exploited to characterize the variability of the response, and then used to accelerate the Bayesian inference of the anomaly. In the regime of parameters considered, the computational results indicate that robust estimates of the location and fractional properties of the anomaly can be obtained, and that these estimates become sharper when high contrast ratios prevail between the anomaly and the surrounding matrix.
90

Dry Static Friction in Metals: Experiments and Micro-Asperity Based Modeling

Sista, Sri Narasimha Bhargava January 2014 (has links)
No description available.

Page generated in 0.1441 seconds