• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 125
  • 23
  • 16
  • 8
  • 1
  • Tagged with
  • 241
  • 241
  • 61
  • 57
  • 52
  • 36
  • 35
  • 34
  • 33
  • 28
  • 26
  • 25
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Bayesian Additive Regression Trees: Sensitivity Analysis and Multiobjective Optimization

Horiguchi, Akira January 2020 (has links)
No description available.
82

Simulation and Calibration of Uncertain Space Fractional Diffusion Equations

Alzahrani, Hasnaa H. 10 January 2023 (has links)
Fractional diffusion equations have played an increasingly important role in ex- plaining long-range interactions, nonlocal dynamics and anomalous diffusion, pro- viding effective means of describing the memory and hereditary properties of such processes. This dissertation explores the uncertainty propagation in space fractional diffusion equations in one and multiple dimensions with variable diffusivity and order parameters. This is achieved by:(i) deploying accurate numerical schemes of the forward problem, and (ii) employing uncertainty quantifications tools that accelerate the inverse problem. We begin by focusing on parameter calibration of a variable- diffusivity fractional diffusion model. A random, spatially-varying diffusivity field is considered together with an uncertain but spatially homogeneous fractional operator order. Polynomial chaos (PC) techniques are used to express the dependence of the stochastic solution on these random variables. A non-intrusive methodology is used, and a deterministic finite-difference solver of the fractional diffusion model is utilized for this purpose. The surrogates are first used to assess the sensitivity of quantities of interest (QoIs) to uncertain inputs and to examine their statistics. In particular, the analysis indicates that the fractional order has a dominant effect on the variance of the QoIs considered. The PC surrogates are further exploited to calibrate the uncertain parameters using a Bayesian methodology. In the broad range of parameters addressed, the analysis shows that the uncertain parameters having a significant impact on the variance of the solution can be reliably inferred, even from limited observations. Next, we address the numerical challenges when multidimensional space-fractional diffusion equations have spatially varying diffusivity and fractional order. Significant computational challenges arise due to the kernel singularity in the fractional integral operator as well as the resulting dense discretized operators. Hence, we present a singularity-aware discretization scheme that regularizes the singular integrals through a singularity subtraction technique adapted to the spatial variability of diffusivity and fractional order. This regularization strategy is conveniently formulated as a sparse matrix correction that is added to the dense operator, and is applicable to different formulations of fractional diffusion equations. Numerical results show that the singularity treatment is robust, substantially reduces discretization errors, and attains the first-order convergence rate allowed by the regularity of the solutions. In the last part, we explore the application of a Bayesian formalism to detect an anomaly in a fractional medium. Specifically, a computational method is presented for inferring the location and properties of an inclusion inside a two-dimensional domain. The anomaly is assumed to have known shape, but unknown diffusivity and fractional order parameters, and is assumed to be embedded in a fractional medium of known fractional properties. To detect the presence of the anomaly, the medium is forced using a collection of localized sources, and its response is measured at the source locations. To this end, the singularity-aware finite-difference scheme is applied. A non-intrusive regression approach is used to explore the dependence of the computed signals on the properties of the anomaly, and the resulting surrogates are first exploited to characterize the variability of the response, and then used to accelerate the Bayesian inference of the anomaly. In the regime of parameters considered, the computational results indicate that robust estimates of the location and fractional properties of the anomaly can be obtained, and that these estimates become sharper when high contrast ratios prevail between the anomaly and the surrounding matrix.
83

Robust Design Framework for Automating Multi-component DNA Origami Structures with Experimental and MD coarse-grained Model Validation

Huang, Chao-Min January 2020 (has links)
No description available.
84

Development and Use of a Spatially Accurate Polynomial Chaos Method for Aerospace Applications

Schaefer, John Anthony 24 January 2023 (has links)
Uncertainty is prevalent throughout the design, analysis, and optimization of aerospace products. When scientific computing is used to support these tasks, sources of uncertainty may include the freestream flight conditions of a vehicle, physical modeling parameters, geometric fidelity, numerical error, and model-form uncertainty, among others. Moreover, while some uncertainties may be treated as probabilistic, aleatory sources, other uncertainties are non-probabilistic and epistemic due to a lack of knowledge, and cannot be rigorously treated using classical statistics or Bayesian approaches. An additional complication for propagating uncertainty is that many aerospace scientific computing tools may be computationally expensive; for example, a single high-fidelity computational fluid dynamics solution may require several days or even weeks to complete. It is therefore necessary to employ uncertainty propagation strategies that require as few solutions as possible. The Non-Intrusive Polynomial Chaos (NIPC) method has grown in popularity in recent decades due to its ability to propagate both aleatory and epistemic parametric sources of uncertainty in a computationally efficient manner. While traditional Monte Carlo methods might require thousands to millions of function evaluations to achieve statistical convergence, NIPC typically requires tens to hundreds for problems with similar numbers of uncertain dimensions. Despite this efficiency, NIPC is limited in one important aspect: it can only propagate uncertainty at a particular point in a design space or flight envelope. For optimization or aerodynamic database problems that require uncertainty estimates at many more than one point, the use of NIPC quickly becomes computationally intractable. This dissertation introduces a new method entitled Spatially Accurate Polynomial Chaos (SAPC) that extends the original NIPC approach for the spatial regression of aleatory and epistemic parametric sources of uncertainty. Throughout the dissertation, the SAPC method is applied to various aerospace problems of interest. These include the regression of aerodynamic force and moment uncertainties throughout the flight envelope of a commercial aircraft, the design under uncertainty of a two-stream propulsive mixer device, and the robust design of a low-boom supersonic demonstrator aircraft. Collectively the results suggest that SAPC may be useful for a large variety of engineering applications. / Doctor of Philosophy / Uncertainty is prevalent throughout the design, analysis, and optimization of aerospace products. When scientific computer simulations are used to support these tasks, sources of uncertainty may include the speed of an aerospace vehicle, the direction of the wind, physical modeling constants or assumptions, and the vehicle shape, among others. As a result of these sources uncertainty, assessments of vehicle performance are also uncertain. For example, if the speed of a vehicle is not known precisely, then computer simulations will predict a lift force which is also imprecisely known. A challenge when assessing the uncertainty in aerospace vehicle performance is that the computer simulations which predict performance may take a long time to run, even on state-of-the-art super computers. Traditional statistical methods may require thousands or millions of simulations for the prediction of uncertainty, which does not fit within the computational budget of most aerospace analyses. A newer method called Non-Intrusive Polynomial Chaos (NIPC) is more efficient, typically requiring only tens to hundreds of simulations; however, NIPC only provides uncertainty estimates at a single point in an aircraft flight envelope or design condition. In this dissertation, a new method called Spatially Accurate Polynomial Chaos (SAPC) is developed. The SAPC method combines desirable features of NIPC with regression methods for an efficient estimation of uncertainty throughout a vehicle flight envelope or design space. Throughout the dissertation, the SAPC method is applied to various aerospace problems of interest. These include the regression of aerodynamic force and moment uncertainties throughout the flight envelope of a commercial aircraft, the design under uncertainty of a two-stream propulsive mixer device, and the robust design of a low-boom supersonic demonstrator aircraft. Collectively the results suggest that SAPC may be useful for a large variety of engineering applications.
85

UNCERTAINTY QUANTIFICATION OF LASER POWDER BED FUSION COMPUTATIONAL MODELS

Scott M Wells (14228129) 09 December 2022 (has links)
<p>  </p> <p>Laser powder bed fusion (L-PBF) is a relatively young metallurgical processing method which has many advantages over traditional casting and wrought based methods. Alloy systems suitable for this additive manufacturing (AM) process include Ti-6Al-4V, 316 stainless steel, Inconel 718 and 625 making it attractive for automotive, aerospace, and biomedical applications. Despite the potential, L-PBF is plagued by defects and inconsistent build qualities which make certification of critical components onerous. Additionally, experimental studies are difficult due to the cost of laser systems and feedstock material. Many researchers have turned to computational modeling as this allows for rigorous examination and isolation of the underlying physics to better understand where problems may arise, and where improvements can be made. However, models often fail to consider the role of systematic and statistical uncertainty while also relying heavily on assumptions and simplifications for computational efficiency. As such, there is no quantifiable metric for how reliable these models are. This work applies an uncertainty quantification (UQ) framework to computational models for L-PBF to understand the role of uncertainty and assumptions on model reliability as this provides insight into their limitations and potential areas of improvement.</p> <p>First, the UQ framework is applied to a finite volume melt pool transport model to evaluate the role of uncertainty and model assumptions on melt pool shapes and solidification dynamics. This includes the role of simulating the powder bed thermophysical properties, surface tension driven Marangoni convection, and the thermodynamic relation dictating latent heat release. The transport model is then weakly coupled to a cellular automata (CA) grain evolution model to propagate and quantify the uncertainty in the as-built microstructure including crystallographic texture formation. Further propagation of melt pool and microstructure uncertainty to the resulting mechanical properties to close the process-microstructure-property relations are discussed. Lastly, recommendations for future model development and research are presented. </p>
86

Bayesian Uncertainty Quantification while Leveraging Multiple Computer Model Runs

Walsh, Stephen A. 22 June 2023 (has links)
In the face of spatially correlated data, Gaussian process regression is a very common modeling approach. Given observational data, kriging equations will provide the best linear unbiased predictor for the mean at unobserved locations. However, when a computer model provides a complete grid of forecasted values, kriging will not apply. To develop an approach to quantify uncertainty of computer model output in this setting, we leverage information from a collection of computer model runs (e.g., historical forecast and observation pairs for tropical cyclone precipitation totals) through a Bayesian hierarchical framework. This framework allows us to combine information and account for the spatial correlation within and across computer model output. Using maximum likelihood estimates and the corresponding Hessian matrices for Gaussian process parameters, these are input to a Gibbs sampler which provides posterior distributions for parameters of interest. These samples are used to generate predictions which provide uncertainty quantification for a given computer model run (e.g., tropical cyclone precipitation forecast). We then extend this framework using deep Gaussian processes to allow for nonstationary covariance structure, applied to multiple computer model runs from a cosmology application. We also perform sensitivity analyses to understand which parameter inputs most greatly impact cosmological computer model output. / Doctor of Philosophy / A crucial theme when analyzing spatial data is that locations that are closer together are more likely to have similar output values (for example, daily precipitation totals). For a particular event, common modeling approach of spatial data is to observe data at numerous locations, and make predictions for locations that were unobserved. In this work, we extend this within-event modeling approach by additionally learning about the uncertainty across different events. Through this extension, we are able to quantify uncertainty for a particular computer model (which may be modeling tropical cyclone precipitation, for example) that does not provide any uncertainty on its own. This framework can be utilized to quantify uncertainty across a vast array of computer model outputs where more than one event or model run has been obtained. We also study how inputting different values into a computer model can influence the values it produces.
87

Advancing computational materials design and model development using data-driven approaches

Sose, Abhishek Tejrao 02 February 2024 (has links)
Molecular dynamics (MD) simulations find their applications in fundamental understanding of molecular level mechanisms of physical processes. This assists in tuning the key features affecting the development of the novel hybrid materials. A certain application demanding the need for a desired function can be cherished through the hybrids with a blend of new properties by a combination of pure materials. However, to run MD simulations, an accurate representation of the interatomic potentials i.e. force-fields (FF) models remain a crucial aspect. This thesis intricately explores the fusion of MD simulations, uncertainty quantification, and data-driven methodologies to accelerate the computational design of innovative materials and models across the following interconnected chapters. Beginning with the development of force fields for atomic-level systems and coarse-grained models for FCC metals, the study progresses into exploring the intricate interfacial interactions between 2D materials like graphene, MoS2, and water. Current state-of-the-art model development faces the challenge of high dimensional input parameters' model and unknown robustness of developed model. The utilization of advanced optimization techniques such as particle swarm optimization (PSO) integrated with MD enhances the accuracy and precision of FF models. Moreover, the bayesian uncertainty quantification (BUQ) assists FF model development researchers in estimating the robustness of the model. Furthermore, the complex structure and dynamics of water confined between and around sheets was unraveled using 3D Convolutional Neural Networks (3D-CNN). Specifically, through classification and regression models, water molecule ordering/disordering and atomic density profiles were accurately predicted, thereby elucidating nuanced interplays between sheet compositions and confined water molecules. To further the computational design of hybrid materials, this thesis delves into designing and investigating polymer composites with functionalized MOFs shedding light on crucial factors governing their compatibility and performance. Therefore, this report includes the study of structure and dynamics of functionalized MOF in the polymer matrix. Additionally, it investigates the biomedical potential of porous MOFs as drug delivery vehicles (DDVs). Often overlooked is the pivotal role of solvents (used in MOF synthesis or found in relevant body fluids) in the drug adsorption and release process. This report underscores the solvent's impact on drug adsorption within MOFs by comparing results in its presence and absence. Building on these findings, the study delves into the effects of MOF functionalization on tuning the drug adsorption and release process. It further explores how different physical and chemical properties influence drug adsorption within MOFs. Furthermore, the research explores the potential of functionalized MOFs for improved carbon capture, considering their application in energy-related contexts. By harnessing machine learning and deep learning, the thesis introduces innovative pathways for material property prediction and design, emphasizing the pivotal fusion of computational methodologies with data-driven approaches to advance molecular-level understanding and propel future material design endeavors. / Doctor of Philosophy / Envision a world where scientific exploration reaches the microscopic scale, powered by advanced computational tools. In this frontier of materials science, researchers employ sophisticated computer simulations to delve into the intricate properties of materials, particularly focusing on Metal-Organic Frameworks (MOFs). These MOFs, equivalent to microscopic molecular sponges, exhibit remarkable abilities to capture gases or hold medicinal drug compounds. This thesis meticulously studies MOFs alongside materials like graphene, Boron Nitride and Molybdenum disulfide, investigating their interactions with water with unprecedented precision. Through these detailed explorations and the fusion of cutting-edge technologies, we aim to unlock a future featuring enhanced drug delivery systems, improved energy storage solutions, and innovative energy applications.
88

Bayesian, Frequentist, and Information Geometry Approaches to Parametric Uncertainty Quantification of Classical Empirical Interatomic Potentials

Kurniawan, Yonatan 20 December 2021 (has links)
Uncertainty quantification (UQ) is an increasingly important part of materials modeling. In this paper, we consider the problem of quantifying parametric uncertainty in classical empirical interatomic potentials (IPs). Previous work based on local sensitivity analysis using the Fisher Information has shown that IPs are sloppy, i.e., are insensitive to coordinated changes of many parameter combinations. We confirm these results and further explore the non-local statistics in the context of sloppy model analysis using both Bayesian (MCMC) and Frequentist (profile likelihood) methods. We interface these tools with the Knowledgebase of Interatomic Models (OpenKIM) and study three models based on the Lennard-Jones, Morse, and Stillinger-Weber potentials, respectively. We confirm that IPs have global properties similar to those of sloppy models from fields such as systems biology, power systems, and critical phenomena. These models exhibit a low effective dimensionality in which many of the parameters are unidentifiable, i.e., do not encode any information when fit to data. Because the inverse problem in such models is ill-conditioned, unidentifiable parameters present challenges for traditional statistical methods. In the Bayesian approach, Monte Carlo samples can depend on the choice of prior in subtle ways. In particular, they often "evaporate" parameters into high-entropy, sub-optimal regions of the parameter space. For profile likelihoods, confidence regions are extremely sensitive to the choice of confidence level. To get a better picture of the relationship between data and parametric uncertainty, we sample the Bayesian posterior at several sampling temperatures and compare the results with those of Frequentist analyses. In analogy to statistical mechanics, we classify samples as either energy-dominated, i.e., characterized by identifiable parameters in constrained (ground state) regions of parameter space, or entropy-dominated, i.e., characterized by unidentifiable (evaporated) parameters. We complement these two pictures with information geometry to illuminate the underlying cause of this phenomenon. In this approach, a parameterized model is interpreted as a manifold embedded in the space of possible data with parameters as coordinates. We calculate geodesics on the model manifold and find that IPs, like other sloppy models, have bounded manifolds with a hierarchy of widths, leading to low effective dimensionality in the model. We show how information geometry can motivate new, natural parameterizations that improve the stability and interpretation of UQ analysis and further suggest simplified, less-sloppy models.
89

Using Machine Learning Techniques to Model the Process-Structure-Property Relationship in Additive Manufacturing

Shishavan, Seyyed Hadi Seifi 06 August 2021 (has links)
Additive manufacturing (AM) is a novel fabrication technique capable of producing highly complex parts. Nevertheless, a major challenge is improving the quality of the fabricated parts. While there are several ways of approaching this problem, developing data-driven methods that use AM process signatures to identify these part anomalies can be rapidly applied to improve the overall part quality during the build. The objective of this dissertation is to model multiple processes within the AM to quantify the quality of the parts and reduced the uncertainty due to variation in input process parameters. The objective of first study is to build a new layer-wise process signature model to characterize the thermal-defect relationship. Based on melt pool images, we propose novel layer-wise key process signatures, which are calculated using multilinear principal component analysis (MPCA) and are directly correlated with layer-wise quality of the part. Second study broadens the spectrum of the dissertation to include mechanical properties, where a novel two-phase modeling methodology is proposed for fatigue life prediction based on in-situ monitoring of thermal history. In final study, our objective is to pave the way toward a better understanding of the uncertainty in the process-defect-structures relationship using an inverse robust design exploration method. The method involves two steps. In the first step, mathematical models are developed to characterize and model the forward flow of information in the intended additive manufacturing process. In the second step, inverse robust design exploration is carried out to investigate satisfying design solutions that meet multiple AM goals.
90

Dry Static Friction in Metals: Experiments and Micro-Asperity Based Modeling

Sista, Sri Narasimha Bhargava January 2014 (has links)
No description available.

Page generated in 0.1408 seconds