• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 570
  • 181
  • 54
  • 47
  • 23
  • 18
  • 10
  • 9
  • 9
  • 8
  • 4
  • 4
  • 4
  • 4
  • 4
  • Tagged with
  • 1208
  • 1208
  • 1208
  • 173
  • 172
  • 165
  • 128
  • 124
  • 120
  • 108
  • 102
  • 96
  • 86
  • 84
  • 79
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
711

High-performance Monte Carlo Computations for Adhesive Bands Formation

Shah, Karim Ali January 2022 (has links)
We propose a lattice model for three stochastically interacting components that mimicsthe formation of the internal structure of adhesive bands via evaporating one component(the solvent) by thermal gradient. We use high-performance computing resources toinvestigate the formation of rubber-acrylate morphologies. We pay special attentionto the role of varying temperature as well as of the changing the solvent interaction inconnection to the shape of the obtained rubber morphologies.In the lattice model, we start with microscopic spins of three particles in the latticewith short-range interactions between them. This microscopic model is approximatednumerically via a Monte Carlo Metropolis-based algorithm. High-performancecomputing resources and Python-based implementations have been used for thenumerical simulation of the lattice model. The numerical implementation highlights theeffect of the model parameters (volatility of the Solvent, temperature, and interactionbetween the particles) on the structure of the obtained morphologies. We demonstratethat one can utilize a reasonably simple model to explain the impact of parameters onthe creation of morphology in ternary systems when one component evaporates.
712

Estimating Structural Models with Bayesian Econometrics

Sacher, Szymon Konrad January 2023 (has links)
With the ever-increasing availability of large, high-dimensional datasets, there is a growingneed for econometric methods that can handle such data. The last decade has seen the development of many such methods in computer science, but their applications to economic models have been limited. In this thesis, I investigate whether modern tools in (exact and approximate) Bayesian inference can be useful in economics. In the three chapters, my coauthors and I develop and estimate a variety of models applied to problems in organizational economics, health, and labor. In chapter one, joint with Andrew Olenski, we estimate a mortality-based Bayesian model of nursing home quality accounting for selection. We then conduct three exercises. First, we examine the correlates of quality, and find that public report cards have near-zero correlation. Second, we show that higher quality nursing homes fared better during the pandemic: a one standard deviation increase in quality corresponds to 2.5% fewer Covid-19 cases. Finally, we show that a 10% increase in the Medicaid reimbursement rate raises quality, leading to a 1.85 percentage point increase in 90-day survival. Such a reform would be cost-effective under conservative estimates of the quality-adjusted statistical value of life. In chapter two, joint with Laura Battaglia and Stephen Hansen, we demonstrate the effectiveness of Hamiltonian Monte Carlo (HMC) in analyzing high-dimensional data in a computationally efficient and methodologically sound manner. We propose a new model, called Supervised Topic Model with Covariates, that shows how modeling this type of data carefully can have significant implications on conclusions compared to a simpler yet methodologically problematic two-step approach. By conducting a simulation study and revisiting the study of executive time use by Bandiera, Prat, Hansen, and Sadun (2020), we demonstrate these results. This approach can accommodate thousands of parameters and doesn’t require custom algorithms specific to each model, making it more accessible for applied researchers. In chapter three, I propose a new way to estimate a two-way fixed effects model such as Abowd, Kramarz, and Margolis (1999) (AKM) that relaxes the stringent assumptions concerning the matching process. Through simulations, I demonstrate that this model performs well and provide an application to matched employer-employee data from Brazil. The results indicate that disregarding selection may result in a significant bias in the estimates of location fixed effects, and thus, can contribute to explaining recent discoveries about the relevance of locations in US labor markets. The three chapters demonstrate the usefulness of modern Bayesian methods for estimating models that would be otherwise infeasible, while remaining accessible enough for applied researchers. The importance of carefully modeling the data of interest instead of relying on ad-hoc solutions is also highlighted, as it has been shown to significantly impact the conclusions drawn across a variety of problems.
713

Rare-Event Estimation and Calibration for Large-Scale Stochastic Simulation Models

Bai, Yuanlu January 2023 (has links)
Stochastic simulation has been widely applied in many domains. More recently, however, the rapid surge of sophisticated problems such as safety evaluation of intelligent systems has posed various challenges to conventional statistical methods. Motivated by these challenges, in this thesis, we develop novel methodologies with theoretical guarantees and numerical applications to tackle them from different perspectives. In particular, our works can be categorized into two areas: (1) rare-event estimation (Chapters 2 to 5) where we develop approaches to estimating the probabilities of rare events via simulation; (2) model calibration (Chapters 6 and 7) where we aim at calibrating the simulation model so that it is close to reality. In Chapter 2, we study rare-event simulation for a class of problems where the target hitting sets of interest are defined via modern machine learning tools such as neural networks and random forests. We investigate an importance sampling scheme that integrates the dominating point machinery in large deviations and sequential mixed integer programming to locate the underlying dominating points. We provide efficiency guarantees and numerical demonstration of our approach. In Chapter 3, we propose a new efficiency criterion for importance sampling, which we call probabilistic efficiency. Conventionally, an estimator is regarded as efficient if its relative error is sufficiently controlled. It is widely known that when a rare-event set contains multiple "important regions" encoded by the dominating points, importance sampling needs to account for all of them via mixing to achieve efficiency. We argue that the traditional analysis recipe could suffer from intrinsic looseness by using relative error as an efficiency criterion. Thus, we propose the new efficiency notion to tighten this gap. In particular, we show that under the standard Gartner-Ellis large deviations regime, an importance sampling that uses only the most significant dominating points is sufficient to attain this efficiency notion. In Chapter 4, we consider the estimation of rare-event probabilities using sample proportions output by crude Monte Carlo. Due to the recent surge of sophisticated rare-event problems, efficiency-guaranteed variance reduction may face implementation challenges, which motivate one to look at naive estimators. In this chapter we construct confidence intervals for the target probability using this naive estimator from various techniques, and then analyze their validity as well as tightness respectively quantified by the coverage probability and relative half-width. In Chapter 5, we propose the use of extreme value analysis, in particular the peak-over-threshold method which is popularly employed for extremal estimation of real datasets, in the simulation setting. More specifically, we view crude Monte Carlo samples as data to fit on a generalized Pareto distribution. We test this idea on several numerical examples. The results show that in the absence of efficient variance reduction schemes, it appears to offer potential benefits to enhance crude Monte Carlo estimates. In Chapter 6, we investigate a framework to develop calibration schemes in parametric settings, which satisfies rigorous frequentist statistical guarantees via a basic notion that we call eligibility set designed to bypass non-identifiability via a set-based estimation. We investigate a feature extraction-then-aggregation approach to construct these sets that target at multivariate outputs. We demonstrate our methodology on several numerical examples, including an application to calibration of a limit order book market simulator. In Chapter 7, we study a methodology to tackle the NASA Langley Uncertainty Quantification Challenge, a model calibration problem under both aleatory and epistemic uncertainties. Our methodology is based on an integration of distributionally robust optimization and importance sampling. The main computation machinery in this integrated methodology amounts to solving sampled linear programs. We present theoretical statistical guarantees of our approach via connections to nonparametric hypothesis testing, and numerical performances including parameter calibration and downstream decision and risk evaluation tasks.
714

Uncertainty Quantification for Micro-Scale Simulations of Flow in Plant Canopies

Giacomini, Beatrice January 2023 (has links)
Recent decades have seen remarkable increase in the fidelity of computational fluid dynamics (CFD) models for the simulation of exchange processes between plant canopies and the atmosphere. However, no matter how accurate the selected CFD solver is, model results are found to be affected by an irreducible level of uncertainty that originates from the inability of exactly measuring vegetation (leaf orientation, foliage density, plant reconfiguration) and flow features (incoming wind direction, solar radiation, stratification effects). Motivated by this consideration, the present PhD thesis proposes a Bayesian uncertainty quantification (UQ) framework for evaluating uncertainty on model parameters and its impact on model results, in the context of CFD for idealized and realistic plant canopy flow. Two problems are considered. First, for the one-dimensional flow within and above the Duke forest near Durham, NC, a one-dimensional Reynolds-averaged Navier--Stokes model is employed. In-situ measurements of turbulence statistics are used to inform the UQ framework in order to evaluate uncertainty on plant geometry and its impact on turbulence statistics and aerodynamic coefficients. The second problem is characterized by a more realistic setup, with three-dimensional simulations aiming at replicating the flow over a walnut block in Dixon, CA. Due to the substantial computational cost associated with large-eddy simulation (LES), a surrogate model is used for flow simulations. The surrogate is built on top of an exiguous number of LESs over realistic plant canopy, with plant area density derived from LiDAR measurements. Here, the goal is to investigate uncertainty on incoming wind direction and potential repercussions on turbulence statistics. Synthetic data are used to inform the framework. In both cases, uncertainty on model parameters is characterized via a Markov chain Monte Carlo procedure (inverse problem) and propagated to model results through Monte Carlo sampling (forward problem). In the validation phase, profiles of turbulence statistics with associated uncertainty are compared with the measurements used to inform the framework. By providing an enriched solution for simulation of flow over idealized and realistic plant canopy, this PhD thesis highlights the potential of UQ to enhance prediction of micro-scale exchange processes between vegetation and atmosphere.
715

Computational Psychometrics for Item-based Computerized Adaptive Learning

Chen, Yi January 2023 (has links)
With advances in computer technology and expanded access to educational data, psychometrics faces new opportunities and challenges for enhancing pattern discovery and decision-making in testing and learning. In this dissertation, I introduced three computational psychometrics studies for solving the technical problems in item-based computerized adaptive learning (CAL) systems related to dynamic measurement, diagnosis, and recommendation based on Bayesian item response theory (IRT). For the first study, I introduced a new knowledge tracing (KT) model, dynamic IRT (DIRT), which can iteratively update the posterior distribution of latent ability based on moment match approximation and capture the uncertainty of ability change during the learning process. For dynamic measurement, DIRT has advantages in interpretation, flexibility, computation cost, and implementability. For the second study, A new measurement model, named multilevel and multidimensional item response theory with Q matrix (MMIRT-Q), was proposed to provide fine-grained diagnostic feedback. I introduced sequential Monte Carlo (SMC) for online estimation of latent abilities. For the third study, I proposed the maximum expected ratio of posterior variance reduction criterion (MERPV) for testing purposes and the maximum expected improvement in posterior mean (MEIPM) criterion for learning purposes under the unified framework of IRT. With these computational psychometrics solutions, we can improve the students’ learning and testing experience with accurate psychometrics measurement, timely diagnosis feedback, and efficient item selection.
716

Photo-biomodulation of human skin fibroblast sub-populations: a systematic approach for the optimization of optical treatment parameters

Mignon, Charles January 2017 (has links)
The thesis presents a rational path for the optimization of the selection of optical treatment parameters in photobiomodulation of human skin fibroblasts. The project begins with an extensive analysis of 90 bibliographic reports in photobiomodulation published between 1985 and 2015, and revealed major inconsistencies in optical parameters selected for clinical applications. Seeking greater clarity for optimal parameter choice, a systematic approach to disentangle the multiple factors underpinning the response of human dermal fibroblasts in vitro to visible and near-infra red (NIR) light was employed. Light-based devices were constructed to specifically and systematically screen the optical parameter window (i.e. wavelength, irradiance and dose) observed in literature. Additionally, critical culture and treatment conditions that have dramatic impact on the outcome of specific light treatment of these human skin dermal cells were identified. In particular, environmental oxygen concentration, cell confluency and serum concentration were all found to have a great effect on the response of dermal fibroblasts to light. In parallel, the induction of reactive oxygen species (ROS) by short visible wavelengths on two dermal fibroblast sub-populations or lineage, reticular and papillary, was monitored by live-cell imaging. The ROS species were found to be created in or close to mitochondria. Lastly, gene expression studies revealed a strong impact of short visible wavelengths, as compared to long and NIR wavelengths on both subpopulations of human dermal fibroblasts. In particular, blue light (450 nm) specifically down-regulated proliferation, metabolism and protein synthesis molecular pathways. At the protein level, 450-nm light inhibited the production of procollagen I in human reticular and papillary fibroblasts in a dose-dependent manner. Gene expression results were in agreement i.e., the same light parameter down-regulated collagen fiber genes, integrins and up-regulated collagenase MMP1. This thesis concludes with a chapter presenting a characterization of the accuracy of a potential translation tool for the prediction of optical photon density inside human skin. / Marie Skłodowska-Curie Actions.
717

A Monte Carlo evaluation of Jan and Prof : two techniques for capturing and clustering rater policies /

Dudycha, Arthur Lynn January 1967 (has links)
No description available.
718

Probabilistic Assessment of Pile Drivability in Swedish Soils

ALINEJAD KORDMAHALLEH, MAEDEH January 2020 (has links)
Site investigations are often performed prior to the design of pile foundations with the aim to collect data regarding soil properties including boulder content. The obtained data is typically limited due to non-homogeneous characteristic of the soil. The geological conditions of the Mälardalen region are characterized by glacial and post glacial clay overlaying on the layer of moraine containing boulders on a bedrock. Thus, pile refusal in results of encountering boulders is a common issue during pile driving in this region. The current methods to assess the pile refusal risk are mostly qualitative and relies on the expertise gained from experience.This thesis aims to develop a numerical model to quantify the boulder content in a soil strata with a confidence interval based on the site investigation results. Furthermore, this study estimates the probability and the consequences of hitting boulders while installing piles.The model simulations show that an increase in the boulder content raises the probability of hitting a boulder, but it is not proportional to the diameter of piles. It means that even in low rate of boulder content with small piles there is a high probability to hit a boulder. Results from simulations using simplified quantitative method show that slender piles have substantial consequences in case they encounter boulders. Thus, these piles are more prone to pile refusal or breakage. It is suggested that in projects where small piles are preferred due to the superstructure design, enough tests be performed to have an estimate of boulder content with tighter interval and more precise design. On the other hand, in projects with low pile counts, it is recommended to have a conservative design with large diameter piles instead of performing numerous in-situ tests. This is because the price for piling becomes relatively low compared to site investigations’ cost.
719

Developments of Two Methodologies in Uncertainty Quantification in Civil Engineering and Engineering Mechanics

Shen, Mengyao January 2022 (has links)
Uncertainty is ubiquitous in civil infrastructure systems and has a major impact on decision-making for structural safety and reliability, and for assessing and managing risk. This dissertation explores the field both from the theoretical-mathematical modeling point of view, as well as from the practical applications perspective. Two problems addressed in the dissertation: 1) how to quantify the randomness in material properties and 2) what’s the impact for the uncertainties in material properties on the response/safety of the civil structures. There are two distinct parts in the dissertation. Part I focuses on the strength evaluation of main cables for long span suspension bridges. A novel methodology is proposed to evaluate the ultimate strength of the main cables of suspension bridges using information obtained from site inspections and from tensile strength tests on selected wire samples extracted from the bridge’s main cables. A new model is proposed accounting for the spatial variation of individual wires’ strength along their length, an important physical attribute of corroded wires considered here for the first time. This model includes: (1) mapping the corrosion stage variation along one-panel-long wires that are visible during an inspection, (2) establishing probability distribution functions for the ultimate tensile strength of 18″ long wire segments in each corrosion stage group, (3) generating random realizations of the ultimate strength of all the wires in the cable’s cross section, accounting for their strength variation along the entire panel length, and (4) accounting for the effect of broken wires in the evaluation panel as well as in adjacent panels. A Monte Carlo simulation approach is finally proposed to generate random realizations of the ultimate overall strength of the cable, using an incremental loading procedure. The final outcome is the probability distribution of the ultimate strength of the entire cable. The methodology is demonstrated through the cable strength evaluation of the FDR Mid-Hudson Bridge and Bear Mountain Bridge in New York state, and compared with corresponding results obtained using the current guidelines of NCHRP Report 534. Part II is more theoretical in nature and focuses on estimating the stochastic response variability of structures with uncertain material properties modeled by stochastic fields. The concept of Variability Response Function (VRF) is applied in the dissertation to quantify the response variability (e.g., mean and variance) of statically indeterminate structures. Two types of response for statically indeterminate beams at a specific location x are studied: bending moment M(x) and displacement/deflection w(x). By solving the governing equations of the statically indeterminate structure, the responses along the length of the beam, M(x) and w(x), are expressed as a function of its (random) zero-moment location denoted by h. For bending moment M(x), combined with a second-order Taylor series expansion of the random zero-moment location h, novel Variability Response Function-based integral expressions for the variance of the response bending moment, Var[M(x)], are established. Extensive numerical examples are provided where the accuracy of the results obtained using the proposed formulation is validated using Monte Carlo simulations involving stochastic fields that follow truncated Gaussian and shifted lognormal probability distribution functions. These Monte Carlo simulation results indicate that the proposed Variability Response Functions are probability-distribution-independent. For deflection w(x), by introducing hinges at zero-moment location h, a statically indeterminate structure can be transformed into its equivalent statically determinate structure. An approximate close-form analytical expression of VRF is therefore built based on the transformed statically determinate structure with all (probabilistically) possible hinge locations. An ensemble average is taken to get the overall variability response function of the system, which can be replaced by the VRF with the hinge located at the same zero-moment point with the deterministic system without any randomness. This Variability Response Function can provide approximate estimates of the stochastic response variability with reasonable accuracy. Moreover, to get more accurate estimates of the statically indeterminate system, the results from the approximate variability response function can be further refined by introducing a correction term Dw. Finally, the corrected response variability of the original statically indeterminate structure can be obtained with almost perfect accuracy (compared to brute force Monte Carlo simulations). To sum up, the proposed VRFs of statically indeterminate beams, for both bending moment and deflection, have numerous desirable attributes including the capability to perform a full sensitivity analysis of the response variability with respect to the spectral characteristics of the random field modeling the uncertain material/system properties and establishing realizable upper bounds of the response variability.
720

Statistics on 10 GHz attenuation due to precipitation in Montreal and Alberta.

Inkster, Don Robert January 1973 (has links)
No description available.

Page generated in 0.0585 seconds