1 |
Sensitivity Analysis of Models with Input CodependenciesDougherty, SEAN 05 December 2013 (has links)
Assuming a set of variates are independent and normally distributed is commonplace in statistics. In this thesis, we consider the consequences of these assumptions as they pertain to global sensitivity analysis. We begin by illustrating how the notion of sensitivity becomes distorted in the presence of codependent model inputs. This observation motivates us to develop a new methodology which accommodates for input codependencies. Our methodology can be summarized through three points: First, a new form of sensitivity is presented which performs as well as the classical form but can be obtained at a fraction of the computational cost. Second, we define a measure which quantifies the extent of distortion caused by codependent inputs. The third point is regarding the modelling of said codependencies. The multivariate normal distribution is a natural choice for modelling codependent inputs; however, our methodology uses a copula-based approach instead. Copulas are a contemporary strategy for constructing multivariate distributions whereby the marginal and joint behaviours are treated separately. As a result, a practitioner has more flexibility when modelling inputs. / Thesis (Master, Chemical Engineering) -- Queen's University, 2013-12-05 10:16:26.81
|
2 |
A Practical and Fast Numerical Method for Calculating Global Sensitivity with Examples from Supply Chain and Measurement ApplicationsGroves, William Alan 02 August 2023 (has links)
No description available.
|
3 |
Barrier Island Morphodynamic Insights from Applied Global Sensitivity Analysis and Decadal Exploratory ModelingHoagland, Steven William Harvey 02 October 2024 (has links)
Barrier islands serve as valuable resources for coastal communities by reducing backbarrier flooding, providing wildlife habitat, and creating local economic activity through opportunities for recreation and tourism. Because the benefits of these islands are linked to their morphology, coastal resource planners must consider what management alternatives will maximize these benefits, considering both short- and long-term goals. Recent advances in long-term computational modeling of barrier island, marsh, and lagoon systems have created opportunities for gaining additional insights into the morphodynamics of these systems, which may help planners make better-informed coastal management decisions. In this series of studies, a recently developed long-term barrier-marsh-lagoon model is evaluated to better understand system morphodynamics and applied to a real barrier island system in the mid-Atlantic to understand its vulnerabilities and the potential impacts of management alternatives. In the first study, a comprehensive review of advances in barrier island morphodynamic modeling was presented. In the second study, a global sensitivity analysis method, the Sobol method, was used to explore the parameter space of the barrier-marsh-lagoon model. The significant influence of initial barrier geometry, the combination of parameters required for short-term drowning to occur, and the significant role of tidal dispersion on backbarrier sediment dynamics were morphodynamic insights drawn from this study. In the third study, five global sensitivity analysis methods were evaluated based on their ability to rank parameters, converge to stable results, and their reliability. Groups of the most significant parameters were generally identified by all methods; however, the Morris method exceeded all others in terms of performance, especially its ability to converge and its reliability. VARS performed second best, on average, with better convergence and reliability results than the Sobol method, and with lower simulation counts. In the fourth study, the long-term model was applied to a mid-Atlantic barrier island and used to assess the island's vulnerabilities to sea level rise, overwash, and the impact of coastal management alternatives. Thin-layer placement and beach nourishment were found to be effective at sustaining the marsh and minimizing island retreat, respectively. / Doctor of Philosophy / Barrier islands help coastal communities by reducing flooding, providing wildlife habitat, and creating local economic activity through opportunities for recreation and tourism. Because the benefits of these islands are linked to their form, decision-makers must think about how to manage these islands to help the community both now and in the future. Recent advances in computer modeling of barrier islands, and the adjacent marshes and lagoons, over decades to hundreds of years, have created opportunities for us to learn more about how these systems behave over time, which may help planners make better-informed coastal management decisions. In this series of studies, a recently developed computer model of the barrier island, marsh, and lagoon is evaluated to learn how the system changes over time and applied to a real barrier island system in the mid-Atlantic to understand its vulnerabilities and the potential impacts of management alternatives. In the first study, a comprehensive review of advances in computer modeling of barrier island changes over time was presented. In the second study, the impact of the model parameters and their combinations with one another was explored using the Sobol global sensitivity analysis method, which is widely considered to be the standard method in practice. The significant influence of initial barrier geometry, the combination of parameters required for the barrier to be overcome by sea level in the short-term, and the significant role of sediment delivered behind the island through tidal inlets were significant insights into the system behavior that were drawn from this study. In the third study, five global sensitivity analysis methods were evaluated based on their ability to rank parameters, the number of computer simulations that were required, the ability of a method to arrive at a conclusive answer, and the consistency of a method in providing an answer. Groups of the most significant parameters were generally identified by all methods; however, the Morris method exceeded all others in terms of its ability to find conclusive and consistent answers due to its ability to identify unimportant parameters. VARS performed second best, on average, with better ability to find conclusive and consistent answers with fewer computer simulations than Sobol. In the fourth study, the long-term computer model was applied to a mid-Atlantic barrier island and used to assess the island's vulnerabilities to sea level rise, overwash (when water flows over the dunes), and the impact of coastal management alternatives. Placing thin layers of additional sediment on top of the marsh platforms and extending the shoreline toward the ocean by placing additional sediment on the beach were found to be effective at sustaining the marsh and minimizing movement of the barrier island landward, respectively.
|
4 |
Efficient Computational Methods for Structural Reliability and Global Sensitivity AnalysesZhang, Xufang 25 April 2013 (has links)
Uncertainty analysis of a system response is an important part of engineering probabilistic analysis. Uncertainty analysis includes: (a) to evaluate moments of the response; (b) to evaluate reliability analysis of the system; (c) to assess the complete probability distribution of the response; (d) to conduct the parametric sensitivity analysis of the output. The actual model of system response is usually a high-dimensional function of input variables. Although Monte Carlo simulation is a quite general approach for this purpose, it may require an inordinate amount of resources to achieve an acceptable level of accuracy. Development of a computationally efficient method, hence, is of great importance.
First of all, the study proposed a moment method for uncertainty quantification of structural systems. However, a key departure is the use of fractional moment of response function, as opposed to integer moment used so far in literature. The advantage of using fractional moment over integer moment was illustrated from the relation of one fractional moment with a couple of integer moments. With a small number of samples to compute the fractional moments, a system output distribution was estimated with the principle of maximum entropy (MaxEnt) in conjunction with the constraints specified in terms of fractional moments. Compared to the classical MaxEnt, a novel feature of the proposed method is that fractional exponent of the MaxEnt distribution is determined through the entropy maximization process, instead of assigned by an analyst in prior.
To further minimize the computational cost of the simulation-based entropy method, a multiplicative dimensional reduction method (M-DRM) was proposed to compute the fractional (integer) moments of a generic function with multiple input variables. The M-DRM can accurately approximate a high-dimensional function as the product of a series low-dimensional functions. Together with the principle of maximum entropy, a novel computational approach was proposed to assess the complete probability distribution of a system output. Accuracy and efficiency of the proposed method for structural reliability analysis were verified by crude Monte Carlo simulation of several examples.
Application of M-DRM was further extended to the variance-based global sensitivity analysis of a system. Compared to the local sensitivity analysis, the variance-based sensitivity index can provide significance information about an input random variable. Since each component variance is defined as a conditional expectation with respect to the system model function, the separable nature of the M-DRM approximation can simplify the high-dimension integrations in sensitivity analysis. Several examples were presented to illustrate the numerical accuracy and efficiency of the proposed method in comparison to the Monte Carlo simulation method.
The last contribution of the proposed study is the development of a computationally efficient method for polynomial chaos expansion (PCE) of a system's response. This PCE model can be later used uncertainty analysis. However, evaluation of coefficients of a PCE meta-model is computational demanding task due to the involved high-dimensional integrations. With the proposed M-DRM, the involved computational cost can be remarkably reduced compared to the classical methods in literature (simulation method or tensor Gauss quadrature method). Accuracy and efficiency of the proposed method for polynomial chaos expansion were verified by considering several practical examples.
|
5 |
Global sensitivity analysis of reactor parameters / Bolade Adewale AdetulaAdetula, Bolade Adewale January 2011 (has links)
Calculations of reactor parameters of interest (such as neutron multiplication factors, decay heat,
reaction rates, etc.), are often based on models which are dependent on groupwise neutron cross
sections. The uncertainties associated with these neutron cross sections are propagated to the final
result of the calculated reactor parameters. There is a need to characterize this uncertainty and to
be able to apportion the uncertainty in a calculated reactor parameter to the different sources of
uncertainty in the groupwise neutron cross sections, this procedure is known as sensitivity analysis.
The focus of this study is the application of a modified global sensitivity analysis technique to
calculations of reactor parameters that are dependent on groupwise neutron cross–sections. Sensitivity
analysis can help in identifying the important neutron cross sections for a particular model,
and also helps in establishing best–estimate optimized nuclear reactor physics models with reduced
uncertainties.
In this study, our approach to sensitivity analysis will be similar to the variance–based global
sensitivity analysis technique, which is robust, has a wide range of applicability and provides
accurate sensitivity information for most models. However, this technique requires input variables
to be mutually independent. A modification to this technique, that allows one to deal with input
variables that are block–wise correlated and normally distributed, is presented.
The implementation of the modified technique involves the calculation of multi–dimensional integrals,
which can be prohibitively expensive to compute. Numerical techniques specifically suited
to the evaluation of multidimensional integrals namely Monte Carlo, quasi–Monte Carlo and sparse
grids methods are used, and their efficiency is compared. The modified technique is illustrated and
tested on a two–group cross–section dependent problem. In all the cases considered, the results obtained
with sparse grids achieved much better accuracy, while using a significantly smaller number of samples. / Thesis (M.Sc. Engineering Sciences (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2011.
|
6 |
Global sensitivity analysis of reactor parameters / Bolade Adewale AdetulaAdetula, Bolade Adewale January 2011 (has links)
Calculations of reactor parameters of interest (such as neutron multiplication factors, decay heat,
reaction rates, etc.), are often based on models which are dependent on groupwise neutron cross
sections. The uncertainties associated with these neutron cross sections are propagated to the final
result of the calculated reactor parameters. There is a need to characterize this uncertainty and to
be able to apportion the uncertainty in a calculated reactor parameter to the different sources of
uncertainty in the groupwise neutron cross sections, this procedure is known as sensitivity analysis.
The focus of this study is the application of a modified global sensitivity analysis technique to
calculations of reactor parameters that are dependent on groupwise neutron cross–sections. Sensitivity
analysis can help in identifying the important neutron cross sections for a particular model,
and also helps in establishing best–estimate optimized nuclear reactor physics models with reduced
uncertainties.
In this study, our approach to sensitivity analysis will be similar to the variance–based global
sensitivity analysis technique, which is robust, has a wide range of applicability and provides
accurate sensitivity information for most models. However, this technique requires input variables
to be mutually independent. A modification to this technique, that allows one to deal with input
variables that are block–wise correlated and normally distributed, is presented.
The implementation of the modified technique involves the calculation of multi–dimensional integrals,
which can be prohibitively expensive to compute. Numerical techniques specifically suited
to the evaluation of multidimensional integrals namely Monte Carlo, quasi–Monte Carlo and sparse
grids methods are used, and their efficiency is compared. The modified technique is illustrated and
tested on a two–group cross–section dependent problem. In all the cases considered, the results obtained
with sparse grids achieved much better accuracy, while using a significantly smaller number of samples. / Thesis (M.Sc. Engineering Sciences (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2011.
|
7 |
Sensitivity analysis of biochemical systems using high-throughput computingKent, Edward Lander January 2013 (has links)
Mathematical modelling is playing an increasingly important role in helping us to understand biological systems. The construction of biological models typically requires the use of experimentally-measured parameter values. However, varying degrees of uncertainty surround virtually all parameters in these models. Sensitivity analysis is one of the most important tools for the analysis of models, and shows how the outputs of a model, such as concentrations and reaction fluxes, are dependent on the parameters which make up the input. Unfortunately, small changes in parameter values can lead to the results of a sensitivity analysis changing significantly. The results of such analyses must therefore be interpreted with caution, particularly if a high degree of uncertainty surrounds the parameter values. Global sensitivity analysis methods can help in such situations by allowing sensitivities to be calculated over a range of possible parameter values. However, these techniques are computationally expensive, particularly for larger, more detailed models. Software was developed to enable a number of computationally-intensive modelling tasks, including two global sensitivity analysis methods, to be run in parallel in a high-throughput computing environment. The use of high-throughput computing enabled the run time of these analyses to be drastically reduced, allowing models to be analysed to a degree that would otherwise be impractical or impossible. Global sensitivity analysis using high-throughput computing was performed on a selection of both theoretical and physiologically-based models. Varying degrees of parameter uncertainty were considered. These analyses revealed instances in which the results of a sensitivity analysis were valid, even under large degrees of parameter variation. Other cases were found for which only a slight change in parameter values could completely change the results of the analysis. Parameter uncertainties are a real problem in biological systems modelling. This work shows how, with the help of high-throughput computing, global sensitivity analysis can become a practical part of the modelling process.
|
8 |
POLYNOMIAL CHAOS EXPANSION IN BIO- AND STRUCTURAL MECHANICS / MISE EN OEUVRE DU CHAOS POLYNOMIAL EN BIOMECANIQUE ET EN MECANIQUE DES STRUCTURESSzepietowska, Katarzyna 12 October 2018 (has links)
Cette thèse présente une approche probabiliste de la modélisation de la mécanique des matériaux et des structures. Le dimensionnement est influencé par l'incertitude des paramètres d'entrée. Le travail est interdisciplinaire et les méthodes décrites sont appliquées à des exemples de biomécanique et de génie civil. La motivation de ce travail était le besoin d'approches basées sur la mécanique dans la modélisation et la simulation des implants utilisés dans la réparation des hernies ventrales. De nombreuses incertitudes apparaissent dans la modélisation du système implant-paroi abdominale. L'approche probabiliste proposée dans cette thèse permet de propager ces incertitudes et d’étudier leurs influences respectives. La méthode du chaos polynomial basée sur la régression est utilisée dans ce travail. L'exactitude de ce type de méthodes non intrusives dépend du nombre et de l'emplacement des points de calcul choisis. Trouver une méthode universelle pour atteindre un bon équilibre entre l'exactitude et le coût de calcul est encore une question ouverte. Différentes approches sont étudiées dans cette thèse afin de choisir une méthode efficace et adaptée au cas d’étude. L'analyse de sensibilité globale est utilisée pour étudier les influences des incertitudes d'entrée sur les variations des sorties de différents modèles. Les incertitudes sont propagées aux modèles implant-paroi abdominale. Elle permet de tirer des conclusions importantes pour les pratiques chirurgicales. À l'aide de l'expertise acquise à partir de ces modèles biomécaniques, la méthodologie développée est utilisée pour la modélisation de joints de bois historiques et la simulation de leur comportement mécanique. Ce type d’étude facilite en effet la planification efficace des réparations et de la rénovation des bâtiments ayant une valeur historique. / This thesis presents a probabilistic approach to modelling the mechanics of materials and structures where the modelled performance is influenced by uncertainty in the input parameters. The work is interdisciplinary and the methods described are applied to medical and civil engineering problems. The motivation for this work was the necessity of mechanics-based approaches in the modelling and simulation of implants used in the repair of ventral hernias. Many uncertainties appear in the modelling of the implant-abdominal wall system. The probabilistic approach proposed in this thesis enables these uncertainties to be propagated to the output of the model and the investigation of their respective influences. The regression-based polynomial chaos expansion method is used here. However, the accuracy of such non-intrusive methods depends on the number and location of sampling points. Finding a universal method to achieve a good balance between accuracy and computational cost is still an open question so different approaches are investigated in this thesis in order to choose an efficient method. Global sensitivity analysis is used to investigate the respective influences of input uncertainties on the variation of the outputs of different models. The uncertainties are propagated to the implant-abdominal wall models in order to draw some conclusions important for further research. Using the expertise acquired from biomechanical models, modelling of historic timber joints and simulations of their mechanical behaviour is undertaken. Such an investigation is important owing to the need for efficient planning of repairs and renovation of buildings of historical value.
|
9 |
Inverse Modeling of Cloud – Aerosol InteractionsPartridge, Daniel January 2011 (has links)
The role of aerosols and clouds is one of the largest sources of uncertainty in understanding climate change. The primary scientific goal of this thesis is to improve the understanding of cloud-aerosol interactions by applying inverse modeling using Markov Chain Monte Carlo (MCMC) simulation. Through a set of synthetic tests using a pseudo-adiabatic cloud parcel model, it is shown that a self adaptive MCMC algorithm can efficiently find the correct optimal values of meteorological and aerosol physiochemical parameters for a specified droplet size distribution and determine the global sensitivity of these parameters. For an updraft velocity of 0.3 m s-1, a shift towards an increase in the relative importance of chemistry compared to the accumulation mode number concentration is shown to exist somewhere between marine (~75 cm-3) and rural continental (~450 cm-3) aerosol regimes. Examination of in-situ measurements from the Marine Stratus/Stratocumulus Experiment (MASE II) shows that for air masses with higher number concentrations of accumulation mode (Dp = 60-120 nm) particles (~450 cm-3), an accurate simulation of the measured droplet size distribution requires an accurate representation of the particle chemistry. The chemistry is relatively more important than the accumulation mode particle number concentration, and similar in importance to the particle mean radius. This result is somewhat at odds with current theory that suggests chemistry can be ignored in all except for the most polluted environments. Under anthropogenic influence, we must consider particle chemistry also in marine environments that may be deemed relatively clean. The MCMC algorithm can successfully reproduce the observed marine stratocumulus droplet size distributions. However, optimising towards the broadness of the measured droplet size distribution resulted in a discrepancy between the updraft velocity, and mean radius/geometric standard deviation of the accumulation mode. This suggests that we are missing a dynamical process in the pseudo-adiabatic cloud parcel model. / At the time of the doctoral defense, the following papers were unpublished and had a status as follows: Paper 3: Submitted. Paper 4: Manuscript.
|
10 |
Statistical Methods for Functional Genomics Studies Using Observational DataLu, Rong 15 December 2016 (has links)
No description available.
|
Page generated in 0.0947 seconds