• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 14
  • 8
  • 1
  • Tagged with
  • 29
  • 29
  • 29
  • 14
  • 14
  • 12
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Sensitivity Analysis of Models with Input Codependencies

Dougherty, SEAN 05 December 2013 (has links)
Assuming a set of variates are independent and normally distributed is commonplace in statistics. In this thesis, we consider the consequences of these assumptions as they pertain to global sensitivity analysis. We begin by illustrating how the notion of sensitivity becomes distorted in the presence of codependent model inputs. This observation motivates us to develop a new methodology which accommodates for input codependencies. Our methodology can be summarized through three points: First, a new form of sensitivity is presented which performs as well as the classical form but can be obtained at a fraction of the computational cost. Second, we define a measure which quantifies the extent of distortion caused by codependent inputs. The third point is regarding the modelling of said codependencies. The multivariate normal distribution is a natural choice for modelling codependent inputs; however, our methodology uses a copula-based approach instead. Copulas are a contemporary strategy for constructing multivariate distributions whereby the marginal and joint behaviours are treated separately. As a result, a practitioner has more flexibility when modelling inputs. / Thesis (Master, Chemical Engineering) -- Queen's University, 2013-12-05 10:16:26.81
2

A Practical and Fast Numerical Method for Calculating Global Sensitivity with Examples from Supply Chain and Measurement Applications

Groves, William Alan 02 August 2023 (has links)
No description available.
3

Barrier Island Morphodynamic Insights from Applied Global Sensitivity Analysis and Decadal Exploratory Modeling

Hoagland, Steven William Harvey 02 October 2024 (has links)
Barrier islands serve as valuable resources for coastal communities by reducing backbarrier flooding, providing wildlife habitat, and creating local economic activity through opportunities for recreation and tourism. Because the benefits of these islands are linked to their morphology, coastal resource planners must consider what management alternatives will maximize these benefits, considering both short- and long-term goals. Recent advances in long-term computational modeling of barrier island, marsh, and lagoon systems have created opportunities for gaining additional insights into the morphodynamics of these systems, which may help planners make better-informed coastal management decisions. In this series of studies, a recently developed long-term barrier-marsh-lagoon model is evaluated to better understand system morphodynamics and applied to a real barrier island system in the mid-Atlantic to understand its vulnerabilities and the potential impacts of management alternatives. In the first study, a comprehensive review of advances in barrier island morphodynamic modeling was presented. In the second study, a global sensitivity analysis method, the Sobol method, was used to explore the parameter space of the barrier-marsh-lagoon model. The significant influence of initial barrier geometry, the combination of parameters required for short-term drowning to occur, and the significant role of tidal dispersion on backbarrier sediment dynamics were morphodynamic insights drawn from this study. In the third study, five global sensitivity analysis methods were evaluated based on their ability to rank parameters, converge to stable results, and their reliability. Groups of the most significant parameters were generally identified by all methods; however, the Morris method exceeded all others in terms of performance, especially its ability to converge and its reliability. VARS performed second best, on average, with better convergence and reliability results than the Sobol method, and with lower simulation counts. In the fourth study, the long-term model was applied to a mid-Atlantic barrier island and used to assess the island's vulnerabilities to sea level rise, overwash, and the impact of coastal management alternatives. Thin-layer placement and beach nourishment were found to be effective at sustaining the marsh and minimizing island retreat, respectively. / Doctor of Philosophy / Barrier islands help coastal communities by reducing flooding, providing wildlife habitat, and creating local economic activity through opportunities for recreation and tourism. Because the benefits of these islands are linked to their form, decision-makers must think about how to manage these islands to help the community both now and in the future. Recent advances in computer modeling of barrier islands, and the adjacent marshes and lagoons, over decades to hundreds of years, have created opportunities for us to learn more about how these systems behave over time, which may help planners make better-informed coastal management decisions. In this series of studies, a recently developed computer model of the barrier island, marsh, and lagoon is evaluated to learn how the system changes over time and applied to a real barrier island system in the mid-Atlantic to understand its vulnerabilities and the potential impacts of management alternatives. In the first study, a comprehensive review of advances in computer modeling of barrier island changes over time was presented. In the second study, the impact of the model parameters and their combinations with one another was explored using the Sobol global sensitivity analysis method, which is widely considered to be the standard method in practice. The significant influence of initial barrier geometry, the combination of parameters required for the barrier to be overcome by sea level in the short-term, and the significant role of sediment delivered behind the island through tidal inlets were significant insights into the system behavior that were drawn from this study. In the third study, five global sensitivity analysis methods were evaluated based on their ability to rank parameters, the number of computer simulations that were required, the ability of a method to arrive at a conclusive answer, and the consistency of a method in providing an answer. Groups of the most significant parameters were generally identified by all methods; however, the Morris method exceeded all others in terms of its ability to find conclusive and consistent answers due to its ability to identify unimportant parameters. VARS performed second best, on average, with better ability to find conclusive and consistent answers with fewer computer simulations than Sobol. In the fourth study, the long-term computer model was applied to a mid-Atlantic barrier island and used to assess the island's vulnerabilities to sea level rise, overwash (when water flows over the dunes), and the impact of coastal management alternatives. Placing thin layers of additional sediment on top of the marsh platforms and extending the shoreline toward the ocean by placing additional sediment on the beach were found to be effective at sustaining the marsh and minimizing movement of the barrier island landward, respectively.
4

Efficient Computational Methods for Structural Reliability and Global Sensitivity Analyses

Zhang, Xufang 25 April 2013 (has links)
Uncertainty analysis of a system response is an important part of engineering probabilistic analysis. Uncertainty analysis includes: (a) to evaluate moments of the response; (b) to evaluate reliability analysis of the system; (c) to assess the complete probability distribution of the response; (d) to conduct the parametric sensitivity analysis of the output. The actual model of system response is usually a high-dimensional function of input variables. Although Monte Carlo simulation is a quite general approach for this purpose, it may require an inordinate amount of resources to achieve an acceptable level of accuracy. Development of a computationally efficient method, hence, is of great importance. First of all, the study proposed a moment method for uncertainty quantification of structural systems. However, a key departure is the use of fractional moment of response function, as opposed to integer moment used so far in literature. The advantage of using fractional moment over integer moment was illustrated from the relation of one fractional moment with a couple of integer moments. With a small number of samples to compute the fractional moments, a system output distribution was estimated with the principle of maximum entropy (MaxEnt) in conjunction with the constraints specified in terms of fractional moments. Compared to the classical MaxEnt, a novel feature of the proposed method is that fractional exponent of the MaxEnt distribution is determined through the entropy maximization process, instead of assigned by an analyst in prior. To further minimize the computational cost of the simulation-based entropy method, a multiplicative dimensional reduction method (M-DRM) was proposed to compute the fractional (integer) moments of a generic function with multiple input variables. The M-DRM can accurately approximate a high-dimensional function as the product of a series low-dimensional functions. Together with the principle of maximum entropy, a novel computational approach was proposed to assess the complete probability distribution of a system output. Accuracy and efficiency of the proposed method for structural reliability analysis were verified by crude Monte Carlo simulation of several examples. Application of M-DRM was further extended to the variance-based global sensitivity analysis of a system. Compared to the local sensitivity analysis, the variance-based sensitivity index can provide significance information about an input random variable. Since each component variance is defined as a conditional expectation with respect to the system model function, the separable nature of the M-DRM approximation can simplify the high-dimension integrations in sensitivity analysis. Several examples were presented to illustrate the numerical accuracy and efficiency of the proposed method in comparison to the Monte Carlo simulation method. The last contribution of the proposed study is the development of a computationally efficient method for polynomial chaos expansion (PCE) of a system's response. This PCE model can be later used uncertainty analysis. However, evaluation of coefficients of a PCE meta-model is computational demanding task due to the involved high-dimensional integrations. With the proposed M-DRM, the involved computational cost can be remarkably reduced compared to the classical methods in literature (simulation method or tensor Gauss quadrature method). Accuracy and efficiency of the proposed method for polynomial chaos expansion were verified by considering several practical examples.
5

Global sensitivity analysis of reactor parameters / Bolade Adewale Adetula

Adetula, Bolade Adewale January 2011 (has links)
Calculations of reactor parameters of interest (such as neutron multiplication factors, decay heat, reaction rates, etc.), are often based on models which are dependent on groupwise neutron cross sections. The uncertainties associated with these neutron cross sections are propagated to the final result of the calculated reactor parameters. There is a need to characterize this uncertainty and to be able to apportion the uncertainty in a calculated reactor parameter to the different sources of uncertainty in the groupwise neutron cross sections, this procedure is known as sensitivity analysis. The focus of this study is the application of a modified global sensitivity analysis technique to calculations of reactor parameters that are dependent on groupwise neutron cross–sections. Sensitivity analysis can help in identifying the important neutron cross sections for a particular model, and also helps in establishing best–estimate optimized nuclear reactor physics models with reduced uncertainties. In this study, our approach to sensitivity analysis will be similar to the variance–based global sensitivity analysis technique, which is robust, has a wide range of applicability and provides accurate sensitivity information for most models. However, this technique requires input variables to be mutually independent. A modification to this technique, that allows one to deal with input variables that are block–wise correlated and normally distributed, is presented. The implementation of the modified technique involves the calculation of multi–dimensional integrals, which can be prohibitively expensive to compute. Numerical techniques specifically suited to the evaluation of multidimensional integrals namely Monte Carlo, quasi–Monte Carlo and sparse grids methods are used, and their efficiency is compared. The modified technique is illustrated and tested on a two–group cross–section dependent problem. In all the cases considered, the results obtained with sparse grids achieved much better accuracy, while using a significantly smaller number of samples. / Thesis (M.Sc. Engineering Sciences (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2011.
6

Global sensitivity analysis of reactor parameters / Bolade Adewale Adetula

Adetula, Bolade Adewale January 2011 (has links)
Calculations of reactor parameters of interest (such as neutron multiplication factors, decay heat, reaction rates, etc.), are often based on models which are dependent on groupwise neutron cross sections. The uncertainties associated with these neutron cross sections are propagated to the final result of the calculated reactor parameters. There is a need to characterize this uncertainty and to be able to apportion the uncertainty in a calculated reactor parameter to the different sources of uncertainty in the groupwise neutron cross sections, this procedure is known as sensitivity analysis. The focus of this study is the application of a modified global sensitivity analysis technique to calculations of reactor parameters that are dependent on groupwise neutron cross–sections. Sensitivity analysis can help in identifying the important neutron cross sections for a particular model, and also helps in establishing best–estimate optimized nuclear reactor physics models with reduced uncertainties. In this study, our approach to sensitivity analysis will be similar to the variance–based global sensitivity analysis technique, which is robust, has a wide range of applicability and provides accurate sensitivity information for most models. However, this technique requires input variables to be mutually independent. A modification to this technique, that allows one to deal with input variables that are block–wise correlated and normally distributed, is presented. The implementation of the modified technique involves the calculation of multi–dimensional integrals, which can be prohibitively expensive to compute. Numerical techniques specifically suited to the evaluation of multidimensional integrals namely Monte Carlo, quasi–Monte Carlo and sparse grids methods are used, and their efficiency is compared. The modified technique is illustrated and tested on a two–group cross–section dependent problem. In all the cases considered, the results obtained with sparse grids achieved much better accuracy, while using a significantly smaller number of samples. / Thesis (M.Sc. Engineering Sciences (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2011.
7

Sensitivity analysis of biochemical systems using high-throughput computing

Kent, Edward Lander January 2013 (has links)
Mathematical modelling is playing an increasingly important role in helping us to understand biological systems. The construction of biological models typically requires the use of experimentally-measured parameter values. However, varying degrees of uncertainty surround virtually all parameters in these models. Sensitivity analysis is one of the most important tools for the analysis of models, and shows how the outputs of a model, such as concentrations and reaction fluxes, are dependent on the parameters which make up the input. Unfortunately, small changes in parameter values can lead to the results of a sensitivity analysis changing significantly. The results of such analyses must therefore be interpreted with caution, particularly if a high degree of uncertainty surrounds the parameter values. Global sensitivity analysis methods can help in such situations by allowing sensitivities to be calculated over a range of possible parameter values. However, these techniques are computationally expensive, particularly for larger, more detailed models. Software was developed to enable a number of computationally-intensive modelling tasks, including two global sensitivity analysis methods, to be run in parallel in a high-throughput computing environment. The use of high-throughput computing enabled the run time of these analyses to be drastically reduced, allowing models to be analysed to a degree that would otherwise be impractical or impossible. Global sensitivity analysis using high-throughput computing was performed on a selection of both theoretical and physiologically-based models. Varying degrees of parameter uncertainty were considered. These analyses revealed instances in which the results of a sensitivity analysis were valid, even under large degrees of parameter variation. Other cases were found for which only a slight change in parameter values could completely change the results of the analysis. Parameter uncertainties are a real problem in biological systems modelling. This work shows how, with the help of high-throughput computing, global sensitivity analysis can become a practical part of the modelling process.
8

POLYNOMIAL CHAOS EXPANSION IN BIO- AND STRUCTURAL MECHANICS / MISE EN OEUVRE DU CHAOS POLYNOMIAL EN BIOMECANIQUE ET EN MECANIQUE DES STRUCTURES

Szepietowska, Katarzyna 12 October 2018 (has links)
Cette thèse présente une approche probabiliste de la modélisation de la mécanique des matériaux et des structures. Le dimensionnement est influencé par l'incertitude des paramètres d'entrée. Le travail est interdisciplinaire et les méthodes décrites sont appliquées à des exemples de biomécanique et de génie civil. La motivation de ce travail était le besoin d'approches basées sur la mécanique dans la modélisation et la simulation des implants utilisés dans la réparation des hernies ventrales. De nombreuses incertitudes apparaissent dans la modélisation du système implant-paroi abdominale. L'approche probabiliste proposée dans cette thèse permet de propager ces incertitudes et d’étudier leurs influences respectives. La méthode du chaos polynomial basée sur la régression est utilisée dans ce travail. L'exactitude de ce type de méthodes non intrusives dépend du nombre et de l'emplacement des points de calcul choisis. Trouver une méthode universelle pour atteindre un bon équilibre entre l'exactitude et le coût de calcul est encore une question ouverte. Différentes approches sont étudiées dans cette thèse afin de choisir une méthode efficace et adaptée au cas d’étude. L'analyse de sensibilité globale est utilisée pour étudier les influences des incertitudes d'entrée sur les variations des sorties de différents modèles. Les incertitudes sont propagées aux modèles implant-paroi abdominale. Elle permet de tirer des conclusions importantes pour les pratiques chirurgicales. À l'aide de l'expertise acquise à partir de ces modèles biomécaniques, la méthodologie développée est utilisée pour la modélisation de joints de bois historiques et la simulation de leur comportement mécanique. Ce type d’étude facilite en effet la planification efficace des réparations et de la rénovation des bâtiments ayant une valeur historique. / This thesis presents a probabilistic approach to modelling the mechanics of materials and structures where the modelled performance is influenced by uncertainty in the input parameters. The work is interdisciplinary and the methods described are applied to medical and civil engineering problems. The motivation for this work was the necessity of mechanics-based approaches in the modelling and simulation of implants used in the repair of ventral hernias. Many uncertainties appear in the modelling of the implant-abdominal wall system. The probabilistic approach proposed in this thesis enables these uncertainties to be propagated to the output of the model and the investigation of their respective influences. The regression-based polynomial chaos expansion method is used here. However, the accuracy of such non-intrusive methods depends on the number and location of sampling points. Finding a universal method to achieve a good balance between accuracy and computational cost is still an open question so different approaches are investigated in this thesis in order to choose an efficient method. Global sensitivity analysis is used to investigate the respective influences of input uncertainties on the variation of the outputs of different models. The uncertainties are propagated to the implant-abdominal wall models in order to draw some conclusions important for further research. Using the expertise acquired from biomechanical models, modelling of historic timber joints and simulations of their mechanical behaviour is undertaken. Such an investigation is important owing to the need for efficient planning of repairs and renovation of buildings of historical value.
9

Statistical Methods for Functional Genomics Studies Using Observational Data

Lu, Rong 15 December 2016 (has links)
No description available.
10

Sensitivity analysis and evolutionary optimization for building design

Wang, Mengchao January 2014 (has links)
In order to achieve global carbon reduction targets, buildings must be designed to be energy efficient. Building performance simulation methods, together with sensitivity analysis and evolutionary optimization methods, can be used to generate design solution and performance information that can be used in identifying energy and cost efficient design solutions. Sensitivity analysis is used to identify the design variables that have the greatest impacts on the design objectives and constraints. Multi-objective evolutionary optimization is used to find a Pareto set of design solutions that optimize the conflicting design objectives while satisfying the design constraints; building design being an inherently multi-objective process. For instance, there is commonly a desire to minimise both the building energy demand and capital cost while maintaining thermal comfort. Sensitivity analysis has previously been coupled with a model-based optimization in order to reduce the computational effort of running a robust optimization and in order to provide an insight into the solution sensitivities in the neighbourhood of each optimum solution. However, there has been little research conducted to explore the extent to which the solutions found from a building design optimization can be used for a global or local sensitivity analysis, or the extent to which the local sensitivities differ from the global sensitivities. It has also been common for the sensitivity analysis to be conducted using continuous variables, whereas building optimization problems are more typically formulated using a mixture of discretized-continuous variables (with physical meaning) and categorical variables (without physical meaning). This thesis investigates three main questions; the form of global sensitivity analysis most appropriate for use with problems having mixed discretised-continuous and categorical variables; the extent to which samples taken from an optimization run can be used in a global sensitivity analysis, the optimization process causing these solutions to be biased; and the extent to which global and local sensitivities are different. The experiments conducted in this research are based on the mid-floor of a commercial office building having 5 zones, and which is located in Birmingham, UK. The optimization and sensitivity analysis problems are formulated with 16 design variables, including orientation, heating and cooling setpoints, window-to-wall ratios, start and stop time, and construction types. The design objectives are the minimisation of both energy demand and capital cost, with solution infeasibility being a function of occupant thermal comfort. It is concluded that a robust global sensitivity analysis can be achieved using stepwise regression with the use of bidirectional elimination, rank transformation of the variables and BIC (Bayesian information criterion). It is concluded that, when the optimization is based on a genetic algorithm, that solutions taken from the start of the optimization process can be reliably used in a global sensitivity analysis, and therefore, there is no need to generate a separate set of random samples for use in the sensitivity analysis. The extent to which the convergence of the variables during the optimization can be used as a proxy for the variable sensitivities has also been investigated. It is concluded that it is not possible to identify the relative importance of variables through the optimization, even though the most important variable exhibited fast and stable convergence. Finally, it is concluded that differences exist in the variable rankings resulting from the global and local sensitivity methods, although the top-ranked solutions from each approach tend to be the same. It also concluded that the sensitivity of the objectives and constraints to all variables is obtainable through a local sensitivity analysis, but that a global sensitivity analysis is only likely to identify the most important variables. The repeatability of these conclusions has been investigated and confirmed by applying the methods to the example design problem with the building being located in four different climates (Birmingham, UK; San Francisco, US; and Chicago, US).

Page generated in 0.1168 seconds