• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 88
  • 20
  • 20
  • 12
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 206
  • 206
  • 38
  • 33
  • 31
  • 30
  • 28
  • 24
  • 24
  • 22
  • 21
  • 20
  • 20
  • 19
  • 18
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

An exploration of building design and optimisation methods using Kriging meta-modelling

Wood, Michael James January 2016 (has links)
This thesis investigates the application of Kriging meta-modelling techniques in the field of building design and optimisation. In conducting this research, there were two key motivational factors. The first is the need for building designers to have tools that allow low energy buildings to be designed in a fast and efficient manner. The second motivating factor is the need for optimisation tools that account, or help account, for the wide variety of uses that a building might have; so-called Robust Optimisation (RO). This thesis therefore includes an analysis of Kriging meta-modelling and first applies this to simple building problems. I then use this simple building model to determine the effect of the updated UK Test Reference Years (TRYs) on energy consumption. Second, I examine Kriging-based optimisation techniques for a single objective. I then revisit the single-building meta-model to examine the effect of uncertainty on a neighbourhood of buildings and compare the results to the output of a brute-force analysis of a full building simulator. The results show that the Kriging emulation is an effective tool for creating a meta-model of a building. The subsequent use in the analysis of the effect of TRYs on building shows that UK buildings are likely to use less heating in the future but are likely to overheat more. In the final two chapters I use the techniques developed to create a robust building optimisation algorithm as well as using Kriging to improve the optimisation efficiency of the well-known NSGA-II algorithm. I show that the Kriging-based robust optimiser effectively finds more robust solutions than traditional global optimisation. I also show that Kriging techniques can be used to augment NSGA-II so that it finds more diverse solutions to some types of multi-objective optimisation problems. The results show that Kriging has significant potential in this field and I reveal many potential areas of future research. This thesis shows how a Kriging-enhanced NSGA-II multi-objective optimisation algorithm can be used to improve the performance of NSGA-II. This new algorithm has been shown to speed up the convergence of some multi-objective optimisation algorithms significantly. Although further work is required to verify the results for a wider variety of building applications, the initial results are promising.
22

QUALITATIVE AND QUANTITATIVE PROCEDURE FOR UNCERTAINTY ANALYSIS IN LIFE CYCLE ASSESSMENT OF WASTEWATER SOLIDS TREATMENT PROCESSES

Alyaseri, Isam 01 May 2014 (has links)
In order to perform the environmental analysis and find the best management in the wastewater treatment processes using life cycle assessment (LCA) method, uncertainty in LCA has to be evaluated. A qualitative and quantitative procedure was constructed to deal with uncertainty for the wastewater treatment LCA studies during the inventory and analysis stages. The qualitative steps in the procedure include setting rules for the inclusion of inputs and outputs in the life cycle inventory (LCI), setting rules for the proper collection of data, identifying and conducting data collection analysis for the significant contributors in the model, evaluating data quality indicators, selecting the proper life cycle impact assessment (LCIA) method, evaluating the uncertainty in the model through different cultural perspectives, and comparing with other LCIA methods. The quantitative steps in the procedure include assigning the best guess value and the proper distribution for each input or output in the model, calculating the uncertainty for those inputs or outputs based on data characteristics and the data quality indicators, and finally using probabilistic analysis (Monte Carlo simulation) to estimate uncertainty in the outcomes. Environmental burdens from the solids handling unit at Bissell Point Wastewater Treatment Plant (BPWWTP) in Saint Louis, Missouri was analyzed. Plant specific data plus literature data were used to build an input-output model. Environmental performance of an existing treatment scenario (dewatering-multiple hearth incineration-ash to landfill) was analyzed. To improve the environmental performance, two alternative scenarios (fluid bed incineration and anaerobic digestion) were proposed, constructed, and evaluated. System boundaries were set to include the construction, operation and dismantling phases. The impact assessment method chosen was Eco-indicator 99 and the impact categories were: carcinogenicity, respiratory organics and inorganics, climate change, radiation, ozone depletion, ecotoxicity, acidification-eutrophication, and minerals and fossil fuels depletion. Analysis of the existing scenario shows that most of the impacts came from the operation phase on the categories related to fossil fuels depletion, respiratory inorganics, and carcinogens due to energy consumed and emissions from incineration. The proposed alternatives showed better performance than the existing treatment. Fluid bed incineration had better performance than anaerobic digestion. Uncertainty analysis showed there is 57.6% possibility to have less impact on the environment when using fluid bed incineration than the anaerobic digestion. Based on single scores ranking in the Eco-indicator 99 method, the environmental impact order is: multiple hearth incineration > anaerobic digestion > fluid bed incineration. This order was the same for the three model perspectives in the Eco-indicator 99 method and when using other LCIA methods (Eco-point 97 and CML 2000). The study showed that the incorporation of qualitative/quantitative uncertainty analysis into LCA gave more information than the deterministic LCA and can strengthen the LCA study. The procedure tested in this study showed that Monte Carlo simulation can be used in quantifying uncertainty in the wastewater treatment studies. The procedure can be used to analyze the performance of other treatment options. Although the analysis in different perspectives and different LCIA methods did not impact the order of the scenarios, it showed a possibility of variation in the final outcomes of some categories. The study showed the importance of providing decision makers with the best and worst possible outcomes in any LCA study and informing them about the perspectives and assumptions used in the assessment. Monte Carlo simulation is able to perform uncertainty analysis in the comparative LCA only between two products or scenarios based on the (A-B) approach due to the overlapping between the probability distributions of the outcomes. It is recommended to modify it to include more than two scenarios.
23

Method for the Interpretation of RMR Variability Using Gaussian Simulation to Reduce the Uncertainty in Estimations of Geomechanical Models of Underground Mines

Rodriguez-Vilca, Juliet, Paucar-Vilcañaupa, Jose, Pehovaz-Alvarez, Humberto, Raymundo, Carlos, Mamani-Macedo, Nestor, Moguerza, Javier M. 01 January 2020 (has links)
El texto completo de este trabajo no está disponible en el Repositorio Académico UPC por restricciones de la casa editorial donde ha sido publicado. / The application of conventional techniques, such as kriging, to model rock mass is limited because rock mass spatial variability and heterogeneity are not considered in such techniques. In this context, as an alternative solution, the application of the Gaussian simulation technique to simulate rock mass spatial heterogeneity based on the rock mass rating (RMR) classification is proposed. This research proposes a methodology that includes a variographic analysis of the RMR in different directions to determine its anisotropic behavior. In the case study of an underground deposit in Peru, the geomechanical record data compiled in the field were used. A total of 10 simulations were conducted, with approximately 6 million values for each simulation. These were calculated, verified, and an absolute mean error of only 3.82% was estimated. It is acceptable when compared with the value of 22.15% obtained with kriging.
24

Modeling and Uncertainty Analysis of CCHP systems

Smith, Joshua Aaron 15 December 2012 (has links)
Combined Cooling Heating and Power (CCHP) systems have been recognized as a viable alternative to conventional electrical and thermal energy generation in buildings because of their high efficiency, low environmental impact, and power grid independence. Many researchers have presented models for comparing CCHP systems to conventional systems and for optimizing CCHP systems. However, many of the errors and uncertainties that affect these modeling efforts have not been adequately addressed in the literature. This dissertation will focus on the following key issues related to errors and uncertainty in CCHP system modeling: (a) detailed uncertainty analysis of a CCHP system model with novel characterization of weather patterns, fuel prices and component efficiencies; (b) sensitivity analysis of a method for estimating the hourly energy demands of a building using Department of Energy (DOE) reference building models in combination with monthly utility bills; (c) development of a practical technique for selecting the optimal Power Generation Unit (PGU) for a given building that is robust with respect to fuel cost and weather uncertainty; (d) development of a systematic method for integrated calibration and parameter estimation of thermal system models. The results from the detailed uncertainty analysis show that CCHP operational strategies can effectively be assessed using steady state models with typical year weather data. The results of the sensitivity analysis reveal that the DOE reference buildings can be adjusted using monthly utility bills to represent the hourly energy demands of actual buildings. The optimal PGU sizing study illustrates that the PGU can be selected for a given building in consideration of weather and fuel cost uncertainty. The results of the integrated parameter estimation study reveal that using the integrated approach can reduce the effect of measurement error on the accuracy of predictive thermal system models.
25

A System Dynamics Approach for the Development of a Patient-Specific Protocol for Radioiodine Treatment of Graves' Disease

Merrill, Steven J 01 January 2009 (has links) (PDF)
The thyroid gland secretes hormones that help to govern metabolism and energy expenditure within the body [1]; these hormones also affect growth and development. As a result, the regulation of thyroid hormones is vital for maintaining an individual's well being. Graves' disease is an autoimmune disorder and is a major cause of hyperthyroidism or an overproduction of thyroid hormones. Radioactive iodine (RAI) therapy has become the preferred treatment with typical RAI protocols being based on the Marinelli-Quimby equation to compute the dose; however, up to 90 % of subjects become hypothyroid within the first year after therapy. In this thesis we focus on the development of a new computational protocol for the calculation of RAI in the treatment of Graves' hyperthyroidism. The new protocol implements a two-compartment model to describe RAI kinetics in the body, which accounts for the conversion between different RAI isotopes used in diagnostic and therapeutic applications. Thus, by using the measured response of the subject's thyroid to a test dose of 123I, the model predicts what amount of RAI (131I) will be needed to reduce, through ablation, the functional, thyroid volume/mass to an amount that would result in a normal metabolic balance. A detailed uncertainty analysis was performed using both a standard propagation of error method as well as a simulation method. The simulation method consisted of both parametric and nonparametric bootstrapping techniques. Using clinical data consisting of activity kinetics and mass dynamics of 17 subjects and measured final mass values of 7 of the 17 subjects, we were able to validate the protocol as well as quantify the uncertainty analysis. This protocol is the basis of an ongoing pilot study in conjunction with Cooley Dickinson hospital, Northampton, MA.
26

STRUCTURAL UNCERTAINTY IN HYDROLOGICAL MODELS

Abhinav Gupta (11185086) 28 July 2021 (has links)
All hydrological models incur various uncertainties that can be broadly classified into three categories: measurement, structural, and parametric uncertainties. Measurement uncertainty exists due to error in measurements of properties and variables (e.g. streamflows that are typically an output and rainfall that serves as an input to hydrological models). Structural uncertainty exists due errors in mathematical representation of real-world hydrological processes. Parametric uncertainty exists due to structural and measurement uncertainty and limited amount of data availability for calibration. <br>Several studies have addressed the problem of measurement and parametric uncertainties but studies on structural uncertainty are lacking. Specifically, there does not exist any model that can be used to quantify structural uncertainties at an ungauged location. This was the first objective of the study: to develop a model of structural uncertainty that can be used to quantify total uncertainty (including structural uncertainty) in streamflow estimates at ungauged locations in a watershed. The proposed model is based on the idea that since the effect of structural uncertainty is to introduce a bias into the parameter estimation, one way to accommodate structural uncertainty is to compensate for this bias. The developed model was applied to two watersheds: Upper Wabash Busseron Watershed (UWBW) and Lower Des Plaines Watershed (LDPW). For UWBW, mean daily streamflow data were used while for LDPW mean hourly streamflow data were used. The proposed model worked well for mean daily data but failed to capture the total uncertainties for hourly data likely due to higher measurement uncertainties in hourly streamflow data than what was assumed in the study.<br>Once a hydrological and error model is specified, the next step is to estimate model- and error- parameters. Parameter estimation in hydrological modeling may be carried out using either formal Bayesian methodology or informal Bayesian methodology. In formal Bayesian methodology, a likelihood function, motivated from probability theory, is specified over a space of models (or residuals), and a prior probability distribution is assigned over the space of models. There has been significant debate on whether the likelihood functions used in Bayesian theory are justified in hydrological modeling. However, relatively little attention has been given to justification of prior probabilities. In most hydrological modeling studies, a uniform prior over hydrological model parameters is used to reflect a complete lack of knowledge of a modeler about model parameters before calibration. Such a prior is also known as a non-informative prior. The second objective of this study was to scrutinize the assumption of uniform prior as non-informative using the principle of maximum information gain. This principle was used to derive non-informative priors for several hydrological models, and it was found that the obtained prior was significantly different from a uniform prior. Further, the posterior distributions obtained by using this prior were significantly different from those obtained by using uniform priors.<br>The information about uncertainty in a modeling exercise is typically obtained from residual time series (the difference between observed and simulated streamflows) which is an aggregate of structural and measurement uncertainties for a fixed model parameter set. Using this residual time series, an estimate of total uncertainty may be obtained but it is impossible to separate structural and measurement uncertainties. The separation of these two uncertainties is, however, required to facilitate the rejection of deficient model structures, and to identify whether the model structure or the measurements need to be improved to reduce the total uncertainty. The only way to achieve this goal is to obtain an estimate of measurement uncertainty before model calibration. An estimate of measurement uncertainties in streamflow can be obtained by using rating-curve analysis but it is difficult to obtain an estimate of measurement uncertainty in rainfall. In this study, the classic idea of repeated sampling is used to get an estimate of measurement uncertainty in rainfall and streamflows. In the repeated sampling scheme, an experiment is performed several times under identical conditions to get an estimate of measurement uncertainty. This kind of repeated sampling, however, is not strictly possible for environmental observations, therefore, repeated sampling was used in an approximate manner using a machine learning algorithm called random forest (RF). The main idea is to identify rainfall-runoff events across several different watersheds which are similar to each other such that they can be thought of as different realizations of the same experiment performed under identical conditions. The uncertainty bounds obtained by RF were compared against the uncertainty band obtained by rating-curve analysis and runoff-coefficient method. Overall, the results of this study are encouraging in using RF as a pseudo repeated sampler. <br>In the fourth objective, importance of uncertainty in estimated streamflows at ungauged locations and uncertainty in measured streamflows at gauged locations is illustrated in water quality modeling. The results of this study showed that it is not enough to obtain an uncertainty bound that envelops the true streamflows, but that the individual realizations obtained by the model of uncertainty should be able to emulate the shape of the true streamflow time series for water quality modeling.
27

DEVELOPMENT OF HYBRID APPROACHES FOR UNCERTAINTY QUANTIFICATION IN HYDROLOGICAL MODELING

Ghaith, Maysara January 2020 (has links)
Water is a scarce resource especially as the water demand is significantly increasing due to the rapid growth of population. Hydrological modelling has gained a lot of attention, as it is the key to predict water availability, optimize the use of water resources and develop risk mitigation schemes. There are still many challenges in hydrological modelling that researchers and designers are trying to solve. These challenges include, but not limited to: i) there is no single robust model that can perform well in all watersheds; ii) model parameters are often associated with uncertainty, which makes the results inconclusive; iii) the required computational power for uncertainty quantification increases with the increase in model complexity; iv) some modelling assumptions to simplify computational complexity, such as parameter independence are, are often not realistic. These challenges make it difficult to provide robust hydrological predictions and/or to quantify the uncertainties within hydrological models in an efficient and accurate way. This study aims to provide more robust hydrological predictions by developing a set of hybrid approaches. Firstly, a hybrid hydrological data-driven (HHDD) model based on the integration of a physically-based hydrological model (HYMOD) and a data-driven model (artificial neural network, ANN) is developed. The HHDD model is capable of improving prediction accuracy and generating interval flow prediction results. Secondly, a hybrid probabilistic forecasting approach is developed by linking the polynomial chaos expansion (PCE) method with ANN. The results indicate that PCE-ANN can be as reliable as but much more efficient than the traditional Monte-Carlo (MC) method for probabilistic flow forecasting. Finally, a hybrid uncertainty quantification approach that can address parameter dependence is developed through the integration of principal component analysis (PCA) with PCE. The results from this dissertation research can provide valuable technical and decision support for hydrological modeling and water resources management under uncertainty. / Thesis / Doctor of Engineering (DEng) / There is a water scarcity problem in the world, so it is vital to have reliable decision support tools for effective water resources management. Researchers and decision-makers rely on hydrological modelling to predict water availability. Hydrological model results are then used for water resources allocation and risk mitigation. Hydrological modelling is not a simple process, as there are different sources of uncertainty associated with it, such as model structure, model parameters, and data. In this study, data-driven techniques are used with process-driven models to develop hybrid uncertainty quantification approaches for hydrological modelling. The overall objectives are: i) to generate more robust probabilistic forecasts; ii) to improve the computational efficiency for uncertainty quantification without compromising accuracy; and, iii) to overcome the limitations of current uncertainty quantification methods, such as parameter interdependency. The developed hybrid approaches can be used by decision-makers in water resources management, as well as risk assessment and mitigation.
28

Uncertainty and Confidence Intervals of the Monte Carlo Ray-Trace Method in Radiation Heat Transfer

Sanchez, Maria Cristina 13 December 2002 (has links)
The primary objective of the work reported here is to develop a methodology to predict the uncertainty associated with radiation heat transfer problems solved using the Monte Carlo ray-trace method (MCRT). Four equations are developed to predict the uncertainty of the distribution factor from one surface to another, the global uncertainty of all the distribution factors in an enclosure, the uncertainty of the net heat flux from a surface, and the global uncertainty of the net heat flux from all the surfaces in an enclosure, respectively. Numerical experiments are performed to successfully validate these equations and to study the impact of various parameters such as the number of surfaces in an enclosure, the number of energy bundles traced in the MCRT model, the fractional uncertainty of emissivity and temperature, and the temperature distribution in the enclosure. Finally, the methodology is successfully applied to a detailed MCRT model of a CERES-like radiometer. / Ph. D.
29

Uncertainty analysis for runoff, crop yield, sediment, and nutrient loads in the Mississippi Delta region using APEX

Méndez Monroy, Javier Fernando 10 May 2024 (has links) (PDF)
Understanding the dynamics of agricultural basins has been difficult for decision-makers when developing cost-effective plans. An uncertainty analysis evaluates the impact of information gaps on hydrologic model’s output and performance. The Agricultural Policy/Environmental Extender (APEX v1501) was used to predict runoff, crop yield, sediment load, total phosphorus, and total nitrogen from agricultural fields in the Mississippi Delta to investigate the impact of using different input variables (climate, soils, and management practices) on mechanistic models. Results indicated that the use of surrogate information such as weather data from close weather stations, a predominant soil series, and simulated irrigation schedules, could be considered when available in situ information is restricted. Overall results provided information on model setup and output interpretation that may be useful to Mississippi Delta decision-makers.
30

BENCH-TOP VALIDATION OF INTELLIGENT MOUTH GUARD

Aksu, Alper 08 August 2013 (has links)
No description available.

Page generated in 0.0938 seconds