21 |
Analysis of the uncertainties in the IAEA/WHO TLD postal dose audit programmeHultqvist, Martha January 2006 (has links)
The International Atomic Energy Agency (IAEA) and the World Health Organisation (WHO) operate the IAEA/WHO TLD postal dose audit programme. The purpose of the programme is to verify the beam calibration in radiotherapy centres in developing countries and to check the Secondary Standards Dosimetry Laboratories (SSDLs). Thermoluminescence dosimeters (TLDs) are used as transfer dosimeters and the evaluation of these are done at the IAEA Dosimetry Laboratory. In the present work the uncertainties in the process of dose determination from TLD readings have been evaluated. The analysis comprises the TLD reading reproducibility, uncertainties in the calibration coefficient, and uncertainties in factors correcting for fading of TL signal, influence of TLD holder, energy response and dose response non-linearity. The individual uncertainties were combined to estimate the total uncertainty in the evaluated dose from TLD readings. Experimental data from 2001-2005 were used in the analysis. The total uncertainty has been estimated to be 1.2 % for irradiations with 60Co -rays and 1.6 % for irradiations with high-energy X-rays. Results from irradiations by the Bureau International des Poids et Mesures (BIPM), Primary Standard Dosimetry Laboratories (PSDLs), Secondary Standard Dosimetry Laboratories (SSDLs) and reference centres compare favourably with the estimated uncertainties. The largest uncertainty components are in the energy correction factor (for high-energy X-rays) with a value of 1.1 % and in the dose response non-linearity correction factor with a value of 0.9 %. It has been shown that the acceptance limits of 5 % for TLD results of hospitals and 3.5 % for SSDLs are justified when related to the uncertainties in the dose calculations and the uncertainty in the determination of absorbed dose to water at the centre, as discussed in IAEA TRS-398 (IAEA, 2000), provided that it is followed.
|
22 |
Uncertainty Analysis and the Identification of the Contaminant Transport and Source Parameters for a Computationally Intensive Groundwater SimulationYin, Yong January 2009 (has links)
Transport parameter estimation and contaminant source identification are critical steps in the development of a physically based groundwater contaminant transport model. Due to the irreversibility of the dispersion process, the calibration of a transport model of interest is inherently ill-posed, and very sensitive to the simplification employed in the development of the lumped models. In this research, a methodology for the calibration of physically based computationally intensive transport models was developed and applied to a case study, the Reich Farm Superfund site in Toms River, New Jersey.
Using HydroGeoSphere, a physically based transient three-dimensional computationally intensive groundwater flow model with spatially and temporally varying recharge was developed. Due to the convergence issue of implementing saturation versus permeability curve (van Genuchten equation) for the large scale models with coarse discretization, a novel flux-based method was innovated to determined solutions for the unsaturated zone for soil-water-retention models. The parameters for the flow system were determined separately from the parameters for the contaminant transport model. The contaminant transport and source parameters were estimated using both approximately 15 years of TCE concentration data from continuous well records and data over a period of approximately 30 years from traditional monitoring wells, and compared using optimization with two heuristic search algorithms (DDS and MicroGA) and a gradient based multi-start PEST.
The contaminant transport model calibration results indicate that overall, multi-start PEST performs best in terms of the final best objective function values with equal number of function evaluations. Multi-start PEST also was employed to identify contaminant transport and source parameters under different scenarios including spatially and temporally varying recharge and averaged recharge. For the detailed, transient flow model with spatially and temporally varying recharge, the estimated transverse dispersivity coefficients were estimated to be significantly less than that reported in the literature for the more traditional approach that uses steady-state flow with averaged, less physically based recharge values. In the end, based on the Latin Hypercube sampling, a methodology for comprehensive uncertainty analysis, which accounts for multiple parameter sets and the associated correlations, was developed and applied to the case study.
|
23 |
Uncertainty Analysis and the Identification of the Contaminant Transport and Source Parameters for a Computationally Intensive Groundwater SimulationYin, Yong January 2009 (has links)
Transport parameter estimation and contaminant source identification are critical steps in the development of a physically based groundwater contaminant transport model. Due to the irreversibility of the dispersion process, the calibration of a transport model of interest is inherently ill-posed, and very sensitive to the simplification employed in the development of the lumped models. In this research, a methodology for the calibration of physically based computationally intensive transport models was developed and applied to a case study, the Reich Farm Superfund site in Toms River, New Jersey.
Using HydroGeoSphere, a physically based transient three-dimensional computationally intensive groundwater flow model with spatially and temporally varying recharge was developed. Due to the convergence issue of implementing saturation versus permeability curve (van Genuchten equation) for the large scale models with coarse discretization, a novel flux-based method was innovated to determined solutions for the unsaturated zone for soil-water-retention models. The parameters for the flow system were determined separately from the parameters for the contaminant transport model. The contaminant transport and source parameters were estimated using both approximately 15 years of TCE concentration data from continuous well records and data over a period of approximately 30 years from traditional monitoring wells, and compared using optimization with two heuristic search algorithms (DDS and MicroGA) and a gradient based multi-start PEST.
The contaminant transport model calibration results indicate that overall, multi-start PEST performs best in terms of the final best objective function values with equal number of function evaluations. Multi-start PEST also was employed to identify contaminant transport and source parameters under different scenarios including spatially and temporally varying recharge and averaged recharge. For the detailed, transient flow model with spatially and temporally varying recharge, the estimated transverse dispersivity coefficients were estimated to be significantly less than that reported in the literature for the more traditional approach that uses steady-state flow with averaged, less physically based recharge values. In the end, based on the Latin Hypercube sampling, a methodology for comprehensive uncertainty analysis, which accounts for multiple parameter sets and the associated correlations, was developed and applied to the case study.
|
24 |
Evaluating and developing parameter optimization and uncertainty analysis methods for a computationally intensive distributed hydrological modelZhang, Xuesong 15 May 2009 (has links)
This study focuses on developing and evaluating efficient and effective parameter
calibration and uncertainty methods for hydrologic modeling. Five single objective
optimization algorithms and six multi-objective optimization algorithms were tested for
automatic parameter calibration of the SWAT model. A new multi-objective
optimization method (Multi-objective Particle Swarm and Optimization & Genetic
Algorithms) that combines the strengths of different optimization algorithms was
proposed. Based on the evaluation of the performances of different algorithms on three
test cases, the new method consistently performed better than or close to the other
algorithms.
In order to save efforts of running the computationally intensive SWAT model,
support vector machine (SVM) was used as a surrogate to approximate the behavior of
SWAT. It was illustrated that combining SVM with Particle Swarm and Optimization
can save efforts for parameter calibration of SWAT. Further, SVM was used as a
surrogate to implement parameter uncertainty analysis fo SWAT. The results show that
SVM helped save more than 50% of runs of the computationally intensive SWAT model
The effect of model structure on the uncertainty estimation of streamflow simulation
was examined through applying SWAT and Neural Network models. The 95%
uncertainty intervals estimated by SWAT only include 20% of the observed data, while Neural Networks include more than 70%. This indicates the model structure is an
important source of uncertainty of hydrologic modeling and needs to be evaluated
carefully. Further exploitation of the effect of different treatments of the uncertainties of
model structures on hydrologic modeling was conducted through applying four types of
Bayesian Neural Networks. By considering uncertainty associated with model structure,
the Bayesian Neural Networks can provide more reasonable quantification of the
uncertainty of streamflow simulation. This study stresses the need for improving
understanding and quantifying methods of different uncertainty sources for effective
estimation of uncertainty of hydrologic simulation.
|
25 |
Reliability Analysis of Special Protection SystemsHsieh, Chen-An 28 July 2005 (has links)
Due to limitation of economics and legislation, the power system is not allowed serious accident on modern social. In order to enhance system reliability, many types of special protection systems (SPS) have been implemented by utilities around the world. One of the main concerns in the design of an SPS is whether the designed system can achieve the reliability requirement. Currently, the literature that discusses the SPS reliability issue is scarce. In this thesis, a comparison of several techniques suitable for performing reliability assessment of SPS is presented. Discussed reliability models include using reliability block diagram, fault tree analysis, Markov modeling and Monte Carlo simulations. In order to understand the uncertainty effects of input data on the calculated system reliability, Monte Carlo Sampling method is utilized in this study to take the input parameters uncertainty into account in the system modeling. To deal with the problem of not being able to reach the reliability requirement after uncertainty analysis, a sensitivity analysis is proposed to analyze the importance of the components involved in the system. Sensitivity analysis can be used to identity the most effective component in the enhancement the SPS reliability. A Taipower SPS is used in this thesis to explain the proposed reliability assessment methods.
|
26 |
Development of an ArcGIS interface and design of a geodatabase for the soil and water assessment toolValenzuela Zapata, Milver Alfredo 30 September 2004 (has links)
This project presents the development and design of a comprehensive interface coupled with a geodatabase (ArcGISwat 2003), for the Soil and Water Assessment Tool (SWAT). SWAT is a hydrologically distributed, lumped parameter model that runs on a continuous time step. The quantity and extensive detail of the spatial and hydrologic data, involved in the input and output, both make SWAT highly complex. A new interface, that will manage the input/output (I/O) process, is being developed using the Geodatabase object model and concepts from hydrological data models such as ArcHydro. It also incorporates uncertainty analysis on the process of modeling. This interface aims to further direct communication and integration with other hydrologic models, consequently increasing efficiency and diminishing modeling time. A case study is presented in order to demonstrate a common watershed-modeling task, which utilizes SWAT and ArcGIS-SWAT2003.
|
27 |
Financial and risk assessment and selection of health monitoring system design options for legacy aircraftEsperon Miguez, Manuel 10 1900 (has links)
Aircraft operators demand an ever increasing availability of their fleets with
constant reduction of their operational costs. With the age of many fleets
measured in decades, the options to face these challenges are limited.
Integrated Vehicle Health Management (IVHM) uses data gathered through
sensors in the aircraft to assess the condition of components to detect and
isolate faults or even estimate their Remaining Useful Life (RUL). This
information can then be used to improve the planning of maintenance
operations and even logistics and operational planning, resulting in shorter
maintenance stops and lower cost. Retrofitting health monitoring technology
onto legacy aircraft has the capability to deliver what operators and maintainers
demand, but working on aging platforms presents numerous challenges. This
thesis presents a novel methodology to select the combination of diagnostic and
prognostic tools for legacy aircraft that best suits the stakeholders’ needs based
on economic return and financial risk. The methodology is comprised of
different steps in which a series of quantitative analyses are carried out to reach
an objective solution. Beginning with the identification of which components
could bring higher reduction of maintenance cost and time if monitored, the
methodology also provides a method to define the requirements for diagnostic
and prognostic tools capable of monitoring these components. It then continues
to analyse how combining these tools affects the economic return and financial
risk. Each possible combination is analysed to identify which of them should be
retrofitted. Whilst computer models of maintenance operations can be used to
analyse the effect of retrofitting IVHM technology on a legacy fleet, the number
of possible combinations of diagnostic and prognostic tools is too big for this
approach to be practicable. Nevertheless, computer models can go beyond the
economic analysis performed thus far and simulations are used as part of the
methodology to get an insight of other effects or retrofitting the chosen toolset.
|
28 |
An exploration of building design and optimisation methods using Kriging meta-modellingWood, Michael James January 2016 (has links)
This thesis investigates the application of Kriging meta-modelling techniques in the field of building design and optimisation. In conducting this research, there were two key motivational factors. The first is the need for building designers to have tools that allow low energy buildings to be designed in a fast and efficient manner. The second motivating factor is the need for optimisation tools that account, or help account, for the wide variety of uses that a building might have; so-called Robust Optimisation (RO). This thesis therefore includes an analysis of Kriging meta-modelling and first applies this to simple building problems. I then use this simple building model to determine the effect of the updated UK Test Reference Years (TRYs) on energy consumption. Second, I examine Kriging-based optimisation techniques for a single objective. I then revisit the single-building meta-model to examine the effect of uncertainty on a neighbourhood of buildings and compare the results to the output of a brute-force analysis of a full building simulator. The results show that the Kriging emulation is an effective tool for creating a meta-model of a building. The subsequent use in the analysis of the effect of TRYs on building shows that UK buildings are likely to use less heating in the future but are likely to overheat more. In the final two chapters I use the techniques developed to create a robust building optimisation algorithm as well as using Kriging to improve the optimisation efficiency of the well-known NSGA-II algorithm. I show that the Kriging-based robust optimiser effectively finds more robust solutions than traditional global optimisation. I also show that Kriging techniques can be used to augment NSGA-II so that it finds more diverse solutions to some types of multi-objective optimisation problems. The results show that Kriging has significant potential in this field and I reveal many potential areas of future research. This thesis shows how a Kriging-enhanced NSGA-II multi-objective optimisation algorithm can be used to improve the performance of NSGA-II. This new algorithm has been shown to speed up the convergence of some multi-objective optimisation algorithms significantly. Although further work is required to verify the results for a wider variety of building applications, the initial results are promising.
|
29 |
QUALITATIVE AND QUANTITATIVE PROCEDURE FOR UNCERTAINTY ANALYSIS IN LIFE CYCLE ASSESSMENT OF WASTEWATER SOLIDS TREATMENT PROCESSESAlyaseri, Isam 01 May 2014 (has links)
In order to perform the environmental analysis and find the best management in the wastewater treatment processes using life cycle assessment (LCA) method, uncertainty in LCA has to be evaluated. A qualitative and quantitative procedure was constructed to deal with uncertainty for the wastewater treatment LCA studies during the inventory and analysis stages. The qualitative steps in the procedure include setting rules for the inclusion of inputs and outputs in the life cycle inventory (LCI), setting rules for the proper collection of data, identifying and conducting data collection analysis for the significant contributors in the model, evaluating data quality indicators, selecting the proper life cycle impact assessment (LCIA) method, evaluating the uncertainty in the model through different cultural perspectives, and comparing with other LCIA methods. The quantitative steps in the procedure include assigning the best guess value and the proper distribution for each input or output in the model, calculating the uncertainty for those inputs or outputs based on data characteristics and the data quality indicators, and finally using probabilistic analysis (Monte Carlo simulation) to estimate uncertainty in the outcomes. Environmental burdens from the solids handling unit at Bissell Point Wastewater Treatment Plant (BPWWTP) in Saint Louis, Missouri was analyzed. Plant specific data plus literature data were used to build an input-output model. Environmental performance of an existing treatment scenario (dewatering-multiple hearth incineration-ash to landfill) was analyzed. To improve the environmental performance, two alternative scenarios (fluid bed incineration and anaerobic digestion) were proposed, constructed, and evaluated. System boundaries were set to include the construction, operation and dismantling phases. The impact assessment method chosen was Eco-indicator 99 and the impact categories were: carcinogenicity, respiratory organics and inorganics, climate change, radiation, ozone depletion, ecotoxicity, acidification-eutrophication, and minerals and fossil fuels depletion. Analysis of the existing scenario shows that most of the impacts came from the operation phase on the categories related to fossil fuels depletion, respiratory inorganics, and carcinogens due to energy consumed and emissions from incineration. The proposed alternatives showed better performance than the existing treatment. Fluid bed incineration had better performance than anaerobic digestion. Uncertainty analysis showed there is 57.6% possibility to have less impact on the environment when using fluid bed incineration than the anaerobic digestion. Based on single scores ranking in the Eco-indicator 99 method, the environmental impact order is: multiple hearth incineration > anaerobic digestion > fluid bed incineration. This order was the same for the three model perspectives in the Eco-indicator 99 method and when using other LCIA methods (Eco-point 97 and CML 2000). The study showed that the incorporation of qualitative/quantitative uncertainty analysis into LCA gave more information than the deterministic LCA and can strengthen the LCA study. The procedure tested in this study showed that Monte Carlo simulation can be used in quantifying uncertainty in the wastewater treatment studies. The procedure can be used to analyze the performance of other treatment options. Although the analysis in different perspectives and different LCIA methods did not impact the order of the scenarios, it showed a possibility of variation in the final outcomes of some categories. The study showed the importance of providing decision makers with the best and worst possible outcomes in any LCA study and informing them about the perspectives and assumptions used in the assessment. Monte Carlo simulation is able to perform uncertainty analysis in the comparative LCA only between two products or scenarios based on the (A-B) approach due to the overlapping between the probability distributions of the outcomes. It is recommended to modify it to include more than two scenarios.
|
30 |
Sensitivity of Stormwater Management Solutions to Spatial ScaleBarich, Jeffrey Michael 01 June 2014 (has links)
Urbanization has considerably altered natural hydrology of urban watersheds by increasing runoff volume, producing higher and faster peak flows, and reducing water quality. Efforts to minimize or avoid these impacts, for example by implementing low impact development (LID) practices, are gaining momentum. Designing effective and economical stormwater management practices at a watershed scale is challenging; LIDs are commonly designed at site scales, considering local hydrologic conditions (i.e., one LID at a time). A number of empirical studies have documented hydrologic and water quality improvements achieved by LIDs. However, watershed scale effectiveness of LIDs has not been well studied. Considering cost, effort, and practicality, computer modeling is the only viable approach to assess LID performance at a watershed scale. As such, the United States Environmental Protection Agency’s Stormwater Management Model (SWMM) was selected for this study. It is well recognized that model predictions are plagued by uncertainties that arise from the lack of quality data and inadequacy of the model to accurately simulate the watershed. To scrutinize sensitivity of prediction accuracies to spatial resolution, four SWMM models of different spatial detail were developed for the Ballona Creek watershed, a highly urbanized watershed in the Los Angeles Basin, as a case study. Detailed uncertainty analyses were carried out for each model to quantify their prediction uncertainties and to examine if a detailed model improves prediction accuracy. Results show that there is a limit to the prediction accuracy achieved by using detailed models. Three of the four models (i.e., all but the least detailed model) produced comparable prediction accuracy. This implies that devoting substantial resources on collecting very detailed data and building fine resolution watershed models may not be necessary, as models of moderate detail could suffice. If confirmed using other urban watersheds, this result could benefit stormwater managers and modelers. All four SWMM models were then used to evaluate hydrologic effectiveness of implementing bioretention cells at a watershed scale. Event based analyses, 1-year, 2-year, 5-year and 10-year storms of 24-hours were considered, as well as data from October 2005 to March 2010 for a continuous simulation. The runoff volume reductions achieved by implementing bioretention cells were not substantial for the event storms. For the continuous simulation analysis, however, about twenty percent reductions in runoff volume were predicted. These results are in-line with previous studies that have reported ineffectiveness of LIDs to reduce runoff volume and peak for less frequent but high intensity storm events.
|
Page generated in 0.0836 seconds