• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1695
  • 419
  • 238
  • 214
  • 136
  • 93
  • 31
  • 26
  • 25
  • 21
  • 20
  • 15
  • 8
  • 7
  • 7
  • Tagged with
  • 3604
  • 597
  • 432
  • 363
  • 358
  • 358
  • 346
  • 326
  • 326
  • 294
  • 282
  • 255
  • 214
  • 213
  • 210
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

Conceptual Model Uncertainty in the Management of the Chi River Basin, Thailand

Nettasana, Tussanee 30 April 2012 (has links)
With increasing demand and pressures on groundwater resources, accurate and reliable groundwater prediction models are essential for sustainable groundwater management. Groundwater models are merely approximations of reality, and we are unable to either fully characterize or mathematically describe the true complexity of the hydrologic system; therefore, inherent in all models are varying degree of uncertainty. A robust management policy should consider uncertainties in both the imprecise nature of conceptual/numerical models and their parameters. This study addresses the critical question of whether the use of multiple conceptual models to explicitly account for conceptual model uncertainty improves the ability of the models to assist in management decisions. Twelve unique conceptual models, characterized by three alternative geological interpretations, two recharge estimations, and two boundary condition implementations, were formulated to estimate sustainable extraction rates from Thailand’s Thaphra Area, where increasing groundwater withdrawals may result in water level declination and saline water upconing. The models were developed with MODFLOW and calibrated using PEST with the same set of observed hydraulic head data. All of the models were found to reasonably produce predictions of the available heads data. To select the best among the alternative models, multiple criteria have been defined and applied to evaluate the quality of individual models. It was found that models perform differently with respect to different evaluation criteria, and that it is unlikely that a single inter-model comparison criterion will ever be sufficient for general use. The chosen alternative models were applied both individually and jointly to quantify uncertainty in the groundwater management context. Different model-averaging methods were assessed in terms of their ability to assist in quantifying uncertainty in sustainable yield estimation. The twelve groundwater simulation models were additionally linked with optimization techniques to determine appropriate groundwater abstraction rates in the TPA Phu Thok aquifer. The management models aim to obtain maximal yields while protecting water level decline. Despite similar performances among the calibrated models, total sustainable yield estimates vary substantially depending on the conceptual model used and range widely, by a factor of 0.6 in total, and by as much as a factor of 4 in each management area. The comparison results demonstrate that simple averaging achieves a better performance than formal and sophisticated averaging methods such as Maximum Likelihood Bayesian Model Averaging, and produce a similar performance to GLUE and combined-multiple criteria averaging methods for both validation testing and management applications, but is much simpler to implement and use, and computationally much less demanding. The joint assessment of parameter and conceptual model uncertainty was performed by generating the multiple realizations of random parameters from the feasible space for each calibrated model using a simple Monte Carlo approach. The multi-model averaging methods produce a higher percentage of predictive coverage than do any individual models. Using model-averaging predictions, lower optimal rates were obtained to minimize head constraint violations, which do not ensue if a single best model is used with parameter uncertainty analysis. Although accounting for all sources of uncertainty is very important in predicting environmental and management problems, the available techniques used in the literature may be too computationally demanding and, in some cases, unnecessary complex, particularly in data-poor systems. The methods presented here to account for the main sources of uncertainty provide the required practical and comprehensive uncertainty analysis and can be applied to other case studies to provide reliable and accurate predictions for groundwater management applications.
262

Methodology for the conceptual design of a robust and opportunistic system-of-systems

Talley, Diana Noonan 18 November 2008 (has links)
Systems are becoming more complicated, complex, and interrelated. Designers have recognized the need to develop systems from a holistic perspective and design them as Systems-of-Systems (SoS). The design of the SoS, especially in the conceptual design phase, is generally characterized by significant uncertainty. As a result, it is possible for all three types of uncertainty (aleatory, epistemic, and error) and the associated factors of uncertainty (randomness, sampling, confusion, conflict, inaccuracy, ambiguity, vagueness, coarseness, and simplification) to affect the design process. While there are a number of existing SoS design methods, several gaps have been identified: the ability to modeling all of the factors of uncertainty at varying levels of knowledge; the ability to consider both the pernicious and propitious aspects of uncertainty; and, the ability to determine the value of reducing the uncertainty in the design process. While there are numerous uncertainty modeling theories, no one theory can effectively model every kind of uncertainty. This research presents a Hybrid Uncertainty Modeling Method (HUMM) that integrates techniques from the following theories: Probability Theory, Evidence Theory, Fuzzy Set Theory, and Info-Gap theory. The HUMM is capable of modeling all of the different factors of uncertainty and can model the uncertainty for multiple levels of knowledge. In the design process, there are both pernicious and propitious characteristics associated with the uncertainty. Existing design methods typically focus on developing robust designs that are insensitive to the associated uncertainty. These methods do not capitalize on the possibility of maximizing the potential benefit associated with the uncertainty. This research demonstrates how these deficiencies can be overcome by identifying the most robust and opportunistic design. In a design process it is possible that the most robust and opportunistic design will not be selected from the set of potential design alternatives due to the related uncertainty. This research presents a process called the Value of Reducing Uncertainty Method (VRUM) that can determine the value associated with reducing the uncertainty in the design problem before a final decision is made by utilizing two concepts: the Expected Value of Reducing Uncertainty (EVRU) and the Expected Cost to Reducing Uncertainty (ECRU).
263

Prediction of ungauged basins - uncertain criteria conditioning, regionalization and multimodel methods

Wyatt, Adam January 2009 (has links)
Research Doctorate - Doctor of Philosophy (PhD) / The purpose of rainfall-runoff modelling, like all environmental modelling is to generate simulations that accurately mimic those encountered in the system being modelled. Once this is achieved, the model may then be used to study the catchment response under conditions that have not previously been observed, such as the determination of extreme flood levels. The complex behaviour of the processes involved in the generation of streamflow mean that to achieve a usable model, simplifications must be made. This inevitably leads to the introduction of model error into the simulations, as these simplifications cannot reproduce the level of response variation encountered in a natural system. As a consequence, a model that performs well at some times may be inappropriate at other times. The MultiModel approach is an alternative method of rainfall-runoff modelling that uses numerous alternative process descriptions to generate a suite of unique rainfall runoff models. These models are calibrated and applied to allow for simulation responses that incorporate not only parameter variability but model structure variability. It is shown that the application of the MultiModel method to four test catchments produced simulated confidence limits that are much more likely to contain flood peaks that are beyond the range encountered during the calibration process than using a single model. This is due to the wider confidence limits generated as a result of the greater structure variability available to the MultiModel. The wider confidence limits are therefore a better reflection of our true understanding of the system being modelled. The prediction of ungauged basins presents an additional challenge to rainfallrunoff modelling. Most methods involve some form of regionalization of model parameters. These approaches are very limited in that they are restricted by model selection and application range. Two unique methods for the prediction of ungauged basins are presented that overcome these restrictions. The first attempts to condition a rainfall-runoff model using uncertain criteria, normally used as a supplement to more common calibration procedures. These criteria include estimates of flood peaks, baseflow, recession and saturated area. It is shown that combinations of these criteria provide a powerful means of constraining the parameter space and reducing the simulation uncertainty. The second approach to model conditioning for ungauged basins uses an alternative method of regionalization that focuses on the estimation of flow characteristics rather than model parameter values. Strong relationships between flow characteristics (such as runoff coefficients, flow duration curves and coefficient of variation) and catchment conditions (such as area, mean annual rainfall and evaporation) are identified for catchments across Australia. Using the estimated ranges of these flow characteristics as assessment criteria, a rainfall-runoff model is successfully conditioned to adequately reproduce the streamflow response of the four test catchments. In particular it is shown that the use of numerous characteristics in tandem further improves the conditioning for the test catchments.
264

Vibrational characteristics of structures with uncertainty

Lucas, Geoffrey Iain, Mechanical & Manufacturing Engineering, Faculty of Engineering, UNSW January 2008 (has links)
This thesis is concerned with the prediction of the vibro-acoustic response of structures with uncertain properties in the mid frequency region. The motivation for this research is the growing need of engineers to understand the responses of a group of similar structures ranging from vehicles, aircraft and aerospace structures, to household whitegood appliances. These structures are complex in geometry and may possess variability in their material or geometric properties, as well as variation arising from the assembly and manufacturing processes. Small variations can have a significant effect on a dynamic response of a structure, and the effect of structural uncertainties increases as the frequency increases. Deterministic modelling techniques such as finite element analysis are only suitable to model complex structures at low frequencies. Furthermore, FEA cannot easily account for uncertainty or randomness in structural parameters. High frequency dynamic predictive techniques such as Statistical Energy Analysis can account for structural uncertainty but is limited to structures with high modal density. There exists a frequency range between the two methods in which neither technique can be applied with great confidence. The objective of this thesis is to investigate predictive techniques for mid frequency vibration analysis of dynamic systems with structural uncertainties. The first part of this work is to numerically characterise the effect of a range of uncertainties on the modal statistics of structures. The degree of uncertainty required to achieve universality of the statistical properties is investigated. This is achieved by examining the modal statistics of dynamic systems with a range of uncertainty, corresponding to uncertainty due to mass and stiffness perturbations, uncertainty at the boundaries of a structure, uncertainty in the coupling between structures, uncertainty in the material properties of a structure and uncertainty in the geometry of a structure. Several structures are examined corresponding to a plate with masses and/or linear springs added at random locations, a plate with torsional springs attached at random locations along its boundary edges, two plates coupled by linear springs at random locations, a mass-loaded coupled L-shaped plate, a mass-loaded frame-plate structure, and a plate with varying Young's modulus, density and thickness. The natural frequencies of the aforementioned structures have been derived using either the Lagrange-Rayleigh-Ritz technique, finite element analysis, or the use of interval analysis in conjunction with FEA. The natural frequency statistics of structures with uncertain properties are observed using two statistical measures; the statistical overlap factor and the probability density function of the spacing between successive natural frequencies. The statistical overlap factor is defined by the variation in a natural frequency from its mean value measured across an ensemble of nominally identical structures with uncertainty. For a single ensemble member, the probability density function of the spacing between successive natural frequencies is compared to a Rayleigh distribution of the mean frequency spacing. A Rayleigh distribution of modal spacings is a feature of the universality exhibited by structures with uncertainty. To further investigate the effect of structural uncertainty on the vibrational characteristics of structures, the interval analysis is applied to finite element models of a plate with uncertainty in its material properties and dimensions. Using this method, the Young's modulus, density and thickness of a rectangular plate were set to vary by a small amount within predefined bounds. Using finite element equations, the natural frequencies and modeshapes of the structure were then determined in terms of the Young's modulus, density and plate thickness. For the mass and spring loaded plates, the springs were shown to affect the lower order modes while the masses had a significant effect on the higher order modes. As the frequency increased, only a small amount of perturbation was sufficient to affect the natural frequencies of a structure. Using the interval analysis method, the variation of the natural frequencies from their deterministic value increased as the frequency increased. An ergodic hypothesis was used to examine the responses statistics of structures with uncertainty. Three structures have been computationally studied corresponding to two plates coupled by springs, an L-shaped plate and a frame plate structure. Uncertainty has been generated for the two coupled plates by locating the springs randomly across the surface of the two plates. For the L-shaped plate and a frame plate structure, uncertainty was generated by randomly positioning small masses across the plates. Using the ergodic hypothesis, the frequency averaged response on one member of an ensemble is compare with the ensemble averaged response. It was found that the ensemble averaged response was well predicted by a frequency averaged response of a single ensemble member. The width of the frequency averaging band was shown to have a large influence on the quality of the match between the frequency and ensemble averaged responses. Results were significantly improved using a frequency averaging bandwidth which varies proportionally to frequency. Finally, experiments have been conducted on an L-shaped plate, a frame plate structure and a vehicle to validate the computational results for the natural frequency and response statistics.
265

Spatial data quality management

He, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
266

Prediction of ungauged basins - uncertain criteria conditioning, regionalization and multimodel methods

Wyatt, Adam January 2009 (has links)
Research Doctorate - Doctor of Philosophy (PhD) / The purpose of rainfall-runoff modelling, like all environmental modelling is to generate simulations that accurately mimic those encountered in the system being modelled. Once this is achieved, the model may then be used to study the catchment response under conditions that have not previously been observed, such as the determination of extreme flood levels. The complex behaviour of the processes involved in the generation of streamflow mean that to achieve a usable model, simplifications must be made. This inevitably leads to the introduction of model error into the simulations, as these simplifications cannot reproduce the level of response variation encountered in a natural system. As a consequence, a model that performs well at some times may be inappropriate at other times. The MultiModel approach is an alternative method of rainfall-runoff modelling that uses numerous alternative process descriptions to generate a suite of unique rainfall runoff models. These models are calibrated and applied to allow for simulation responses that incorporate not only parameter variability but model structure variability. It is shown that the application of the MultiModel method to four test catchments produced simulated confidence limits that are much more likely to contain flood peaks that are beyond the range encountered during the calibration process than using a single model. This is due to the wider confidence limits generated as a result of the greater structure variability available to the MultiModel. The wider confidence limits are therefore a better reflection of our true understanding of the system being modelled. The prediction of ungauged basins presents an additional challenge to rainfallrunoff modelling. Most methods involve some form of regionalization of model parameters. These approaches are very limited in that they are restricted by model selection and application range. Two unique methods for the prediction of ungauged basins are presented that overcome these restrictions. The first attempts to condition a rainfall-runoff model using uncertain criteria, normally used as a supplement to more common calibration procedures. These criteria include estimates of flood peaks, baseflow, recession and saturated area. It is shown that combinations of these criteria provide a powerful means of constraining the parameter space and reducing the simulation uncertainty. The second approach to model conditioning for ungauged basins uses an alternative method of regionalization that focuses on the estimation of flow characteristics rather than model parameter values. Strong relationships between flow characteristics (such as runoff coefficients, flow duration curves and coefficient of variation) and catchment conditions (such as area, mean annual rainfall and evaporation) are identified for catchments across Australia. Using the estimated ranges of these flow characteristics as assessment criteria, a rainfall-runoff model is successfully conditioned to adequately reproduce the streamflow response of the four test catchments. In particular it is shown that the use of numerous characteristics in tandem further improves the conditioning for the test catchments.
267

Spatial data quality management

He, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
268

The impact of incentives, uncertainty and transaction costs on the efficiency of public sector outsourcing contracts /

Jensen, Paul H. January 2004 (has links)
Thesis (Ph. D.)--University of New South Wales, 2004. / Also available online.
269

Essays in inventory decisions under uncertainty

Manikas, Andrew Steven. January 2008 (has links)
Thesis (Ph. D.)--Management, Georgia Institute of Technology, 2008. / Committee Chair: Yih-Long Chang; Committee Member: Paul Griffin; Committee Member: Ravi Subramanian; Committee Member: Soumen Ghosh; Committee Member: Srinagesh Gavirneni.
270

Managing uncertainty using probabilistic databases /

Dalvi, Nilesh. January 2007 (has links)
Thesis (Ph. D.)--University of Washington, 2007. / Vita. Includes bibliographical references (p. 124-130).

Page generated in 0.0446 seconds