Spelling suggestions: "subject:"model uncertainty"" "subject:"godel uncertainty""
1 
Conceptual Model Uncertainty in the Management of the Chi River Basin, ThailandNettasana, Tussanee 30 April 2012 (has links)
With increasing demand and pressures on groundwater resources, accurate and reliable groundwater prediction models are essential for sustainable groundwater management. Groundwater models are merely approximations of reality, and we are unable to either fully characterize or mathematically describe the true complexity of the hydrologic system; therefore, inherent in all models are varying degree of uncertainty. A robust management policy should consider uncertainties in both the imprecise nature of conceptual/numerical models and their parameters. This study addresses the critical question of whether the use of multiple conceptual models to explicitly account for conceptual model uncertainty improves the ability of the models to assist in management decisions.
Twelve unique conceptual models, characterized by three alternative geological interpretations, two recharge estimations, and two boundary condition implementations, were formulated to estimate sustainable extraction rates from Thailand’s Thaphra Area, where increasing groundwater withdrawals may result in water level declination and saline water upconing. The models were developed with MODFLOW and calibrated using PEST with the same set of observed hydraulic head data. All of the models were found to reasonably produce predictions of the available heads data. To select the best among the alternative models, multiple criteria have been defined and applied to evaluate the quality of individual models. It was found that models perform differently with respect to different evaluation criteria, and that it is unlikely that a single intermodel comparison criterion will ever be sufficient for general use. The chosen alternative models were applied both individually and jointly to quantify uncertainty in the groundwater management context. Different modelaveraging methods were assessed in terms of their ability to assist in quantifying uncertainty in sustainable yield estimation.
The twelve groundwater simulation models were additionally linked with optimization techniques to determine appropriate groundwater abstraction rates in the TPA Phu Thok aquifer. The management models aim to obtain maximal yields while protecting water level decline. Despite similar performances among the calibrated models, total sustainable yield estimates vary substantially depending on the conceptual model used and range widely, by a factor of 0.6 in total, and by as much as a factor of 4 in each management area. The comparison results demonstrate that simple averaging achieves a better performance than formal and sophisticated averaging methods such as Maximum Likelihood Bayesian Model Averaging, and produce a similar performance to GLUE and combinedmultiple criteria averaging methods for both validation testing and management applications, but is much simpler to implement and use, and computationally much less demanding.
The joint assessment of parameter and conceptual model uncertainty was performed by generating the multiple realizations of random parameters from the feasible space for each calibrated model using a simple Monte Carlo approach. The multimodel averaging methods produce a higher percentage of predictive coverage than do any individual models. Using modelaveraging predictions, lower optimal rates were obtained to minimize head constraint violations, which do not ensue if a single best model is used with parameter uncertainty analysis.
Although accounting for all sources of uncertainty is very important in predicting environmental and management problems, the available techniques used in the literature may be too computationally demanding and, in some cases, unnecessary complex, particularly in datapoor systems. The methods presented here to account for the main sources of uncertainty provide the required practical and comprehensive uncertainty analysis and can be applied to other case studies to provide reliable and accurate predictions for groundwater management applications.

2 
The Utility of Using Multiple Conceptual Models for the Design of Groundwater Remediation SystemsSheffield, Philip January 2014 (has links)
The design of pump and treat systems for groundwater remediation is often aided by
numerical groundwater modelling. Model predictions are uncertain, with this uncertainty
resulting from unknown parameter values, model structure and future system forcings.
Researchers have begun to suggest that uncertainty in groundwater model predictions is largely dominated by structural/conceptual model uncertainty and that multiple conceptual
models be developed in order to characterize this uncertainty. As regulatory bodies
begin to endorse the more expensive multiple conceptual model approach, it is useful to
assess whether a multiple model approach provides a signi cant improvement over a conventional single model approach for pump and treat system design, supplemented with a factor of safety. To investigate this question, a case study located in Tacoma, Washington which was provided by ConestogaRovers & Associates (CRA) was used.
Twelve conceptual models were developed to represent conceptual model uncertainty
at the Tacoma, Washington site and a pump and treat system was optimally designed for each conceptual model. Each design was tested across all 12 conceptual models with no factor of safety applied, and a factor of safety of 1.5 and 2 applied. Adding a factor of safety of 1.5 decreased the risk of containment failure to 15 percent, compared to 21 percent with no factor of safety. Increasing the factor of safety from 1.5 to 2 further reduced the risk of containment failure to 9 percent, indicating that the application of a factor of safety reduces the risk of design failure at a cost directly proportional to the value of the factor of safety.
To provide a relatively independent estimate of a factor of safety approach a single
"best" model developed by CRA was compared against the multiple model approach.
With a factor of safety of 1.5 or greater, adequate capture was demonstrated across all
12 conceptual models. This demonstrated that in this case using the single \best" model developed by CRA with a factor of safety would have been a reasonable surrogate for a multiple model approach. This is of practical importance to engineers as it demonstrates that the a conventional single model approach may be su cient. However, it is essential that the model used is a good model. Furthermore, a multiple model approach will likely be an excessive burden in cases such as pump and treat system design, where the cost of failure is low as the system can be adjusted during operation to respond to new data. This may not be the case for remedial systems with high capital costs such as permeable reactive barriers, which cannot be easily adjusted.

3 
Conceptual Model Uncertainty in the Management of the Chi River Basin, ThailandNettasana, Tussanee 30 April 2012 (has links)
With increasing demand and pressures on groundwater resources, accurate and reliable groundwater prediction models are essential for sustainable groundwater management. Groundwater models are merely approximations of reality, and we are unable to either fully characterize or mathematically describe the true complexity of the hydrologic system; therefore, inherent in all models are varying degree of uncertainty. A robust management policy should consider uncertainties in both the imprecise nature of conceptual/numerical models and their parameters. This study addresses the critical question of whether the use of multiple conceptual models to explicitly account for conceptual model uncertainty improves the ability of the models to assist in management decisions.
Twelve unique conceptual models, characterized by three alternative geological interpretations, two recharge estimations, and two boundary condition implementations, were formulated to estimate sustainable extraction rates from Thailand’s Thaphra Area, where increasing groundwater withdrawals may result in water level declination and saline water upconing. The models were developed with MODFLOW and calibrated using PEST with the same set of observed hydraulic head data. All of the models were found to reasonably produce predictions of the available heads data. To select the best among the alternative models, multiple criteria have been defined and applied to evaluate the quality of individual models. It was found that models perform differently with respect to different evaluation criteria, and that it is unlikely that a single intermodel comparison criterion will ever be sufficient for general use. The chosen alternative models were applied both individually and jointly to quantify uncertainty in the groundwater management context. Different modelaveraging methods were assessed in terms of their ability to assist in quantifying uncertainty in sustainable yield estimation.
The twelve groundwater simulation models were additionally linked with optimization techniques to determine appropriate groundwater abstraction rates in the TPA Phu Thok aquifer. The management models aim to obtain maximal yields while protecting water level decline. Despite similar performances among the calibrated models, total sustainable yield estimates vary substantially depending on the conceptual model used and range widely, by a factor of 0.6 in total, and by as much as a factor of 4 in each management area. The comparison results demonstrate that simple averaging achieves a better performance than formal and sophisticated averaging methods such as Maximum Likelihood Bayesian Model Averaging, and produce a similar performance to GLUE and combinedmultiple criteria averaging methods for both validation testing and management applications, but is much simpler to implement and use, and computationally much less demanding.
The joint assessment of parameter and conceptual model uncertainty was performed by generating the multiple realizations of random parameters from the feasible space for each calibrated model using a simple Monte Carlo approach. The multimodel averaging methods produce a higher percentage of predictive coverage than do any individual models. Using modelaveraging predictions, lower optimal rates were obtained to minimize head constraint violations, which do not ensue if a single best model is used with parameter uncertainty analysis.
Although accounting for all sources of uncertainty is very important in predicting environmental and management problems, the available techniques used in the literature may be too computationally demanding and, in some cases, unnecessary complex, particularly in datapoor systems. The methods presented here to account for the main sources of uncertainty provide the required practical and comprehensive uncertainty analysis and can be applied to other case studies to provide reliable and accurate predictions for groundwater management applications.

4 
Regression Model Stochastic Search via Local OrthogonalizationXu, Ruoxi 16 December 2011 (has links)
No description available.

5 
Topics in portfolio choice : qualitative properties, time consistency and investment under model uncertaintyKallblad, Sigrid Linnea January 2014 (has links)
The study of expected utility maximization in continuoustime stochastic market models dates back to the seminal work of Merton 1969 and has since been central to the area of Mathematical Finance. The associated stochastic optimization problems have been extensively studied. The problem formulation relies on two strong underlying assumptions: the ability to specify the underpinning market model and the knowledge of the investor's risk preferences. However, neither of these inputs is easily available, if at all. Resulting issues have attracted continuous attention and prompted very active and diverse lines of research. This thesis seeks to contribute towards this literature and questions related to both of the above issues are studied. Specifically, we study the implications of certain qualitative properties of the utility function; we introduce, and study various aspects of, the notion of robust forward investment criteria; and we study the investment problem associated with risk and ambiguityaverse preference criteria defined in terms of quasiconcave utility functionals.

6 
Model Validation in Fire Protection EngineeringLantz, Renee Vaillancourt 24 August 2001 (has links)
"In the prediction of phenomenon behavior there is a presupposition that a similarity exists between model and phenomenon. Success of application is derived from that similarity. An example of this approach is the use of similarity conditions such as Reynolds number in flow problems or Fourier number in heat transfer problems. The advent of performancebased codes has opened up opportunities for many diverse avenues of fire model implementation. The reliability of models depends upon model correspondence uncertainty. Model correspondence uncertainty is incomplete and distorted information introduced into a simulation by a modeling scheme. It manifests itself as 1) the uncertainty associated with the mathematical relationships hypothesized for a particular model, and 2) the uncertainty of the predictions obtained from the model. Improving model implementation by providing a method for rankordering models is the goal of the Model Validity Criterion (MVC) method. MVC values can be useful as a tool to objectively and quantitatively choose a model for an application or as part of a model improvement program. The MVC method calculates the amount of model correspondence uncertainty introduced by a modeling scheme. Model choice is based upon the strategy of minimizing correspondence uncertainty and therefore provides the model that best corresponds to the phenomenon. The MVC value for a model is quantified as the sum of the length of two files. These files are individual measures of model structure correspondence uncertainty and model behavior correspondence uncertainty. The combination of the two uncertainty components gives an objective and structured evaluation of the relative validity of each model from a set of likely candidate models. The model with the smallest uncertainty files has the lowest MVC value and is the model with the most validity. Ultimately the value of such a method is only realized from its utility. Example applications of the MVC method are demonstrated. Examples evaluate the rankordering of plume physics options used within the computer zone model WPIFire when validated against upper layer temperature data from compartmentfire test scenarios. The results show how candidate models of a set may be discriminated against based on validity. These results are powerful in that they allow the user to establish a quantitative measure for level of model performance and/or choose the most valid model for an application."

7 
Robust design of control charts for autocorrelated processes with model uncertaintyLee, Hyun Cheol 01 November 2005 (has links)
Statistical process control (SPC) procedures suitable for autocorrelated processes have been extensively investigated in recent years. The most popular method is the residualbased control chart. To implement this method, a time series model, which is usually an autoregressive moving average (ARMA) model, of the process is required. However, the model must be estimated from data in practice and the resulting ARMA modeling errors are unavoidable. Residualbased control charts are known to be sensitive to ARMA modeling errors and often suffer from inflated false alarm rates. As an alternative, control charts can be applied directly to the autocorrelated data with widened control limits. The widened amount is determined by the autocorrelation function of the process. The alternative method, however, can not be also free from the effects of modeling errors because it relies on an accurate process model to be effective.
To compare robustness to the ARMA modeling errors between the preceding two kinds of methods for control charting autocorrelated data, this dissertation investigates the sensitivity analytically. Then, two robust design procedures for residualbased control charts are developed from the result of the sensitivity analysis. The first approach for robust design uses the worstcase (maximum) variance of a chart statistic to guarantee the initial specification of control charts. The second robust design method uses the expected variance of the chart statistic. The resulting control limits are widened by an amount that depends on the variance of chart statistic  maximum or expected  as a function of (among other things) the parameter estimation error covariances.

8 
Dissipativity, optimality and robustness of model predictive control policiesLøvaas, Christian January 2008 (has links)
Research Doctorate  Doctor of Philosophy (PhD) / This thesis addresses the problem of robustness in model predictive control (MPC) of discretetime systems. In contrast with most previous work on robust MPC, our main focus is on robustness in the face of both imperfect state information and dynamic model uncertainty. For linear discretetime systems with model uncertainty described by sum quadratic constraints, we propose outputfeedback MPC policies that: (i) treat soft constraints using quadratic penalty functions; (ii) respect hard constraints using 'tighter' constraints; and (iii) achieve robust closedloop stability and nonzero setpoint tracking. Our two main tools are: (1) a new linear matrix inequality condition which parameterizes a class of quadratic MPC cost functions that all lead to robust closedloop stability; and (2) a new parameterization of soft constraints which has the advantage of leading to optimization problems of prescribable size. The stability test we use for MPC design builds on wellknown results from dissipativity theory which we tailor to the case of constrained discretetime systems. The proposed robust MPC designs are shown to converge to wellknown nominal MPC designs as the model uncertainty (description) goes to zero. Furthermore, the present approach to cost function selection is independently motivated by a novel result linking MPC and minimax optimal control theory. Specifically, we show that the considered class of MPC policies are the closedloop optimal solutions of a particular class of minimax optimal control problems. In addition, for a class of nonlinear discretetime systems with constraints and bounded disturbance inputs, we propose statefeedback MPC policies that inputtostate stabilize the system. Our two main tools in this last part of the thesis are: (1) a class of Nstep affine statefeedback policies; and (2) a result that establishes equivalence between the latter class and an associated class of Nstep affine disturbancefeedback policies. Our equivalence result generalizes a recent result in the literature for linear systems to the case when N is chosen to be less than the nonlinear system's 'inputstate linear horizon'.

9 
QUALITATIVE AND QUANTITATIVE PROCEDURE FOR UNCERTAINTY ANALYSIS IN LIFE CYCLE ASSESSMENT OF WASTEWATER SOLIDS TREATMENT PROCESSESAlyaseri, Isam 01 May 2014 (has links)
In order to perform the environmental analysis and find the best management in the wastewater treatment processes using life cycle assessment (LCA) method, uncertainty in LCA has to be evaluated. A qualitative and quantitative procedure was constructed to deal with uncertainty for the wastewater treatment LCA studies during the inventory and analysis stages. The qualitative steps in the procedure include setting rules for the inclusion of inputs and outputs in the life cycle inventory (LCI), setting rules for the proper collection of data, identifying and conducting data collection analysis for the significant contributors in the model, evaluating data quality indicators, selecting the proper life cycle impact assessment (LCIA) method, evaluating the uncertainty in the model through different cultural perspectives, and comparing with other LCIA methods. The quantitative steps in the procedure include assigning the best guess value and the proper distribution for each input or output in the model, calculating the uncertainty for those inputs or outputs based on data characteristics and the data quality indicators, and finally using probabilistic analysis (Monte Carlo simulation) to estimate uncertainty in the outcomes. Environmental burdens from the solids handling unit at Bissell Point Wastewater Treatment Plant (BPWWTP) in Saint Louis, Missouri was analyzed. Plant specific data plus literature data were used to build an inputoutput model. Environmental performance of an existing treatment scenario (dewateringmultiple hearth incinerationash to landfill) was analyzed. To improve the environmental performance, two alternative scenarios (fluid bed incineration and anaerobic digestion) were proposed, constructed, and evaluated. System boundaries were set to include the construction, operation and dismantling phases. The impact assessment method chosen was Ecoindicator 99 and the impact categories were: carcinogenicity, respiratory organics and inorganics, climate change, radiation, ozone depletion, ecotoxicity, acidificationeutrophication, and minerals and fossil fuels depletion. Analysis of the existing scenario shows that most of the impacts came from the operation phase on the categories related to fossil fuels depletion, respiratory inorganics, and carcinogens due to energy consumed and emissions from incineration. The proposed alternatives showed better performance than the existing treatment. Fluid bed incineration had better performance than anaerobic digestion. Uncertainty analysis showed there is 57.6% possibility to have less impact on the environment when using fluid bed incineration than the anaerobic digestion. Based on single scores ranking in the Ecoindicator 99 method, the environmental impact order is: multiple hearth incineration > anaerobic digestion > fluid bed incineration. This order was the same for the three model perspectives in the Ecoindicator 99 method and when using other LCIA methods (Ecopoint 97 and CML 2000). The study showed that the incorporation of qualitative/quantitative uncertainty analysis into LCA gave more information than the deterministic LCA and can strengthen the LCA study. The procedure tested in this study showed that Monte Carlo simulation can be used in quantifying uncertainty in the wastewater treatment studies. The procedure can be used to analyze the performance of other treatment options. Although the analysis in different perspectives and different LCIA methods did not impact the order of the scenarios, it showed a possibility of variation in the final outcomes of some categories. The study showed the importance of providing decision makers with the best and worst possible outcomes in any LCA study and informing them about the perspectives and assumptions used in the assessment. Monte Carlo simulation is able to perform uncertainty analysis in the comparative LCA only between two products or scenarios based on the (AB) approach due to the overlapping between the probability distributions of the outcomes. It is recommended to modify it to include more than two scenarios.

10 
Three essays on macrofinance: robustness and portfolio theoryGuimarães, Pedro Henrique Engel 28 July 2017 (has links)
Submitted by Pedro Guimarães (pedroengel@hotmail.com) on 20171228T19:42:52Z
No. of bitstreams: 1
Tese.pdf: 917520 bytes, checksum: cfa05ebb1d37a4a617f387942ee05a15 (MD5) / Approved for entry into archive by GILSON ROCHA MIRANDA (gilson.miranda@fgv.br) on 20180115T18:46:52Z (GMT) No. of bitstreams: 1
Tese.pdf: 917520 bytes, checksum: cfa05ebb1d37a4a617f387942ee05a15 (MD5) / Made available in DSpace on 20180116T19:08:33Z (GMT). No. of bitstreams: 1
Tese.pdf: 917520 bytes, checksum: cfa05ebb1d37a4a617f387942ee05a15 (MD5)
Previous issue date: 20170728 / This doctoral thesis is composed of three chapters related to portfolio theory and model uncertainty. The first paper investigates how ambiguity averse agents explain the equity premium puzzle for a large group of countries including both Advanced Economies (AE) and Emerging Markets (EM). In the second article, we develop a general robust allocation framework that is capable of dealing with parametric and non parametric asset allocation models. In the final paper, I investigate portfolio selection criteria and analyze a set of portfolios out of sample performance in terms of Sharpe ratio (SR) and Certainty Equivalent (CEQ)

Page generated in 0.1048 seconds