• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 5
  • 5
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Multi-Model Bayesian Analysis of Data Worth and Optimization of Sampling Scheme Design

Xue, Liang January 2011 (has links)
Groundwater is a major source of water supply, and aquifers form major storage reservoirs as well as water conveyance systems, worldwide. The viability of groundwater as a source of water to the world's population is threatened by overexploitation and contamination. The rational management of water resource systems requires an understanding of their response to existing and planned schemes of exploitation, pollution prevention and/or remediation. Such understanding requires the collection of data to help characterize the system and monitor its response to existing and future stresses. It also requires incorporating such data in models of system makeup, water flow and contaminant transport. As the collection of subsurface characterization and monitoring data is costly, it is imperative that the design of corresponding data collection schemes is cost-effective. A major benefit of new data is its potential to help improve one's understanding of the system, in large part through a reduction in model predictive uncertainty and corresponding risk of failure. Traditionally, value-of-information or data-worth analyses have relied on a single conceptual-mathematical model of site hydrology with prescribed parameters. Yet there is a growing recognition that ignoring model and parameter uncertainties render model predictions prone to statistical bias and underestimation of uncertainty. This has led to a recent emphasis on conducting hydrologic analyses and rendering corresponding predictions by means of multiple models. We develop a theoretical framework of data worth analysis considering model uncertainty, parameter uncertainty and potential sample value uncertainty. The framework entails Bayesian Model Averaging (BMA) with emphasis on its Maximum Likelihood version (MLBMA). An efficient stochastic optimization method, called Differential Evolution Method (DEM), is explored to aid in the design of optimal sampling schemes aiming at maximizing data worth. A synthetic case entailing generated log hydraulic conductivity random fields is used to illustrate the procedure. The proposed data worth analysis framework is applied to field pneumatic permeability data collected from unsaturated fractured tuff at the Apache Leap Research Site (ALRS) near Superior, Arizona.
2

A Least-Cost Strategy for Evaluating a Brownfields Redevelopment Project Subject to Indoor Air Exposure Regulations

Wang, Xiaomin 20 August 2012 (has links)
Over the course of the past several decades the benefits of redeveloping brownfields have been widely recognized. Actions have been taken to foster sustainable redevelopment of brownfields by government, policy makers and stakeholders across the world. However, redevelopments encounter great challenges and risks related to environmental and non-environmental issues. In this work, we intend to build a comprehensive and practical framework to evaluate the hydrogeological and financial risks involved during redevelopment and to ensure developers reserve sufficient capital to cover unexpected future costs within the guarantee period. Punitive damages, which contribute to these costs, are in this thesis solely associated with the cost of repossessing a house within a development should the indoor air concentration of TCE exceed the regulatory limit at a later time. The uncertainties associated with brownfield remediation have been among the barriers to brownfield redevelopment. This is mainly caused by the lack of knowledge about a site’s environmental condition. In order to alleviate uncertainties and to better understand the contaminant transport process in the subsurface, numerical simulations have been conducted to investigate the role of controlling parameters in determining the fate and transport of volatile organic compounds originating from a NAPL source zone located below the water table in the subsurface. In the first part of this thesis, the numerical model CompFlow Bio is used on a hypothesized three-dimensional problem geometry where multiple residential dwellings are built. The simulations indicate that uncertainty in the simulated indoor air concentration is sensitive to heterogeneity in the permeability structure of a stratigraphically continuous aquifer with uncertainty defined as the probability of exceeding a regulatory limit. Houses which are laterally offset from the groundwater plume are less affected by vapour intrusion due to limited transverse horizontal flux of TCE within the groundwater plume in agreement with the ASTM (2008) guidance. Within this uncertainty framework, we show that the Johnson and Ettinger (1991) model generates overly-conservative results and contributes to the exclusion zone being much further away from the groundwater plume relative to either CompFlow Bio or ASTM (2008). The probability of failure (or the probability of exceedence of the regulatory limit) is defined and calculated for further study. Due to uncertainties resulting from parameter estimation and model prediction, a methodology is introduced to incorporate field measurements into the initial estimates from the numerical model in order to improve prediction accuracy. The principle idea of this methodology is to combine the geostatistical tool kriging with the statistical data assimilation method Kalman filter to evaluate the worth and effectiveness of data in a quantitative way in order to select an optimal sampling scenario. This methodology is also used to infer whether one of the houses located adjacent to affected houses has indoor air problems based on the measurements subject to the observation that the affected house is monitored and has problems and developers have liability if a problem occurs. In this part of the study, different sampling scenarios are set up in terms of permeability (1 – 80 boreholes) and soil gas concentration (2, 4 and 7 samples) and three metrics are defined and computed as a criterion for comparison. Financing brownfield redevelopment is often viewed as a major barrier to the development process mainly due to risks and liabilities associated with brownfields. The common way of managing the risk is to transfer it to insurers by purchasing insurance coverage. This work provides two different strategies to price the risk, which is equivalent to an insurance premium. It is intended to give an instructive insight into project planning and feasibility studies during the decision-making process of a brownfield project. The two strategies of risk capital valuation are an actuarial premium calculation principle and a martingale premium calculation principle accounting for the hydrogeological and financial uncertainties faced in a project. The data used for valuation are the posterior estimates of data assimilation obtained from the results of different sampling scenarios. The cost-benefit-risk analysis is employed as a basis to construct the objective function in order to find the least cost among sampling scenarios for the project. As a result, it shows that drilling seven boreholes to extract permeability data and taking soil gas samplings in four locations or seven locations alternatively give the minimum total cost. Sensitivity analysis of some influential parameters (the safety loading factors and the possible methods to calculate the probability of failure) is performed to determine their roles of importance in the risk capital valuation. This framework can be applied to provide guidance for other risk-based environmental projects.
3

A Least-Cost Strategy for Evaluating a Brownfields Redevelopment Project Subject to Indoor Air Exposure Regulations

Wang, Xiaomin 20 August 2012 (has links)
Over the course of the past several decades the benefits of redeveloping brownfields have been widely recognized. Actions have been taken to foster sustainable redevelopment of brownfields by government, policy makers and stakeholders across the world. However, redevelopments encounter great challenges and risks related to environmental and non-environmental issues. In this work, we intend to build a comprehensive and practical framework to evaluate the hydrogeological and financial risks involved during redevelopment and to ensure developers reserve sufficient capital to cover unexpected future costs within the guarantee period. Punitive damages, which contribute to these costs, are in this thesis solely associated with the cost of repossessing a house within a development should the indoor air concentration of TCE exceed the regulatory limit at a later time. The uncertainties associated with brownfield remediation have been among the barriers to brownfield redevelopment. This is mainly caused by the lack of knowledge about a site’s environmental condition. In order to alleviate uncertainties and to better understand the contaminant transport process in the subsurface, numerical simulations have been conducted to investigate the role of controlling parameters in determining the fate and transport of volatile organic compounds originating from a NAPL source zone located below the water table in the subsurface. In the first part of this thesis, the numerical model CompFlow Bio is used on a hypothesized three-dimensional problem geometry where multiple residential dwellings are built. The simulations indicate that uncertainty in the simulated indoor air concentration is sensitive to heterogeneity in the permeability structure of a stratigraphically continuous aquifer with uncertainty defined as the probability of exceeding a regulatory limit. Houses which are laterally offset from the groundwater plume are less affected by vapour intrusion due to limited transverse horizontal flux of TCE within the groundwater plume in agreement with the ASTM (2008) guidance. Within this uncertainty framework, we show that the Johnson and Ettinger (1991) model generates overly-conservative results and contributes to the exclusion zone being much further away from the groundwater plume relative to either CompFlow Bio or ASTM (2008). The probability of failure (or the probability of exceedence of the regulatory limit) is defined and calculated for further study. Due to uncertainties resulting from parameter estimation and model prediction, a methodology is introduced to incorporate field measurements into the initial estimates from the numerical model in order to improve prediction accuracy. The principle idea of this methodology is to combine the geostatistical tool kriging with the statistical data assimilation method Kalman filter to evaluate the worth and effectiveness of data in a quantitative way in order to select an optimal sampling scenario. This methodology is also used to infer whether one of the houses located adjacent to affected houses has indoor air problems based on the measurements subject to the observation that the affected house is monitored and has problems and developers have liability if a problem occurs. In this part of the study, different sampling scenarios are set up in terms of permeability (1 – 80 boreholes) and soil gas concentration (2, 4 and 7 samples) and three metrics are defined and computed as a criterion for comparison. Financing brownfield redevelopment is often viewed as a major barrier to the development process mainly due to risks and liabilities associated with brownfields. The common way of managing the risk is to transfer it to insurers by purchasing insurance coverage. This work provides two different strategies to price the risk, which is equivalent to an insurance premium. It is intended to give an instructive insight into project planning and feasibility studies during the decision-making process of a brownfield project. The two strategies of risk capital valuation are an actuarial premium calculation principle and a martingale premium calculation principle accounting for the hydrogeological and financial uncertainties faced in a project. The data used for valuation are the posterior estimates of data assimilation obtained from the results of different sampling scenarios. The cost-benefit-risk analysis is employed as a basis to construct the objective function in order to find the least cost among sampling scenarios for the project. As a result, it shows that drilling seven boreholes to extract permeability data and taking soil gas samplings in four locations or seven locations alternatively give the minimum total cost. Sensitivity analysis of some influential parameters (the safety loading factors and the possible methods to calculate the probability of failure) is performed to determine their roles of importance in the risk capital valuation. This framework can be applied to provide guidance for other risk-based environmental projects.
4

Développement d'une stratégie de localisation d'une source de contaminants en nappe : mesures innovantes et modélisation inverse / Development of a contaminant source localisation strategy in aquifers : innovative measurements and inverse modeling

Essouayed, Elyess 08 March 2019 (has links)
La gestion et la dépollution de sites contaminés peuvent être complexe et demandent un investissement important pour localiser les sources de contaminations, zones émettant les flux de polluants les plus importants. Les travaux réalisés proposent une stratégie pour localiser les sources de pollution à partir de mesures in situ de flux massiques et de modélisation inverse. Ainsi, dans le cadre de l’étude, un outil innovant a d’abord été développé afin de mesurer la vitesse des eaux souterraines dans un puits. L’outil appelé DVT (Direct Velocity Tool) a permis de répondre aux contraintes imposées par les outils existants et de mesurer des vitesses d’écoulement très lentes. Des essais en laboratoire et des tests en site réels ont été réalisés et comparés à d’autres outils de mesure. Le DVT permet aussi indirectement de définir la portion de source conduisant au flux de polluant maximum, en le combinant avec une mesure locale de concentration. L’étude présente ensuite l’utilisation de la modélisation inverse pour localiser une source de contaminant et d’estimer les paramètres définissant les caractéristiques du domaine. Pour cela, l'étude s'est faite sur deux cas synthétiques. Pour adapter les méthodes à une véritable gestion de sites pollués, une stratégie itérative est développée en imposant un ajout limité de nouvelles observations à chaque phase de modélisation, basée sur l’approche de type Data Worth. Les résultats de la position de la source sur les deux cas synthétiques ont permis d’évaluer la méthode mise en place et de juger son applicabilité à une problématique réelle. Cette stratégie de localisation de source est par la suite testée sur un site réel à partir (i) de mesures in situ de flux massiques avec les vitesses au DVT et les concentrations et (ii) la modélisation. Les essais ont permis de cibler les forages à mettre en place sur site aidant à localiser la source. Néanmoins, en analysant plus précisément les résultats, le champ de conductivité hydraulique estimé par l'optimisation ne correspond pas à la réalité. De plus, les flux massiques de contaminants ainsi que le ratio des polluants du site, mettent en valeur deux panaches distincts. Une phase finale de modélisation a donc été lancée afin d'estimer (i) la présence potentielle de deux sources et (ii) la chimie de la zone étudiée. Les résultats de la stratégie sont comparés aux mesures geoprobe qui a pu confirmer la présence d’une des deux sources identifiées. / Contaminated sites management and remediation can be complex and require a significant investment to locate the contaminant source, which delivers the higher pollutant mass fluxes. The study proposes a strategy for contaminant source localisation using in situ measurement and inverse modelling. First, an innovative tool was developed to measure groundwater velocity in a well. The developed tool called DVT (Direct Velocity Tool) made it possible to measure a low Darcy flux. Laboratory and field tests were performed with the DVT and compared to other velocity measurement tools. By combining the DVT with a local concentration measurement, it is possible to calculate the mass fluxes passing through wells. Then the thesis present the inverse modeling used for source localisation and parameters estimation. The study was done on two synthetics cases using the non-linear optimisation method. To adapt the method to a real management of polluted sites, an iterative strategy is developed by imposing a limited addition of new observations to each modeling phase. This strategy is base on the Data Worth approach. Source localisation results on the two synthetic cases made it possible to judge the method applicability to a real site problem. The source localisation strategy is then applied to a real site with (i) mass flux measurement with velocities (DVT) and concentrations and (ii) inverse modeling. The modeling phases made it possible to locate the new wells and helped the source localisation. Nevertheless, by analysing the results more precisely, the hydraulic conductivity field estimated by the optimisation did not correspond to reality. In addition, contaminant mass fluxes highlightes two distinct zones of flux. By analysing the pollutant ratio of the site, it appears that two plumes are potentially present. Thus, another inverse modeling phase has been tested (i) to locate the two potential sources and (ii) to estimate the chemistry of the site. Results of the strategy were compared to the geoprobe campaign which confirmed the second source location.
5

Model-based data worth analysis for groundwater systems with the use of surrogate models

Gosses, Moritz 30 November 2020 (has links)
The aim of this work is the improvement of model-based data worth analysis for groundwater systems with the use of surrogate models. Physically-based groundwater models are wide-spread tools used to make diverse predictions for research and management problems. They allow incorporation of system knowledge and a multitude of data. Often, this complexity is accompanied with high model run times. This is especially problematic for applications such as uncertainty analysis or data worth analysis, which necessitate many model runs. Surrogate models aim to address these challenges with run time reduction through simplification of the model, usually through techniques of the following three major categories: projection-based methods, data-driven methods and structural reduction methods. The run time reduction through the use of surrogate models is associated with impairments regarding their applicability, accuracy of system representation, predictive uncertainty quantification and integration of system knowledge. In light of these potential limitations, this thesis compares the ability of three different surrogate models in reproducing diverse model predictions and data worth estimates of a complex, real-world benchmark model. The surrogates used are a spatially and parametrically simplified physically-based model, a set of artificial neural networks (ANNs) and a projection-based 'proper orthogonal decomposition' (POD) model. In the first part of this dissertation, the potentials and shortcomings of the popular POD method in regard to boundary representation are detailed and an extension of the method for accurate boundary depiction is proposed. The explicit treatment of boundary conditions is shown to eliminate reduction-induced errors at Dirichlet and Neumann boundaries (and reduce errors at Cauchy boundaries) for a small trade-off in general groundwater head accuracy. Ease of implementation and the potential for purposeful application of the extension allow modelers a target-oriented refinement of POD models. The second part of this dissertation addresses the challenge of quantifying the model simplification error of a surrogate model in light of erroneousness of their model predictions. An existing method for predictive uncertainty quantification is extended to estimate the simplification error and bias of different model predictions of all three surrogate models compared to the complex benchmark model predictions. Results show that the magnitude and structure of model simplification error is highly dependent on both the type of model prediction and surrogate model. In the final part of the thesis, two of the surrogate models are compared with the complex model in their application for analysis of worth of different data types. First-order second-moment data worth analysis methods are extended to account for the non-uniqueness of calibrated model parameters in a robust method. It is then used in collaboration with the surrogate models to analyze the worth of existing, 'future' and 'parametric' data for varying model predictions. The comparison of changes in predictive uncertainty variance between complex model and the surrogates shows that the simplified, physically-based model is only able to identify data worth in the existing calibration data set. The POD model is a suitable surrogate for data worth analysis in regards to all differing predictions and worth of existing, 'future' and 'parametric' data when combined with the strengths of the proposed robust data worth analysis method.:1 Introduction 2 State of the art 2.1 Surrogate modeling for groundwater systems 2.1.1 Introduction 2.1.2 Projection-based methods 2.1.3 Data-driven methods 2.1.4 Structural simplification methods 2.1.5 Open research questions 2.2 Uncertainty and data worth analysis 2.2.1 Introduction 2.2.2 Sources of uncertainty in groundwater modeling 2.2.3 Types of uncertainty analysis 2.2.4 Data worth analysis 2.2.5 Open research questions 3 Objectives and contributions 3.1 Explicit boundary treatment in POD 3.2 Analysis of model simplification error 3.3 Robust data worth analysis using surrogate models 3.4 Expected impact 4 Methods 4.1 The Wairau Plain aquifer, the complex model and its surrogates 4.1.1 The Wairau Plain aquifer 4.1.2 Complex MODFLOW model of the Wairau Plain aquifer (CM) 4.1.3 Surrogate 1: simplified MODFLOW model (SM1,sMm) 4.1.4 Surrogate 2: linearized POD model (SM2,POD) 4.1.5 Surrogate 3: artificial neural networks (SM3,ANN) 4.2 POD extension for explicit boundary treatment 4.2.1 Groundwater models and basic POD 4.2.2 Theory of explicit treatment of boundary conditions in POD 4.2.3 Different boundary conditions in eb-POD 4.2.4 Cost of eb-POD compared to basic POD 4.3 Model simplification error analysis - theory 4.3.1 A linear model, solution space and null-space 4.3.2 Surrogate model: definition and calibration 4.3.3 Parameter simplification - relationship between complex model and surrogate model parameters 4.3.4 Simplification error of surrogate model predictions 4.4 Model simplification error analysis - scatter plot analysis 4.4.1 Methodology 4.4.2 General features of the scatter plots 4.4.3 Contributions of error terms 4.4.4 Prediction pairs 4.4.5 Summary 4.5 Robust data worth analysis 4.5.1 First-order second-moment uncertainty estimation 4.5.2 Worth of data 4.5.3 Generating calibrated parameter sets - null-space parameter perturbation 4.5.4 Robust data worth analysis 5 Results and discussion 5.1 Explicit treatment of boundary conditions in POD 5.1.1 (Variable) Dirichlet boundaries 5.1.2 Neumann boundaries 5.1.3 Cauchy boundaries 5.1.4 Applying eb-POD: summary 5.2 Quantifying model simplification error 5.2.1 Simplified MODFLOW model: SM1,sMm 5.2.2 POD surrogate model: SM2,POD 5.2.3 ANN surrogate model: SM3,ANN 5.2.4 Surrogate comparison: simplification errors in model predictions 5.3 Robust data worth analysis using surrogate models 5.3.1 Worth of existing data 5.3.2 Worth of 'future' data 5.3.3 Worth of 'parametric' data 5.3.4 Data worth with surrogate models: summary 5.4 Discussion 6 Conclusions and outlook A Appendix: Publications / Das Ziel dieser Arbeit ist die Verbesserung modellbasierter Datenwertanalyse in Grundwassersystemen mittels Ersatzmodellen. Vielfältige Fragestellungen im Fachgebiet Grundwasser werden in Wissenschaft und Praxis mithilfe von physikalisch basierten Modellen beantwortet. Solche Modelle können hierbei vorhandene Kenntnisse über das System sowie eine Vielzahl gemessener Daten einbinden. Jedoch führt die zugrundeliegende Komplexität häufig zu langen Modelllaufzeiten, was insbesondere für Anwendungen mit vielen Modellläufen, wie Unsicherheits- oder Datenwertanalyse, problematisch ist. Hier greifen Ersatzmodelle an: eine Vereinfachung des Modells reduziert die zugehörigen Laufzeiten. Solche Ersatzmodelle lassen sich in drei Kategorien einteilen: projektionsbasierte Methoden, datengetriebene Methoden und strukturreduzierende Methoden. Ersatzmodelle haben jedoch auch verschiedene Nachteile: eingeschränkte Anwendbarkeit, geringere Genauigkeit der Systemabbildung, verringerte Qualität ihrer Vorhersage- und Unsicherheitsschätzung, sowie Restriktionen bei der Integrierung vorhandener Systemkenntnisse. Zur Abschätzung dieser möglichen Einschränkungen vergleicht diese Arbeit drei verschiedene Ersatzmodelle mit einem realen, komplexen Benchmarkmodell in Bezug auf ihre Fähigkeit, verschiedene Modellvorhersagen und Datenwertabschätzungen glaubhaft zu reproduzieren. Die drei Ersatzmodelle sind: ein räumlich und parametrisch vereinfachtes, physikalisch basiertes Modell, ein Verbund künstlicher neuronaler Netze und ein projektionsbasiertes 'proper orthogonal decomposition' (POD) Modell. Der erste Teil der Arbeit widmet sich den Ungenauigkeiten der weitverbreiteten POD Methode bei der Abbildung von Randbedingungen und zeigt eine Erweiterung der Methode zur präziseren Darstellung der Randbedingungen in POD auf. Es wird nachgewiesen, dass diese explizite Behandlung der Randbedingungen in POD reduktionsbasierte Fehler in den Dirichlet- und Neumann-Randbedingungen eliminiert sowie Fehler in den Cauchy-Randbedingungen reduziert. Die Methode führt jedoch zu einer leicht verringerten Genauigkeit bei der allgemeinen Abbildung der Grundwasserstände im Vergleich zu herkömmlichen POD. Dennoch ist die Methode ein nützliches Werkzeug, da sie eine zielgerichtete Anpassung von POD-Ersatzmodellen erlaubt. Der zweite Teil der Arbeit stellt sich der Aufgabe, den durch die Modellvereinfachung eingebrachten Fehler der Ersatzmodelle bezogen auf ihre Modellvorhersagen zu beurteilen. Hierzu wird eine vorhandene Methode zur Berechnung von Vorhersageunsicherheiten angepasst und erweitert, um Fehler und Bias durch Modellvereinfachung zwischen den drei Ersatzmodellen und dem Benchmarkmodell für verschiedene Modellvorhersagen abzuschätzen. Die Ergebnisse zeigen, dass Größe und Struktur des Fehlers durch Modellvereinfachung stark von der Art der Modellvorhersage und des jeweiligen Ersatzmodells abhängen. Im letzten Teil der Arbeit werden zwei der Ersatzmodelle mit dem Benchmarkmodell bezüglich ihrer Nutzbarkeit zur Datenwertanalyse verglichen. Die verwendete, lokale Methode zur Datenwertanalyse wird auf mehrere plausible Parameterfelder angewandt, um die Nichteindeutigkeit kalibrierter Modellparameter mit einzubeziehen. Mittels dieser erweiterten, robusten Methode werden vorhandene, 'zukünftige' und 'parametrische' Datenwerte in Bezug auf verschiedene Modellvorhersagen für die Ersatzmodelle sowie das Benchmarkmodell evaluiert. Der Vergleich der auf Vorhersageunsicherheiten bezogenen Datenwerte zwischen Benchmarkmodell und dem vereinfachten, physikalisch basierten Modell zeigt, dass dieses Ersatzmodell nur die Ergebnisse für vorhandene Daten korrekt wiedergibt. Das POD Modell dagegen ist ein geeignetes Ersatzmodell hinsichtlich seiner Fähigkeit, Datenwerte des Benchmarkmodells mittels der vorgestellten robusten Methode zu reproduzieren. Dies ist unabhängig von der gewählten Vorhersage und davon, ob vorhandene, 'zukünftige' oder 'parametrische' Daten betrachtet werden.:1 Introduction 2 State of the art 2.1 Surrogate modeling for groundwater systems 2.1.1 Introduction 2.1.2 Projection-based methods 2.1.3 Data-driven methods 2.1.4 Structural simplification methods 2.1.5 Open research questions 2.2 Uncertainty and data worth analysis 2.2.1 Introduction 2.2.2 Sources of uncertainty in groundwater modeling 2.2.3 Types of uncertainty analysis 2.2.4 Data worth analysis 2.2.5 Open research questions 3 Objectives and contributions 3.1 Explicit boundary treatment in POD 3.2 Analysis of model simplification error 3.3 Robust data worth analysis using surrogate models 3.4 Expected impact 4 Methods 4.1 The Wairau Plain aquifer, the complex model and its surrogates 4.1.1 The Wairau Plain aquifer 4.1.2 Complex MODFLOW model of the Wairau Plain aquifer (CM) 4.1.3 Surrogate 1: simplified MODFLOW model (SM1,sMm) 4.1.4 Surrogate 2: linearized POD model (SM2,POD) 4.1.5 Surrogate 3: artificial neural networks (SM3,ANN) 4.2 POD extension for explicit boundary treatment 4.2.1 Groundwater models and basic POD 4.2.2 Theory of explicit treatment of boundary conditions in POD 4.2.3 Different boundary conditions in eb-POD 4.2.4 Cost of eb-POD compared to basic POD 4.3 Model simplification error analysis - theory 4.3.1 A linear model, solution space and null-space 4.3.2 Surrogate model: definition and calibration 4.3.3 Parameter simplification - relationship between complex model and surrogate model parameters 4.3.4 Simplification error of surrogate model predictions 4.4 Model simplification error analysis - scatter plot analysis 4.4.1 Methodology 4.4.2 General features of the scatter plots 4.4.3 Contributions of error terms 4.4.4 Prediction pairs 4.4.5 Summary 4.5 Robust data worth analysis 4.5.1 First-order second-moment uncertainty estimation 4.5.2 Worth of data 4.5.3 Generating calibrated parameter sets - null-space parameter perturbation 4.5.4 Robust data worth analysis 5 Results and discussion 5.1 Explicit treatment of boundary conditions in POD 5.1.1 (Variable) Dirichlet boundaries 5.1.2 Neumann boundaries 5.1.3 Cauchy boundaries 5.1.4 Applying eb-POD: summary 5.2 Quantifying model simplification error 5.2.1 Simplified MODFLOW model: SM1,sMm 5.2.2 POD surrogate model: SM2,POD 5.2.3 ANN surrogate model: SM3,ANN 5.2.4 Surrogate comparison: simplification errors in model predictions 5.3 Robust data worth analysis using surrogate models 5.3.1 Worth of existing data 5.3.2 Worth of 'future' data 5.3.3 Worth of 'parametric' data 5.3.4 Data worth with surrogate models: summary 5.4 Discussion 6 Conclusions and outlook A Appendix: Publications

Page generated in 0.0401 seconds