Spelling suggestions: "subject:"[een] UNCERTAINTY ANALYSIS"" "subject:"[enn] UNCERTAINTY ANALYSIS""
61 |
Osäkerhet vid översvämningskartering av vattendrag : En kunskapsöversikt och tillämpning på MIKE 11 / Uncertainty in flood inundation modeling of watercourses : A research overview and application to MIKE 11Björkman, Elin January 2014 (has links)
På grund av osäkerheter i indata, parametrar och modellstruktur kan det finnas stora osäkerheter i översvämningskarteringar. Trots detta sker oftast ingen osäkerhetsanalys vid översvämningskarteringar i praktiken vilket gör att beslutsfattare och andra användare kan uppfatta resultaten som mer korrekta än vad de egentligen är. En orsak till att osäkerhetsanalys ännu inte blivit en vedertagen del i översvämningskarteringar kan vara att modellerare på konsultbyråer och myndigheter inte har tillräcklig kunskap om ämnet. Att tillgången på data kan vara begränsad underlättar inte heller vid osäkerhetsanalyser. Dessutom saknas exempel på hur osäkerheter kan analyseras i MIKE 11, vilket är en av de vanligaste modellerna som används vid översvämningskarteringar på konsultbyråer. Syftet med examensarbetet var tvåfaldigt. Det första var att ge en generell kunskapsöverblick över aktuell forskning om osäkerheter och osäkerhetsanalys vid översvämningskarteringar för att öka kunskapen hos konsulter och beslutsfattare. Det andra syftet var att med ett exempel visa hur osäkerheter kan uppskattas i en översvämningskartering skapad i MIKE 11 då det finns begränsad tillgång på data. En litteraturstudie visade att det ofta finns stora osäkerheter i flödesberäkningar och den geometriska beskrivningen och att det finns väldigt många sätt att analysera dessa på. Några av metoderna som används är Monte Carlo simuleringar, Oskarpa mängder, Scenarioanalys, Bayesiansk kalibrering och Generalized Likelihood Uncertainty Estimation, GLUE. En fallstudie gjordes där en hydraulisk modell av Kungsbackaån skapades med MIKE 11. Den metod som var praktiskt genomförbar att använda för att uppskatta osäkerheterna i detta arbete var scenarioanalys. Totalt utfördes 36 olika modellsimuleringar där kalibreringsflöde, Mannings tal och scenarioflöde varierades. Scenarioanalys ger inte någon exakt beräkning av osäkerheterna utan endast en subjektiv uppskattning. Resultatet av scenarioanalysen visade att då havsnivån i Kungsbackafjorden var 0,92 m skiljde de simulerada vattennivåerna som mest med 1,3 m för 100-årsflödet och med 0,41 m för beräknat högsta flöde, BHF. Även osäkerheterna i utbredningen för de två flödena undersöktes och visade sig vara som störst i flacka områden trots att osäkerheten i vattennivåerna var mindre där. / Due to uncertainty in data, parameters and model structure, there may be large uncertainties in flood inundation models. Despite of this, uncertainty analysis is still rarely used by practitioners when creating flood maps. A reason why uncertainty analysis has not yet become customary in flood inundation modeling may be due to a lack of knowledge. Low availability of data can sometimes also make it more difficult to do an uncertainty analysis. Moreover, no examples exist of how uncertainties can be analyzed in MIKE 11, which is one of the most common models used in flood mapping at consultant agencies. The aim of this study was twofold. Firstly, to provide a general overview of current research on uncertainty and uncertainty analysis for flood inundation modeling. This in order to increase knowledge among consultants and decision makers. Secondly, to give an example of how uncertainties can be estimated in a flood inundation model created in MIKE 11 when there is limited access to data. The research overview showed that there is often considerable uncertainty in the discharge calculations and geometrical description in hydraulic models, and that there are many different ways to analyze the uncertainties. Some methods that are often used are Monte Carlo simulations, fuzzy sets, scenario analysis, Bayesian calibration and Generalized Likelihood Uncertainty Estimation, GLUE. A case study was performed in which a hydraulic model was built for the River Kungsbackaån in MIKE 11. A scenario analysis was carried out to show the uncertainties in the hydraulic model. Overall, 36 different model runs were made in which the calibration discharge, Manning's number and design flow were varied. Scenario analysis cannot provide a precise estimate of the uncertainty, it can only give a subjective estimate. The results of the scenario analysis showed that when the sea level in Kungsbackafjorden was 0,92 m the simulated water levels differed at most by 1,3 m for the 100-year discharge and by 0,41 m for the calculated maximum flow. Also, the flood extent of the two discharges were investigated. The greatest uncertainty in the extent was found in the flat areas even though the uncertainty in water levels was smaller there.
|
62 |
Exploração de espaços de parâmetros de modelos biológicos sob diferentes paradigmas estatísticos / Parameter space exploration of biological models under different statistical paradigmsAndre Chalom Machado de Oliveira 02 September 2015 (has links)
A formulação e o uso de modelos matemáticos complexos têm recebido grande atenção no estudo da ecologia nos últimos anos. Questões relacionadas à exploração de espaços de parâmetros destes modelos - executada de forma eficiente, sistemática e à prova de erros - são de grande importância para melhor compreender, avaliar a confiabilidade e interpretar o resultado destes modelos. Neste trabalho, apresentamos uma investigação de métodos existentes para responder as questões relevantes da área, com ênfase na técnica conhecida como Hipercubo Latino e com foco na análise quantitativa dos resultados, e realizamos a comparação entre resultados analíticos de incerteza e sensibilidade e resultados obtidos do Hipercubo. Ainda, examinamos a proposta de uma metodologia paralela baseada no paradigma estatístico da verossimilhança. O capítulo 1 introduz uma investigação a respeito dos conceitos históricos sobre a natureza da probabilidade, situando o conceito da verossimilhança como componente central da inferência estatística. O capítulo 2 (em inglês) traz uma revisão bibliográfica sobre o estado da arte em análises de incerteza e sensibilidade, apresentando dois exemplos de aplicação das técnicas descritas a problemas de crescimento populacional estruturado. O capítulo 3 examina a proposta de uma metodologia baseada na verossimilhança dos dados como uma abordagem integrativa entre a estimação de parâmetros e a análise de incerteza, apresentando resultados preliminares. Durante o progresso do presente trabalho, um pacote de funções na linguagem R foi desenvolvido para facilitar o emprego na prática das ferramentas teóricas expostas acima. Os apêndices deste texto trazem um tutorial e exemplos de uso deste pacote, pensado para ser ao mesmo tempo conveniente e de fácil extensão, e disponível livremente na internet, no endereço http://cran.r-project.org/web/packages/pse. / There is a growing trend in the use of mathematical modeling tools in the study of many areas of the biological sciences. The use of computer models in science is increasing, specially in fields where laboratory experiments are too complex or too costly, like ecology. Questions of efficient, systematic and error-proof exploration of parameter spaces are are of great importance to better understand, estimate confidences and make use of the output from these models. We present a survey of the proposed methods to answer these questions, with emphasis on the Latin Hypercube Sampling and focusing on quantitative analysis of the results. We also compare analytical results for sensitivity and uncertainty, where relevant, to LHS results. Finally, we examine the proposal of a methodology based on the likelihood statistical paradigm. Chapter 1 introduces a brief investigation about the historical views about the nature of probability, in order to situate the concept of likelihood as a central component in statistical inference. Chapter 2 (in English) shows a revision about the state-of-art uncertainty and sensitivity analyses, with a practical example of applying the described techniques to two models of structured population growth. Chapter 3 examines the proposal of a likelihood based approach as an integrative procedure between parameter value estimation and uncertainty analyses, with preliminary results. During the progress of this work, a package of R functions was developed to facilitate the real world use of the above theoretical tools. The appendices of this text bring a tutorial and examples of using this package, freely available on the Internet at http://cran.r-project.org/web/packages/pse.
|
63 |
Integração de analise de incertezas e ajuste de historico de produçaõ / Integration of uncertainty analysis and history matching processMoura Filho, Marcos Antonio Bezerra de 12 August 2018 (has links)
Orientadores: Denis Jose Schiozer, Celio Maschio / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica e Instituto de Geociencias / Made available in DSpace on 2018-08-12T23:06:56Z (GMT). No. of bitstreams: 1
MouraFilho_MarcosAntonioBezerrade_M.pdf: 4188788 bytes, checksum: 07988564a5783bc054c31f18ca0a2752 (MD5)
Previous issue date: 2006 / Resumo:O processo de ajuste de histórico tradicional normalmente resulta em um único modelo determinístico que é utilizado para representar o reservatório, o que pode não ser suficiente para garantir previsões de produção confiáveis, principalmente para campos em início de produção. Este trabalho apresenta uma análise quantitativa das incertezas dos atributos de reservatório integrada com o processo de ajuste de histórico. Ao invés de ser utilizada uma abordagem determinística, aborda-se uma análise probabilística dos modelos de reservatório resultando em faixas de incerteza de previsão de produção e possibilitando uma melhor visão do comportamento futuro de reservatórios. Na metodologia utilizada neste trabalho, dados de simulação são comparados com dados de produção observados e, de acordo com os afastamentos em relação ao histórico de produção, há uma mudança das probabilidades de ocorrência dos cenários. Em alguns procedimentos propostos, há alterações também nos valores dos atributos incertos, diminuindo sua faixa de incerteza. O maior desafio deste trabalho consiste em determinar uma maneira consistente e confiável para promover a integração da análise de incertezas e ajuste de histórico, aumentando a confiabilidade na previsão de comportamento de reservatórios de petróleo e que seja possível de ser automatizada, facilitando o trabalho e acelerando o processo. Foram testados vários critérios até se alcançar a validação da metodologia proposta. Após a análise dos resultados obtidos, sugere-se uma seqüência de aplicação dos métodos de redução de incerteza propostos na metodologia. A principal contribuição desta metodologia é aumentar a confiabilidade na previsão de comportamento de reservatórios através de simulação numérica e mostrar a necessidade de incorporar incertezas ao processo de ajuste de histórico de produção. Uma outra contribuição deste trabalho é iniciar essa linha de pesquisa propondo e validando alguns métodos para integrar os processos de ajuste e análise de incertezas / Abstract: History matching process usually results in a unique deterministic model that is used torepresent the reservoir, but it may not be enough to guarantee reliable production forecasts, mainly for fields in early production stages. This work presents a quantitative uncertainty analysis of reservoir attributes integrated to the history matching process. Instead of using a deterministic approach, it is used a probabilistic analysis of the reservoir models, resulting in uncertainty ranges for the production forecast and allowing a better prediction of reservoir performance. In the methodology used in this work, simulation data are compared to observed production data and, according to the difference between those data, the probabilities of the scenarios are changed. In some procedures, the probability distribution of the reservoir attributes also change, diminishing their uncertainty range. The main challenges of this work are: (1) the determination of a consistent and reliable procedure to provide the integration of the uncertainty analysis and the history matching process, increasing the reliability in the reservoir performance forecast; and (2) to develop an automatic procedure, making the work easier and speeding up the process. The main contribution of this work is to increase the reliability of production predictions through reservoir simulation models and to show the necessity of incorporating uncertainties in the history matching. Other contribution of this work is start up a research line, proposing and validating some methods to integrate the history matching process and the uncertainty analysis / Mestrado / Ciencias e Engenharia do Petroleo / Mestre em Ciências e Engenharia de Petróleo
|
64 |
Improving Seasonal Rainfall and Streamflow Forecasting in the Sahel Region via Better Predictor Selection, Uncertainty Quantification and Forecast Economic Value AssessmentSittichok, Ketvara January 2016 (has links)
The Sahel region located in Western Africa is well known for its high rainfall variability. Severe and recurring droughts have plagued the region during the last three decades of the 20th century, while heavy precipitation events (with return periods of up to 1,200 years) were reported between 2007 and 2014. Vulnerability to extreme events is partly due to the fact that people are not prepared to cope with them. It would be of great benefit to farmers if information about the magnitudes of precipitation and streamflow in the upcoming rainy season were available a few months before; they could then switch to more adapted crops and farm management systems if required. Such information would also be useful for other sectors of the economy, such as hydropower production, domestic/industrial water consumption, fishing and navigation.
A logical solution to the above problem would be seasonal rainfall and streamflow forecasting, which would allow to generate knowledge about the upcoming rainy season based on information available before it's beginning. The research in this thesis sought to improve seasonal rainfall and streamflow forecasting in the Sahel by developing statistical rainfall and streamflow seasonal forecasting models. Sea surface temperature (SST) were used as pools of predictor. The developed method allowed for a systematic search of the best period to calculate the predictor before it was used to predict average rainfall or streamflow over the upcoming rainy season.
Eight statistical models consisted of various statistical methods including linear and polynomial regressions were developed in this study. Two main approaches for seasonal streamflow forecasting were developed here: 1) A two steps streamflow forecasting approach (called the indirect method) which first linked the average SST over a period prior to the date of forecast to average rainfall amount in the upcoming rainy season using the eight statistical models, then linked the rainfall amount to streamflow using a rainfall-runoff model (Soil and Water Assessment Tool (SWAT)). In this approach, the forecasted rainfall was disaggregated to daily time step using a simple approach (the fragment method) before being fed into SWAT.
2) A one step streamflow forecasting approach (called as the direct method) which linked the average SST over a period prior to the date of forecast to the average streamflow in the upcoming rainy season using the eight statistical models.
To decrease the uncertainty due to model selection, Bayesian Model Averaging (BMA) was also applied. This method is able to explore the possibility of combining all available potential predictors (instead of selecting one based on an arbitrary criterion). The BMA is also capability to produce the probability density of the forecast which allows end-users to visualize the density of expected value and assess the level of uncertainty of the generated forecast. Finally, the economic value of forecast system was estimated using a simple economic approach (the cost/loss ratio method).
Each developed method was evaluated using three well known model efficiency criteria: the Nash-Sutcliffe coefficient (Ef), the coefficient of determination (R2) and the Hit score (H). The proposed models showed equivalent or better rainfall forecasting skills than most research conducted in the Sahel region. The linear model driven by the Pacific SST produced the best rainfall forecasts (Ef = 0.82, R2 = 0.83, and H = 82%) at a lead time of up to 12 months. The rainfall forecasting model based on polynomial regression and forced by the Atlantic ocean SST can be used using a lead time of up to 5 months and had a slightly lower performance (Ef = 0.80, R2 = 0.81, and H = 82%). Despite the fact that the natural relationship between rainfall and SST is nonlinear, this study found that good results can be achieved using linear models.
For streamflow forecasting, the direct method using polynomial regression performed slightly better than the indirect method (Ef = 0.74, R2 = 0.76, and H = 84% for the direct method; Ef = 0.70, R2 = 0.69, and H = 77% for the indirect method). The direct method was driven by the Pacific SST and had five months lead time. The indirect method was driven by the Atlantic SST and had six months lead time. No significant difference was found in terms of performance between BMA and the linear regression models based on a single predictor for streamflow forecasting. However, BMA was able to provide a probabilistic forecast that accounts for model selection uncertainty, while the linear regression model had a longer lead time.
The economic value of forecasts developed using the direct and indirect methods were estimated using the cost/loss ratio method. It was found that the direct method had a better value than the indirect method. The value of the forecast declined with higher return periods for all methods. Results also showed that for the particular watershed under investigation, the direct method provided a better information for flood protection.
This research has demonstrated the possibility of decent seasonal streamflow forecasting in the Sirba watershed, using the tropical Pacific and Atlantic SSTs as predictors.The findings of this study can be used to improve the performance of seasonal streamflow forecasting in the Sahel. A package implementing the statistical models developed in this study was developed so that end users can apply them for seasonal rainfall or streamflow forecasting in any region they are interested in, and using any predictor they may want to try.
|
65 |
An efficient analysis of pareto optimal solutions in multidisciplinary designErfani, Tohid January 2011 (has links)
Optimisation is one of the most important and challenging part of any engineering design. In real world design problems one faces multiobjective optimisation under constraints. The optimal solution in these cases is not unique because the objectives can contradict each other. In such cases, a set of optimal solutions which forms a Pareto frontier in the objective space is considered. There are many algorithms to generate the Pareto frontier. However, only a few of them are potentially capable of providing an evenly distributed set of the solutions. Such a property is especially important in real-life design because a decision maker is usually able to analyse only a very limited quantity of solutions. This thesis consists of two main parts. At first, it develops and gives the detailed description of two different algorithms that are able to generate an evenly distributed Pareto set in a general formulation. One is a classical approach and called Directed Search Domain (DSD) and the other, the cylindrical constraint evolutionary algorithm (CCEA), is a hybrid population based method. The efficiency of the algorithms are demonstrated by a number of challenging test cases and the comparisons with the results of the other existing methods. It is shown that the proposed methods are successful in generating the Pareto solutions even when some existing methods fail. In real world design problems, deterministic approaches cannot provide a reliable solution as in the event of uncertainty, deterministic optimal solution would be infeasible in many instances. Therefore a solution less sensitive to problem perturbation is desirable. This leads to the robust solution which is the focus of the second part of the thesis. In the literature, there are some techniques tailored for robust optimisation. However, most of them are either computationally expensive or do not systematically articulate the designer preferences into a robust solution. In this thesis, by introducing a measure for robustness in multiobjective context, a tunable robust function (TRF) is presented. Including the TRF in the problem formulation, it is demonstrated that the desirable robust solution based on designer preferences can be obtained. This not only provides the robust solution but also gives a control over the robustness level. The method is efficient as it only increases the dimension of the problem by one irrespective of the dimension of the original problem.
|
66 |
Parameterschätzung und Modellevaluation für komplexe SystemeSchumann-Bischoff, Jan 06 April 2016 (has links)
No description available.
|
67 |
Water and Carbon Balance Modeling: Methods of Uncertainty AnalysisJuston, John January 2010 (has links)
How do additional data of the same and/or different type contribute to reducing model parameter and predictive uncertainties? This was the question addressed with two models – the HBV hydrological water balance model and the ICBM soil carbon balance model – that were used to investigate the usefulness of the Generalized Likelihood Uncertainty Estimation (GLUE) method for calibrations and uncertainty analyses. The GLUE method is based on threshold screening of Monte Carlo simulations using so-called informal likelihood measures and subjective acceptance criterion. This method is highly appropriate for model calibrations when errors are dominated by epistemic rather than stochastic uncertainties. The informative value of data for model calibrations was investigated with numerous calibrations aimed at conditioning posterior parameter distributions and boundaries on model predictions. The key results demonstrated examples of: 1) redundant information in daily time series of hydrological data; 2) diminishing returns in the value of continued time series data collections of the same type; 3) the potential value of additional data of a different type; 4) a means to effectively incorporate fuzzy information in model calibrations; and 5) the robustness of estimated parameter uncertainty for portability of a soil carbon model between and tropical climate zones. The key to obtaining these insights lied in the methods of uncertainty analysis used to produce them. A paradigm for selecting between formal and informal likelihood measures in uncertainty analysis is presented and discussed for future use within a context of climate related environmental modeling. / QC 20110414
|
68 |
A Computational Framework for Dam Safety Risk Assessment with Uncertainty AnalysisSrivastava, Anruag 01 May 2013 (has links)
The growing application of risk analysis in dam safety, especially for the owners of large numbers of dams (e.g., U.S. Army Corps of Engineers), has motivated the development of a new tool (DAMRAE) for event tree based dam safety risk analysis. Various theoretical challenges were overcome in formulating the computational framework of DAMRAE and several new computational concepts were introduced. The concepts of Connectivity and Pedigree matrices are proposed to quantify the user-drawn event tree structures with proper accounting of interdependencies among the event tree branches. A generic calculation of Common-Cause Adjustment for the non-mutually exclusive failure modes is implemented along with introducing the new concepts of system response probability and consequence freezing. New output presentation formats such as cumulative risk estimate vs. initiating variable plots to analyze the increase of an incremental (annualized) risk estimate as a function of initiating variable are introduced. An additional consideration is given to the non-breach risk estimates in the risk modeling and new output formats such as non-breach F-N and F-$ charts are included as risk analysis outputs.
DAMRAE, a Visual Basic.NET based framework, provides a convenient platform to structure the risk assessment of a dam in its existing state and for alternatives or various stages of implementing a risk reduction plan. The second chapter of the dissertation presents the architectural framework of DAMRAE and describes the underlying theoretical and computational logic employed in the software. An example risk assessment is presented in the third chapter to demonstrate the DAMRAE functionalities.
In the fourth chapter, the DAMRAE framework is extended into DAMRAE-U to incorporate uncertainty analysis functionality. Various aspects and requirements reviewed for uncertainty analysis in the context of dam safety risk assessment and theoretical challenges overcome to develop the computational framework for DAMRAE-U are described in this chapter. The capabilities of DAMRAE-U are illustrated in the fifth chapter, which contains an example dam safety risk assessment with uncertainty analysis. The dissertation concludes with a summary of DAMRAE features and recommendations for further work in the sixth chapter.
|
69 |
MAPPING AND UNCERTAINTY ANALYSIS OF URBAN VEGETATION CARBON DENSITY BY COMBINING SPATIAL MODELING, DE-SHADOW & SPECTRAL UNMIXING ANALYSISQie, Guangping 01 May 2019 (has links) (PDF)
AN ABSTRACT OF THE DISSERTATION OF
|
70 |
Modeling the Dissolution of Immiscible Contaminants in Groundwater for Decision SupportPrieto Estrada, Andres Eduardo 27 June 2023 (has links)
Predicting the dissolution rates of immiscible contaminants in groundwater is crucial for developing environmental remediation strategies, but quantitative modeling efforts are inherently subject to multiple uncertainties. These include unknown residual amounts of non-aqueous phase liquids (NAPL) and source zone dimensions, inconsistent historical monitoring of contaminant mass discharge, and the mathematical simulation of field-scale mass transfer processes. Effective methods for simulating NAPL dissolution must therefore be able to assimilate a variety of data through physical and scalable mass transfer parameters to quantify and reduce site-specific uncertainties. This investigation coupled upscaled and numerical mass transfer modeling with uncertainty analyses to understand and develop data-assimilation and parameter-scaling methods for characterizing NAPL source zones and predicting depletion timeframes.
Parameters of key interest regulating kinetic NAPL persistence and contaminant fluxes are residual mass and saturation, but neither can be measured directly at field sites. However, monitoring and characterization measurements can constrain source zone dimensions, where NAPL mass is distributed. This work evaluated the worth of source zone delineation and dissolution monitoring for estimating NAPL mass and mass transfer coefficients at multiple scales of spatial resolution. Mass transfer processes in controlled laboratory and field experiments were analyzed by simulating monitored dissolved-phase concentrations through the parameterization of explicit and lumped system properties in volume-averaged (VA) and numerical models of NAPL dissolution, respectively. Both methods were coupled with uncertainty analysis tools to investigate the relationship between data availability and model design for accurately constraining system parameters and predictions. The modeling approaches were also combined for reproducing experimental bulk effluent rates in discretized domains, explicitly parameterizing mass transfer coefficients at multiple grid scales.
Research findings linked dissolved-phase monitoring signatures to model estimates of NAPL persistence, supported by source zone delineation data. The accurate characterization of source zone properties and kinetic dissolution rates, governing NAPL longevity, was achieved by adjusting model parameterization complexity to data availability. While multistage effluent rates accurately constrained explicit-process parameters in VA models, spatially-varying lumped-process parameters estimated from late dissolution stages also constrained unbiased predictions of NAPL depletion. Advantages of the numerical method included the simultaneous assimilation of bulk and high-resolution monitoring data for characterizing the distribution of residual NAPL mass and dissolution rates, whereas the VA method predicted source dissipation timeframes from delineation data alone. Additionally, comparative modeling analyses resulted in a methodology for scaling VA mass transfer coefficients to simulate NAPL dissolution and longevity at multiple grid resolutions. This research suggests feasibility in empirical constraining of lumped-process parameters by applying VA concepts to numerical mass transfer and transport models, enabling the assimilation of monitoring and source delineation data to reduce site-specific uncertainties. / Doctor of Philosophy / Predicting the dissolution rates of immiscible contaminants in groundwater is crucial for developing environmental restoration strategies, but quantitative modeling efforts are inherently subject to multiple uncertainties. These include unknown mass and dimensions of contaminant source zones, inconsistent groundwater monitoring, and the mathematical simulation of physical processes controlling dissolution rates at field scales. Effective simulation methods must therefore be able to leverage a variety of data through rate-limiting parameters suitable for quantifying and reducing uncertainties at contaminated sites. This investigation integrated mathematical modeling with uncertainty analyses to understand and develop data-driven approaches for characterizing contaminant source zones and predicting dissolution rates at multiple measurement scales.
Parameters of key interest regulating the lifespan of source zones are the distribution and amount of residual contaminant mass, which cannot be measured directly at field sites. However, monitoring and site characterization measurements can constrain source zone dimensions, where contaminant mass is distributed. This work evaluated the worth of source zone delineation and groundwater monitoring for estimating contaminant mass and dissolution rates at multiple measurement scales. Rate-limiting processes in controlled laboratory and field experiments were analyzed by simulating monitored groundwater concentrations through the explicit and lumped representation of system properties in volume-averaged (VA) and numerical models of contaminant dissolution, respectively. Both methods were coupled with uncertainty analysis tools to investigate the relationship between data availability and model design for accurately constraining system parameters and predictions. The approaches were also combined for predicting average contaminant concentrations at multiple scales of spatial resolution.
Research findings linked groundwater monitoring profiles to model estimates of contaminant persistence, supported by source zone delineation data. The accurate characterization of source zone properties and contaminant dissolution rates was achieved by adjusting model complexity to data availability. While monitoring profiles indicating multi-rate contaminant dissolution accurately constrained explicit-process parameters in VA models, spatially-varying lumped parameters estimated from late dissolution stages also constrained unbiased predictions of source mass depletion. Advantages of the numerical method included the simultaneous utilization of average and spatially-detailed monitoring data for characterizing the distribution of contaminant mass and dissolution rates, whereas the VA method predicted source longevity timeframes from delineation data alone. Additionally, comparative modeling analyses resulted in a methodology for scaling estimable VA parameters to predict contaminant dissolution rates at multiple scales of spatial resolution. This research suggests feasibility in empirical constraining of lumped parameters by applying VA concepts to numerical models, enabling a comprehensive data-driven methodology to quantify environmental risk and support groundwater cleanup designs.
|
Page generated in 0.031 seconds