• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 92
  • 25
  • 20
  • 12
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 221
  • 221
  • 40
  • 35
  • 32
  • 30
  • 30
  • 24
  • 24
  • 24
  • 22
  • 20
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

Procédés de traitement biologiques in situ : la modélisation numérique comme outil d'aide à la décision / Biological treatment processes in situ : Numerical modelling as decision support tool

Verardo, Elicia 22 March 2016 (has links)
La bio-remédiation in situ est une technique de réhabilitation couramment utilisée pour le traitement des sols et des nappes contaminés, notamment par les hydrocarbures pétroliers. Si démontrer la pertinence de ce type de traitement constitue un préalable incontournable pour chacun des sites où il est mis en œuvre, l’efficacité du traitement dépend de ses conditions de mise en œuvre dans le contexte spécifique d’un site. Le suivi et le contrôle des différents processus qui gouvernent les phénomènes de biodégradation est complexe, et leur optimisation constitue un élément clé de la réussite du traitement tant au plan technique qu’économique. La démarche générale du travail de thèse porte sur le développement d’une méthodologie permettant d’employer la modélisation dans une démarche de gestion (au sens des textes de 2007) d’un site contaminé par des hydrocarbures pétroliers traité par biodégradation in situ. L’originalité du travail de thèse porte sur l’utilisation de la modélisation comme outil de compréhension des mécanismes et d’aide à la décision à chaque étape du traitement : (i) Dimensionnement de l’installation : définir la meilleure option envisageable, (ii) suivi de l’efficacité du traitement : optimiser le procédé et (iii) prédiction et justification de l’arrêt du traitement : sécuriser en termes de garantie de résultat. Les données d’un site d’étude servent de support dans la définition de l’approche méthodologique de modélisation. A chaque étape d’un projet de bio-remédiation in situ peut être associée une étape de modélisation qui fera appel à des moyens plus ou moins sophistiqués avec des exigences de précision variables. Le premier outil développé concerne l’estimation des incertitudes prédictives dans les modèles mis en œuvre. Cet aspect est fondamental, dès lors que l’on souhaite utiliser la modélisation dans un processus d’aide à la décision. Les processus de bio-remédiation in situ impliquent des relations complexes et incertaines entre la biomasse, les contaminants et les mesures de contrôle appropriées. Prévoir la performance du traitement (en termes de réduction du flux et/ou de la masse) constitue un défi en raison des incertitudes liées aux propriétés du milieu, de la source et aux incertitudes liées aux mécanismes de bio-remédiation. L’étude de la contribution des incertitudes paramétriques dans la prédiction de la performance du traitement est réalisée avec la méthode du « Null Space Monte Carlo » (NSMC) implémentée dans l’outil PEST. Le second outil utilisé concerne l’optimisation du design et/ou du monitoring d’un procédé de bio-traitement in situ. Dans ce contexte, deux objectifs peuvent être envisagés à savoir la réduction du flux de contaminants d’une part, et l’élimination de la masse à la zone source d’autre part. L’outil utilisé est un algorithme d’optimisation mathématique dénommé “Particle Swarm Optimisation” (PSO). Le choix de la fonction objectif à optimiser est particulièrement important et s’avère lié au comportement spécifique hydrogéochimique du site considéré. Cette étude montre que les outils NSMC et PSO s’avèrent appropriés dans l'utilisation de modèles de transport réactif dans la gestion environnementale. Les temps de calcul de ces modèles hautement paramétrés et non linéaires limitent encore l'utilisation de la modélisation comme outil d’aide à la décision. Malgré ces limites, l’approche proposée pour gérer la bio-remédiation in situ des eaux souterraines sur site réel peut être efficace pour fournir un soutien dans la gestion du traitement d’une pollution, étendant ainsi le domaine d’application de la modélisation numérique. Cette approche permet aussi de mettre en évidence des difficultés dues aux spécificités du site ou à la technique même de traitement choisie, permettant d’alerter à temps les gestionnaires. / In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Although demonstrating the relevance of this type of treatment is an essential prerequisite for each site where it is implemented, the effectiveness of the treatment depends on its implementation conditions in the site-specific context. The monitoring and control of different processes that govern biodegradation phenomena is complex, and optimization is a key element of successful treatment both technically and economically. The general approach of the thesis is the development of a methodology for using modelling in a management approach (as defined in the French regulatory text) of petroleum-contaminated site treated by in situ biodegradation. The work focuses on the use of modelling as a tool for understanding mechanisms and for decision support at every stage of treatment: • System design: defining the best possible option.• Monitoring the effectiveness of treatment: process optimization.• Prediction and justification of stopping treatment: analysis of the uncertainty on the treatment result. Data from two study sites are used to define the modelling methodology. At each stage of the bio-remediation project (design, conception, monitoring and optimization) may be associated a modelling stage that will be more or less sophisticated depending on accuracy requirements. The first tool developed involved predictive uncertainty analysis, which is crucial when modelling is used as a decision support tool, and can be used at the design process step or for predicting the effectiveness of treatment. The process of in-situ bioremediation involves complex and uncertain relationships among biomass, contaminants and appropriate control actions. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with (i) the medium and source properties and (ii) the efficiency of concentration reducing mechanisms. Parametric uncertainty contributions involved in forecasting treatment performance is carried out with the “Null-Space Monte Carlo” (NSMC) method implemented in the PEST tool. The second tool relates design and / or monitoring optimization of the bio-treatment method. In this context, two purposes can be considered: the reduction of contaminants flux or mass in the source zone. The tool used is a mathematical optimization algorithm called "Particle Swarm Optimization" (PSO). The choice of the objective function to be optimized is particularly important and appears to be related to hydrogeochemical site-specific behavior. This study showed that the NSMC and PSO methods are suitable tools for an efficient use reactive transport models in environmental management. The computation time of these highly parameterized and nonlinear models still limit the use of modelling as a decision support tool. Despite these limitations, the proposed approach for managing the bioremediation in-situ groundwater on actual site can be effective to provide support in managing the treatment of pollution, extending the field of application of numerical modelling. This approach also allows to highlight difficulties due to site-specific behavior or to the treatment technique applied, and to inform decision support managers in consequence.
122

Physics-Guided Machine Learning in Ocean Acoustics Using Fisher Information

Mortenson, Michael Craig 14 April 2022 (has links)
Waterborne acoustic signals carry information about the ocean environment. Ocean geoacoustic inversion is the task of estimating environmental parameters from received acoustic signals by matching the measured sound with the predictions of a physics-based model. A lower bound on the uncertainty associated with environmental parameter estimates, the Cramér-Rao bound, can be calculated from the Fisher information, which is dependent on derivatives of a physics-based model. Physics-based preconditioners circumvent the need for variable step sizes when computing numerical derivatives. This work explores the feasibility of using a neural network to perform geoacoustic inversion for environmental parameters and their associated uncertainties from ship noise spectrogram data. To train neural networks, a synthetic dataset is generated and tested for generalizability against 31 measurements taken during the SBCEX2017 study of the New England Mud Patch.
123

Performance Evaluation and Field Validation of Building Thermal Load Prediction Model

Sarwar, Riasat Azim 14 August 2015 (has links)
This thesis presents performance evaluation and a field validation study of a time and temperature indexed autoregressive with exogenous (4-3-5 ARX) building thermal load prediction model with an aim to integrate the model with actual predictive control systems. The 4-3-5 ARX model is very simple and computationally efficient with relatively high prediction accuracy compared to the existing sophisticated prediction models, such as artificial neural network prediction models. However, performance evaluation and field validation of the model are essential steps before implementing the model in actual practice. The performance of the model was evaluated under different climate conditions as well as under modeling uncertainty. A field validation study was carried out for three buildings at Mississippi State University. The results demonstrate that the 4-3-5 ARX model can predict building thermal loads in an accurate manner most of the times, indicating that the model can be readily implemented in predictive control systems.
124

Traumatic brain injury: modeling and simulation of the brain at large deformation

Prabhu, Raj 06 August 2011 (has links)
The brain is a complex organ and its response to the mechanical loads at all strain rates has been nonlinear and inelastic in nature. Split-Hopkinson Pressure Bar (SHPB) high strain rate compressive tests conducted on porcine brain samples showed a strain rate dependent inelastic mechanical behavior. Finite Element (FE) modeling of the SHPB setup in ABAQUS/Explicit, using a specific constitutive model (MSU TP Ver. 1.1) for the brain, showed non-uniform stress state during tissue deformation. Song et al.’s assertion of using annular samples for negating inertial effects was also tested. FE simulation results showed that the use of cylindrical or annular did not mitigate the initial hardening. Further uniaxial stress state was not maintained is either case. Experimental studies on hydration effects of the porcine brain on its mechanical response revealed two different phenomenological trends. The wet brain (~80% water wt. /wt.) showed strain rate dependency along with two unique mechanical behavior patterns at quasi-static and high strain rates. The dry brain’s (~0% water wt. /wt.) response was akin to the response of metals. The dry brain’s response also observed to be strain rate insensitivity in its elastic modulus and yield stress variations. Uncertainty analysis of the wet brain high strain rate data revealed large uncertainty bands for the sample-to-sample random variations. This large uncertainty in the brain material should be taken into in the FE modeling and design stages. FE simulations of blast loads to the human head showed that Pressure played a dominant role in causing blast-related Traumatic Brain Injury (bTBI). Further, the analysis of shock waves exposed the deleterious effect of the 3-Dimensional geometry of the skull in pinning the location of bTBI. The effects of peak negative Pressure at injury sites have been attributed to bTBI pathologies such as Diffuse Axonal Injury (DAI), subdural hemorrhage and cerebral contusion.
125

Improving hydrological post-processing for assessing the conditional predictive uncertainty of monthly streamflows

Romero Cuellar, Jonathan 07 January 2020 (has links)
[ES] La cuantificación de la incertidumbre predictiva es de vital importancia para producir predicciones hidrológicas confiables que soporten y apoyen la toma de decisiones en el marco de la gestión de los recursos hídricos. Los post-procesadores hidrológicos son herramientas adecuadas para estimar la incertidumbre predictiva de las predicciones hidrológicas (salidas del modelo hidrológico). El objetivo general de esta tesis es mejorar los métodos de post-procesamiento hidrológico para estimar la incertidumbre predictiva de caudales mensuales. Esta tesis pretende resolver dos problemas del post-procesamiento hidrológico: i) la heterocedasticidad y ii) la función de verosimilitud intratable. Los objetivos específicos de esta tesis son tres. Primero y relacionado con la heterocedasticidad, se propone y evalúa un nuevo método de post-procesamiento llamado GMM post-processor que consiste en la combinación del esquema de modelado de probabilidad Bayesiana conjunta y la mezcla de Gaussianas múltiples. Además, se comparó el desempeño del post-procesador propuesto con otros métodos tradicionales y bien aceptados en caudales mensuales a través de las doce cuencas hidrográficas del proyecto MOPEX. A partir de este objetivo (capitulo 2), encontramos que GMM post-processor es el mejor para estimar la incertidumbre predictiva de caudales mensuales, especialmente en cuencas de clima seco. Segundo, se propone un método para cuantificar la incertidumbre predictiva en el contexto de post-procesamiento hidrológico cuando sea difícil calcular la función de verosimilitud (función de verosimilitud intratable). Algunas veces en modelamiento hidrológico es difícil calcular la función de verosimilitud, por ejemplo, cuando se trabaja con modelos complejos o en escenarios de escasa información como en cuencas no aforadas. Por lo tanto, se propone el ABC post-processor que intercambia la estimación de la función de verosimilitud por el uso de resúmenes estadísticos y datos simulados. De este objetivo específico (capitulo 3), se demuestra que la distribución predictiva estimada por un método exacto (MCMC post-processor) o por un método aproximado (ABC post-processor) es similar. Este resultado es importante porque trabajar con escasa información es una característica común en los estudios hidrológicos. Finalmente, se aplica el ABC post-processor para estimar la incertidumbre de los estadísticos de los caudales obtenidos desde las proyecciones de cambio climático, como un caso particular de un problema de función de verosimilitud intratable. De este objetivo específico (capitulo 4), encontramos que el ABC post-processor ofrece proyecciones de cambio climático más confiables que los 14 modelos climáticos (sin post-procesamiento). De igual forma, ABC post-processor produce bandas de incertidumbre más realista para los estadísticos de los caudales que el método clásico de múltiples conjuntos (ensamble). / [CA] La quantificació de la incertesa predictiva és de vital importància per a produir prediccions hidrològiques confiables que suporten i recolzen la presa de decisions en el marc de la gestió dels recursos hídrics. Els post-processadors hidrològics són eines adequades per a estimar la incertesa predictiva de les prediccions hidrològiques (eixides del model hidrològic). L'objectiu general d'aquesta tesi és millorar els mètodes de post-processament hidrològic per a estimar la incertesa predictiva de cabals mensuals. Els objectius específics d'aquesta tesi són tres. Primer, es proposa i avalua un nou mètode de post-processament anomenat GMM post-processor que consisteix en la combinació de l'esquema de modelatge de probabilitat Bayesiana conjunta i la barreja de Gaussianes múltiples. A més, es compara l'acompliment del post-processador proposat amb altres mètodes tradicionals i ben acceptats en cabals mensuals a través de les dotze conques hidrogràfiques del projecte MOPEX. A partir d'aquest objectiu (capítol 2), trobem que GMM post-processor és el millor per a estimar la incertesa predictiva de cabals mensuals, especialment en conques de clima sec. En segon lloc, es proposa un mètode per a quantificar la incertesa predictiva en el context de post-processament hidrològic quan siga difícil calcular la funció de versemblança (funció de versemblança intractable). Algunes vegades en modelació hidrològica és difícil calcular la funció de versemblança, per exemple, quan es treballa amb models complexos o amb escenaris d'escassa informació com a conques no aforades. Per tant, es proposa l'ABC post-processor que intercanvia l'estimació de la funció de versemblança per l'ús de resums estadístics i dades simulades. D'aquest objectiu específic (capítol 3), es demostra que la distribució predictiva estimada per un mètode exacte (MCMC post-processor) o per un mètode aproximat (ABC post-processor) és similar. Aquest resultat és important perquè treballar amb escassa informació és una característica comuna als estudis hidrològics. Finalment, s'aplica l'ABC post-processor per a estimar la incertesa dels estadístics dels cabals obtinguts des de les projeccions de canvi climàtic. D'aquest objectiu específic (capítol 4), trobem que l'ABC post-processor ofereix projeccions de canvi climàtic més confiables que els 14 models climàtics (sense post-processament). D'igual forma, ABC post-processor produeix bandes d'incertesa més realistes per als estadístics dels cabals que el mètode clàssic d'assemble. / [EN] The predictive uncertainty quantification in monthly streamflows is crucial to make reliable hydrological predictions that help and support decision-making in water resources management. Hydrological post-processing methods are suitable tools to estimate the predictive uncertainty of deterministic streamflow predictions (hydrological model outputs). In general, this thesis focuses on improving hydrological post-processing methods for assessing the conditional predictive uncertainty of monthly streamflows. This thesis deal with two issues of the hydrological post-processing scheme i) the heteroscedasticity problem and ii) the intractable likelihood problem. Mainly, this thesis includes three specific aims. First and relate to the heteroscedasticity problem, we develop and evaluate a new post-processing approach, called GMM post-processor, which is based on the Bayesian joint probability modelling approach and the Gaussian mixture models. Besides, we compare the performance of the proposed post-processor with the well-known exiting post-processors for monthly streamflows across 12 MOPEX catchments. From this aim (chapter 2), we find that the GMM post-processor is the best suited for estimating the conditional predictive uncertainty of monthly streamflows, especially for dry catchments. Secondly, we introduce a method to quantify the conditional predictive uncertainty in hydrological post-processing contexts when it is cumbersome to calculate the likelihood (intractable likelihood). Sometimes, it can be challenging to estimate the likelihood itself in hydrological modelling, especially working with complex models or with ungauged catchments. Therefore, we propose the ABC post-processor that exchanges the requirement of calculating the likelihood function by the use of some sufficient summary statistics and synthetic datasets. With this aim in mind (chapter 3), we prove that the conditional predictive distribution is similarly produced by the exact predictive (MCMC post-processor) or the approximate predictive (ABC post-processor), qualitatively speaking. This finding is significant because dealing with scarce information is a common condition in hydrological studies. Finally, we apply the ABC post-processing method to estimate the uncertainty of streamflow statistics obtained from climate change projections, such as a particular case of intractable likelihood problem. From this specific objective (chapter 4), we find that the ABC post-processor approach: 1) offers more reliable projections than 14 climate models (without post-processing); 2) concerning the best climate models during the baseline period, produces more realistic uncertainty bands than the classical multi-model ensemble approach. / I would like to thank the Gobernación del Huila Scholarship Program No. 677 (Colombia) for providing the financial support for my PhD research. / Romero Cuellar, J. (2019). Improving hydrological post-processing for assessing the conditional predictive uncertainty of monthly streamflows [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/133999 / TESIS
126

Efficient, Accurate, and Non-Gaussian Error Propagation Through Nonlinear, Closed-Form, Analytical System Models

Anderson, Travis V. 29 July 2011 (has links) (PDF)
Uncertainty analysis is an important part of system design. The formula for error propagation through a system model that is most-often cited in literature is based on a first-order Taylor series. This formula makes several important assumptions and has several important limitations that are often ignored. This thesis explores these assumptions and addresses two of the major limitations. First, the results obtained from propagating error through nonlinear systems can be wrong by one or more orders of magnitude, due to the linearization inherent in a first-order Taylor series. This thesis presents a method for overcoming that inaccuracy that is capable of achieving fourth-order accuracy without significant additional computational cost. Second, system designers using a Taylor series to propagate error typically only propagate a mean and variance and ignore all higher-order statistics. Consequently, a Gaussian output distribution must be assumed, which often does not reflect reality. This thesis presents a proof that nonlinear systems do not produce Gaussian output distributions, even when inputs are Gaussian. A second-order Taylor series is then used to propagate both skewness and kurtosis through a system model. This allows the system designer to obtain a fully-described non-Gaussian output distribution. The benefits of having a fully-described output distribution are demonstrated using the examples of both a flat rolling metalworking process and the propeller component of a solar-powered unmanned aerial vehicle.
127

Development and Evaluation of Polaris CANDU Geometry Modelling and of TRACE_Mac/PARCS_Mac Coupling with RRS for CANDU Analysis / Polaris and TRACE/PARCS Code Development for CANDU Analysis

Younan, Simon January 2022 (has links)
McMaster University DOCTOR OF PHILOSOPHY (2022) Hamilton, Ontario (Engineering Physics) TITLE: Development and Evaluation of Polaris CANDU Geometry Modelling and of TRACE_Mac/PARCS_Mac Coupling with RRS for CANDU Analysis AUTHOR: Simon Younan, M.A.Sc. (McMaster University), B.Eng. (McMaster University) SUPERVISOR: Dr. David Novog NUMBER OF PAGES: xiv, 163 / In the field of nuclear safety analysis, as computers have become more powerful, there has been a trend away from low-fidelity models using conservative assumptions, to high-fidelity best-estimate models combined with uncertainty analysis. A number of these tools have been developed in the United States, due to the popularity of light water reactors. These include the SCALE analysis suite developed by ORNL, as well as the PARCS and TRACE tools backed by the USNRC. This work explores adapting the capabilities of these tools to the analysis of CANDU reactors. The Polaris sequence, introduced in SCALE 6.2, was extended in this work to support CANDU geometries and compared to existing SCALE sequences such as TRITON. Emphasis was placed on the Embedded Self-Shielding Method (ESSM), introduced with Polaris. Both Polaris and ESSM were evaluated and found to perform adequately for CANDU geometries. The accuracy of ESSM was found to improve when the precomputed selfshielding factors were updated using a CANDU representation. The PARCS diffusion code and the TRACE system thermalhydraulics code were coupled, using the built-in coupling capability between the two codes. In addition, the Exterior Communications Interface (ECI), used for coupling with TRACE, was utilized. A Python interface to the ECI library was developed in this work and used to couple an RRS model written in Python to the coupled PARCS/TRACE model. A number of code modifications were made to accommodate the required coupling and correct code deficiencies, with the modified versions named PARCS_Mac and TRACE_Mac. The coupled codes were able to simulate multiple transients based on prior studies as well as operational events. The code updates performed in this work may be used for many future studies, particularly for uncertainty propagation through a full set of calculations, from the lattice model to a full coupled system model. / Thesis / Doctor of Philosophy (PhD) / Modern nuclear safety analysis tools offer more accurate predictions for the safety and operation of nuclear reactors, including CANDU reactors. These codes take advantage of modern computer hardware, and also a shift in philosophy from conservative analysis to best estimate plus uncertainty analysis. The goal of this thesis was to adapt a number of modern tools to support CANDU analysis and uncertainty propagation, with a particular emphasis on coupling of multiple interacting models. These tools were then demonstrated, and results analyzed. The simulations performed in this work were successful in producing results comparable to prior studies along with experimental and operational data. This included the simulation of four weeks of reactor operation including “shim mode” operation. Sensitivity and uncertainty analyses were performed over the course of the work to quantify the precision and significance of the results as well as to identify areas of interest for future research.
128

Monte Carlo analysis of BWR transients : A study on the Best Estimate Plus Uncertainty methodology within safety analyses at Forsmark 3

Eriksson, Jonathan January 2023 (has links)
Transient analyses at Forsmark nuclear power plant are currently performed using realistic computer models in conjunction with conservative estimates of initial- and boundary conditions. This is known as the combined methodology. Due to the conservative estimates, the methodology runs the risk of sometimes over-estimating certain safety criteria which will negatively affect the optimization of reactor operation. The Best Estimate Plus Uncertainty (BEPU) methodology can provide higher safety margins by using probabilities instead of conservatisms when estimating initial and boundary conditions. The BEPU methodology applies a Monte Carlo method to assess the distribution of one or several key outputs. This study focuses on the lowest dryout margin achieved during each Monte Carlo simulation. The tolerance limits of the output are set with the help of Wilks formula using a one-sided 95% tolerance limit with 95% confidence. A total of 36 unique parameters describing initial and boundary conditions have been sampled for each Monte Carlo simulation. The parameters have been sampled using either Gaussian or Uniform distribution functions. The random nature of the Monte Carlo simulations has uncovered alternative event sequences and end states that are not seen in the combined methodology. Assessing the choice of order statistic in Wilks formula also concludes that there are diminishing returns the higher the order statistic is. When choosing the order statistic, one should consider the trade-off between an increased accuracy in the estimated outputs and the increased computational time required. The conservative methodology uses a mix of conservative and nominal estimations of key parameters. The difference in dryout margin between the conservative and the Monte Carlo results should therefore not be used to draw a conclusion about which methodology out-performs the other. While the Monte Carlo simulations do not result in an improved core optimization, they can act as a complement to the combined methodology by providing a more detailed analysis of possible event pathways for a postulated transient.
129

Development of Data-Driven Models for Membrane Fouling Prediction at Wastewater Treatment Plants

Kovacs, David January 2022 (has links)
Membrane bioreactors (MBRs) have proven to be an extremely effective wastewater treatment process combining ultrafiltration with biological processes to produce high-quality effluent. However, one of the major drawbacks to this technology is membrane fouling – an inevitable process that reduces permeate production and increases operating costs. The prediction of membrane fouling in MBRs is important because it can provide decision support to wastewater treatment plant (WWTP) operators. Currently, mechanistic models are often used to estimate transmembrane pressure (TMP), which is an indicator of membrane fouling, but their performance is not always satisfactory. In this research, existing mechanistic and data-driven models used for membrane fouling are investigated. Data-driven machine learning techniques consisting of random forest (RF), artificial neural network (ANN), and long-short term memory network (LSTM) are used to build models to predict transmembrane pressure (TMP) at various stages of the MBR production cycle. The models are built with 4 years of high-resolution data from a confidential full-scale municipal WWTP. The model performances are examined using statistical measures such as coefficient of determination (R2), root mean squared error, mean absolute percentage error, and mean squared error. The results show that all models provide reliable predictions while the RF models have the best predictive accuracy when compared to the ANN and LSTM models. The corresponding R2 values for RF when predicting before, during, and after back pulse TMP are 0.996, 0.927, and 0.996, respectively. Model uncertainty (including hyperparameter and algorithm uncertainty) is quantified to determine the impact of hyperparameter tuning and the variance of extreme predictions caused by algorithm choice. The ANN models are most impacted by hyperparameter tuning and have the highest variability when predicting extreme values within each model’s respective hyperparameter range. The proposed models can be useful tools in providing decision support to WWTP operators employing fouling mitigation strategies, which can potentially lead to better operation of WWTPs and reduced costs. / Thesis / Master of Applied Science (MASc)
130

Evaluation of Uncertainty in Hydrodynamic Modeling

Camacho Rincon, Rene Alexander 17 August 2013 (has links)
Uncertainty analysis in hydrodynamic modeling is useful to identify and report the limitations of a model caused by different sources of error. In the practice, the main sources of errors are divided into model structure errors, errors in the input data due to measurement imprecision among other, and parametric errors resulting from the difficulty of identifying physically representative parameter values valid at the temporal and spatial scale of the models. This investigation identifies, implements, evaluates, and recommends a set of methods for the evaluation of model structure uncertainty, parametric uncertainty, and input data uncertainty in hydrodynamic modeling studies. A comprehensive review of uncertainty analysis methods is provided and a set of widely applied methods is selected and implemented in real case studies identifying the main limitations and benefits of their use in hydrodynamic studies. In particular, the following methods are investigated: the First Order Variance Analysis (FOVA) method, the Monte Carlo Uncertainty Analysis (MCUA) method, the Bayesian Monte Carlo (BMC) method, the Markov Chain Monte Carlo (MCMC) method and the Generalized Likelihood Uncertainty Estimation (GLUE) method. The results of this investigation indicate that the uncertainty estimates computed with FOVA are consistent with the results obtained by MCUA. In addition, the comparison of BMC, MCMC and GLUE indicates that BMC and MCMC provide similar estimations of the posterior parameter probability distributions, single-point parameter values, and uncertainty bounds mainly due to the use of the same likelihood function, and the low number of parameters involved in the inference process. However, the implementation of MCMC is substantially more complex than the implementation of BMC given that its sampling algorithm requires a careful definition of auxiliary proposal probability distributions along with their variances to obtain parameter samples that effectively belong to the posterior parameter distribution. The analysis also suggest that the results of GLUE are inconsistent with the results of BMC and MCMC. It is concluded that BMC is a powerful and parsimonious strategy for evaluation of all the sources of uncertainty in hydrodynamic modeling. Despites of the computational requirements of BMC, the method can be easily implemented in most practical applications.

Page generated in 0.1181 seconds