• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 93
  • 25
  • 20
  • 12
  • 4
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 223
  • 223
  • 40
  • 35
  • 32
  • 31
  • 30
  • 24
  • 24
  • 24
  • 22
  • 21
  • 20
  • 20
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
121

[en] SEMI-QUANTITATIVE METHODOLOGY FOR ASSESSING THE RISK OF CO2 INJECTION FOR STORAGE IN GEOLOGICAL RESERVOIRS / [pt] METODOLOGIA SEMI-QUANTITATIVA PARA AVALIAÇÃO DO RISCO DA INJEÇÃO DE CO2 PARA ARMAZENAMENTO EM RESERVATÓRIOS GEOLÓGICOS

FERNANDA LINS GONCALVES PEREIRA 03 October 2016 (has links)
[pt] A última etapa do sequestro e armazenamento de carbono (CCS) pode ser realizada pela de injeção de CO2 em reservatórios geológicos. Projetos de CCS fazem parte de uma série de técnicas para a mitigação dos gases do efeito estufa. Neste trabalho, uma metodologia semi-quantitativa para avaliação do risco da injeção de CO2 em reservatórios geológicos é apresentada. Essa metodologia é desenvolvida a partir da criação e utilização de uma matriz de risco. Essa matriz possui em uma direção categorias de severidade ajustadas de forma qualitativa e na outra direção categorias de probabilidade ajustadas a partir de análises probabilísticas. Os valores de risco de uma fonte de perigo são calculados pelo produto de suas severidades com suas probabilidades associadas. As fontes de perigo são problemas relacionados à injeção de CO2 que são selecionadas para análise de um cenário específico. As categorias de severidade são definidas por faixas de níveis de funcionamento de uma fonte de perigo. Diversos métodos de análise probabilística são investigados e a família de métodos do valor médio apresenta características favoráveis ao seu emprego em funções de estado limite complexas. A metodologia é aplicada em um estudo de caso ilustrativo. Com os valores de risco resultantes, faz-se a identificação da principal fonte de perigo e das variáveis aleatórias mais influentes. A avaliação da metodologia indica que ela é uma ferramenta poderosa para os analistas e tomadores de decisão, e tem potencial para auxiliar na fase de planejamento de projetos de CCS. / [en] The last stage of carbon capture and sequestration (CCS) can be performed by CO2 injection process in geological reservoirs. CCS projects belong to a number of ways to mitigate greenhouse gases. In this work, a semi-quantitative methodology to assess the risk of CO2 injection in geological reservoirs is developed. This methodology is based on the establishment and application of a risk matrix. This matrix has in one direction severity categories set in a qualitative way and in the other direction probability categories set from probabilistic analysis. The risk values of a hazard source are calculated by the product of their severities with their associated probabilities. Hazard sources are problems related to the injection of CO2 that are selected for a specific scenario analysis. The severity categories are defined by operating level ranges of a hazard source. Several probabilistic analysis methods are investigated and the family of the mean value methods shows characteristics favoring their use in complex limit state functions.The methodology is applied in an illustrative case study. With the resulting risk values, the identification of the main hazard source and the most inuential random variables are made. Assessment of the methodology indicates that it is a powerful tool for analysts and decision makers, and it has the potential to assist in the CCS project planning phase.
122

Procédés de traitement biologiques in situ : la modélisation numérique comme outil d'aide à la décision / Biological treatment processes in situ : Numerical modelling as decision support tool

Verardo, Elicia 22 March 2016 (has links)
La bio-remédiation in situ est une technique de réhabilitation couramment utilisée pour le traitement des sols et des nappes contaminés, notamment par les hydrocarbures pétroliers. Si démontrer la pertinence de ce type de traitement constitue un préalable incontournable pour chacun des sites où il est mis en œuvre, l’efficacité du traitement dépend de ses conditions de mise en œuvre dans le contexte spécifique d’un site. Le suivi et le contrôle des différents processus qui gouvernent les phénomènes de biodégradation est complexe, et leur optimisation constitue un élément clé de la réussite du traitement tant au plan technique qu’économique. La démarche générale du travail de thèse porte sur le développement d’une méthodologie permettant d’employer la modélisation dans une démarche de gestion (au sens des textes de 2007) d’un site contaminé par des hydrocarbures pétroliers traité par biodégradation in situ. L’originalité du travail de thèse porte sur l’utilisation de la modélisation comme outil de compréhension des mécanismes et d’aide à la décision à chaque étape du traitement : (i) Dimensionnement de l’installation : définir la meilleure option envisageable, (ii) suivi de l’efficacité du traitement : optimiser le procédé et (iii) prédiction et justification de l’arrêt du traitement : sécuriser en termes de garantie de résultat. Les données d’un site d’étude servent de support dans la définition de l’approche méthodologique de modélisation. A chaque étape d’un projet de bio-remédiation in situ peut être associée une étape de modélisation qui fera appel à des moyens plus ou moins sophistiqués avec des exigences de précision variables. Le premier outil développé concerne l’estimation des incertitudes prédictives dans les modèles mis en œuvre. Cet aspect est fondamental, dès lors que l’on souhaite utiliser la modélisation dans un processus d’aide à la décision. Les processus de bio-remédiation in situ impliquent des relations complexes et incertaines entre la biomasse, les contaminants et les mesures de contrôle appropriées. Prévoir la performance du traitement (en termes de réduction du flux et/ou de la masse) constitue un défi en raison des incertitudes liées aux propriétés du milieu, de la source et aux incertitudes liées aux mécanismes de bio-remédiation. L’étude de la contribution des incertitudes paramétriques dans la prédiction de la performance du traitement est réalisée avec la méthode du « Null Space Monte Carlo » (NSMC) implémentée dans l’outil PEST. Le second outil utilisé concerne l’optimisation du design et/ou du monitoring d’un procédé de bio-traitement in situ. Dans ce contexte, deux objectifs peuvent être envisagés à savoir la réduction du flux de contaminants d’une part, et l’élimination de la masse à la zone source d’autre part. L’outil utilisé est un algorithme d’optimisation mathématique dénommé “Particle Swarm Optimisation” (PSO). Le choix de la fonction objectif à optimiser est particulièrement important et s’avère lié au comportement spécifique hydrogéochimique du site considéré. Cette étude montre que les outils NSMC et PSO s’avèrent appropriés dans l'utilisation de modèles de transport réactif dans la gestion environnementale. Les temps de calcul de ces modèles hautement paramétrés et non linéaires limitent encore l'utilisation de la modélisation comme outil d’aide à la décision. Malgré ces limites, l’approche proposée pour gérer la bio-remédiation in situ des eaux souterraines sur site réel peut être efficace pour fournir un soutien dans la gestion du traitement d’une pollution, étendant ainsi le domaine d’application de la modélisation numérique. Cette approche permet aussi de mettre en évidence des difficultés dues aux spécificités du site ou à la technique même de traitement choisie, permettant d’alerter à temps les gestionnaires. / In-situ bioremediation is a commonly used remediation technology to clean up the subsurface of petroleum-contaminated sites. Although demonstrating the relevance of this type of treatment is an essential prerequisite for each site where it is implemented, the effectiveness of the treatment depends on its implementation conditions in the site-specific context. The monitoring and control of different processes that govern biodegradation phenomena is complex, and optimization is a key element of successful treatment both technically and economically. The general approach of the thesis is the development of a methodology for using modelling in a management approach (as defined in the French regulatory text) of petroleum-contaminated site treated by in situ biodegradation. The work focuses on the use of modelling as a tool for understanding mechanisms and for decision support at every stage of treatment: • System design: defining the best possible option.• Monitoring the effectiveness of treatment: process optimization.• Prediction and justification of stopping treatment: analysis of the uncertainty on the treatment result. Data from two study sites are used to define the modelling methodology. At each stage of the bio-remediation project (design, conception, monitoring and optimization) may be associated a modelling stage that will be more or less sophisticated depending on accuracy requirements. The first tool developed involved predictive uncertainty analysis, which is crucial when modelling is used as a decision support tool, and can be used at the design process step or for predicting the effectiveness of treatment. The process of in-situ bioremediation involves complex and uncertain relationships among biomass, contaminants and appropriate control actions. Forecasting remedial performance (in terms of flux and mass reduction) is a challenge due to uncertainties associated with (i) the medium and source properties and (ii) the efficiency of concentration reducing mechanisms. Parametric uncertainty contributions involved in forecasting treatment performance is carried out with the “Null-Space Monte Carlo” (NSMC) method implemented in the PEST tool. The second tool relates design and / or monitoring optimization of the bio-treatment method. In this context, two purposes can be considered: the reduction of contaminants flux or mass in the source zone. The tool used is a mathematical optimization algorithm called "Particle Swarm Optimization" (PSO). The choice of the objective function to be optimized is particularly important and appears to be related to hydrogeochemical site-specific behavior. This study showed that the NSMC and PSO methods are suitable tools for an efficient use reactive transport models in environmental management. The computation time of these highly parameterized and nonlinear models still limit the use of modelling as a decision support tool. Despite these limitations, the proposed approach for managing the bioremediation in-situ groundwater on actual site can be effective to provide support in managing the treatment of pollution, extending the field of application of numerical modelling. This approach also allows to highlight difficulties due to site-specific behavior or to the treatment technique applied, and to inform decision support managers in consequence.
123

Physics-Guided Machine Learning in Ocean Acoustics Using Fisher Information

Mortenson, Michael Craig 14 April 2022 (has links)
Waterborne acoustic signals carry information about the ocean environment. Ocean geoacoustic inversion is the task of estimating environmental parameters from received acoustic signals by matching the measured sound with the predictions of a physics-based model. A lower bound on the uncertainty associated with environmental parameter estimates, the Cramér-Rao bound, can be calculated from the Fisher information, which is dependent on derivatives of a physics-based model. Physics-based preconditioners circumvent the need for variable step sizes when computing numerical derivatives. This work explores the feasibility of using a neural network to perform geoacoustic inversion for environmental parameters and their associated uncertainties from ship noise spectrogram data. To train neural networks, a synthetic dataset is generated and tested for generalizability against 31 measurements taken during the SBCEX2017 study of the New England Mud Patch.
124

Performance Evaluation and Field Validation of Building Thermal Load Prediction Model

Sarwar, Riasat Azim 14 August 2015 (has links)
This thesis presents performance evaluation and a field validation study of a time and temperature indexed autoregressive with exogenous (4-3-5 ARX) building thermal load prediction model with an aim to integrate the model with actual predictive control systems. The 4-3-5 ARX model is very simple and computationally efficient with relatively high prediction accuracy compared to the existing sophisticated prediction models, such as artificial neural network prediction models. However, performance evaluation and field validation of the model are essential steps before implementing the model in actual practice. The performance of the model was evaluated under different climate conditions as well as under modeling uncertainty. A field validation study was carried out for three buildings at Mississippi State University. The results demonstrate that the 4-3-5 ARX model can predict building thermal loads in an accurate manner most of the times, indicating that the model can be readily implemented in predictive control systems.
125

Traumatic brain injury: modeling and simulation of the brain at large deformation

Prabhu, Raj 06 August 2011 (has links)
The brain is a complex organ and its response to the mechanical loads at all strain rates has been nonlinear and inelastic in nature. Split-Hopkinson Pressure Bar (SHPB) high strain rate compressive tests conducted on porcine brain samples showed a strain rate dependent inelastic mechanical behavior. Finite Element (FE) modeling of the SHPB setup in ABAQUS/Explicit, using a specific constitutive model (MSU TP Ver. 1.1) for the brain, showed non-uniform stress state during tissue deformation. Song et al.’s assertion of using annular samples for negating inertial effects was also tested. FE simulation results showed that the use of cylindrical or annular did not mitigate the initial hardening. Further uniaxial stress state was not maintained is either case. Experimental studies on hydration effects of the porcine brain on its mechanical response revealed two different phenomenological trends. The wet brain (~80% water wt. /wt.) showed strain rate dependency along with two unique mechanical behavior patterns at quasi-static and high strain rates. The dry brain’s (~0% water wt. /wt.) response was akin to the response of metals. The dry brain’s response also observed to be strain rate insensitivity in its elastic modulus and yield stress variations. Uncertainty analysis of the wet brain high strain rate data revealed large uncertainty bands for the sample-to-sample random variations. This large uncertainty in the brain material should be taken into in the FE modeling and design stages. FE simulations of blast loads to the human head showed that Pressure played a dominant role in causing blast-related Traumatic Brain Injury (bTBI). Further, the analysis of shock waves exposed the deleterious effect of the 3-Dimensional geometry of the skull in pinning the location of bTBI. The effects of peak negative Pressure at injury sites have been attributed to bTBI pathologies such as Diffuse Axonal Injury (DAI), subdural hemorrhage and cerebral contusion.
126

Efficient, Accurate, and Non-Gaussian Error Propagation Through Nonlinear, Closed-Form, Analytical System Models

Anderson, Travis V. 29 July 2011 (has links) (PDF)
Uncertainty analysis is an important part of system design. The formula for error propagation through a system model that is most-often cited in literature is based on a first-order Taylor series. This formula makes several important assumptions and has several important limitations that are often ignored. This thesis explores these assumptions and addresses two of the major limitations. First, the results obtained from propagating error through nonlinear systems can be wrong by one or more orders of magnitude, due to the linearization inherent in a first-order Taylor series. This thesis presents a method for overcoming that inaccuracy that is capable of achieving fourth-order accuracy without significant additional computational cost. Second, system designers using a Taylor series to propagate error typically only propagate a mean and variance and ignore all higher-order statistics. Consequently, a Gaussian output distribution must be assumed, which often does not reflect reality. This thesis presents a proof that nonlinear systems do not produce Gaussian output distributions, even when inputs are Gaussian. A second-order Taylor series is then used to propagate both skewness and kurtosis through a system model. This allows the system designer to obtain a fully-described non-Gaussian output distribution. The benefits of having a fully-described output distribution are demonstrated using the examples of both a flat rolling metalworking process and the propeller component of a solar-powered unmanned aerial vehicle.
127

Development and Evaluation of Polaris CANDU Geometry Modelling and of TRACE_Mac/PARCS_Mac Coupling with RRS for CANDU Analysis / Polaris and TRACE/PARCS Code Development for CANDU Analysis

Younan, Simon January 2022 (has links)
McMaster University DOCTOR OF PHILOSOPHY (2022) Hamilton, Ontario (Engineering Physics) TITLE: Development and Evaluation of Polaris CANDU Geometry Modelling and of TRACE_Mac/PARCS_Mac Coupling with RRS for CANDU Analysis AUTHOR: Simon Younan, M.A.Sc. (McMaster University), B.Eng. (McMaster University) SUPERVISOR: Dr. David Novog NUMBER OF PAGES: xiv, 163 / In the field of nuclear safety analysis, as computers have become more powerful, there has been a trend away from low-fidelity models using conservative assumptions, to high-fidelity best-estimate models combined with uncertainty analysis. A number of these tools have been developed in the United States, due to the popularity of light water reactors. These include the SCALE analysis suite developed by ORNL, as well as the PARCS and TRACE tools backed by the USNRC. This work explores adapting the capabilities of these tools to the analysis of CANDU reactors. The Polaris sequence, introduced in SCALE 6.2, was extended in this work to support CANDU geometries and compared to existing SCALE sequences such as TRITON. Emphasis was placed on the Embedded Self-Shielding Method (ESSM), introduced with Polaris. Both Polaris and ESSM were evaluated and found to perform adequately for CANDU geometries. The accuracy of ESSM was found to improve when the precomputed selfshielding factors were updated using a CANDU representation. The PARCS diffusion code and the TRACE system thermalhydraulics code were coupled, using the built-in coupling capability between the two codes. In addition, the Exterior Communications Interface (ECI), used for coupling with TRACE, was utilized. A Python interface to the ECI library was developed in this work and used to couple an RRS model written in Python to the coupled PARCS/TRACE model. A number of code modifications were made to accommodate the required coupling and correct code deficiencies, with the modified versions named PARCS_Mac and TRACE_Mac. The coupled codes were able to simulate multiple transients based on prior studies as well as operational events. The code updates performed in this work may be used for many future studies, particularly for uncertainty propagation through a full set of calculations, from the lattice model to a full coupled system model. / Thesis / Doctor of Philosophy (PhD) / Modern nuclear safety analysis tools offer more accurate predictions for the safety and operation of nuclear reactors, including CANDU reactors. These codes take advantage of modern computer hardware, and also a shift in philosophy from conservative analysis to best estimate plus uncertainty analysis. The goal of this thesis was to adapt a number of modern tools to support CANDU analysis and uncertainty propagation, with a particular emphasis on coupling of multiple interacting models. These tools were then demonstrated, and results analyzed. The simulations performed in this work were successful in producing results comparable to prior studies along with experimental and operational data. This included the simulation of four weeks of reactor operation including “shim mode” operation. Sensitivity and uncertainty analyses were performed over the course of the work to quantify the precision and significance of the results as well as to identify areas of interest for future research.
128

Monte Carlo analysis of BWR transients : A study on the Best Estimate Plus Uncertainty methodology within safety analyses at Forsmark 3

Eriksson, Jonathan January 2023 (has links)
Transient analyses at Forsmark nuclear power plant are currently performed using realistic computer models in conjunction with conservative estimates of initial- and boundary conditions. This is known as the combined methodology. Due to the conservative estimates, the methodology runs the risk of sometimes over-estimating certain safety criteria which will negatively affect the optimization of reactor operation. The Best Estimate Plus Uncertainty (BEPU) methodology can provide higher safety margins by using probabilities instead of conservatisms when estimating initial and boundary conditions. The BEPU methodology applies a Monte Carlo method to assess the distribution of one or several key outputs. This study focuses on the lowest dryout margin achieved during each Monte Carlo simulation. The tolerance limits of the output are set with the help of Wilks formula using a one-sided 95% tolerance limit with 95% confidence. A total of 36 unique parameters describing initial and boundary conditions have been sampled for each Monte Carlo simulation. The parameters have been sampled using either Gaussian or Uniform distribution functions. The random nature of the Monte Carlo simulations has uncovered alternative event sequences and end states that are not seen in the combined methodology. Assessing the choice of order statistic in Wilks formula also concludes that there are diminishing returns the higher the order statistic is. When choosing the order statistic, one should consider the trade-off between an increased accuracy in the estimated outputs and the increased computational time required. The conservative methodology uses a mix of conservative and nominal estimations of key parameters. The difference in dryout margin between the conservative and the Monte Carlo results should therefore not be used to draw a conclusion about which methodology out-performs the other. While the Monte Carlo simulations do not result in an improved core optimization, they can act as a complement to the combined methodology by providing a more detailed analysis of possible event pathways for a postulated transient.
129

Development of Data-Driven Models for Membrane Fouling Prediction at Wastewater Treatment Plants

Kovacs, David January 2022 (has links)
Membrane bioreactors (MBRs) have proven to be an extremely effective wastewater treatment process combining ultrafiltration with biological processes to produce high-quality effluent. However, one of the major drawbacks to this technology is membrane fouling – an inevitable process that reduces permeate production and increases operating costs. The prediction of membrane fouling in MBRs is important because it can provide decision support to wastewater treatment plant (WWTP) operators. Currently, mechanistic models are often used to estimate transmembrane pressure (TMP), which is an indicator of membrane fouling, but their performance is not always satisfactory. In this research, existing mechanistic and data-driven models used for membrane fouling are investigated. Data-driven machine learning techniques consisting of random forest (RF), artificial neural network (ANN), and long-short term memory network (LSTM) are used to build models to predict transmembrane pressure (TMP) at various stages of the MBR production cycle. The models are built with 4 years of high-resolution data from a confidential full-scale municipal WWTP. The model performances are examined using statistical measures such as coefficient of determination (R2), root mean squared error, mean absolute percentage error, and mean squared error. The results show that all models provide reliable predictions while the RF models have the best predictive accuracy when compared to the ANN and LSTM models. The corresponding R2 values for RF when predicting before, during, and after back pulse TMP are 0.996, 0.927, and 0.996, respectively. Model uncertainty (including hyperparameter and algorithm uncertainty) is quantified to determine the impact of hyperparameter tuning and the variance of extreme predictions caused by algorithm choice. The ANN models are most impacted by hyperparameter tuning and have the highest variability when predicting extreme values within each model’s respective hyperparameter range. The proposed models can be useful tools in providing decision support to WWTP operators employing fouling mitigation strategies, which can potentially lead to better operation of WWTPs and reduced costs. / Thesis / Master of Applied Science (MASc)
130

Evaluation of Uncertainty in Hydrodynamic Modeling

Camacho Rincon, Rene Alexander 17 August 2013 (has links)
Uncertainty analysis in hydrodynamic modeling is useful to identify and report the limitations of a model caused by different sources of error. In the practice, the main sources of errors are divided into model structure errors, errors in the input data due to measurement imprecision among other, and parametric errors resulting from the difficulty of identifying physically representative parameter values valid at the temporal and spatial scale of the models. This investigation identifies, implements, evaluates, and recommends a set of methods for the evaluation of model structure uncertainty, parametric uncertainty, and input data uncertainty in hydrodynamic modeling studies. A comprehensive review of uncertainty analysis methods is provided and a set of widely applied methods is selected and implemented in real case studies identifying the main limitations and benefits of their use in hydrodynamic studies. In particular, the following methods are investigated: the First Order Variance Analysis (FOVA) method, the Monte Carlo Uncertainty Analysis (MCUA) method, the Bayesian Monte Carlo (BMC) method, the Markov Chain Monte Carlo (MCMC) method and the Generalized Likelihood Uncertainty Estimation (GLUE) method. The results of this investigation indicate that the uncertainty estimates computed with FOVA are consistent with the results obtained by MCUA. In addition, the comparison of BMC, MCMC and GLUE indicates that BMC and MCMC provide similar estimations of the posterior parameter probability distributions, single-point parameter values, and uncertainty bounds mainly due to the use of the same likelihood function, and the low number of parameters involved in the inference process. However, the implementation of MCMC is substantially more complex than the implementation of BMC given that its sampling algorithm requires a careful definition of auxiliary proposal probability distributions along with their variances to obtain parameter samples that effectively belong to the posterior parameter distribution. The analysis also suggest that the results of GLUE are inconsistent with the results of BMC and MCMC. It is concluded that BMC is a powerful and parsimonious strategy for evaluation of all the sources of uncertainty in hydrodynamic modeling. Despites of the computational requirements of BMC, the method can be easily implemented in most practical applications.

Page generated in 0.0553 seconds