Spelling suggestions: "subject:"densemble kalman bfilter"" "subject:"densemble kalman builter""
21 |
History Matching of 4D Seismic Data Attributes using the Ensemble Kalman FilterRavanelli, Fabio M. 05 1900 (has links)
One of the most challenging tasks in the oil industry is the production of reliable reservoir forecast models. Because of different sources of uncertainties the numerical models employed are often only crude approximations of the reality. This problem is tackled by the conditioning of the model with production data through data assimilation. This process is known in the oil industry as history matching. Several recent advances are being used to improve history matching reliability, notably the use of time-lapse seismic data and automated history matching software tools. One of the most promising data assimilation techniques employed in the oil industry is the ensemble Kalman filter (EnKF) because its ability to deal with highly non-linear models, low computational cost and easy computational implementation when compared with other methods.
A synthetic reservoir model was used in a history matching study designed to predict the peak production allowing decision makers to properly plan field development actions. If only production data is assimilated, a total of 12 years of historical data is required to properly characterize the production uncertainty and consequently the correct moment to take actions and decommission the field. However if time-lapse seismic data is available this conclusion can be reached 4 years in advance due to the additional fluid displacement information obtained with the seismic data. Production data provides geographically sparse data in contrast with seismic data which are sparse in time.
Several types of seismic attributes were tested in this study. Poisson’s ratio proved to be the most sensitive attribute to fluid displacement. In practical applications, however the use of this attribute is usually avoided due to poor quality of the data. Seismic impedance tends to be more reliable.
Finally, a new conceptual idea was proposed to obtain time-lapse information for a history matching study. The use of crosswell time-lapse seismic tomography to map velocities in the interwell region was demonstrated as a potential tool to ensure survey reproducibility and low acquisition cost when compared with full scale surface surveys. This approach relies on the higher velocity sensitivity to fluid displacement at higher frequencies. The velocity effects were modeled using the Biot velocity model. This method provided promising results leading to similar RRMS error reductions when compared with conventional history matched surface seismic data.
|
22 |
Efficient Ensemble Data Assimilation and Forecasting of the Red Sea CirculationToye, Habib 23 November 2020 (has links)
This thesis presents our efforts to build an operational ensemble forecasting system for the Red Sea, based on the Data Research Testbed (DART) package for ensemble data assimilation and the Massachusetts Institute of Technology general circulation ocean model (MITgcm) for forecasting. The Red Sea DART-MITgcm system efficiently integrates all the ensemble members in parallel, while accommodating different ensemble assimilation schemes. The promising ensemble adjustment Kalman filter (EAKF), designed to avoid manipulating the gigantic covariance matrices involved in the ensemble assimilation process, possesses relevant features required for an operational setting. The need for more efficient filtering schemes to implement a high resolution assimilation system for the Red Sea and to handle large ensembles for proper description of the assimilation statistics prompted the design and implementation of new filtering approaches. Making the most of our world-class supercomputer, Shaheen, we first pushed the system limits by designing a fault-tolerant scheduler extension that allowed us to test for the first time a fully realistic and high resolution 1000 ensemble members ocean ensemble assimilation system. In an operational setting, however, timely forecasts are of essence, and running large ensembles, albeit preferable and desirable, is not sustainable. New schemes aiming at lowering the computational burden while preserving reliable assimilation results, were developed. The ensemble Optimal Interpolation (EnOI) algorithm requires only a single model integration in the forecast step, using a static ensemble of preselected members for assimilation, and is therefore computationally significantly cheaper than the EAKF. To account for the strong seasonal variability of the Red Sea circulation, an EnOI with seasonally-varying ensembles (SEnOI) was first implemented. To better handle intra-seasonal variabilities and enhance the developed seasonal EnOI system, an automatic procedure to adaptively select the ensemble members through the assimilation cycles was then introduced. Finally, an efficient Hybrid scheme combining the dynamical flow-dependent covariance of the EAKF and a static covariance of the EnOI was proposed and successfully tested in the Red Sea. The developed Hybrid ensemble data assimilation system will form the basis of the first operational Red Sea forecasting system that is currently being implemented to support Saudi Aramco operations in this basin.
|
23 |
Data Assimilation for Systems with Multiple TimescalesVicente Ihanus, Dan January 2023 (has links)
This text provides an overview of problems in the field of data assimilation. We explore the possibility of recreating unknown data by continuously inserting known data into certain dynamical systems, under certain regularity assumptions. Additionally, we discuss an alternative statistical approach to data assimilation and investigate the utilization of the Ensemble Kalman Filter for assimilating data into dynamical models. A key challenge in numerical weather prediction is incorporating convective precipitation into an idealized setting for numerical computations. To answer this question we examine the modified rotating shallow water equations, a nonlinear coupled system of partial differential equations and further assess if this primitive model accurately mimics phenomena observed in operational numerical weather prediction models. Numerical experiments conducted using a Deterministic Ensemble Kalman Filter algorithm support its applicability for convective-scale data assimilation. Furthermore, we analyze the frequency spectrum of numerical forecasts using the Wavelet transform. Our frequency analysis suggests that, under certain experimental settings, there are similarities in the initialization of operational models, which can aid in understanding the problem of intialization of numerical weather prediction models.
|
24 |
Use of social media data in flood monitoring / Uso de dados das mídias sociais no monitoramento de enchentesRestrepo Estrada, Camilo Ernesto 05 November 2018 (has links)
Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. This thesis aims to show a novel methodology that shows a way to close the research gap regarding the use of social networks as a proxy for precipitation-runoff and flood forecast estimates. To address this, it is proposed to use a transformation function that creates a proxy variable for rainfall by analysing messages from geo-social media and precipitation measurements from authoritative sources, which are then incorporated into a hydrological model for the flow estimation. Then the proxy and authoritative rainfall data are merged to be used in a data assimilation scheme using the Ensemble Kalman Filter (EnKF). It is found that the combined use of authoritative rainfall values with the social media proxy variable as input to the Probability Distributed Model (PDM), improves flow simulations for flood monitoring. In addition, it is found that when these models are made under a scheme of fusion-assimilation of data, the results improve even more, becoming a tool that can help in the monitoring of \"ungauged\" or \"poorly gauged\" catchments. The main contribution of this thesis is the creation of a completely original source of rain monitoring, which had not been explored in the literature in a quantitative way. It also shows how the joint use of this source and data assimilation methodologies aid to detect flood events. / As inundações são um dos tipos mais devastadores de desastres em todo o mundo em termos de perdas humanas, econômicas e sociais. Se os dados oficiais forem escassos ou indisponíveis por alguns períodos, outras fontes de informação são necessárias para melhorar a estimativa de vazões e antecipar avisos de inundação. Esta tese tem como objetivo mostrar uma metodologia que mostra uma maneira de fechar a lacuna de pesquisa em relação ao uso de redes sociais como uma proxy para as estimativas de precipitação e escoamento. Para resolver isso, propõe-se usar uma função de transformação que cria uma variável proxy para a precipitação, analisando mensagens de medições geo-sociais e precipitação de fontes oficiais, que são incorporadas em um modelo hidrológico para a estimativa de fluxo. Em seguida, os dados de proxy e precipitação oficial são fusionados para serem usados em um esquema de assimilação de dados usando o Ensemble Kalman Filter (EnKF). Descobriu-se que o uso combinado de valores oficiais de precipitação com a variável proxy das mídias sociais como entrada para o modelo distribuído de probabilidade (Probability Distributed Model - PDM) melhora as simulações de fluxo para o monitoramento de inundações. A principal contribuição desta tese é a criação de uma fonte completamente original de monitoramento de chuva, que não havia sido explorada na literatura de forma quantitativa.
|
25 |
Use of social media data in flood monitoring / Uso de dados das mídias sociais no monitoramento de enchentesCamilo Ernesto Restrepo Estrada 05 November 2018 (has links)
Floods are one of the most devastating types of worldwide disasters in terms of human, economic, and social losses. If authoritative data is scarce, or unavailable for some periods, other sources of information are required to improve streamflow estimation and early flood warnings. Georeferenced social media messages are increasingly being regarded as an alternative source of information for coping with flood risks. However, existing studies have mostly concentrated on the links between geo-social media activity and flooded areas. This thesis aims to show a novel methodology that shows a way to close the research gap regarding the use of social networks as a proxy for precipitation-runoff and flood forecast estimates. To address this, it is proposed to use a transformation function that creates a proxy variable for rainfall by analysing messages from geo-social media and precipitation measurements from authoritative sources, which are then incorporated into a hydrological model for the flow estimation. Then the proxy and authoritative rainfall data are merged to be used in a data assimilation scheme using the Ensemble Kalman Filter (EnKF). It is found that the combined use of authoritative rainfall values with the social media proxy variable as input to the Probability Distributed Model (PDM), improves flow simulations for flood monitoring. In addition, it is found that when these models are made under a scheme of fusion-assimilation of data, the results improve even more, becoming a tool that can help in the monitoring of \"ungauged\" or \"poorly gauged\" catchments. The main contribution of this thesis is the creation of a completely original source of rain monitoring, which had not been explored in the literature in a quantitative way. It also shows how the joint use of this source and data assimilation methodologies aid to detect flood events. / As inundações são um dos tipos mais devastadores de desastres em todo o mundo em termos de perdas humanas, econômicas e sociais. Se os dados oficiais forem escassos ou indisponíveis por alguns períodos, outras fontes de informação são necessárias para melhorar a estimativa de vazões e antecipar avisos de inundação. Esta tese tem como objetivo mostrar uma metodologia que mostra uma maneira de fechar a lacuna de pesquisa em relação ao uso de redes sociais como uma proxy para as estimativas de precipitação e escoamento. Para resolver isso, propõe-se usar uma função de transformação que cria uma variável proxy para a precipitação, analisando mensagens de medições geo-sociais e precipitação de fontes oficiais, que são incorporadas em um modelo hidrológico para a estimativa de fluxo. Em seguida, os dados de proxy e precipitação oficial são fusionados para serem usados em um esquema de assimilação de dados usando o Ensemble Kalman Filter (EnKF). Descobriu-se que o uso combinado de valores oficiais de precipitação com a variável proxy das mídias sociais como entrada para o modelo distribuído de probabilidade (Probability Distributed Model - PDM) melhora as simulações de fluxo para o monitoramento de inundações. A principal contribuição desta tese é a criação de uma fonte completamente original de monitoramento de chuva, que não havia sido explorada na literatura de forma quantitativa.
|
26 |
Filtro de Kalman Ensemble: uma análise da estimação conjunta dos estados e dos parâmetros / Ensemble Kalman filter: an analysis of the joint estimation of states and parametersSilva, Rafael Oliveira 08 April 2019 (has links)
O Filtro de Kalman Ensemble (EnKF) é um algoritmo de Monte Carlo sequencial para inferência em modelos de espaço de estados lineares e não lineares. Este filtro combinado com alguns outros métodos propaga a distribuição a posteriori conjunta dos estados e dos parâmetros ao longo do tempo. Existem poucos trabalhos que consideram o problema da estimação simultânea dos estados e parâmetros, e os métodos existentes possuem limitações. Nesta dissertação analisamos a eficiência desses métodos por meio de estudos de simulação em modelos de espaço de estados lineares e não lineares. O problema de estimação não linear aqui tratado refere-se ao modelo de produção excedente logístico, para o qual o EnKF pode ser considerado uma possível alternativa aos algoritmos MCMC. Os resultados da simulação revelam que a acurácia das estimativas aumenta quando a série temporal cresce, mas alguns parâmetros apresentam problemas na estimação. / The Ensemble Kalman Filter (EnKF) is a sequential Monte Carlo algorithm for inference in linear and nonlinear state-space models. This filter combined with some other methods propagates the joint posterior distribution of states and parameters over time. There are fewer papers that consider the problem of simultaneous state-parameter estimation and existing methods have limitations. The purpose of this dissertation is to analyze the efficiency of these methods by means of simulation studies in linear and nonlinear state-space models. The nonlinear estimation problem addressed here refers to the logistic surplus-production model, for which the EnKF can be considered as a possible alternative to MCMC algorithms. The simulation results reveal that the accuracy of the estimates increases when the time series grows, but some parameters present problems in the estimation.
|
27 |
A Comparative Study of the Particle Filter and the Ensemble Kalman FilterDatta Gupta, Syamantak January 2009 (has links)
Non-linear Bayesian estimation, or estimation of the state of a non-linear stochastic system from a set of indirect noisy measurements is a problem encountered in several fields of science. The particle filter and the ensemble Kalman filter are both used to get sub-optimal solutions of Bayesian inference problems, particularly for
high-dimensional non-Gaussian and non-linear models. Both are essentially Monte Carlo techniques that compute their results using a set of estimated trajectories of the variable to be monitored. It has been shown that in a linear and Gaussian environment, solutions obtained from both these filters converge to the optimal solution obtained by the Kalman Filter. However, it is of interest to explore how the two filters compare to each other in basic methodology and construction, especially due to the
similarity between them. In this work, we take up a specific problem of Bayesian inference in a restricted framework and compare analytically the results obtained from the particle filter and the ensemble Kalman filter. We show that for the chosen model, under certain assumptions, the two filters become methodologically analogous as the sample size goes to infinity.
|
28 |
A Comparative Study of the Particle Filter and the Ensemble Kalman FilterDatta Gupta, Syamantak January 2009 (has links)
Non-linear Bayesian estimation, or estimation of the state of a non-linear stochastic system from a set of indirect noisy measurements is a problem encountered in several fields of science. The particle filter and the ensemble Kalman filter are both used to get sub-optimal solutions of Bayesian inference problems, particularly for
high-dimensional non-Gaussian and non-linear models. Both are essentially Monte Carlo techniques that compute their results using a set of estimated trajectories of the variable to be monitored. It has been shown that in a linear and Gaussian environment, solutions obtained from both these filters converge to the optimal solution obtained by the Kalman Filter. However, it is of interest to explore how the two filters compare to each other in basic methodology and construction, especially due to the
similarity between them. In this work, we take up a specific problem of Bayesian inference in a restricted framework and compare analytically the results obtained from the particle filter and the ensemble Kalman filter. We show that for the chosen model, under certain assumptions, the two filters become methodologically analogous as the sample size goes to infinity.
|
29 |
Tsunami Prediction and Earthquake Parameters Estimation in the Red SeaSawlan, Zaid A 12 1900 (has links)
Tsunami concerns have increased in the world after the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami. Consequently, tsunami models have been developed rapidly in the last few years. One of the advanced tsunami models is the GeoClaw tsunami model introduced by LeVeque (2011). This model is adaptive and consistent.
Because of different sources of uncertainties in the model, observations are needed to improve model prediction through a data assimilation framework. Model inputs are earthquake parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while the smoother operates smoothing to estimate the earthquake parameters.
This method reduces the error produced by uncertain inputs. In addition,
state-parameter EnKF is implemented to estimate earthquake parameters. Although number of observations is small, estimated parameters generates a better tsunami prediction than the model. Methods and results of prediction experiments in the Red Sea are presented and the prospect of developing an operational tsunami prediction system in the Red Sea is discussed.
|
30 |
Constraining 3D Petroleum Reservoir Models to Petrophysical Data, Local Temperature Observations, and Gridded Seismic Attributes with the Ensemble Kalman Filter (EnKF)Zagayevskiy, Yevgeniy Unknown Date
No description available.
|
Page generated in 0.0608 seconds