Spelling suggestions: "subject:"densemble kalman bfilter"" "subject:"densemble kalman builter""
1 |
An investigation of the multi-scale mixed finite element??eamline simulator and it oupling with the ensemble kalman filterMukerjee, Rahul 15 May 2009 (has links)
The multi-scale mixed finite element method (MsMFEM) discussed in this work uses a
two-scale approach, where the solutions to independent local flow problems on the fine
grid capture the fine-scale variations of the reservoir model, while the coarse grid
equations appropriately assimilate this information in the global solution. Temporal
changes in porous media flow are relatively moderate when compared to the spatial
variations in the reservoir. Hence, approximate global solutions by adaptively solving
these local flow problems can be obtained with significant savings in computational
time. The ensemble Kalman filter, used for real-time updating of reservoir models, can
thus be coupled with the MsMFEM-streamline simulator to speed up the historymatching
process considerably.
|
2 |
Reduction of Dimensionality in Spatiotemporal ModelsSætrom, Jon January 2010 (has links)
No description available.
|
3 |
Non-Adjoint Surfactant Flood Optimization of Net Present Value and Incorporation of Optimal Solution Under Geological and Economic UncertaintyOdi, Uchenna O. 2009 December 1900 (has links)
The advent of smart well technology, which is the use of down hole sensors to adjust well controls (i.e. injection rate, bottomhole pressure, etc.), has allowed the possibility to control a field in all stages of the production. This possibility holds great promise in better managing enhanced oil recovery (EOR) processes, especially in terms of applying optimization techniques. However, some procedures for optimizing EOR processes are not based on the physics of the process, which may lead to erroneous results. In addition, optimization of EOR processes can be difficult, and limited, if there is no access to the simulator code for computation of the adjoints used for optimization.
This research describes the development of a general procedure for designing an initial starting point for a surfactant flood optimization. The method does not rely on a simulator's adjoint computation or on external computing of adjoints for optimization. The reservoir simulator used for this research was Schlumberger's Eclipse 100, and optimization was accomplished through use of a program written in Matlab. Utility of the approach is demonstrated by using it to optimize the process net present value (NPV) of a 5-spot surfactant flood (320-acres) and incorporating the optimization solution into a probabilistic geological and economic setting. This thesis includes a general procedure for optimizing a surfactant flood and provides groundwork for optimizing other EOR techniques.
This research is useful because it takes the optimal solution and calculates a probability of success for possible NPVs. This is very important when accessing risk in a business scenario, because projects that have unknown probability of success are most likely to be abandoned as uneconomic. This thesis also illustrates possible NPVs if the optimal solution was used.
|
4 |
Initial Member Selection and Covariance Localization Study of Ensemble Kalman Filter based Data AssimilationYip, Yeung 2011 May 1900 (has links)
Petroleum engineers generate reservoir simulation models to optimize production and maximize recovery. History matching is one of the methods used to calibrate the reservoir models. During traditional history matching, individual model variable parameters (permeability, relative permeability, initial water saturation, etc) are adjusted until the production history is matched using the updated reservoir model. However, this method of utilizing only one model does not help capture the full range of system uncertainty. Another drawback is that the entire model has to be matched from the initial time when matching for new observation data.
Ensemble Kalman Filter (EnKF) is a data assimilation technique that has gained increasing interest in the application of petroleum history matching in recent years. The basic methodology of the EnKF consists of the forecast step and the update step. This data assimilation method utilizes a collection of state vectors, known as an ensemble, which are simulated forward in time. In other words, each ensemble member represents a reservoir model (realization). Subsequently, during the update step, the sample covariance is computed from the ensemble, while the collection of state vectors is updated using the formulations which involve this updated sample covariance.
When a small ensemble size is used for a large, field-scale model, poor estimate of the covariance matrix could occur (Anderson and Anderson 1999; Devegowda and Arroyo 2006). To mitigate such problem, various covariance conditioning schemes have been proposed to improve the performance of EnKF, without the use of large ensemble sizes that require enormous computational resources.
In this study, we implemented EnKF coupled with these various covariance localization schemes: Distance-based, Streamline trajectory-based, and Streamline sensitivity-based localization and Hierarchical EnKF on a synthetic reservoir field case study. We will describe the methodology of each of the covariance localization schemes with their characteristics and limitations.
|
5 |
Application of the Ensemble Kalman Filter to Estimate Fracture Parameters in Unconventional Horizontal Wells by Downhole Temperature MeasurementsGonzales, Sergio Eduardo 16 December 2013 (has links)
The increase in energy demand throughout the world has forced the oil industry to develop and expand on current technologies to optimize well productivity. Distributed temperature sensing has become a current and fairly inexpensive way to monitor performance in hydraulic fractured wells in real time by the aid of fiber optic. However, no applications have yet been attempted to describe or estimate the fracture parameters using distributed temperature sensing as the observation parameter. The Ensemble Kalman Filter, a recursive filter, has proved to be an effective tool in the application of inverse problems to determine parameters of non-linear models. Even though large amounts of data are acquired as the information used to apply an estimation, the Ensemble Kalman Filter effectively minimizes the time of operation by only using “snapshots” of the ensembles collected by various simulations where the estimation is updated continuously to be calibrated by comparing it to a reference model.
A reservoir model using ECLIPSE is constructed that measures temperature throughout the wellbore. This model is a hybrid representation of what distributed temperature sensing measures in real-time throughout the wellbore. Reservoir and fracture parameters are selected in this model with similar properties and values to an unconventional well. However, certain parameters such as fracture width are manipulated to significantly diminish the computation time.
A sensitivity study is performed for all the reservoir and fracture parameters in order to understand which parameters require more or less data to allow the Ensemble Kalman Filter to arrive to an acceptable estimation. Two fracture parameters are selected based on their low sensitivity and importance in fracture design to perform the Ensemble Kalman Filter on various simulations.
Fracture permeability has very low sensitivity. However, when applying the estimation the Ensemble Kalman Filter arrives to an acceptable estimation. Similarly fracture halflength, with medium sensitivity, arrives to an acceptable estimation around the same number of integration steps. The true effectiveness of the Ensemble Kalman Filter is presented when both parameters are estimated jointly and arrive to an acceptable estimation without being computationally expensive. The effectiveness of the Ensemble Kalman Filter is directly connected to the quantity of data acquired. The more data available to run simulations, the better and faster the filter performs.
|
6 |
Testing a Coupled Global-limited-area Data Assimilation System Using Observations from the 2004 Pacific Typhoon SeasonHolt, Christina 2011 August 1900 (has links)
Tropical cyclone (TC) track and intensity forecasts have improved in recent years due to increased model resolution, improved data assimilation, and the rapid increase in the number of routinely assimilated observations over oceans. The data assimilation approach that has received the most attention in recent years is Ensemble Kalman Filtering (EnKF). The most attractive feature of the EnKF is that it uses a fully flow-dependent estimate of the error statistics, which can have important benefits for the analysis of rapidly developing TCs.
We implement the Local Ensemble Transform Kalman Filter algorithm, a variation of the EnKF, on a reduced-resolution version of the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) model and the NCEP Regional Spectral Model (RSM) to build a coupled global-limited area analysis/forecast system. This is the first time, to our knowledge, that such a system is used for the analysis and forecast of tropical cyclones. We use data from summer 2004 to study eight tropical cyclones in the Northwest Pacific.
The benchmark data sets that we use to assess the performance of our system are the NCEP Reanalysis and the NCEP Operational GFS analyses from 2004. These benchmark analyses were both obtained by the Statistical Spectral Interpolation, which was the operational data assimilation system of NCEP in 2004. The GFS Operational analysis assimilated a large number of satellite radiance observations in addition to the observations assimilated in our system. All analyses are verified against the Joint Typhoon Warning Center Best Track data set. The errors are calculated for the position and intensity of the TCs.
The global component of the ensemble-based system shows improvement in position analysis over the NCEP Reanalysis, but shows no significant difference from the NCEP operational analysis for most of the storm tracks. The regional component of our system improves position analysis over all the global analyses. The intensity analyses, measured by the minimum sea level pressure, are of similar quality in all of the analyses. Regional deterministic forecasts started from our analyses are generally not significantly different from those started from the GFS operational analysis. On average, the regional experiments performed better for longer than 48 h sea level pressure forecasts, while the global forecast performed better in predicting the position for longer than 48 h.
|
7 |
A Hybrid Ensemble Kalman Filter for Nonlinear DynamicsWatanabe, Shingo 2009 December 1900 (has links)
In this thesis, we propose two novel approaches for hybrid Ensemble Kalman
Filter (EnKF) to overcome limitations of the traditional EnKF. The first approach is to
swap the ensemble mean for the ensemble mode estimation to improve the covariance
calculation in EnKF. The second approach is a coarse scale permeability constraint while
updating in EnKF. Both hybrid EnKF approaches are coupled with the streamline based
Generalized Travel Time Inversion (GTTI) algorithm for periodic updating of the mean
of the ensemble and to sequentially update the ensemble in a hybrid fashion.
Through the development of the hybrid EnKF algorithm, the characteristics of
the EnKF are also investigated. We found that the limits of the updated values constrain
the assimilation results significantly and it is important to assess the measurement error
variance to have a proper balance between preserving the prior information and the
observation data misfit. Overshooting problems can be mitigated with the streamline
based covariance localizations and normal score transformation of the parameters to
support the Gaussian error statistics.
The swapping mean and mode estimation approach can give us a better matching
of the data as long as the mode solution of the inversion process is satisfactory in terms
of matching the observation trajectory.
The coarse scale permeability constrained hybrid approach gives us better
parameter estimation in terms of capturing the main trend of the permeability field and
each ensemble member is driven to the posterior mode solution from the inversion
process. However the WWCT responses and pressure responses need to be captured
through the inversion process to generate physically plausible coarse scale permeability
data to constrain hybrid EnKF updating.
Uncertainty quantification methods for EnKF were developed to verify the
performance of the proposed hybrid EnKF compared to the traditional EnKF. The results
show better assimilation quality through a sequence of updating and a stable solution is
demonstrated.
The potential of the proposed hybrid approaches are promising through the
synthetic examples and a field scale application.
|
8 |
What the collapse of the ensemble Kalman filter tells us about particle filtersMorzfeld, Matthias, Hodyss, Daniel, Snyder, Chris January 2017 (has links)
The ensemble Kalman filter (EnKF) is a reliable data assimilation tool for high-dimensional meteorological problems. On the other hand, the EnKF can be interpreted as a particle filter, and particle filters (PF) collapse in high-dimensional problems. We explain that these seemingly contradictory statements offer insights about how PF function in certain high-dimensional problems, and in particular support recent efforts in meteorology to 'localize' particle filters, i.e. to restrict the influence of an observation to its neighbourhood.
|
9 |
ADVANCING SEQUENTIAL DATA ASSIMILATION METHODS FOR ENHANCED HYDROLOGIC FORECASTING IN SEMI-URBAN WATERSHEDSLeach, James January 2019 (has links)
Accurate hydrologic forecasting is vital for proper water resource management. Practices that are impacted by these forecasts include power generation, reservoir management, agricultural water use, and flood early warning systems. Despite these needs, the models largely used are simplifications of the real world and are therefore imperfect. The forecasters face other challenges in addition to the model uncertainty, which includes imperfect observations used for model calibration and validation, imperfect meteorological forecasts, and the ability to effectively communicate forecast results to decision-makers. Bayesian methods are commonly used to address some of these issues, and this thesis will be focused on improving methods related to recursive Bayesian estimation, more commonly known as data assimilation.
Data assimilation is a means to optimally account for the uncertainties in observations, models, and forcing data. In the literature, data assimilation for urban hydrologic and flood forecasting is rare; therefore the main areas of study in this thesis are urban and semi-urban watersheds. By providing improvements to data assimilation methods, both hydrologic and flood forecasting can be enhanced in these areas. This work explored the use of alternative data products as a type of observation that can be assimilated to improve hydrologic forecasting in an urban watershed. The impact of impervious surfaces in urban and semi-urban watersheds was also evaluated in regards to its impact on remotely sensed soil moisture assimilation. Lack of observations is another issue when it comes to data assimilation, particularly in semi- or fully-distributed models; because of this, an improved method for updating locations which do not have observations was developed which utilizes information theory’s mutual information. Finally, we explored extending data assimilation into the short-term forecast by using prior knowledge of how a model will respond to forecasted forcing data.
Results from this work found that using alternative data products such as those from the Snow Data Assimilation System or the Soil Moisture and Ocean Salinity mission, can be effective at improving hydrologic forecasting in urban watersheds. They also were effective at identifying a limiting imperviousness threshold for soil moisture assimilation into urban and semi-urban watersheds. Additionally, the inclusion of mutual information between gauged and ungauged locations in a semi-distributed hydrologic model was able to provide better state updates in models. Finally, by extending data assimilation into the short-term forecast, the reliability of the forecasts could be improved substantially. / Dissertation / Doctor of Philosophy (PhD) / The ability to accurately model hydrological systems is essential, as that allows for better planning and decision making in water resources management. The better we can forecast the hydrologic response to rain and snowmelt events, the better we can plan and manage our water resources. This includes better planning and usage of water for agricultural purposes, better planning and management of reservoirs for power generation, and better preparing for flood events. Unfortunately, hydrologic models primarily used are simplifications of the real world and are therefore imperfect. Additionally, our measurements of the physical system responses to atmospheric forcing can be prone to both systematic and random errors that need to be accounted for. To address these limitations, data assimilation can be used to improve hydrologic forecasts by optimally accounting for both model and observation uncertainties. The work in this thesis helps to further advance and improve data assimilation, with a focus on enhancing hydrologic forecasting in urban and semi-urban watersheds. The research presented herein can be used to provide better forecasts, which allow for better planning and decision making.
|
10 |
Vers une assimilation des données de déformation en volcanologie / Towards assimilation of deformation measurements in volcanologyBato, Mary Grace 02 July 2018 (has links)
Le suivi de la mise en place du magma à faible profondeur et de sa migration vers la surface est crucial pour prévoir les éruptions volcaniques.Avec les progrès récents de l'imagerie SAR et le nombre croissant de réseaux GNSS continus sur les volcans, il est maintenant possible de fournir une évolution continue et spatialement étendue des déplacements de surface pendant les périodes inter-éruptives. Pour les volcans basaltiques, ces mesures combinées à des modèles dynamiques simples peuvent être exploitées pour caractériser et contraindre la mise en pression d'un ou de plusieurs réservoirs magmatiques, ce qui fournit une meilleure information prédictive sur l'emplacement du magma à faible profondeur. L'assimilation de données—un processus séquentiel qui combine au mieux les modèles et les observations, en utilisant parfois une information a priori basée sur les statistiques des erreurs, pour prédire l'état d'un système dynamique—a récemment gagné en popularité dans divers domaines des géosciences. Dans cette thèse, je présente la toute première application de l'assimilation de données en volcanologie en allant des tests synthétiques à l’utilisation de données géodésiques réelles.La première partie de ce travail se concentre sur le développement de stratégies afin d'évaluer le potentiel de l’assimilation de données. En particulier, le Filtre de Kalman d'Ensemble a été utilisé avec un modèle dynamique simple à deux chambres et de données géodésiques synthétiques pour aborder les points suivants : 1) suivi de l'évolution de la pression magmatique en profondeur et des déplacements de surface et estimation des paramètres statiques incertains du modèle, 2) assimilation des données GNSS et InSAR, 3) mise en évidence des avantages ou des inconvénients de l'EnKF par rapport à une technique d'inversion bayésienne. Les résultats montrent que l’EnKF fonctionne de manière satisfaisante et que l'assimilation de données semble prometteuse pour la surveillance en temps réel des volcans.La deuxième partie de la thèse est dédiée à l'application de la stratégie mise au point précédemment à l’exploitation des données GNSS inter-éruptives enregistrées de 2004 à 2011 au volcan Grímsvötn en Islande, afin de tester notre capacité à prédire la rupture d'une chambre magmatique en temps réel. Nous avons introduit ici le concept de ``niveau critique'' basé sur l’estimation de la probabilité d'une éruption à chaque pas de temps. Cette probabilité est définie à partir de la proportion d'ensembles de modèles qui dépassent un seuil critique, initialement assigné selon une distribution donnée. Nos résultats montrent que lorsque 25 +/- 1 % des ensembles du modèle ont dépassé la surpression critique une éruption est imminente. De plus, dans ce chapitre, nous élargissons également les tests synthétiques précédents en améliorant la stratégie EnKF d'assimilation des données géodésiques pour l'adapter à l’utilisation de données réelles en nombre limité. Les outils de diagnostiques couramment utilisés en assimilation de données sont mis en oeuvre et présentés.Enfin, je démontre qu'en plus de son intérêt pour prédire les éruptions volcaniques, l'assimilation séquentielle de données géodésiques basée sur l'utilisation de l'EnKF présente un potentiel unique pour apporter une information sur l'alimentation profonde du système volcanique. En utilisant le modèle dynamique à deux réservoirs pour le système de plomberie de Grímsvötn et en supposant une géométrie fixe et des propriétés magmatiques invariantes, nous mettons en évidence que l'apport basal en magma sous Grímsvötn diminue de 85 % au cours des 10 mois précédant le début de l'événement de rifting de Bárdarbunga. La perte d'au moins 0.016 km3 dans l'approvisionnement en magma de Grímsvötn est interprétée comme une conséquence de l'accumulation de magma sous Bárdarbunga et de l'alimentation consécutive de l'éruption Holuhraun à 41 km de distance. / Tracking magma emplacement at shallow depth as well as its migration towards the Earth's surface is crucial to forecast volcanic eruptions.With the recent advances in Interferometric Synthetic Aperture Radar (InSAR) imaging and the increasing number of continuous Global Navigation Satellite System (GNSS) networks recorded on volcanoes, it is now possible to provide continuous and spatially extensive evolution of surface displacements during inter-eruptive periods. For basaltic volcanoes, these measurements combined with simple dynamical models can be exploited to characterise and to constrain magma pressure building within one or several magma reservoirs, allowing better predictive information on the emplacement of magma at shallow depths. Data assimilation—a sequential time-forward process that best combines models and observations, sometimes a priori information based on error statistics, to predict the state of a dynamical system—has recently gained popularity in various fields of geoscience (e.g. ocean-weather forecasting, geomagnetism and natural resources exploration). In this dissertation, I present the very first application of data assimilation in volcanology from synthetic tests to analyzing real geodetic data.The first part of this work focuses on the development of strategies in order to test the applicability and to assess the potential of data assimilation, in particular, the Ensemble Kalman Filter (EnKF) using a simple two-chamber dynamical model (Reverso2014) and artificial geodetic data. Synthetic tests are performed in order to address the following: 1) track the magma pressure evolution at depth and reconstruct the synthetic ground surface displacements as well as estimate non-evolving uncertain model parameters, 2) properly assimilate GNSS and InSAR data, 3) highlight the strengths and weaknesses of EnKF in comparison with a Bayesian-based inversion technique (e.g. Markov Chain Monte Carlo). Results show that EnKF works well with the synthetic cases and there is a great potential in utilising data assimilation for real-time monitoring of volcanic unrest.The second part is focused on applying the strategy that we developed through synthetic tests in order to forecast the rupture of a magma chamber in real time. We basically explored the 2004-2011 inter-eruptive dataset at Grímsvötn volcano in Iceland. Here, we introduced the concept of “eruption zones” based on the evaluation of the probability of eruption at each time step estimated as the percentage of model ensembles that exceeded their failure overpressure values initially assigned following a given distribution. Our results show that when 25 +/- 1% of the model ensembles exceeded the failure overpressure, an actual eruption is imminent. Furthermore, in this chapter, we also extend the previous synthetic tests by further enhancing the EnKF strategy of assimilating geodetic data in order to adapt to real world problems such as, the limited amount of geodetic data available to monitor ice-covered active volcanoes. Common diagnostic tools in data assimilation are presented.Finally, I demonstrate that in addition to the interest of predicting volcanic eruptions, sequential assimilation of geodetic data on the basis of EnKF shows a unique potential to give insights into volcanic system roots. Using the two-reservoir dynamical model for Grímsvötn 's plumbing system and assuming a fixed geometry and constant magma properties, we retrieve the temporal evolution of the basal magma inflow beneath Grímsvötn that drops up to 85% during the 10 months preceding the initiation of the Bárdarbunga rifting event. The loss of at least 0.016 km3 in the magma supply of Grímsvötn is interpreted as a consequence of magma accumulation beneath Bárdarbunga and subsequent feeding of the Holuhraun eruption 41 km away.
|
Page generated in 0.0802 seconds