• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 29
  • 19
  • 9
  • 2
  • Tagged with
  • 63
  • 63
  • 32
  • 19
  • 13
  • 12
  • 12
  • 11
  • 10
  • 9
  • 9
  • 9
  • 8
  • 8
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Simulation Of Depleted Gas Reservoir For Underground Gas Storage

Ozturk, Bulent 01 December 2004 (has links) (PDF)
For a natural gas importing country, &ldquo / take or pay&rdquo / approach creates problems since the demand for natural gas varies during the year and the excess amount of natural gas should be stored. In this study, an underground gas storage project is evaluated in a depleted gas Field M. After gathering all necessary reservoir, fluid, production and pressure data, the data were adapted to computer language, which was used in a commercial simulator software (IMEX) that is the CMG&rsquo / s (Computer Modelling Group) new generation adoptive simulator, to reach the history matching. The history matching which consists of the 4 year of production of the gas reservoir is the first step of this study. The simulation program was able to accomplish a good history match with the given parameters of the reservoir. Using the history match as a base, five different scenarios were created and forecast the injection and withdrawal performance of the reservoir. These scenarios includes 5 newly drilled horizontal wells which were used in combinations with the existing wells. With a predetermined injection rate of 13 MMcf/D was set for all the wells and among the 5 scenarios, 5 horizontal &ndash / 6 vertical injectors &amp / 5 horizontal - 6 vertical producers is the most successful in handling the gas inventory and the time it takes for a gas injection and production period. After the determination of the well configuration, the optimum injection rate for the entire field was obtained and found to be 130 MMcf/D by running different injection rates for all wells and then for only horizontal wells different injection rates were applied with a constant injection rate of 130 MMcf/d for vertical wells. Then it has been found that it is better to apply the 5th scenario which includes 5 horizontal &ndash / 6 vertical injectors &amp / 5 horizontal - 6 vertical producers having an injection rate of 130 MMcf/d for horizontal and vertical wells. Since within the 5th scenario, changing the injection rate to 1.3 Bcf/d and 13 Bcf/d, did not effect and change the average reservoir pressure significantly, it is best to carry out the project with the optimum injection rate which is 130 MMcf/d. The total gas produced untill 2012 is 394 BCF and the gas injected is 340 BCF where the maximum average reservoir pressure was recovered and set into a new value of 1881 psi by injection and cushion gas pressure as 1371 psi by withdrawal. If 5th scenario is compared with the others, there is an increase in injection and production performance about 90%.
32

Applications of Level Set and Fast Marching Methods in Reservoir Characterization

Xie, Jiang 2012 August 1900 (has links)
Reservoir characterization is one of the most important problems in petroleum engineering. It involves forward reservoir modeling that predicts the fluid behavior in the reservoir and inverse problem that calibrates created reservoir models with given data. In this dissertation, we focus on two problems in the field of reservoir characterization: depth of investigation in heterogeneous reservoirs, and history matching and uncertainty quantification of channelized reservoirs. The concept of depth of investigation is fundamental to well test analysis. Much of the current well test analysis relies on analytical solutions based on homogeneous or layered reservoirs. However, such analytic solutions are severely limited for heterogeneous and fractured reservoirs, particularly for unconventional reservoirs with multistage hydraulic fractures. We first generalize the concept to heterogeneous reservoirs and provide an efficient tool to calculate drainage volume using fast marching methods and estimate pressure depletion based on geometric pressure approximation. The applicability of proposed method is illustrated using two applications in unconventional reservoirs including flow regime visualization and stimulated reservoir volume estimation. Due to high permeability contrast and non-Gaussianity of channelized permeability field, it is difficult to history match and quantify uncertainty of channelized reservoirs using traditional approaches. We treat facies boundaries as level set functions and solve the moving boundary problem (history matching) with the level set equation. In addition to level set methods, we also exploit the problem using pixel based approach. The reversible jump Markov Chain Monte Carlo approach is utilized to search the parameter space with flexible dimensions. Both proposed approaches are demonstrated with two and three dimensional examples.
33

A Hierarchical Multiscale Approach to History Matching and Optimization for Reservoir Management in Mature Fields

Park, Han-Young 2012 August 1900 (has links)
Reservoir management typically focuses on maximizing oil and gas recovery from a reservoir based on facts and information while minimizing capital and operating investments. Modern reservoir management uses history-matched simulation model to predict the range of recovery or to provide the economic assessment of different field development strategies. Geological models are becoming increasingly complex and more detailed with several hundred thousand to million cells, which include large sets of subsurface uncertainties. Current issues associated with history matching, therefore, involve extensive computation (flow simulations) time, preserving geologic realism, and non-uniqueness problem. Many of recent rate optimization methods utilize constrained optimization techniques, often making them inaccessible for field reservoir management. Field-scale rate optimization problems involve highly complex reservoir models, production and facilities constraints and a large number of unknowns. We present a hierarchical multiscale calibration approach using global and local updates in coarse and fine grid. We incorporate a multiscale framework into hierarchical updates: global and local updates. In global update we calibrate large-scale parameters to match global field-level energy (pressure), which is followed by local update where we match well-by-well performances by calibration of local cell properties. The inclusion of multiscale calibration, integrating production data in coarse grid and successively finer grids sequentially, is critical for history matching high-resolution geologic models through significant reduction in simulation time. For rate optimization, we develop a hierarchical analytical method using streamline-assisted flood efficiency maps. The proposed approach avoids use of complex optimization tools; rather we emphasize the visual and the intuitive appeal of streamline method and utilize analytic solutions derived from relationship between streamline time of flight and flow rates. The proposed approach is analytic, easy to implement and well-suited for large-scale field applications. Finally, we present a hierarchical Pareto-based approach to history matching under conflicting information. In this work we focus on multiobjective optimization problem, particularly conflicting multiple objectives during history matching of reservoir performances. We incorporate Pareto-based multiobjective evolutionary algorithm and Grid Connectivity-based Transformation (GCT) to account for history matching with conflicting information. The power and effectiveness of our approaches have been demonstrated using both synthetic and real field cases.
34

Fast History Matching of Time-Lapse Seismic and Production-Data for High Resolution Models

Rey Amaya, Alvaro 2011 August 1900 (has links)
Seismic data have been established as a valuable source of information for the construction of reservoir simulation models, most commonly for determination of the modeled geologic structure, and also for population of static petrophysical properties (e.g. porosity, permeability). More recently, the availability of repeated seismic surveys over the time scale of years (i.e., 4D seismic) has shown promising results for the qualitative determination of changes in fluid phase distributions and pressure required for determination of areas of bypassed oil, swept volumes and pressure maintenance mechanisms. Quantitatively, and currently the state of the art in reservoir model characterization, 4D seismic data have proven distinctively useful for the calibration of geologic spatial variability which ultimately contributes to the improvement of reservoir development and management strategies. Among the limited variety of techniques for the integration of dynamic seismic data into reservoir models, streamline-based techniques have been demonstrated as one of the more efficient approaches as a result of their analytical sensitivity formulations. Although streamline techniques have been used in the past to integrate time-lapse seismic attributes, the applications were limited to the simplified modeling scenarios of two-phase fluid flow and invariant streamline geometry throughout the production schedule. This research builds upon and advances existing approaches to streamline-based seismic data integration for the inclusion of both production and seismic data under varying field conditions. The proposed approach integrates data from reservoirs under active reservoir management and the corresponding simulation models can be constrained using highly detailed or realistic schedules. Fundamentally, a new derivation of seismic sensitivities is proposed that is able to represent a complex reservoir evolution between consecutive seismic surveys. The approach is further extended to manage compositional reservoir simulation with dissolution effects and gravity-convective-driven flows which, in particular, are typical of CO2 transport behavior following injection into deep saline aquifers. As a final component of this research, the benefits of dynamic data integration on the determination of swept and drained volumes by injection and production, respectively, are investigated. Several synthetic and field reservoir modeling scenarios are used for an extensive demonstration of the efficacy and practical feasibility of the proposed developments.
35

Integração de analise de incertezas e ajuste de historico de produçaõ / Integration of uncertainty analysis and history matching process

Moura Filho, Marcos Antonio Bezerra de 12 August 2018 (has links)
Orientadores: Denis Jose Schiozer, Celio Maschio / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecanica e Instituto de Geociencias / Made available in DSpace on 2018-08-12T23:06:56Z (GMT). No. of bitstreams: 1 MouraFilho_MarcosAntonioBezerrade_M.pdf: 4188788 bytes, checksum: 07988564a5783bc054c31f18ca0a2752 (MD5) Previous issue date: 2006 / Resumo:O processo de ajuste de histórico tradicional normalmente resulta em um único modelo determinístico que é utilizado para representar o reservatório, o que pode não ser suficiente para garantir previsões de produção confiáveis, principalmente para campos em início de produção. Este trabalho apresenta uma análise quantitativa das incertezas dos atributos de reservatório integrada com o processo de ajuste de histórico. Ao invés de ser utilizada uma abordagem determinística, aborda-se uma análise probabilística dos modelos de reservatório resultando em faixas de incerteza de previsão de produção e possibilitando uma melhor visão do comportamento futuro de reservatórios. Na metodologia utilizada neste trabalho, dados de simulação são comparados com dados de produção observados e, de acordo com os afastamentos em relação ao histórico de produção, há uma mudança das probabilidades de ocorrência dos cenários. Em alguns procedimentos propostos, há alterações também nos valores dos atributos incertos, diminuindo sua faixa de incerteza. O maior desafio deste trabalho consiste em determinar uma maneira consistente e confiável para promover a integração da análise de incertezas e ajuste de histórico, aumentando a confiabilidade na previsão de comportamento de reservatórios de petróleo e que seja possível de ser automatizada, facilitando o trabalho e acelerando o processo. Foram testados vários critérios até se alcançar a validação da metodologia proposta. Após a análise dos resultados obtidos, sugere-se uma seqüência de aplicação dos métodos de redução de incerteza propostos na metodologia. A principal contribuição desta metodologia é aumentar a confiabilidade na previsão de comportamento de reservatórios através de simulação numérica e mostrar a necessidade de incorporar incertezas ao processo de ajuste de histórico de produção. Uma outra contribuição deste trabalho é iniciar essa linha de pesquisa propondo e validando alguns métodos para integrar os processos de ajuste e análise de incertezas / Abstract: History matching process usually results in a unique deterministic model that is used torepresent the reservoir, but it may not be enough to guarantee reliable production forecasts, mainly for fields in early production stages. This work presents a quantitative uncertainty analysis of reservoir attributes integrated to the history matching process. Instead of using a deterministic approach, it is used a probabilistic analysis of the reservoir models, resulting in uncertainty ranges for the production forecast and allowing a better prediction of reservoir performance. In the methodology used in this work, simulation data are compared to observed production data and, according to the difference between those data, the probabilities of the scenarios are changed. In some procedures, the probability distribution of the reservoir attributes also change, diminishing their uncertainty range. The main challenges of this work are: (1) the determination of a consistent and reliable procedure to provide the integration of the uncertainty analysis and the history matching process, increasing the reliability in the reservoir performance forecast; and (2) to develop an automatic procedure, making the work easier and speeding up the process. The main contribution of this work is to increase the reliability of production predictions through reservoir simulation models and to show the necessity of incorporating uncertainties in the history matching. Other contribution of this work is start up a research line, proposing and validating some methods to integrate the history matching process and the uncertainty analysis / Mestrado / Ciencias e Engenharia do Petroleo / Mestre em Ciências e Engenharia de Petróleo
36

History Matching of 4D Seismic Data Attributes using the Ensemble Kalman Filter

Ravanelli, Fabio M. 05 1900 (has links)
One of the most challenging tasks in the oil industry is the production of reliable reservoir forecast models. Because of different sources of uncertainties the numerical models employed are often only crude approximations of the reality. This problem is tackled by the conditioning of the model with production data through data assimilation. This process is known in the oil industry as history matching. Several recent advances are being used to improve history matching reliability, notably the use of time-lapse seismic data and automated history matching software tools. One of the most promising data assimilation techniques employed in the oil industry is the ensemble Kalman filter (EnKF) because its ability to deal with highly non-linear models, low computational cost and easy computational implementation when compared with other methods. A synthetic reservoir model was used in a history matching study designed to predict the peak production allowing decision makers to properly plan field development actions. If only production data is assimilated, a total of 12 years of historical data is required to properly characterize the production uncertainty and consequently the correct moment to take actions and decommission the field. However if time-lapse seismic data is available this conclusion can be reached 4 years in advance due to the additional fluid displacement information obtained with the seismic data. Production data provides geographically sparse data in contrast with seismic data which are sparse in time. Several types of seismic attributes were tested in this study. Poisson’s ratio proved to be the most sensitive attribute to fluid displacement. In practical applications, however the use of this attribute is usually avoided due to poor quality of the data. Seismic impedance tends to be more reliable. Finally, a new conceptual idea was proposed to obtain time-lapse information for a history matching study. The use of crosswell time-lapse seismic tomography to map velocities in the interwell region was demonstrated as a potential tool to ensure survey reproducibility and low acquisition cost when compared with full scale surface surveys. This approach relies on the higher velocity sensitivity to fluid displacement at higher frequencies. The velocity effects were modeled using the Biot velocity model. This method provided promising results leading to similar RRMS error reductions when compared with conventional history matched surface seismic data.
37

Robust Method for Reservoir Simulation History Matching Using Bayesian Inversion and Long-Short Term Memory Network (LSTM) Based Proxy

Zhang, Zhen 11 1900 (has links)
History matching is a critical process used for calibrating simulation models and assessing subsurface uncertainties. This common technique aims to align the reservoir models with the observed data. However, achieving this goal is often challenging due to the non uniqueness of the solution, underlying subsurface uncertainties, and usually the high computational cost of simulations. The traditional approach is often based on trial and error, which is exhaustive and labor-intensive. Some analytical and numerical proxies combined with Monte Carlo simulations are utilized to reduce the computational time. However, these approaches suffer from low accuracy and may not fully capture subsurface uncertainties. This study proposes a new robust method utilizing Bayesian Markov Chain Monte Carlo (MCMC) to perform assisted history matching under uncertainties. We propose a novel three-step workflow that includes 1) multi-resolution low-fidelity models to guarantee high-quality matching; 2) Long-Short Term Memory (LSTM) network as a low-fidelity model to reproduce continuous time-response based on the simulation model, combined with Bayesian optimization to obtain the optimum low fidelity model; 3) Bayesian MCMC runs to obtain the Bayesian inversion of the uncertainty parameters. We perform sensitivity analysis on the LSTM’s architecture, hyperparameters, training set, number of chains, and chain length to obtain the optimum setup for Bayesian-LSTM history matching. We also compare the performance of predicting the recovery factor using different surrogate methods, including polynomial chaos expansions (PCE), kriging, and support vector machines for regression (SVR). We demonstrate the proposed method using a water flooding problem for the upper Tarbert formation of the tenth SPE comparative model. This study case represents a highly heterogeneous nearshore environment. Results showed that the Bayesian-optimized LSTM has successfully captured the physics in the high-fidelity model. The Bayesian-LSTM MCMC produces an accurate prediction with narrow ranges of uncertainties. The posterior prediction through the high-fidelity model ensures the robustness and accuracy of the workflow. This approach provides an efficient and practical history-matching method for reservoir simulation and subsurface flow modeling with significant uncertainties.
38

A New Method for History Matching and Forecasting Shale Gas/Oil Reservoir Production Performance with Dual and Triple Porosity Models

Samandarli, Orkhan 2011 August 1900 (has links)
Different methods have been proposed for history matching production of shale gas/oil wells which are drilled horizontally and usually hydraulically fractured with multiple stages. These methods are simulation, analytical models, and empirical equations. It has been well known that among the methods listed above, analytical models are more favorable in application to field data for two reasons. First, analytical solutions are faster than simulation, and second, they are more rigorous than empirical equations. Production behavior of horizontally drilled shale gas/oil wells has never been completely matched with the models which are described in this thesis. For shale gas wells, correction due to adsorption is explained with derived equations. The algorithm which is used for history matching and forecasting is explained in detail with a computer program as an implementation of it that is written in Excel's VBA. As an objective of this research, robust method is presented with a computer program which is applied to field data. The method presented in this thesis is applied to analyze the production performance of gas wells from Barnett, Woodford, and Fayetteville shales. It is shown that the method works well to understand reservoir description and predict future performance of shale gas wells. Moreover, synthetic shale oil well also was used to validate application of the method to oil wells. Given the huge unconventional resource potential and increasing energy demand in the world, the method described in this thesis will be the "game changing" technology to understand the reservoir properties and make future predictions in short period of time.
39

History matching and uncertainty quantificiation using sampling method

Ma, Xianlin 15 May 2009 (has links)
Uncertainty quantification involves sampling the reservoir parameters correctly from a posterior probability function that is conditioned to both static and dynamic data. Rigorous sampling methods like Markov Chain Monte Carlo (MCMC) are known to sample from the distribution but can be computationally prohibitive for high resolution reservoir models. Approximate sampling methods are more efficient but less rigorous for nonlinear inverse problems. There is a need for an efficient and rigorous approach to uncertainty quantification for the nonlinear inverse problems. First, we propose a two-stage MCMC approach using sensitivities for quantifying uncertainty in history matching geological models. In the first stage, we compute the acceptance probability for a proposed change in reservoir parameters based on a linearized approximation to flow simulation in a small neighborhood of the previously computed dynamic data. In the second stage, those proposals that passed a selected criterion of the first stage are assessed by running full flow simulations to assure the rigorousness. Second, we propose a two-stage MCMC approach using response surface models for quantifying uncertainty. The formulation allows us to history match three-phase flow simultaneously. The built response exists independently of expensive flow simulation, and provides efficient samples for the reservoir simulation and MCMC in the second stage. Third, we propose a two-stage MCMC approach using upscaling and non-parametric regressions for quantifying uncertainty. A coarse grid model acts as a surrogate for the fine grid model by flow-based upscaling. The response correction of the coarse-scale model is performed by error modeling via the non-parametric regression to approximate the response of the computationally expensive fine-scale model. Our proposed two-stage sampling approaches are computationally efficient and rigorous with a significantly higher acceptance rate compared to traditional MCMC algorithms. Finally, we developed a coarsening algorithm to determine an optimal reservoir simulation grid by grouping fine scale layers in such a way that the heterogeneity measure of a defined static property is minimized within the layers. The optimal number of layers is then selected based on a statistical analysis. The power and utility of our approaches have been demonstrated using both synthetic and field examples.
40

Continuous reservoir model updating using an ensemble Kalman filter with a streamline-based covariance localization

Arroyo Negrete, Elkin Rafael 25 April 2007 (has links)
This work presents a new approach that combines the comprehensive capabilities of the ensemble Kalman filter (EnKF) and the flow path information from streamlines to eliminate and/or reduce some of the problems and limitations of the use of the EnKF for history matching reservoir models. The recent use of the EnKF for data assimilation and assessment of uncertainties in future forecasts in reservoir engineering seems to be promising. EnKF provides ways of incorporating any type of production data or time lapse seismic information in an efficient way. However, the use of the EnKF in history matching comes with its shares of challenges and concerns. The overshooting of parameters leading to loss of geologic realism, possible increase in the material balance errors of the updated phase(s), and limitations associated with non-Gaussian permeability distribution are some of the most critical problems of the EnKF. The use of larger ensemble size may mitigate some of these problems but are prohibitively expensive in practice. We present a streamline-based conditioning technique that can be implemented with the EnKF to eliminate or reduce the magnitude of these problems, allowing for the use of a reduced ensemble size, thereby leading to significant savings in time during field scale implementation. Our approach involves no extra computational cost and is easy to implement. Additionally, the final history matched model tends to preserve most of the geological features of the initial geologic model. A quick look at the procedure is provided that enables the implementation of this approach into the current EnKF implementations. Our procedure uses the streamline path information to condition the covariance matrix in the Kalman Update. We demonstrate the power and utility of our approach with synthetic examples and a field case. Our result shows that using the conditioned technique presented in this thesis, the overshooting/undershooting problems disappears and the limitation to work with non- Gaussian distribution is reduced. Finally, an analysis of the scalability in a parallel implementation of our computer code is given.

Page generated in 0.0873 seconds