• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 24
  • 5
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 76
  • 19
  • 14
  • 12
  • 11
  • 10
  • 10
  • 9
  • 7
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Understanding reservoir mechanisms using phase and component streamline tracing

Kumar, Sarwesh 15 May 2009 (has links)
Conventionally streamlines are traced using total flux across the grid cell faces. The visualization of total flux streamlines shows the movement of flood, injector-producer relationship, swept area and movement of tracer. But they fail to capture some important signatures of reservoir dynamics, such as dominant phase in flow, appearance and disappearance of phases (e.g. gas), and flow of components like CO2. In the work being presented, we demonstrate the benefits of visualizing phase and component streamlines which are traced using phase and component fluxes respectively. Although the phase and component streamlines are not appropriate for simulation, as they might be discontinuous, they definitely have a lot of useful information about the reservoir processes and recovery mechanisms. In this research, phase and component streamline tracing has been successfully implemented in three-phase and compositional simulation and the additional information obtained using these streamlines have been explored. The power and utility of the phase and component streamlines have been demonstrated using synthetic examples and two field cases. The new formulation of streamline tracing provides additional information about the reservoir drive mechanisms. The phase streamlines capture the dominant phase in flow in different parts of the reservoir and the area swept corresponding to different phases can be identified. Based on these streamlines the appearance and disappearance of phases can be identified. Also these streamlines can be used for optimizing the field recovery processes like water injection and location of infill wells. Using component streamlines the movement of components like CO2 can be traced, so they can be used for optimizing tertiary recovery mechanisms and tracking of tracers. They can also be used to trace CO2 in CO2 sequestration project where the CO2 injection is for long term storage in aquifers or reservoirs. They have also other potential uses towards study of reservoir processes and behavior such as drainage area mapping for different phases, phase rate allocations to reservoir layers, etc.
12

The impact of grid geometry on displacement calculations

Jimenez Arismendi, Eduardo A. 15 November 2004 (has links)
Reservoir simulation models are becoming increasingly sophisticated in tandem with the rapid development of geological modeling methods. Widely used commercial simulators usually model flow through heavily faulted and structurally complex geometries with the flexibility provided by corner-point geometry. However, the nonorthogonality component present within these frameworks may compromise the solution accuracy of the model and the subsequent operational decisions involved. We propose a systematic methodology to evaluate the impact of complex gridding introducing a new streamline formulation for corner-point geometry. Based on a new time-like variable, the new formulation provides a significantly simpler and more robust development to handle the complexity in structurally demanding and faulted systems. It retains the simplicity and speed of streamline-based flow models and provides an efficient way to visualize nonorthogonal effects. Applied to various geometries showing challenging features of geology and flow, the displacement fronts obtained from streamline-derived analytic calculation identified the discrepancies characteristic between known solutions and results from two widely used commercial simulators.
13

An efficient Bayesian formulation for production data integration into reservoir models

Leonardo, Vega Velasquez 17 February 2005 (has links)
Current techniques for production data integration into reservoir models can be broadly grouped into two categories: deterministic and Bayesian. The deterministic approach relies on imposing parameter smoothness constraints using spatial derivatives to ensure large-scale changes consistent with the low resolution of the production data. The Bayesian approach is based on prior estimates of model statistics such as parameter covariance and data errors and attempts to generate posterior models consistent with the static and dynamic data. Both approaches have been successful for field-scale applications although the computational costs associated with the two methods can vary widely. This is particularly the case for the Bayesian approach that utilizes a prior covariance matrix that can be large and full. To date, no systematic study has been carried out to examine the scaling properties and relative merits of the methods. The main purpose of this work is twofold. First, we systematically investigate the scaling of the computational costs for the deterministic and the Bayesian approaches for realistic field-scale applications. Our results indicate that the deterministic approach exhibits a linear increase in the CPU time with model size compared to a quadratic increase for the Bayesian approach. Second, we propose a fast and robust adaptation of the Bayesian formulation that preserves the statistical foundation of the Bayesian method and at the same time has a scaling property similar to that of the deterministic approach. This can lead to orders of magnitude savings in computation time for model sizes greater than 100,000 grid blocks. We demonstrate the power and utility of our proposed method using synthetic examples and a field example from the Goldsmith field, a carbonate reservoir in west Texas. The use of the new efficient Bayesian formulation along with the Randomized Maximum Likelihood method allows straightforward assessment of uncertainty. The former provides computational efficiency and the latter avoids rejection of expensive conditioned realizations.
14

Model Calibration, Drainage Volume Calculation and Optimization in Heterogeneous Fractured Reservoirs

Kang, Suk Sang 1975- 14 March 2013 (has links)
We propose a rigorous approach for well drainage volume calculations in gas reservoirs based on the flux field derived from dual porosity finite-difference simulation and demonstrate its application to optimize well placement. Our approach relies on a high frequency asymptotic solution of the diffusivity equation and emulates the propagation of a 'pressure front' in the reservoir along gas streamlines. The proposed approach is a generalization of the radius of drainage concept in well test analysis (Lee 1982), which allows us not only to compute rigorously the well drainage volumes as a function of time but also to examine the potential impact of infill wells on the drainage volumes of existing producers. Using these results, we present a systematic approach to optimize well placement to maximize the Estimated Ultimate Recovery. A history matching algorithm is proposed that sequentially calibrates reservoir parameters from the global-to-local scale considering parameter uncertainty and the resolution of the data. Parameter updates are constrained to the prior geologic heterogeneity and performed parsimoniously to the smallest spatial scales at which they can be resolved by the available data. In the first step of the workflow, Genetic Algorithm is used to assess the uncertainty in global parameters that influence field-scale flow behavior, specifically reservoir energy. To identify the reservoir volume over which each regional multiplier is applied, we have developed a novel approach to heterogeneity segmentation from spectral clustering theory. The proposed clustering can capture main feature of prior model by using second eigenvector of graph affinity matrix. In the second stage of the workflow, we parameterize the high-resolution heterogeneity in the spectral domain using the Grid Connectivity based Transform to severely compress the dimension of the calibration parameter set. The GCT implicitly imposes geological continuity and promotes minimal changes to each prior model in the ensemble during the calibration process. The field scale utility of the workflow is then demonstrated with the calibration of a model characterizing a structurally complex and highly fractured reservoir.
15

An efficient Bayesian formulation for production data integration into reservoir models

Leonardo, Vega Velasquez 17 February 2005 (has links)
Current techniques for production data integration into reservoir models can be broadly grouped into two categories: deterministic and Bayesian. The deterministic approach relies on imposing parameter smoothness constraints using spatial derivatives to ensure large-scale changes consistent with the low resolution of the production data. The Bayesian approach is based on prior estimates of model statistics such as parameter covariance and data errors and attempts to generate posterior models consistent with the static and dynamic data. Both approaches have been successful for field-scale applications although the computational costs associated with the two methods can vary widely. This is particularly the case for the Bayesian approach that utilizes a prior covariance matrix that can be large and full. To date, no systematic study has been carried out to examine the scaling properties and relative merits of the methods. The main purpose of this work is twofold. First, we systematically investigate the scaling of the computational costs for the deterministic and the Bayesian approaches for realistic field-scale applications. Our results indicate that the deterministic approach exhibits a linear increase in the CPU time with model size compared to a quadratic increase for the Bayesian approach. Second, we propose a fast and robust adaptation of the Bayesian formulation that preserves the statistical foundation of the Bayesian method and at the same time has a scaling property similar to that of the deterministic approach. This can lead to orders of magnitude savings in computation time for model sizes greater than 100,000 grid blocks. We demonstrate the power and utility of our proposed method using synthetic examples and a field example from the Goldsmith field, a carbonate reservoir in west Texas. The use of the new efficient Bayesian formulation along with the Randomized Maximum Likelihood method allows straightforward assessment of uncertainty. The former provides computational efficiency and the latter avoids rejection of expensive conditioned realizations.
16

A Study on the Relationship among Organization a change Work Quality of Life and Organizational Commitment¡G A Case of a Navy Logistic Department

Lan, Yi-chien 16 August 2008 (has links)
Summary Directed by Military Streamline Program, the missions and organization of Navy shipyards were undergone a great change in recent years. They were transformed from a ships maintenance facility into a district-liked Logistics Support Command. It not only repairs ships, but also takes care of security measurement and utility management that were once operated by Naval District. During the process of transform and changes, the employees and servicemen of the shipyard are playing an important role in transferring skills and assignments. But the inevitable impact and effects also greatly affected these people on every aspect. There are rare academic studies that focused on Navy logistics service personnel, especially on relationship among ¡§work quality¡¨ and ¡§organizational commitment¡¨. Due to these two factors are useful criteria to evaluate the performance of the streamline program. A further study can help to ease the impact on both organization and its people. The core missions of Logistic Support Commands are ships maintenance and repair that was inherited from the shipyards. The work force is mainly comprised by technical sergeants and civilian employee. Technical sergeants can be replaced through recruiting system, but civilian employee are no longer hiring. The aging and declining of work force would increase the workload and stress on existing person. The situation also greatly affects the quality and competence of Navy ships¡¦ maintenance capability. This thesis is to examine the impact on civilian employee of Logistic Support Command during the streamline process. It focuses on the relationship and affection between ¡§work quality¡¨ and ¡§organizational commitment¡¨ during this specified period. It also can present as an indication for planning department on next stage of streamline. The purposes of this research are to uphold employee¡¦s work quality while organizational commitments decreasing, to implement streamline program properly, to promote work effectiveness, and to fulfill mission adequately. The object of this research is civilian employee of Navy Logistic Support Command. There were total of 182 survey have been delivered, and 177 effective feedbacks were received. The analytical methods are factors analysis, descriptive statistical analysis, correlation coefficient analysis, and regression analysis. The outcomes of this research are: 1. Every factors of streamline process should positively connect to ¡§work quality¡¨ and ¡§organizational commitment¡¨, but ¡§group relationship¡¨ does not. 2. Employee who has more conscious about work quality; such as content, condition, and lifestyle of work, is more likely to remain in the organization and committed to it. 3. ¡§Organizational streamline¡¨ and ¡§organizational commitment¡¨ are positively correlated. The suggestions of this research are: 1. Design streamline program appropriately, and emphasize communication repeatedly. 2. Simplified operation process, and focused on work quality. 3. Encourage employee to life-long learning, diversify his own skill. 4. Persuade senior employee to stay longer, pass their techniques to fresh hands, and strengthen organizational commitment as well.
17

Streamline-based simulation of water injection in naturally fractured reservoirs

Al-Huthali, Ahmed 30 September 2004 (has links)
The current streamline formulation is limited to single-porosity systems and is then not suitable for application to naturally fractured reservoirs. Describing the fluid transport in naturally fractured reservoirs has been recognized as a main challenge for simulation engineers due to the complicated physics involved. In this work, we generalized the streamline-based simulation to describe the fluid transport in naturally fractured reservoirs. We implemented three types of transfer function: the conventional transfer function (CTF), the diffusion transfer function (DTF), and the empirical transfer function (ETF). We showed that these transfer functions can be implemented easily in the current single-porosity streamline codes. These transfer functions have been added as a source term to the transport equation that describes the saturation evolution along the streamlines. We solved this equation numerically for all types of transfer functions. The numerical solution of the continuity equation with DTF and ETF requires discretizing a convolution term. We derived an analytical solution to the saturation equation with ETF in terms of streamline TOF to validate the numerical solution. We obtain an excellent match between the numerical and the analytical solution. The final stage of our study was to validate our work by comparing our dual-porosity streamline simulator (DPSS) to the commercial dual-porosity simulator, ECLIPSE. The dual-porosity ECLIPSE uses the CTF to describe the interaction between the matrix-blocks and the fracture system. The dual-porosity streamline simulator with CTF showed an excellent match with the dual-porosity ECLIPSE. On the other hand, dual-porosity streamline simulation with DTF and ETF showed a lower recovery than the recovery obtained from the dual-porosity ECLIPSE and the DPSS with CTF. This difference in oil recovery is not due to our formulation, but is related to the theoretical basis on which CTF, DTF, and ETF were derived in the literature. It was beyond the scope of this study to investigate the relative accuracy of each transfer function. We demonstrate that the DPSS is computationally efficient and ideal for large-scale field application. Also, we showed that the DPSS minimizes numerical smearing and grid orientation effects compared to the dual-porosity ECLIPSE.
18

Design rolby na úpravu ledové plochy / Design of ice resurfacer

Machálek, Róbert January 2010 (has links)
Theme of this diploma thesis is the design of ice resurfacer. It deals with design of a special automatic vehicle, which is able to execute the process of ice resurfacing, without human control. With regard to the protection of the environment and health of the people inside rink arena, the propulsion system is based on the electric motor, with batteries as a power source. Final design reflects innovation of the automatization in this area. The appearance supports the uniqueness and the timelessness of the ice resurfacer and it is build up on the principles of the streamline design, with emphasis on the purity of form.
19

A feature-based approach to visualizing and mining simulation data

Jiang, Ming 13 July 2005 (has links)
No description available.
20

Advanced flow visualization

LI, Liya 11 December 2007 (has links)
No description available.

Page generated in 0.0584 seconds