Spelling suggestions: "subject:"conformance"" "subject:"conformances""
1 |
Simulation study of preformed particle gel for conformance controlTaksaudom, Pongpak 10 October 2014 (has links)
Conformance control has long been a compelling subject in improving waterflood oil recovery. By blocking the areas previously swept by water, subsequently injected water is allowed to sweep the remaining unswept portions of the reservoir and thereby increase the ultimate oil recovery. One technique that has received a great deal of attention recently in achieving this in-depth water shut-off is crosslinked gel injection. However, processing and predicting the performance of these gels in complex petroleum reservoirs is known to be extremely challenging. A model that accurately represents the reservoir features, chemical properties, and displacement mechanisms is, therefore, required. In this study, we further developed the UT in-house numerical reservoir simulator, branded as UTGEL. Our first focus was to enable UTGEL to simulate a new type of temperature-resistant and salt-tolerant pre-crosslinked swellable particle gel, known as Preformed Particle Gel or PPG. A series of numerical simulations have been conducted to match with experimental data and generate parameters for full field scale simulation. Five laboratory experimental matching attempts were successfully performed using the UTGEL simulator in this study. The matched experiments included a fracture model, two sandpack models, a sandstone coreflood experiment, and a parallel sandpack model. The second focus of this study was to investigate the applications of PPG in blocking high-permeability layers, fractures, and conduits. A number of synthetic and actual field cases were generated to study the performance of PPG in (1) reservoirs with various layered permeability contrasts, from extremely low to extremely high permeability contrasts, (2) reservoirs containing extensive conduits or channels, and (3) real field cases where heterogeneity had been identified unfavorable to the waterflood efficiency. The simulation outcomes indicated significant incremental oil recovery from PPG treatment ranging from less than 5% to almost 30%. A number of sensitivity analyses were also conducted to provide some insights on the optimal PPG treatment design. Lastly, to enhance the capability of UTGEL in simulating gel transport in diverse scenarios, a novel Embedded Discrete Fracture Modeling (EDFM) concept was implemented into UTGEL in this study, allowing multiple sets of fracture planes and conduits with dip angles and orientations to be modeled and simulated with gel treatments for the first time with a rather computationally inexpensive method. Although the developed simulator requires further improvement and validation against wider reservoir and fluid conditions, the representative results from a number of generated models in this study have suggested another step forward towards achieving realistic reservoir modeling and advanced gel transport simulation. / text
|
2 |
Quality in vocational education for higher technicians in Hong KongLaw, Kwok-Sang January 2001 (has links)
No description available.
|
3 |
Application of X-ray CT for investigating fluid flow and conformance control during CO2 injection in highly heterogeneous mediaChakravarthy, Deepak 29 August 2005 (has links)
Fractured reservoirs have always been considered poor candidates for enhanced
oil recovery. This can be attributed to the complexities involved in understanding and
predicting performance in these reservoirs. In a fractured system, the high permeability
fracture forms the preferred pathway for the injected fluids, and a large amount of oil that
is stored in the matrix is bypassed. Hence, a good understanding of multiphase fluid flow
in fractures is required to reduce oil bypass and increase recovery from these reservoirs.
This research investigates the effect of heterogeneity and injection rates on oil bypass and
also the various techniques used for the improvement of sweep efficiency in
heterogeneous systems. Several coreflood experiments were performed using
homogeneous and heterogeneous cores and a 4th generation X-Ray CT scanner was used
to visualize heterogeneity and fluid flow in the core. Porosity and saturation
measurements were made during the course of the experiment.
The experimental results indicate that injection rates play a very important role in
affecting the recovery process, more so in the presence of fractures. At high injection
rates, faster breakthrough of CO2 and higher oil bypass were observed than at low
injection rates. But very low injection rates are not attractive from an economic point of
view. Hence water viscosified with a polymer was injected directly into the fracture to
divert CO2 flow into the matrix and delay breakthrough, similar to the WAG process.
Although the breakthrough time reduced considerably, water ??leak off?? into the matrix
was very high. To counter this problem, a cross-linked gel was used in the fracture for
conformance control. The gel was found to overcome ??leak off?? problems and effectively
divert CO2 flow into the matrix. This experimental research will serve to increase the
understanding of fluid flow and conformance control methods in fractured reservoirs.
|
4 |
Simulation study of polymer microgel conformance treatmentsAbdulbaki, Mazen Ramzi 06 November 2012 (has links)
Significant quantities of hydrocarbon are bypassed during conventional waterfloods. This is the direct result of fluid channeling through high permeability zones within the reservoir. Conformance control offers a mean of increasing vertical and areal sweep efficiency, thus decreasing the amount of hydrocarbon bypassed. This, in turn, results in increased hydrocarbon production, decreased water cut, and field life extension. This thesis focuses on the use of polymer microgels as a relatively novel conformance control agent. Polymer-microgel-enhanced waterflooding tackles fluid channeling by “plugging” high permeability channels, or thief zones, and diverting trailing flooding fluid to adjacent poorly swept areas of the reservoir.
The first major objective of this thesis was to provide an extensive literature survey on polymer microgel technology, which can serve as the go-to reference on this topic. Colloidal Dispersion Gels (CDGs), Preformed Particle Gels (PPGs), temperature-sensitive polymer microgels (Bright Water), and pH-sensitive polymer microgels are all discussed in detail, and an attempt is made to highlight the potential mechanisms by which they plug thief zones and improve oil recovery.
This thesis then outlines the results of simulating numerous polymer microgel floods, ranging from experimental cases to field cases. Specifically, Colloidal Dispersion Gels (CDGs) were chosen for the simulations undergone. All simulations were run using UTGEL, a newly developed in-house simulator designed exclusively for the simulation of polymer, gel, and microgel floods. The simulations performed provide insight on the polymer microgel flooding process, and also served as a means of validating UTGEL’s polymer microgel (CDG) models. The development of the UTGEL simulator was important as it enables the optimization of polymer microgel floods for maximized hydrocarbon recovery efficiency.
The results of a simulation study, using a synthetic field case, are also outlined. This sensitivity study provides additional insight on optimal operational conditions for polymer microgel technology. More specifically, this study aimed to investigate the effectiveness of microgel flooding treatments in layered reservoirs of varying permeability contrasts, vertical-to-horizontal permeability ratios, and under a variety of different injection concentrations. / text
|
5 |
Device fingerprinting: Conformance test av HTML5 / Device fingerprinting: Conformance test of HTML5Bolin, Tobias January 2015 (has links)
No description available.
|
6 |
Symboleo: Specification and Verification of Legal ContractsParvizimosaed, Alireza 21 October 2022 (has links)
Contracts are legally binding and enforceable agreements among two or more parties that govern social interactions. They have been used for millennia, including in commercial transactions, employment relationships and intellectual property generation. Each contract determines obligations and powers of contracting parties. The execution of a contract needs to be continuously monitored to ensure compliance with its terms and conditions. Smart contracts are software systems that monitor and control the execution of contracts to ensure compliance. But for such software systems to become possible, contracts need to be specified precisely to eliminate ambiguities, contradictions, and missing clauses. This thesis proposes a formal specification language for contracts named Symboleo. The ontology of Symboleo is founded on the legal concepts of obligation (a kind of duty) and power (a kind of right) complemented with the concepts of event and situation that are suitable for conceptualizing monitoring tasks. The formal semantics of legal concepts is defined in terms of state machines that describe the lifetimes of contracts, obligations, and powers, as well as axioms that describe precisely state transitions. The language supports execution-time operations that enable subcontracting assignment of rights and substitution of performance to a third party during the execution of a contract. Symboleo has been applied to the formalization of contracts from three different domains as a preliminary evaluation of its expressiveness. Formal specifications can be algorithmically analyzed to ensure that they satisfy desired properties. Towards this end, the thesis presents two implemented analysis tools. One is a conformance checking tool (SymboleoPC) that ensures that a specification is consistent with the expectations of contracting parties. Expectations are defined for this tool in terms of scenarios (sequences of events) and the expected final outcome (i.e., successful/unsuccessful execution). The other tool (SymboleoPC), which builds on top of an existing model checker (nuXmv), can prove/disprove desired properties of a contract, expressed in temporal logic. These tools have been used for assessing different business contracts. SymboleoPC is also assessed in terms of performance and scalability, with positive results. Symboleo, together with its associated tools, is envisioned as an enabler for the formal verification of contracts to address requirements-level issues, at design time.
|
7 |
Evaluation of Automated Test Generation for Simulink : A Case Study in the Context of Propulsion Control SoftwareRoslund, Anton January 2020 (has links)
Automated Test Generation (ATG) has been successfully applied in many domains. For the modeling and simulation language Simulink, there has been research on developing tools for ATG with promising results. However, most tools developed as part of academic research and are not publicly available, or severely limited in their ability to be integrated into an industrial workflow. There are commercial ATG tools for Simulink, with Simulink Design Verifier (SLDV) as the de-facto standard tool. For this thesis, we perform an empirical comparison of manual tests to those generated by SLDV. For the comparison, we used 180 components from the propulsion control software developed by our industry partner. All except two components are compatible for test generation to some extent. The majority of components are partially compatible, requiring block replacement or stubbing. Approximation of floating-point numbers is the primary reason for block replacement, which can be performed automatically by SLDV. Two components were incompatible, and 14 required full stubbing of blocks. Using a pre-processing step, the generated tests achieve similar coverage as the manual tests. We performed a Mann–Whitney U test with the hypothesis that the generated tests achieve higher coverage than the manual tests. There are no statistically significant differences for either decision coverage (0.0719), or condition coverage (0.8357). However, for Modified Condition/Decision Coverage, the generated tests achieve higher coverage, and the difference is significant (0.0027). The limitations of ATG were explored by looking at the cases where the generated tests achieved lower coverage than the manual test. We found that the use of floating-point arithmetic and temporal logic increases the time required for test generation, and causes the analysis to hit the time limit. The test generation does not support all custom S-functions and perform stubbing of these blocks. This made the tool unable to reason about persistent storage. Configuration constants have limited support, which was the reason for the coverage difference in three cases. We have concluded that while much effort is required for custom tooling and initial setup, ATG can prove useful for early fault detection in an industrial workflow. ATG would prove especially useful in an automated continuous integration workflow for integration-level conformance testing.
|
8 |
Towards controlling software architecture erosion through runtime conformance monitoringde Silva, Lakshitha R. January 2014 (has links)
The software architecture of a system is often used to guide and constrain its implementation. While the code structure of an initial implementation is likely to conform to its intended architecture, its dynamic properties cannot always be fully checked until deployment. Routine maintenance and changing requirements can also lead to a deployed system deviating from this architecture over time. Dynamic architecture conformance checking plays an important part in ensuring that software architectures and corresponding implementations stay consistent with one another throughout the software lifecycle. However, runtime conformance checking strategies often force changes to the software, demand tight coupling between the monitoring framework and application, impact performance, require manual intervention, and lack flexibility and extensibility, affecting their viability in practice. This thesis presents a dynamic conformance checking framework called PANDArch framework, which aims to address these issues. PANDArch is designed to be automated, pluggable, non-intrusive, performance-centric, extensible and tolerant of incomplete specifications. The thesis describes the concept and design principles behind PANDArch, and its current implementation, which uses an architecture description language to specify architectures and Java as the target language. The framework is evaluated using three open source software products of different types. The results suggest that dynamic architectural conformance checking with the proposed features may be a viable option in practice.
|
9 |
A contingency framework of enterprise governance in the UK : a value-based management approachElghrabawy, Adel January 2012 (has links)
Corporate governance (CG) has recently received much attention because of the wave of financial scandals in the early 2000s and the more recent global financial crisis. CG reforms, including laws, codes and listing rules have been established to protect shareholders’ rights and restore investors’ confidence in the capital market. These reforms have largely contributed to the evolution of internal and external governance mechanisms that are aimed at mitigating agency conflicts between managers and shareholders. However, overemphasis has been placed on the monitoring and control dimensions of governance, which may hinder entrepreneurial activities, obscure business prosperity and contribute to a narrow perspective on CG. It has been argued that there is a need to broaden CG beyond compliance (conformance) to a set of rules and laws, to include the performance aspects of governance that focus on strategy and value creation. In other words, governance should not only focus on monitoring managerial performance to ensure accountability to shareholders, but also on mechanisms that motivate management to optimise shareholders’ wealth. Enterprise governance (EG) framework has been introduced to keep the balance between the conformance and performance dimensions of governance. However, few studies address the possible tension between conformance and performance. Moreover, there is no agreement among these studies on the relationship between conformance and performance in the governance context. Arguably, Value-based Management (VBM) is an appropriate approach to address the issue of EG. VBM adopts value creation as an overall objective, develops a strategy that contributes to value creation and integrates it into decision-making. In this way, VBM can act as an effective mechanism for motivating management to maximise shareholder wealth, which works in parallel with other CG mechanisms, to mitigate agency conflicts resulting from the separation between ownership and management. This study aims to develop a contingency framework of EG through operationalising the conformance using CG and performance using corporate entrepreneurship (CE). This framework examines the inter-relationships between VBM, compliance with the Combined Code on Corporate Governance (CCCG), CE and the ultimate effect on organisational performance. More specifically, the study empirically examines the effect of compliance with the CCCG on CE, and whether VBM can achieve a balance between compliance with the CCCG and CE, should a conflict exist. The study also examines whether a fit between contingency variables (company size, agency conflicts, uncertainty, strategy and decentralisation), VBM, compliance with the CCCG codes and CE is associated with organisational performance. To achieve the aim of this study a cross-sectional survey, based on a questionnaire, is conducted to identify the level of VBM implementation, contextual and organisational factors in the large and medium quoted companies in the UK. The questionnaire targets the Chief Financial Officers (CFOs) in these companies as key informants. In addition, a content analysis of the annual reports of the sampled companies is undertaken to measure the level of compliance with the CCCG. Financial data (e.g. organisational performance) have been obtained from the DataStream, Fame and Thomson One Banker databases. Partial Least Squares Structural Equation Modelling (PLS-SEM) is adopted for data analysis and hypotheses testing. The results suggest that VBM implementation is positively associated with agency conflicts, low cost strategies and decentralisation. Compliance with the CCCG is positively associated with agency conflicts and company size. CE is positively associated with company size, uncertainty and differentiation strategies. In addition, the fit between compliance with the CCCG and contingency factors significantly predicts the marketbased performance. The fit between CE and the contingency factors significantly predict the perceived performance. However, the results regarding the effect of VBM on organisational performance are mixed. While VBM has no significant direct effect on the market-based performance, VBM has indirect positive effect on the market-based performance acting through compliance with the CCCG as an intervening variable. VBM is significantly associated with compliance with the CCCG but not with CE. No evidence is found for negative association between compliance with the CCCG and CE. The results support a large number of the proposed relationships between the contingency factors, VBM, compliance with the CCCG and CE. The results also suggest that using both compliance with the CCCG and CE as intervening variables in the relationship between VBM and organisational performance contributes to explaining the mixed results in the VBM literature. In terms of the EG framework, VBM does not keep a balance between conformance and performance. VBM emphasises the compliance with the CCCG (conformance) at the expense of CE (performance). The results did not provide significant evidence of a conflict between compliance with the CCCG and CE, the area which lacks empirical evidence. This study contributes to the literature at different levels. At the theoretical level, this study develops a theoretical model that links a performance management system (PMS), i.e. VBM, to CG practices and CE. This model attempts to bridge the gap between different disciplines, including management accounting, CG and entrepreneurship. Furthermore, combining both the contingency theory and the agency theory lenses contributes to the development of a comprehensive model of EG. At the methodological level, unlike previous studies, this study measures VBM practices on a continuum, rather than categories. Multiple data collection methods are used, and a powerful statistical technique (PLS-SEM) is adopted for data analysis. At the empirical level, the study is conducted in the UK. Though it is different from the US in many aspects, very few studies have been conducted in this context in many research areas such as VBM, CG and CE.
|
10 |
Modeling and methodologies for the test of IMS services / Modélisation et méthodologie pour le test de services IMSLalanne, Felipe 03 February 2012 (has links)
Le test de conformité est le processus permettant de contrôler qu'un système possède un ensemble de propriétés souhaitées et se comporte conformément à certaines exigences prédéfinies. Dans ce contexte, les techniques de test passif sont utilisées lorsque le système sous test ne peut être interrompu ou l'accès aux interfaces du système est indisponible. Le test passif s'appuie sur l'observation de l'application pendant l'exécution, et la comparaison de l'observation avec le comportement attendu, défini à travers des propriétés de conformité. L'objectif de cette thèse est la définition d'une méthodologie de validation des protocoles communicants par test passif. Les approches existantes sont issues de travaux basés sur des spécifications à états finis ou de transitions étiquetées et comme tels, ils présument l'existence d'une relation de causalité entre les évènements observés dans la trace du système. Pour le traitement des protocoles basés sur des messages, comme le protocole SIP (fondamental pour les services IMS), telle causalité n’existe pas nécessairement et en outre, elle ne peut être déterminée que par la partie données du protocole. Étant donné que les techniques existantes sont optimisées pour traiter les parties de contrôle, ils présentent des limites pour les tests basés sur des parties de données: expressibilité réduite de propriétés de conformité, entre autres. Dans ce travail nous présentons une approche sur la base des messages et données pour traiter ces problèmes. Les observations dans une trace sont sous la forme de messages. Le comportement attendu est défini de manière ascendante, à partir des critères basés sur les relations entre les champs des données des messages. Des relations temporelles sont définies entre ces critères, par exemple, une propriété peut exiger que certains critères “doit être reconnu pour tous les messages dans la trace”. Notre approche permet d’exprimer des formules sur l’avenir et le passé de la trace, permettant de définir des critères plus généraux que ceux qui utilisent uniquement des parties de contrôle. Des problèmes liés à la satisfaction des propriétés et la déclaration des verdicts de conformité sont également discutés. Bien que l'observation d'un comportement défini comme une propriété est un indice de conformité, l'absence d'observation n'est pas nécessairement indicative d'une faute. Plusieurs solutions à ce problème ont été proposées et mises en œuvre dans ce travail. Enfin, notre travail présente des perspectives intéressantes en termes d'extensibilité pour la détection en ligne ou une expressivité améliorée, mais aussi car une approche basée sur des messages fournit une vision alternative aux techniques de test traditionnelles / Conformance testing is the process of checking that a system possesses a set of desired properties and behaves in accordance with some predefined requirements. In this context, passive testing techniques are used when the system under test cannot be interrupted or access to the system's interfaces is unavailable. Passive testing relies on the observation of the implementation during runtime, and the comparison of the observation with the expected behavior, defined through conformance properties. The objective of this thesis is to define a novel methodology to validate communicating protocols by passive testing. Existing approaches are derived from works with finite-state and labelled transition specifications and as such, they presume there exists a causality relation between the events observed in the implementation (the trace). When dealing with message-based protocols, such as the Session Initiation Protocol (fundamental for IMS services), such causality does not necessarily exist and furthermore, it may only be determined through data parts. Since existing techniques are optimized for dealing with control parts, they present limitations for testing based on data parts: reduced expressibility and succinctness of conformance properties, as well as problems to deal with satisfaction of properties including future conditions. In this work we present a message-based/data-centric approach for dealing with these issues. Observations in a trace are in the form of messages. Expected behavior is defined in a bottom-up fashion, starting from expected criteria that must be fulfilled by one or more messages, defined as constraints between the message data fields. Temporal relations by quantification over the criteria, e.g. a property may require that certain criteria "must be held for all messages in the trace". Our approach allows to express formulas about the future and past of the trace, allowing to define more general criteria than through control parts alone. Issues related to satisfaction of properties and declaration of conformance verdicts are also discussed here. Although observation of a behavior defined as a property is indication of conformance, lack of observation is not necessarily indicative of a fault. Several solutions to this issue have been proposed and implemented in this work. Finally, our work presents interesting perspectives, in terms of extensibility for online detection or improved expressiveness, but also since a message-based approach provides an alternative view to traditional testing techniques
|
Page generated in 0.0625 seconds