Spelling suggestions: "subject:"conforman""
11 |
Architecture-Based Verification of Software-Intensive SystemsJohnsen, Andreas January 2010 (has links)
<p>Development of software-intensive systems such as embedded systems for telecommunications, avionics and automotives occurs under severe quality, schedule and budget constraints. As the size and complexity of software-intensive systems increase dramatically, the problems originating from the design and specification of the system architecture becomes increasingly significant. Architecture-based development approaches promise to improve the efficiency of software-intensive system development processes by reducing costs and time, while increasing quality. This paradox is partially explained by the fact that the system architecture abstracts away unnecessary details, so that developers can concentrate both on the system as a whole, and on its individual pieces, whether it's the components, the components' interfaces, or connections among components. The use of architecture description languages (ADLs) provides an important basis for verification since it describes how the system should behave, in a high level view and in a form where automated tests can be generated. Analysis and testing based on architecture specifications allow detection of problems and faults early in the development process, even before the implementation phase, thereby reducing a significant amount of costs and time. Furthermore, tests derived from the architecture specification can later be applied to the implementation to see the conformance of the implementation with respect to the specification. This thesis extends the knowledge base in the area of architecture-based verification. In this thesis report, an airplane control system is specified using the Architecture Analysis and Description Language (AADL). This specification will serve as a starting point of a system development process where developed architecture-based verification algorithms are applied.</p>
|
12 |
Verification of Web Services in Support of ChoreographyHsieh, Wen-Fan 02 June 2011 (has links)
In recent years, Web services had been widely used on the Internet. Thanks to the convenient communication technologies and their inexpensive cost, communications between organizations is much easier, and Web services have become a de-facto standard for organizations to provide information and services. There are two different perspectives to describe Web service composition: orchestration and choreography. Works that verify a choreography model so as to alleviating some correctness problem such as deadlock have also been proposed. However the verification of implementations based on a choreography model has not be addressed. In this thesis, we propose an approach to verify the conformance of a set of Web services to a given choreography model and prune some candidate Web services which do not comply with the choreography model to avoid discordance with the choreography model and run-time errors. The proposed approach is evaluated by simulating 10,000 execution sequences of composite Web services. The experimental results show that our proposed method improves the performance of success rate and space usage by pruning the unsuitable candidate Web services.
|
13 |
A Statistical Approach To Lean Construction Implementations Of Construction Companies In TurkeyTezel, Bulent Algan 01 August 2007 (has links) (PDF)
One of the major change efforts for the construction industry is lean
construction. This thesis analyzes the practices of the construction
companies in Turkey from the lean construction perspective. Prior to the
analysis in question, requisite information about change in the
construction industry, lean thinking and lean construction will be
presented.
A questionnaire, based on a lean construction model, is used to survey
the practices and gather the data for the analysis. Various statistical
analysis methods are performed on the gathered data to make
inferences. According to these analyses, the lean construction
characteristics of the construction companies will be discussed and the
recommendations for improving the lean conformance of the construction
companies will be presented.
|
14 |
Testbatn - A Scenario Based Test Platform For Conformance Andinteroperability TestingNamli, Tuncay 01 June 2011 (has links) (PDF)
Today, interoperability is the major challenge for e-Business and e-Government domains. The
fundamental solution is the standardization in different levels of business-to-business interactions.
However publishing standards alone are not enough to assure interoperability between
products of different vendors. In this respect, testing and certification activities are very important
to promote standard adoption, validate conformance and interoperability of the products
and maintain correct information exchange. In e-Business collaborations, standards need
to address different layers of interoperability stack / communication layer, business document
layer and business process layer. Although there have been conformance and interoperability
testing tools and initiatives for each one of these categories, there is currently no support
for testing an integration of the above within a test scenario which is similar to real life use
cases. Together with the integration of different layers of testing, testing process should be
automated so that test case execution can be done at low cost, and repeated if required. In
this theses, a highly adaptable and flexible Test Execution Model and a complementary XML
based Test Description Language consisting of high level test constructs which can handle or
simulate different parts or layers of the interoperability stack is designed. The computer interpretable test description language allow dynamic set up of test cases and provides flexibility
to design, modify, maintain and extend the test functionality in contrast to a priori designed
and hard coded test cases. The work presented in this thesis is a part of the TestBATN system
supported by TUBITAK, TEYDEB Project No: 7070191.
|
15 |
Nanoparticle-stabilized CO₂ foams for potential mobility control applicationsHariz, Tarek Rafic 21 November 2013 (has links)
Carbon dioxide (CO₂) flooding is the second most common tertiary recovery technique implemented in the United States. Yet, there is huge potential to advance the process by improving the volumetric sweep efficiency of injected CO₂. Delivering CO₂ into the reservoir as a foam is one way to do this. Surfactants have traditionally been used to generate CO₂ foams for mobility control; however, the use of nanoparticles as a foam stabilizing agent provides several advantages. Surfactant-stabilized foams require constant regeneration to be effective, and the surfactant is adsorbed onto reservoir rocks and is prone to chemical degradation at harsh reservoir conditions. Nanoparticle-stabilized foams have been found to be tolerant of high temperature and high salinity environments. Their nano size also allows them to be transported through reservoir rocks without blocking pore throats. Stable CO₂-in-water foams were generated using 5 nm silica nanoparticles with a short chain polyethylene glycol surface coating. These foams were generated by the co-injection of CO₂ and a nanoparticle dispersion through both rock matrix and fractures. A threshold shear rate was found to exist for foam generation in both fractured and non-fractured Boise sandstone cores. The ability of nanoparticles to generate foams only above a threshold shear rate is advantageous; in field applications, high shear rates are associated with high permeability zones, where the presence of foam is desired. Reducing CO₂ mobility in these high permeability zones diverts CO₂ into lower permeability regions containing not yet swept oil. Nanoparticles were also found to be able to stabilize CO₂ foams by co-injection through rough-walled fractures in cement cores, demonstrating their ability to stabilize foams without matrix flow. Experiments were conducted on the ability of fly ash, a waste product from burning coal in power plants, to stabilize oil-in-water emulsions and CO₂ foams. The use of fly ash particles as a foam stabilizing agent would significantly reduce material costs for potential tertiary oil recovery and CO₂ sequestration applications. Nano-milled fly ash particles without surface treatment were able to generate stable oil-in-water emulsions when high frequency, high energy vibrations were applied to a mixture of fly ash dispersion and dodecane. Oil-in-water emulsions were also generated by co-injecting fly ash and dodecane, a low pressure analog to CO₂, through a beadpack. Emulsions generated by co-injection, however, were unstable and coalesced within an hour. A threshold shear rate was required for the emulsion generation. Fly ash particles were found to be able to stabilize CO₂ foam in a high pressure batch mixing cell, but not by co-injection through a beadpack. Dispersions of fly ash particles were found to be stable only at low salinities (<1 wt% NaCl). / text
|
16 |
Novel solvent injection and conformance control technologies for fractured viscous oil reservoirsRankin, Kelli Margaret 24 June 2014 (has links)
Fractured viscous oil resources hold great potential for continued oil production growth globally. However, many of these resources are not accessible with current commercial technologies using steam injection which limits operations to high temperatures. Several steam-solvent processes have been proposed to decrease steam usage, but they still require operating temperatures too high for many projects. There is a need for a low temperature injection strategy alternative for viscous oil production. This dissertation discusses scoping experimental work for a low temperature solvent injection strategy targeting fractured systems. The strategy combines three production mechanisms – gas-oil gravity drainage, liquid extraction, and film gravity drainage. During the initial heating period when the injected solvent is in the liquid phase, liquid extraction occurs. When the solvent is in the vapor phase, solvent-enhanced film gravity drainage occurs. A preliminary simulation of the experiments was developed to study the impact of parameter uncertainty on the model performance. Additional work on reducing uncertainty for key parameters controlling the two solvent production mechanisms will be necessary.
In a natural fracture network, the solvent would not be injected uniformly throughout the reservoir. Preferential injection into the higher conductivity fracture areas would result in early breakthrough leaving unswept areas of high oil saturation. Conformance control would be necessary to divert subsequent solvent injection into the unswept zones. A variety of techniques, including polymer and silica gel treatments, have been designed to block flow through the swept zones, but all involve initiating gelation prior to injection. This dissertation also looks at a strategy that uses the salinity gradient between the injected silica nanoparticle dispersion and the in-situ formation water to trigger gelation. First, the equilibrium phase behavior of silica dispersions as a function of sodium chloride and nanoparticle concentration and temperature was determined. The dispersions exhibited three phases – a clear, stable dispersion; gel; and a viscous, unstable dispersion. The gelation time was found to decrease exponentially as a function of silica concentration, salinity, and temperature. During core flood tests under matrix and fracture injection, the in-situ formed gels were shown to provide sufficient conductivity reduction even at low nanoparticle concentration. / text
|
17 |
Modeling and methodologies for the test of IMS servicesLalanne, Felipe 03 February 2012 (has links) (PDF)
Conformance testing is the process of checking that a system possesses a set of desired properties and behaves in accordance with some predefined requirements. In this context, passive testing techniques are used when the system under test cannot be interrupted or access to the system's interfaces is unavailable. Passive testing relies on the observation of the implementation during runtime, and the comparison of the observation with the expected behavior, defined through conformance properties. The objective of this thesis is to define a novel methodology to validate communicating protocols by passive testing. Existing approaches are derived from works with finite-state and labelled transition specifications and as such, they presume there exists a causality relation between the events observed in the implementation (the trace). When dealing with message-based protocols, such as the Session Initiation Protocol (fundamental for IMS services), such causality does not necessarily exist and furthermore, it may only be determined through data parts. Since existing techniques are optimized for dealing with control parts, they present limitations for testing based on data parts: reduced expressibility and succinctness of conformance properties, as well as problems to deal with satisfaction of properties including future conditions. In this work we present a message-based/data-centric approach for dealing with these issues. Observations in a trace are in the form of messages. Expected behavior is defined in a bottom-up fashion, starting from expected criteria that must be fulfilled by one or more messages, defined as constraints between the message data fields. Temporal relations by quantification over the criteria, e.g. a property may require that certain criteria "must be held for all messages in the trace". Our approach allows to express formulas about the future and past of the trace, allowing to define more general criteria than through control parts alone. Issues related to satisfaction of properties and declaration of conformance verdicts are also discussed here. Although observation of a behavior defined as a property is indication of conformance, lack of observation is not necessarily indicative of a fault. Several solutions to this issue have been proposed and implemented in this work. Finally, our work presents interesting perspectives, in terms of extensibility for online detection or improved expressiveness, but also since a message-based approach provides an alternative view to traditional testing techniques
|
18 |
Runtime Conformance Checking of Mobile Agent Systems Using Executable ModelsSaifan, Ahmad 27 April 2010 (has links)
Mobility occurs naturally in many distributed system applications such as telecommunications and electronic commerce. Mobility may reduce bandwidth consumption
and coupling and increase flexibility. However, it seems that relatively little work has
been done to support quality assurance techniques such as testing and verification of
mobile systems.
This thesis describes an approach for checking the conformance of a mobile, distributed application with respect to an executable model at runtime. The approach
is based on kiltera -- a novel, high-level language supporting the description and execution of models of concurrent, mobile, distributed, and timed computation. The
approach allows distributed, rather than centralized, monitoring. However, it makes
very few assumptions about the platform that the mobile agent system is implemented
in.
We have implemented our approach and validated it using four case studies. Two
of them are examples of mobile agent systems, the two others are implementations
of distributed algorithms. Our approach was able to detect seeded faults in the
implementations. To check the effectiveness and the efficiency of our approach more
comprehensively a mutation-based evaluation framework has been implemented. In
this framework a set of a new mutation operators for mobile agent systems has been
identified in order to automatically generate and run a number of mutants programs
and then evaluate the ability of our approach to detect these mutants. We found that
our approach is very effective and efficient in killing the non-equivalent mutants. / Thesis (Ph.D, Computing) -- Queen's University, 2010-04-27 12:35:47.996
|
19 |
Test-Based Falsification and Conformance Testing for Cyber-Physical SystemsJanuary 2015 (has links)
abstract: In this dissertation, two problems are addressed in the verification and control of Cyber-Physical Systems (CPS):
1) Falsification: given a CPS, and a property of interest that the CPS must satisfy under all allowed operating conditions, does the CPS violate, i.e. falsify, the property?
2) Conformance testing: given a model of a CPS, and an implementation of that CPS on an embedded platform, how can we characterize the properties satisfied by the implementation, given the properties satisfied by the model?
Both problems arise in the context of Model-Based Design (MBD) of CPS: in MBD, the designers start from a set of formal requirements that the system-to-be-designed must satisfy.
A first model of the system is created.
Because it may not be possible to formally verify the CPS model against the requirements, falsification tries to verify whether the model satisfies the requirements by searching for behavior that violates them.
In the first part of this dissertation, I present improved methods for finding falsifying behaviors of CPS when properties are expressed in Metric Temporal Logic (MTL).
These methods leverage the notion of robust semantics of MTL formulae: if a falsifier exists, it is in the neighborhood of local minimizers of the robustness function.
The proposed algorithms compute descent directions of the robustness function in the space of initial conditions and input signals, and provably converge to local minima of the robustness function.
The initial model of the CPS is then iteratively refined by modeling previously ignored phenomena, adding more functionality, etc., with each refinement resulting in a new model.
Many of the refinements in the MBD process described above do not provide an a priori guaranteed relation between the successive models.
Thus, the second problem above arises: how to quantify the distance between two successive models M_n and M_{n+1}?
If M_n has been verified to satisfy the specification, can it be guaranteed that M_{n+1} also satisfies the same, or some closely related, specification?
This dissertation answers both questions for a general class of CPS, and properties expressed in MTL. / Dissertation/Thesis / Doctoral Dissertation Electrical Engineering 2015
|
20 |
Towards Pattern Based Architectural Conformance CheckingOlsson, Tobias January 2016 (has links)
Patterns are a source of knowledge when architecting software systems. They provide abstract and time-tested solutions that show how a system should be structured to achieve needed qualities. However, when developing software there is a chance that small mistakes are introduced in the source code. Over time, these mistakes can accumulate and break the structure of the pattern and its qualities are lost. There are methods that can help find such errors, but none of these provide a pattern abstraction. In this work, we describe a method that raises the level of abstraction from checking individual dependencies to checking key dependencies in the pattern. We implement our method, apply it to check the Model-View-Controller pattern. We show that the method can find architectural problems in real source code and examine how removal of detected erosions affects the source code. We conducted an experiment in a software project setting to determine if using the method affects the number of architectural problems. Some project teams were randomly assigned to use a software service that automated our method. It checked how well their implementation conformed to Model-View-Controller every time they updated the source code. The experiment showed that developers that used the tool had significantly fewer detected architectural problems during the course of the project. Our method makes conformance checking easier to use. This might help increase the adoption of conformance checking in industry.
|
Page generated in 0.06 seconds