Spelling suggestions: "subject:"[een] MODEL CHECKING"" "subject:"[enn] MODEL CHECKING""
91 |
Verification of Genetic Fuzzy SystemsArnett, Timothy J. 06 June 2016 (has links)
No description available.
|
92 |
In pursuit of a hidden evaderBohn, Christopher A. 29 September 2004 (has links)
No description available.
|
93 |
Exploring Hybrid Dynamic and Static Techniques for Software VerificationCheng, Xueqi 10 March 2010 (has links)
With the growing importance of software on which human lives increasingly depend, the correctness requirement of the underlying software becomes especially critical. However, the increasing complexities and sizes of modern software systems pose special challenges on the effectiveness as well as efficiency of software verification. Two major obstacles include the quality of test generation in terms of error detection in software testing and the state space explosion problem in software formal verification (model checking).
In this dissertation, we investigate several hybrid techniques that explore dynamic (with program execution), static (without program execution) as well as the synergies of multiple approaches in software verification from the perspectives of testing and model checking. For software testing, a new simulation-based internal variable range coverage metric is proposed with the goal of enhancing the error detection capability of the generated test data when applied as the target metric. For software model checking, we utilize various dynamic analysis methods, such as data mining, swarm intelligence (ant colony optimization), to extract useful high-level information from program execution data. Despite being incomplete, dynamic program execution can still help to uncover important program structure features and variable correlations. The extracted knowledge, such as invariants in different forms, promising control flows, etc., is then used to facilitate code-level program abstraction (under-approximation/over-approximation), and/or state space partition, which in turn improve the performance of property verification.
In order to validate the effectiveness of the proposed hybrid approaches, a wide range of experiments on academic and real-world programs were designed and conducted, with results compared against the original as well as the relevant verification methods. Experimental results demonstrated the effectiveness of our methods in improving the quality as well as performance of software verification. For software testing, the newly proposed coverage metric constructed based on dynamic program execution data is able to improve the quality of test cases generated in terms of mutation killing — a widely applied measurement for error detection. For software model checking, the proposed hybrid techniques greatly take advantage of the complementary benefits from both dynamic and static approaches: the lightweight dynamic techniques provide flexibility in extracting valuable high-level information that can be used to guide the scope and the direction of static reasoning process. It consequently results in significant performance improvement in software model checking. On the other hand, the static techniques guarantee the completeness of the verification results, compensating the weakness of dynamic methods. / Ph. D.
|
94 |
Design Verification for Sequential Systems at Various Abstraction LevelsZhang, Liang 31 January 2005 (has links)
With the ever increasing complexity of digital systems, functional verification has become a daunting task to circuit designers. Functional verification alone often surpasses 70% of the total development cost and the situation has been projected to continue to worsen. The most critical limitations of existing techniques are the capacity issue and the run-time issue. This dissertation addresses the functional verification problem using a unified approach, which utilizes different core algorithms at various abstraction levels. At the logic level, we focus on incorporating a set of novel ideas to existing formal verification approaches. First, we present a number of powerful optimizations to improve the performance and capacity of a typical SAT-based bounded model checking framework. Secondly, we present a novel method for performing dynamic abstraction within a framework for abstraction-refinement based model checking. Experiments on a wide range of industrial designs have shown that the proposed optimizations consistently provide between 1-2 orders of magnitude speedup and can be extremely useful in enhancing the efficacy of existing formal verification algorithms. At the register transfer level, where the formal verification is less likely to succeed, we developed an efficient ATPG-based validation framework, which leverages the high-level circuit information and an improved observability-enhanced coverage to generate high quality validation sequences. Experiments show that our approach is able to generate high quality validation vectors, which achieve both high tag coverage and high bug coverage with extremely low computational cost. / Ph. D.
|
95 |
Exploring Abstraction Techniques for Scalable Bit-Precise Verification of Embedded SoftwareHe, Nannan 01 June 2009 (has links)
Conventional testing has become inadequate to satisfy rigorous reliability requirements of embedded software that is playing an increasingly important role in many safety critical applications. Automatic formal verification is a viable avenue for ensuring the reliability of such software. Recently, more and more formal verification techniques have begun modeling a non-Boolean data variable as a bit-vector with bounded width (i.e. a vector of multiple bits like 32- or 64- bits) to implement bit-precise verification. One major challenge in the scalable application of such bit-precise verification on real-world embedded software is that the state space for verification can be intractably large.
In this dissertation, several abstraction techniques are explored to deal with this scalability challenge in the bit-precise verification of embedded software. First, we propose a tight integration of program slicing, which is an important static program analysis technique, with bounded model checking (BMC). While many software verification tools apply program slicing as a separate preprocessing step, we integrate slicing operations into our model construction and reduction process and enhance them with compilation optimization techniques to compute accurate program slices. We also apply a proof-based abstraction-refinement framework to further remove those program segments irrelevant to the property being verified.
Next, we present a method of using symbolic simulation for scalable formal verification. The simulation involves distinguishing X as symbolic values to abstract concrete variables' values. Also, the method embeds this symbolic simulation in a counterexample-guided abstraction-refinement framework to automatically construct and verify an abstract model, which has a smaller state space than that of the original concrete program.
This dissertation also presents our efforts on using two common testability metrics — controllability metric (CM) and observability metric (OM) — as the high-level structural guidance for scalable bit-precise verification. A new abstraction approach is proposed based on the concept of under- and over-approximation to efficiently solve bit-vector formulas generated from embedded software verification instances. These instances include both complicated arithmetic computations and intensive control structures. Our approach applies CM and OM to assist the abstraction refinement procedure in two ways: (1) it uses CM and OM to guide the construction of a simple under-approximate model, which includes only a subset of execution paths in a verification instance, so that a counterexample that refutes the instance can be obtained with reduced effort, and (2) in order to reduce the cost of using proof-based refinement alone, it uses OM heuristics to guide the restoration of additional verification-relevant formula constraints with low computational cost for refinement. Experiments show a significant reduction of the solving time compared to state-of-the-art solvers for the bit-vector arithmetic.
This dissertation finally proposes an efficient algorithm to discover non-uniform encoding widths of individual variables in the verification model, which may be smaller than their original modeling width but sufficient for the verification. Our algorithm distinguishes itself from existing approaches in that it is path-oriented; it takes advantage of CM and OM values to guide the computation of the initial, non-uniform encoding widths, and the effective adjustment of these widths along different paths, until the property is verified. It can restrict the search from those paths that are deemed less favorable or have been searched in previous steps, thus simplifying the problem. Experiments demonstrate that our algorithm can significantly speed up the verification especially in searching for a counterexample that violates the property under verification. / Ph. D.
|
96 |
Search-space Aware Learning Techniques for Unbounded Model Checking and Path Delay TestingChandrasekar, Kameshwar 24 April 2006 (has links)
The increasing complexity of VLSI designs, in recent years, poses serious challenges while ensuring the correctness of large designs for functionality and timing. In this dissertation, we target two related problems in Design Verification and Testing: Unbounded Model Checking and Path Delay Fault Testing, that commonly suffer from extremely large memory requirements. We propose efficient representations and intelligent learning techniques that reason on the problem structure and take advantage of the repeated search space, thereby alleviating the memory required and time taken to solve these problems.
In this dissertation, we exploit Automatic Test Pattern Generation (ATPG) for Unbounded Model Checking (UMC). In order to perform unbounded model checking, we need the core image / preimage computation engines that perform forward / backward reachability analysis. First, we develop an ATPG engine, with search-space aware learning, that computes ``all solutions" for a given target objective and stores it as a decision diagram. We propose efficient decision selection heuristics and derive a suitable cut-set metric to quickly obtain a compact solution set. The solution set that is obtained, with the initial state set as the objective, represents the one-cycle preimage. In order to use the preimage state set as the objective in the subsequent iterations, we propose efficient techniques to convert a decision diagram into clauses/circuit. We propose a node-based conversion scheme that derives the functionality of each node in the decision diagram. The proposed scheme contains the size of the state set and helps to iteratively compute the preimage for many cycles until a fixed point / desired state is reached.
Further, we gear the ATPG engine to directly compute the circuit cofactors, rather than individual solutions. The circuit cofactors contain a large number of solutions and hence capture a larger solution space. We also propose efficient learning techniques to prune the cofactor space and accelerate preimage computation. Then, we develop an exclusive image computation procedure that branches on the combinational inputs of the circuit and projects the values on the next state flip-flops as the image. We perform learning on the input solution space and incrementally store the image obtained as a decision diagram. We consistently show, with our experimental results, that our techniques are better than the existing techniques in terms of both performance and capacity.
In the case of delay testing, we consider the test generation for path delay fault (PDF) model, which is the most accurate in characterizing the cumulative effect of distributed delays along each path in a circuit. The main bottle-neck in the ATPG for PDFs is the exponential number of paths in a circuit. In this work, we use the circuit information to analyze the common segments shared by different paths in a circuit. Based on the common sensitization constraints, we propose to identify the ``untestable core of segments" that cannot be sensitized together. We use these segments to identify the conflict search space for a huge number of untestable path delay faults apriori and prune them on-the-fly during test generation. Experimental results show that a huge number of untestable path delay faults are identified and it helps to accelerate test generation. / Ph. D.
|
97 |
Strategies for SAT-Based Formal VerificationVimjam, Vishnu Chaithanya 13 February 2007 (has links)
Verification of digital hardware designs is becoming an increasingly complex task as the designs are incorporating more functionality, becoming complex and growing larger in size. Today, verification remains a bottleneck in meeting time-to-market requirements and consumes more than 70% of the overall design-costs. Traditionally, verification has been done using simulation-based approaches, where a set of appropriate test-stimuli is used by the designer. As the designs become more complex, however, simulation-based techniques often fail to capture corner-case errors. Furthermore, unless exhaustively tested, these approaches do not guarantee the correctness of a system with respect to its specifications. As a consequence, formal methods for design verification have been sought after. In formal verification, the conformance of a design to a given set of specifications is proven mathematically, thereby leaving no room for unexplored search spaces. Despite the exponential time/memory complexities often involved within the formal approaches, they have shown promise in capturing subtle bugs, which were missed otherwise.
In this dissertation, we focus on Boolean Satisfiability (SAT) based formal verification, which has gained tremendous importance in the recent past. Importantly, SAT-based approaches often alleviate the memory explosion problem, which had been a bottleneck of the traditional symbolic (Binary Decision Diagram based) approaches. In SAT-based techniques, the set of verification tasks are converted into a set of Boolean formulae, which are checked for satisfiability using a SAT solver. These problems are often NP-complete and are prone to an explosion in the required run-time. To overcome this, we propose novel strategies which utilize both structural and logical information of a sequential circuit. In particular, we devise techniques to extract non-trivial invariants of a design, strengthen properties such that they can be proven faster and interleave bounded reachability analysis with bounded model checking. We provide the necessary algorithms and implementation details in order to automate the proposed techniques. Experiments conducted on a variety of benchmark circuits show that orders of magnitude improvement in overall run-times can be achieved via our techniques compared to the existing state-of-the-art SAT-based approaches. / Ph. D.
|
98 |
Constraint Solving for Diagnosing Concurrency BugsKhoshnood, Sepideh 28 May 2015 (has links)
Programmers often have to spend a significant amount of time inspecting the software code and execution traces to identify the root cause of a software bug. For a multithreaded program, debugging is even more challenging due to the subtle interactions between concurrent threads and the often astronomical number of possible interleavings. In this work, we propose a logical constraint-based symbolic analysis method to aid in the diagnosis of concurrency bugs and find their root causes, which can be later used to recommend repairs. In our method, the diagnosis process is formulated as a set of constraint solving problems. By leveraging the power of constraint satisfiability (SAT) solvers and a bounded model checker, we perform a semantic analysis of the sequential computation as well as the thread interactions. The analysis is ideally suited for handling software with small to medium code size but complex concurrency control, such as device drivers, synchronization protocols, and concurrent data structures. We have implemented our method in a software tool and demonstrated its effectiveness in diagnosing subtle concurrency bugs in multithreaded C programs. / Master of Science
|
99 |
Identification and Analysis of Illegal States in the Apoptotic Discrete Transition System Model using ATPG and SAT-based TechniquesShrivastava, Anupam 14 November 2008 (has links)
Programmed Cell Death, or Apoptosis, plays a critical role in human embryonic development and in adult tissue homeostasis. Recent research efforts in Bioinformatics and Computational Biology focus on gaining deep insight into the Apoptosis process. This allows researchers to clearly study the relation between the dysregulation of apoptosis and the development of cancer. Research in this highly interdisciplinary field of bioinformatics has become much more quantitative, using tools from computational sciences to understand the behavior of Biological systems.
Previously, an abstracted model has been developed to study the Apoptosis process as a Finite State Discrete Transition Model. This model facilitates the reutilization of the digital design verification and testing techniques developed in the Electronic Design Automation domain. These verification and testing techniques for hardware have become robust over the past few decades. Usually simulation is the cornerstone of the Design Verification industry and bulk of states are covered by simulation. Formal verification techniques are then used to analyze the remaining corner case states. Techniques like Genetic Algorithm guided Logic Simulation (GALS) and SAT-based Induction have already been applied to the Apoptosis Discrete Transition Model. However, the Apoptosis model presents some unique problems. The simulation techniques have shown to be unable to cover most of the states of the Apoptosis model. When SAT-based Induction is applied to the Apoptosis model, in particular to find illegal states, very few illegal states are identified. It particularly suffers from the fact that the Apoptosis Model is rather complex and the formulation for testing and verification is hard to tackle at larger bounds greater than 20 or so. Consequently, the state space of the Apoptosis model largely lies in the unknown region, meaning that we are unable to either reach those states or prove that they are illegal. Unless we know whether these states are reachable or illegal, it is not feasible to infer information about the model such as what protein concentrations can be reached under what kind of input stimuli. Questions such as whether certain protein concentrations can be reached or not in this model can only be answered if we have a clear picture of the reachability of state space.
In this thesis, we propose techniques based on ATPG and SAT based image computation of the Apoptosis finite transition model. Our method leverages the results obtained in previous research work. It uses the reachable states obtained from the simulation traces of the previous work as initial states for our technique. This enables us to identify more illegal states in less number of iterations; in other words, we are able to reach the fixed point in image computation faster. Our experimental analysis illustrates that the proposed techniques could prove most of the former unknown states as illegal states. We are able to extend our analysis to obtain clearer picture of the interaction of any two proteins in the system considered together. / Master of Science
|
100 |
Automatic Selection of Verification Tools for Efficient Analysis of Biochemical ModelsBakir, M.E., Konur, Savas, Gheorghe, Marian, Krasnogor, N., Stannett, M. 24 April 2018 (has links)
Yes / Motivation: Formal verification is a computational approach that checks system correctness (in relation to a desired functionality). It has been widely used in engineering applications to verify that systems work correctly. Model checking, an algorithmic approach to verification, looks at whether a system model satisfies its requirements specification. This approach has been applied to a large number of models in systems and synthetic biology as well as in systems medicine. Model checking is, however, computationally very expensive, and is not scalable to large models and systems. Consequently, statistical model checking (SMC), which relaxes some of the constraints of model checking, has been introduced to address this drawback. Several SMC tools have been developed; however, the performance of each tool significantly varies according to the system model in question and the type of requirements being verified. This makes it hard to know, a priori, which one to use for a given model and requirement, as choosing the most efficient tool for any biological application requires a significant degree of computational expertise, not usually available in biology labs. The objective of this paper is to introduce a method and provide a tool leading to the automatic selection of the most appropriate model checker for the system of interest.
Results: We provide a system that can automatically predict the fastest model checking tool for a given biological model. Our results show that one can make predictions of high confidence, with over 90% accuracy. This implies significant performance gain in verification time and substantially reduces the “usability barrier” enabling biologists to have access to this powerful computational technology. / EPSRC, Innovate UK
|
Page generated in 0.0489 seconds