• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 269
  • 74
  • 31
  • 10
  • 7
  • 6
  • 6
  • 6
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 493
  • 493
  • 164
  • 151
  • 120
  • 107
  • 95
  • 82
  • 78
  • 58
  • 56
  • 51
  • 49
  • 48
  • 45
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
281

On Reducing the Trusted Computing Base in Binary Verification

An, Xiaoxin 15 June 2022 (has links)
The translation of binary code to higher-level models has wide applications, including decompilation, binary analysis, and binary rewriting. This calls for high reliability of the underlying trusted computing base (TCB) of the translation methodology. A key challenge is to reduce the TCB by validating its soundness. Both the definition of soundness and the validation method heavily depend on the context: what is in the TCB and how to prove it. This dissertation presents three research contributions. The first two contributions include reducing the TCB in binary verification, and the last contribution includes a binary verification process that leverages a reduced TCB. The first contribution targets the validation of OCaml-to-PVS translation -- commonly used to translate instruction-set-architecture (ISA) specifications to PVS -- where the destination language is non-executable. We present a methodology called OPEV to validate the translation between OCaml and PVS, supporting non-executable semantics. The validation includes generating large-scale tests for OCaml implementations, generating test lemmas for PVS, and generating proofs that automatically discharge these lemmas. OPEV incorporates an intermediate type system that captures a large subset of OCaml types, employing a variety of rules to generate test cases for each type. To prove the PVS lemmas, we develop automatic proof strategies and discharge the test lemmas using PVS Proof-Lite, a powerful proof scripting utility of the PVS verification system. We demonstrate our approach in two case studies that include 259 functions selected from the Sail and Lem libraries. For each function, we generate thousands of test lemmas, all of which are automatically discharged. The dissertation's second contribution targets the soundness validation of a disassembly process where the source language does not have well-defined semantics. Disassembly is a crucial step in binary security, reverse engineering, and binary verification. Various studies in these fields use disassembly tools and hypothesize that the reconstructed disassembly is correct. However, disassembly is an undecidable problem. State-of-the-art disassemblers suffer from issues ranging from incorrectly recovered instructions to incorrectly assessing which addresses belong to instructions and which to data. We present DSV, a systematic and automated approach to validate whether the output of a disassembler is sound with respect to the input binary. No source code, debugging information, or annotations are required. DSV defines soundness using a transition relation defined over concrete machine states: a binary is sound if, for all addresses in the binary that can be reached from the binary's entry point, the bytes of the (disassembled) instruction located at an address are the same as the actual bytes read from the binary. Since computing this transition relation is undecidable, DSV uses over-approximation by preventing false positives (i.e., the existence of an incorrectly disassembled reachable instruction but deemed unreachable) and allowing, but minimizing, false negatives. We apply DSV to 102 binaries of GNU Coreutils with eight different state-of-the-art disassemblers from academia and industry. DSV is able to find soundness issues in the output of all disassemblers. The dissertation's third contribution is WinCheck: a concolic model checker that detects memory-related properties of closed-source binaries. Bugs related to memory accesses are still a major issue for security vulnerabilities. Even a single buffer overflow or use-after-free in a large program may be the cause of a software crash, a data leak, or a hijacking of the control flow. Typical static formal verification tools aim to detect these issues at the source code level. WinCheck is a model-checker that is directly applicable to closed-source and stripped Windows executables. A key characteristic of WinCheck is that it performs its execution as symbolically as possible while leaving any information related to pointers concrete. This produces a model checker tailored to pointer-related properties, such as buffer overflows, use-after-free, null-pointer dereferences, and reading from uninitialized memory. The technique thus provides a novel trade-off between ease of use, accuracy, applicability, and scalability. We apply WinCheck to ten closed-source binaries available in a Windows 10 distribution, as well as the Windows version of the entire Coreutils library. We conclude that the approach taken is precise -- provides only a few false negatives -- but may not explore the entire state space due to unresolved indirect jumps. / Doctor of Philosophy / Binary verification is a process that verifies a class of properties, usually security-related properties, on binary files, and does not need access to source code. Since a binary file is composed of byte sequences and is not human-readable, in the binary verification process, a number of assumptions are usually made. The assumptions often involve the error-free nature of a set of subsystems used in the verification process and constitute the verification process's trusted computing base (or TCB). The reliability of the verification process therefore depends on how reliable the TCB is. The dissertation presents three research contributions in this regard. The first two contributions include reducing the TCB in binary verification, and the last contribution includes a binary verification process that leverages a reduced TCB. The dissertation's first contribution presents a validation on OCaml-to-PVS translations -- commonly used to translate a computer architecture's instruction specifications to PVS, a language that allows mathematical specifications. To build up a reliable semantical model of assembly instructions, which is assumed to be in the TCB, it is necessary to validate the translation. The dissertation's second contribution validates the soundness of the disassembly process, which translates a binary file to corresponding assembly instructions. Since the disassembly process is generally assumed to be trustworthy in many binary verification works, the TCB of binary verification could be reduced by validating the soundness of the disassembly process. With the reduced TCB, the dissertation introduces WinCheck, the dissertation's third and final contribution: a concolic model checker that validates pointer-related properties of closed-source Windows binaries. The pointer-related properties include absence of buffer overflow, absence of use-after-free, and absence of null-pointer dereference.
282

Enhancing SAT-based Formal Verification Methods using Global Learning

Arora, Rajat 25 May 2004 (has links)
With the advances in VLSI and System-On-Chip (SOC) technology, the complexity of hardware systems has increased manifold. Today, 70% of the design cost is spent in verifying these intricate systems. The two most widely used formal methods for design verification are Equivalence Checking and Model Checking. Equivalence Checking requires that the implementation circuit should be exactly equivalent to the specification circuit (golden model). In other words, for each possible input pattern, the implementation circuit should yield the same outputs as the specification circuit. Model checking, on the other hand, checks to see if the design holds certain properties, which in turn are indispensable for the proper functionality of the design. Complexities in both Equivalence Checking and Model Checking are exponential to the circuit size. In this thesis, we firstly propose a novel technique to improve SAT-based Combinational Equivalence Checking (CEC) and Bounded Model Checking (BMC). The idea is to perform a low-cost preprocessing that will statically induce global signal relationships into the original CNF formula of the circuit under verification and hence reduce the complexity of the SAT instance. This efficient and effective preprocessing quickly builds up the implication graph for the circuit under verification, yielding a large set of logic implications composed of direct, indirect and extended backward implications. These two-node implications (spanning time-frame boundaries) are converted into two-literal clauses, and added to the original CNF database. The added clauses constrain the search space of the SAT-solver engine, and provide correlation among the different variables, which enhances the Boolean Constraint Propagation (BCP). Experimental results on large and difficult ISCAS'85, ISCAS'89 (full scan) and ITC'99 (full scan) CEC instances and ISCAS'89 BMC instances show that our approach is independent of the state-of-the-art SAT-solver used, and that the added clauses help to achieve more than an order of magnitude speedup over the conventional approach. Also, comparison with Hyper-Resolution [Bacchus 03] suggests that our technique is much more powerful, yielding non-trivial clauses that significantly simplify the SAT instance complexity. Secondly, we propose a novel global learning technique that helps to identify highly non-trivial relationships among signals in the circuit netlist, thereby boosting the power of the existing implication engine. We call this new class of implications as 'extended forward implications', and show its effectiveness through additional untestable faults they help to identify. Thirdly, we propose a suite of lemmas and theorems to formalize global learning. We show through implementation that these theorems help to significantly simplify a generic CNF formula (from Formal Verification, Artificial Intelligence etc.) by identifying the necessary assignments, equivalent signals, complementary signals and other non-trivial implication relationships among its variables. We further illustrate through experimental results that the CNF formula simplification obtained using our tool outshines the simplification obtained using other preprocessors. / Master of Science
283

Explicit-State Model Checking of Concurrent x86-64 Assembly

Bharadwaj, Abhijith Ananth 10 July 2020 (has links)
The thesis presents xavier, a novel tool-set for model checking of concurrent x86-64 assembly programs, via Partial Order Reduction (POR). xavier{} presents a realistic platform for systematically exploring and analyzing the state-space of concurrent x86 assembly programs, with the aim of detecting bugs via assertion failures in mainstream programs. Recently, a number of state-of-the-art model checking solutions have been introduced to efficiently explore the state-space of concurrent programs, using POR algorithms. However, such solutions are inefficient while analyzing stateful programming languages, such as the x86 assembly language, due to their low level of abstraction. To this end, xavier{} makes two contributions: i) a novel order-sensitivity based POR algorithm, that is applicable to concurrent x86 assembly, ii) an x86 machine-model that can accurately perform relaxed-consistency emulation of concurrent x86 assembly, without the need for any translations. We demonstrate the applicability of xavier{} through an evaluation on several classical mutual-exclusion benchmarks and mainstream benchmarks from the Userspace Read-Copy-Update (URCU) concurrency library, where the benchmarks range from $250-3700$ lines of x86 assembly. The framework is the first that supports systematic model checking of concurrent x86 assembly programs, and the effectiveness of xavier{} is demonstrated by reproducing a concurrency issue of threads accessing intermediate states in the URCU library, which stems from an assumption violation. / Master of Science / Sound verification of multi-threaded programs necessitate a systematic analysis of program state-spaces that result from thread interactions. Consequently, model-checking cite{godefroid1997model, Clarke2018} has been one of the prominent methods used to tackle the verification of multi-threaded programs. However, existing model-checking solutions are inefficient while analyzing stateful programming languages, such as the x86 assembly language, due to the solutions' higher level of abstraction. Therefore, the thesis presents xavier, a novel tool-set and a realistic platform for systematically exploring and analyzing the state-space of mainstream concurrent x86 assembly programs, with the aim of detecting bugs via assertion failures. To this end, xavier{} makes two contributions: i) a novel order-sensitivity based Partial Order Reduction algorithm, which efficiently explores the state space of concurrent x86 assembly, ii) an x86 machine-model that can accurately emulate the execution of concurrent x86 assembly, without the need for any translations. We demonstrate the applicability of xavier{} through an evaluation on several classical mutual-exclusion and mainstream benchmarks from the Userspace Read-Copy-Update (URCU) concurrency library, where the benchmarks range from $250-3700$ lines of x86 assembly. Moreover, we demonstrate the effectiveness of xavier{} by reproducing a concurrency issue in the URCU library, which manifests as a result of an assumption violation.
284

Reachability Analysis of RTL Circuits Using k-Induction Bounded Model Checking and Test Vector Compaction

Roy, Tonmoy 05 September 2017 (has links)
In the first half of this thesis, a novel approach for k-induction bounded model checking using signal domain constraints and property partitioning for proving unreachability of branches in Verilog RTL code is presented. To do this, it approach uses program slicing with respect to the variables of the property under test to generate small-sized SMT formulas that describe the change of variable values between consecutive cycles. Variable substitution is then used on these variables to generate the formula for the subsequent cycles without traversing the abstract syntax tree of the entire design. To reduce the approximation on the induction step, an addition of signal domain constraints is proposed. Moreover, we present the technique for splitting up the property in question to get a better model of the system. The later half of the thesis is concerned with presenting a technique for doing sequential vector compaction on test set generated during simulation based ATPG. Starting with a compaction framework for storing metadata and about the test vectors during generation, this work presented to methods for findind the solution of this compaction problem. The first of these two methods generate the optimum solution by converting the problem appropriate for an optimization solver. The latter method utilizes a heuristics based approach for solving the same problem which generates a comparable but sub-optimal solution while having magnitudes better time and computational efficiency. / Master of Science / Electronic circuits can be described with languages known a hardware description languages like Verilog. The first part of this thesis is concerned about automatically proving if parts of this code is actually useful or reachable when implemented on and actual circuit. The thesis builds up on a method known as bounded model checking which can automatically prove if a property holds or not for a given system. The key insight is obtained from the fact that various memory elements in a circuit is allowed to be only in a certain range of values during the design process. The later half of this thesis is gear towards generating minimum sized inputs values to a circuit required for testing it. This work uses large sized input values to circuits generated by a previously published tool and proposes a way to make them smaller. This can reduce cost immensely for testing circuits in the industry where even the smallest increase in testing time increases cost of development immensely. There are two such approaches presented, one of which gives the optimum result but takes a long time to run for larger circuits, while the other gives comparable but sub-optimal result in a much more time efficient manner.
285

Enhancing Input/Output Correctness, Protection, Performance, and Scalability for Process Control Platforms

Burrow, Ryan David 07 June 2019 (has links)
Most modern control systems use digital controllers to ensure safe operation. We modify the traditional digital control system architecture to integrate a new component known as a trusted input/output processor (TIOP). TIOP interface to the inputs (sensors) and outputs (actuators) of the system through existing communication protocols. The TIOP also interface to the application processor (AP) through a simple message passing protocol. This removes any direct input/output (I/O) interaction from taking place in the AP. By isolating this interaction from the AP, system resilience against malware is increased by enabling the ability to insert run-time monitors to ensure correct operation within provided safe limits. These run-time monitors can be located in either the TIOP(s) or in independent hardware. Furthermore, monitors have the ability to override commands from the AP should those commands seek to violate the safety requirements of the system. By isolating I/O interaction, formal methods can be applied to verify TIOP functionality, ensuring correct adherence to the rules of operation. Additionally, removing sequential I/O interaction in the AP allows multiple I/O operations to run concurrently. This reduces I/O latency which is desirable in many control systems with large numbers of sensors and actuators. Finally, by utilizing a hierarchical arrangement of TIOP, scalable growth is efficiently supported. We demonstrate this on a Xilinx Zynq-7000 programmable system-on-chip device. / Master of Science / Complex modern systems, from unmanned aircraft system to industrial plants are almost always controlled digitally. These digital control systems (DCSes) need to be verified for correctness since failures can have disastrous consequences. However, proving that a DCS will always act correctly can be infeasible if the system is too complex. In addition, with the growth of inter-connectivity of systems through the internet, malicious actors have more access than ever to attempt to cause these systems to deviate from their proper operation. This thesis seeks to solve these problems by introducing a new architecture for DCSes that uses isolated components that can be verified for correctness. In addition, safety monitors are implemented as a part of the architecture to prevent unsafe operation.
286

Discrete Transition System Model and Verification for Mitochondrially Mediated Apoptotic Signaling Pathways

Lam, Huy Hong 13 July 2007 (has links)
Computational biology and bioinformatics for apoptosis have been gaining much momentum due to the advances in computational sciences. Both fields use extensive computational techniques and modeling to mimic real world behaviors. One problem of particular interest is on the study of reachability, in which the goal is to determine if a target state or protein concentration in the model is realizable for a signaling pathway. Another interesting problem is to examine faulty pathways and how a fault can make a previously unrealizable state possible, or vice versa. Such analysis can be extremely valuable to the understanding of apoptosis. However, these analyses can be costly or even impractical for some approaches, since they must simulate every aspect of the model. Our approach introduces an abstracted model to represent a portion of the apoptosis signaling pathways as a finite state machine. This abstraction allows us to apply hardware testing and verification techniques and also study the behaviors of the system without full simulation. We proposed a framework that is tailor-built to implement these verification techniques for the discrete model. Through solving Boolean constraint satisfaction problems (SAT-based) and with guided stimulation (Genetic Algorithm), we can further extract the properties and behaviors of the system. Furthermore, our model allows us to conduct cause-effect analysis of the apoptosis signaling pathways. By constructing single- and double-fault models, we are able to study what fault(s) can cause the model to malfunction and the reasons behind it. Unlike simulation, our abstraction approach allows us to study the system properties and system manipulations from a different perspective without fully relying on simulation. Using these observations as hypotheses, we aim to conduct laboratory experiments and further refine our model. / Master of Science
287

Abstraction Guided Semi-formal Verification

Parikh, Ankur 28 June 2007 (has links)
Abstraction-guided simulation is a promising semi-formal framework for design validation in which an abstract model of the design is used to guide a logic simulator towards a target property. However, key issues still need to be addressed before this framework can truly deliver on it's promise. Concretizing, or finding a real trace from an abstract trace, remains a hard problem. Abstract traces are often spurious, for which no corresponding real trace exits. This is a direct consequence of the abstraction being an over-approximation of the real design. Further, the way in which the abstract model is constructed is an open-ended problem which has a great impact on the performance of the simulator. In this work, we propose a novel approaches to address these issues. First, we present a genetic algorithm to select sets of state variables directly from the gate-level net-list of the design, which are highly correlated to the target property. The sets of selected variables are used to build the Partition Navigation Tracks (PNTs). PNTs capture the behavior of expanded portions of the state space as they related to the target property. Moreover, the computation and storage costs of the PNTs is small, making them scale well to large designs. Our experiments show that we are able to reach many more hard-to-reach states using our proposed techniques, compared to state-of-the-art methods. Next, we propose a novel abstraction strengthening technique, where the abstract design is constrained to make it more closely resemble the concrete design. Abstraction strengthening greatly reduces the need to refine the abstract model for hard to reach properties. To achieve this, we efficiently identify sequentially unreachable partial sates in the concrete design via intelligent partitioning, resolution and cube enlargement. Then, these partial states are added as constraints in the abstract model. Our experiments show that the cost to compute these constraints is low and the abstract traces obtained from the strengthened abstract model are far easier to concretize. / Master of Science
288

Extending Model Checking Using Inductive Proofs in Distributed Digital Currency Protocols

Storey, Kyle R. 26 June 2023 (has links) (PDF)
Model checking is an effective method to verify both safety and liveness properties in distributed systems. However, the complexity of model checking grows exponentially with the number of entities which makes it suitable only for small systems. Interactive theorem provers allow for machine-checked proofs. These proofs can include inductive reasoning which allows them to reason about an arbitrarily large number of entities. However, proving safety and liveness properties in these proofs can be difficult. This work explores how combining model checking and inductive proofs can be an effective method for formally verifying complex distributed protocols. This is demonstrated on a part of MyCHIPs, a novel digital currency based on the value of personal credit. It has been selected as a case study because it requires certain properties to hold on a non-trivial distributed algorithm.
289

Towards an Integrated Approach to Verification and Model-Based Testing in System Engineering

Lefticaru, Raluca, Konur, Savas, Yildirim, Unal, Uddin, Amad, Campean, Felician, Gheorghe, Marian 22 April 2017 (has links)
Yes / Engineering design in general and system design of embedded software have a direct impact on the final engineering product and the software implementation, respectively. Guaranteeing that the models utilised meet the specified requirements is beneficial in detecting misbehaviour and software flaws. This requires an integrated approach, combining verification and model-based testing methodology and notations and methods from system engineering and software engineering. In this paper, we propose a model-based approach integrating various notations utilised in the functional design of complex systems with formal verification and testing. We illustrate our approach on the cruise control system of an e-Bike case study.
290

P colonies and kernel P systems

Csuhaj-Varju, E., Gheorghe, Marian, Lefticaru, Raluca 18 July 2018 (has links)
Yes / P colonies, tissue-like P systems with very simple components, have received constant attention from the membrane computing community and in the last years several new variants of the model have been considered. Another P system model, namely kernel P system, integrating the most successfully used features of membrane systems, has recently attracted interest and some important developments have been reported. In this paper we study connections among several classes of P colonies and kernel P systems, by showing how the behaviour of these P colony systems can be represented as kernel P systems. An example illustrates the way it is modelled by using P colonies and kernel P systems and some properties of it are formally proved in the latter approach. / Grant No. K 120558 of the NKFIH—National Research, Development, and Innovation Office, Hungary; Romanian National Authority for Scientific Research, CNCS-UEFISCDI (Project No. PN-III-P4-ID-PCE-2016-0210).

Page generated in 0.0664 seconds