• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 866
  • 125
  • 116
  • 106
  • 63
  • 24
  • 24
  • 20
  • 12
  • 9
  • 8
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 1758
  • 421
  • 358
  • 295
  • 270
  • 260
  • 254
  • 223
  • 211
  • 192
  • 179
  • 171
  • 128
  • 123
  • 121
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Direct methods for deductive verification of temporal properties in continuous dynamical systems

Sogokon, Andrew January 2016 (has links)
This thesis is concerned with the problem of formal verification of correctness specifications for continuous and hybrid dynamical systems. Our main focus will be on developing and automating general proof principles for temporal properties of systems described by non-linear ordinary differential equations (ODEs) under evolution constraints. The proof methods we consider will work directly with the differential equations and will not rely on the explicit knowledge of solutions, which are in practice rarely available. Our ultimate goal is to increase the scope of formal deductive verification tools for hybrid system designs. We give a comprehensive survey and comparison of available methods for checking set invariance in continuous systems, which provides a foundation for safety verification using inductive invariants. Building on this, we present a technique for constructing discrete abstractions of continuous systems in which spurious transitions between discrete states are entirely eliminated, thereby extending previous work. We develop a method for automatically generating inductive invariants for continuous systems by efficiently extracting reachable sets from their discrete abstractions. To reason about liveness properties in ODEs, we introduce a new proof principle that extends and generalizes methods that have been reported previously and is highly amenable to use as a rule of inference in a deductive verification calculus for hybrid systems. We will conclude with a summary of our contributions and directions for future work.
172

Rabbit: A novel approach to find data-races during state-space exploration / Rabbit: A novel approach to find data-races during state-space exploration

Oliveira, João Paulo dos Santos 30 August 2012 (has links)
Submitted by Pedro Henrique Rodrigues (pedro.henriquer@ufpe.br) on 2015-03-05T18:45:35Z No. of bitstreams: 2 jpso-master_rabbit_complete.pdf: 1450168 bytes, checksum: 081b9f94c19c494561e97105eb417001 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) / Made available in DSpace on 2015-03-05T18:45:35Z (GMT). No. of bitstreams: 2 jpso-master_rabbit_complete.pdf: 1450168 bytes, checksum: 081b9f94c19c494561e97105eb417001 (MD5) license_rdf: 1232 bytes, checksum: 66e71c371cc565284e70f40736c94386 (MD5) Previous issue date: 2012-08-30 / Data-races are an important kind of error in concurrent shared-memory programs. Software model checking is a popular approach to find them. This research proposes a novel approach to find races that complements model-checking by efficiently reporting precise warnings during state-space exploration (SSE): Rabbit. It uses information obtained across different paths explored during SSE to predict likely racy memory accesses. We evaluated Rabbit on 33 different scenarios of race, involving a total of 21 distinct application subjects of various sources and sizes. Results indicate that Rabbit reports race warnings very soon compared to the time the model checker detects the race (for 84.8% of the cases it reports a true warning of race in <5s) and that the warnings it reports include very few false alarms. We also observed that the model checker finds the actual race quickly when it uses a guided-search that builds on Rabbit’s output (for 74.2% of the cases it reports the race in <20s).
173

On the automated verification of symmetric-key cryptographic algorithms: an approach based on SAT-solvers

Lafitte, Frédéric 19 September 2017 (has links)
A cryptographic protocol is a structured exchange of messages protected by means of cryptographic algorithms. Computer security in general relies heavily on these protocols and algorithms; in turn, these rely absolutely on smaller components called primitives. As technology advances, computers have reached a cost and a degree of miniaturisation conducive to their proliferation throughout society in the form of software-controlled network-enabled things. As these things find their way into environments where security is critical, their protection ultimately relies on primitives; if a primitive fails, all security solutions (protocols, policies, etc.) that are built on top of it are likely to offer no security at all. Lightweight symmetric-key primitives, in particular, will play a critical role.The security of protocols is frequently verified using formal and automated methods. Concerning algorithms and public-key primitives, formal proofs are often used, although they are somewhat error prone and current efforts aim to automate them. On the other hand, symmetric-key primitives are still currently analysed in a rather ad-hoc manner. Since their security is only guaranteed by the test-of-time, they traditionally have a built-in security margin. Despite being paramount to the security of embedded devices, lightweight primitives appear to have a smaller security margin and researchers would greatly benefit from automated tools in order to strengthen tests-of-time.In their seminal work back in 2000, Massacci and Marraro proposed to formulate primitives in propositional logic and to use SAT solvers to automatically verify their properties. At that time, SAT solvers were quite different from what they have become today; the continuous improvement of their performance makes them an even better choice for a verification back-end. The performance of SAT solvers improved so much that starting around 2006, some cryptanalysts started to use them, but mostly in order to speedup their attacks. This thesis introduces the framework CryptoSAT and shows its advantages for the purpose of verification. / La sécurité informatique repose en majeure partie sur des mécanismes cryptographiques, qui à leur tour dépendent de composants encore plus fondamentaux appelés primitives ;si une primitive échoue, toute la sécurité qui en dépend est vouée à l'échec. Les ordinateurs ont atteint un coût et un degré de miniaturisation propices à leur prolifération sous forme de systèmes embarqués (ou enfouis) qui offrent généralement peu de ressources calculatoires, notamment dans des environnements où la sécurité est primordiale. Leur sécurité repose donc lourdement sur les primitives dites à clé symétrique, puisque ce sont celles qui sont le mieux adaptées aux ressources limitées dont disposent les systèmes embarqués. Il n'est pas mathématiquement prouvé que les primitives à clé symétrique soient dépourvues de failles de sécurité, contrairement à tous les autres mécanismes cryptographiques :alors que la protection qu'offre la cryptographie peut, en général, être prouvée de façon formelle (dans un modèle limité) et parfois au moyen de méthodes automatisées qui laissent peu de place à l'erreur, la protection qu'offrent les primitives à clé symétrique n'est garantie que par “l'épreuve du temps”, c.-à-d. par la résistance (durable) de ces primitives face aux attaques conçues par la communauté des chercheurs en cryptologie. Pour compenser l'absence de garanties formelles, ces primitives sont traditionnellement pourvues d'une “marge de sécurité”, c.-à-d. de calculs supplémentaires, juste au cas où, dont le coût est difficile à justifier lorsque les ressources calculatoires sont rares.Afin de pallier à l'insuffisance de l'épreuve du temps et à la diminution des marges de sécurité, cette thèse revient sur les travaux de Massacci et Marraro qui, en 2000, avaient proposé de formuler les primitives en logique propositionnelle de sorte que leurs propriétés puissent être vérifiées automatiquement au moyen d'algorithmes SAT. A cette époque, les algorithmes SAT étaient très différents de ce qu'ils sont devenus aujourd'hui ;l'amélioration de leur performance, continuelle au fil des années, en fait un choix encore plus judicieux comme moteur de vérification. Dans le cadre de cette thèse, une méthode a été développée pour permettre à un cryptologue de facilement vérifier les propriétés d'une primitive à clé symétrique de façon formelle et automatique à l'aide d'algorithmes SAT, tout en lui permettant de faire abstraction de la logique propositionnelle. L'utilité de la méthode a ensuite été mise en évidence en obtenant des réponses formelles à des questions, posées dans la littérature en cryptanalyse, concernant des failles potentielles tant au niveau de la conception qu'au niveau de la mise en oeuvre de certaines primitives. / Doctorat en Sciences / info:eu-repo/semantics/nonPublished
174

Improving product development process through verification and validation

Härkönen, J. (Janne) 17 June 2009 (has links)
Abstract The workload of Verification and Validation (V&amp;V) has increased constantly in the high technology industries. The changes in the business environment, with fast time-to-market and demands to decrease research and development costs, have increased the importance of efficient product creation process, including V&amp;V. The significance of the V&amp;V related know-how and testing is increasing in the high tech business environment. As a consequence, companies in the ICT sector have pressures for improving product development process and verification and validation activities. The main motive for this research arises from the fact that the research has been scarce on verification and validation from product development process perspective. This study approaches the above mentioned goal from four perspectives: current challenges and success factors, V&amp;V maturity in different NPD phases, benchmarking automotive sector, and shifting the emphasis of NPD efforts. This dissertation is qualitative in nature and is based on interviewing experienced industrial managers, reflecting their views against scientific literature. The researcher has analysed the obtained material and made conclusions. The main implications of this doctoral dissertation can be concluded as a visible need to shift the emphasis of V&amp;V activities to early NPD. These activities should be viewed and managed over the entire NPD process. There is a need for companies to understand the V&amp;V maturity in different NPD phases and develop activities based on this understanding. Verification and validation activities must be seen as an integral element for successful NPD. Benchmarking other sectors may enable identifying development potential for NPD process. The automotive sector being a mature sector, has developed practices for successfully handling requirements during NPD. The role of V&amp;V is different in different NPD phases. Set-based type V&amp;V can provide required understanding during early product development. In addition, developing parallel technological alternatives and platforms during early NPD also support shifting the emphasis towards earlier development phases.
175

Improving verification and validation activities in ICT companies—product development management approach

Belt, P. (Pekka) 05 June 2009 (has links)
Abstract The main motive for this research arises from the fact that the research has been scarce on verification and validation (V&amp;V) activities from the management viewpoint, even though V&amp;V has been covered from the technical viewpoint. There was a clear need for studying the management aspects due to the development of the information and communications technology (ICT) sector, and increased significance of V&amp;V activities. ICT has developed into a turbulent, high clock-speed sector and the importance of V&amp;V activities has increased significantly. As a consequence, companies in the ICT sector require ideas for improving their verification and validation activities from the product development management viewpoint. This study approaches the above mentioned goal from four perspectives: current V&amp;V management challenges, organisational and V&amp;V maturities, benchmarking another sector, and uncertainty during new product development (NPD). This dissertation is qualitative in nature and is based on interviewing experienced industrial managers, reflecting their views against scientific literature. The researcher has analysed the obtained material and made conclusions. The main implications of this doctoral dissertation can be concluded as a need to overcome the current tendency to organise through functional silos, and low maturity of V&amp;V activities. Verification and validation activities should be viewed and managed over the entire NPD process. This requires new means for cross-functional integration. The maturity of the overall management system needs to be adequate to enable higher efficiency and effectiveness of V&amp;V activities. There are pressures to shift the emphasis of V&amp;V to early NPD and simultaneously delay decision-making in NPD projects to a stage where enough information is available. Understanding enhancing V&amp;V methods are a potential way to advance towards these goals.
176

A Certified Core Policy Language

Sistany, Bahman January 2016 (has links)
We present the design and implementation of a Certified Core Policy Language (ACCPL) that can be used to express access-control rules and policies. Although full-blown access-control policy languages such as eXtensible Access Control Markup Language (XACML) [OAS13] already exist, because access rules in such languages are often expressed in a declarative manner using fragments of a natural language like English, it isn’t alwaysclear what the intended behaviour of the system encoded in these access rules should be. To remedy this ambiguity, formal specification of how an access-control mechanism should behave, is typically given in some sort of logic, often a subset of first order logic. To show that an access-control system actually behaves correctly with respect to its specification, proofs are needed, however the proofs that are often presented in the literature are hard or impossible to formally verify. The verification difficulty is partly due to the fact that the language used to do the proofs while mathematical in nature, utilizes intuitive justifications to derive the proofs. Intuitive language in proofs means that the proofs could be incomplete and/or contain subtle errors. ACCPL is small by design. By small we refer to the size of the language; the syntax, auxiliary definitions and the semantics of ACCPL only take a few pages to describe. This compactness allows us to concentrate on the main goal of this thesis which is the ability to reason about the policies written in ACCPL with respect to specific questions. By making the language compact, we have stayed away from completeness and expressive power in several directions. For example, ACCPL uses only a single policy combinator, the conjunction policy combinator. The design of ACCPL is therefore a trade-off between ease of formal proof of correctness and expressive power. We also consider ACCPL a core policy access-control language since we have retained the core features of many access-control policy languages. For instance ACCPL employs a single condition type called a “prerequisite” where other languages may have very expressive and rich sets of conditions.
177

Speaker Verification Systems Under Various Noise and SNR Conditions

Wan, Qianhui January 2017 (has links)
In speaker verification, the mismatches between the training speech and the testing speech can greatly affect the robustness of classification algorithms, and the mismatches are mainly caused by the changes in the noise types and the signal to noise ratios. This thesis aims at finding the most robust classification methods under multi-noise and multiple signal to noise ratio conditions. Comparison of several well-known state of the art classification algorithms and features in speaker verification are made through examining the performance of small-set speaker verification system (e.g. voice lock for a family). The effect of the testing speech length is also examined. The i-vector/Probabilistic Linear Discriminant Analysis method with compensation strategies is shown to provide a stable performance for both previously seen and previously unseen noise scenarios, and a C++ implementation with online processing and multi-threading is developed for this approach.
178

Verification of liveness properties on hybrid dynamical systems

Carter, Rebekah January 2013 (has links)
A hybrid dynamical system is a mathematical model for a part of the real world where discrete and continuous parts interact with each other. Typically such systems are complex, and it is difficult to know how they will behave for general parameters and initial conditions. However, the method of formal verification gives us the ability to prove automatically that certain behaviour does or does not happen for a range of parameters in a system. The challenge is then to define suitable methods for proving properties on hybrid systems.This thesis looks at using formal verification for proving liveness properties on hybrid systems: a liveness property says that something good eventually happens in the system. This work presents the theoretical background and practical application of various methods for proving and disproving inevitability properties (a type of liveness) in different classes of hybrid systems. The methods combine knowledge of dynamical behaviour of a system with the brute-force approach of model checking, in order to make the most of the benefits of both sides. The work on proving liveness properties is based on abstraction of dynamical systems to timed automata. This thesis explores the limits of a pre-defined abstraction method, adds some dynamical knowledge to the method, and shows that this improvement makes liveness properties provable in certain continuous dynamical systems. The limits are then pushed further to see how this method can be used for piecewise-continuous dynamical systems. The resulting algorithms are implemented for both classes of systems.In order to disprove liveness properties in hybrid systems a novel framework is proposed, using a new property called deadness. Deadness is a dynamically-aware property of the hybrid system which, if true, disproves the liveness property by means of a finite execution: we usually require an infinite execution to disprove a liveness property. An algorithm is proposed which uses dynamical properties of hybrid systems to derive deadness properties automatically, and the implementation of this algorithm is discussed and applied to a simplified model of an oilwell drillstring.
179

Formální vertifikace textových use casů / Verification of Textual Use-Cases

Vinárek, Jiří January 2013 (has links)
The aim of this thesis is to create a tool for formal verification of systems specified using textual use- cases. The tool should allow for automated verification of temporal invariants specified in temporal logic (CTL and LTL formulae). The textual specification is transformed to a formal model that is verified using the NuSMV symbolic model-checker. Potential errors are shown to the user in the form of an HTML report. Using this feedback, the user is able to iteratively develop valid textual use-case specifications. The tool's architecture should be focused on reusability of its components and extensibility. Powered by TCPDF (www.tcpdf.org)
180

Formally Verified Code Obfuscation in the Coq Proof Assistant

Lu, Weiyun 20 December 2019 (has links)
Code obfuscation is a software security technique where transformations are applied to source and/or machine code to make them more difficult to analyze and understand to deter reverse-engineering and tampering. However, in many commercial tools, such as Irdeto's Cloakware product, it is not clear why the end user should believe that the programs that come out the other end are still the same program"! In this thesis, we apply techniques of formal specification and verification, by using the Coq Proof Assistant and IMP (a simple imperative language within it), to formulate what it means for a program's semantics to be preserved by an obfuscating transformation, and give formal machine-checked proofs that these properties hold. We describe our work on opaque predicate and control flow flattening transformations. Along the way, we also employ Hoare logic as an alternative to state equivalence, as well as augment the IMP program with Switch statements. We also define a lower-level flowchart language to wrap around IMP for modelling certain flattening transformations, treating blocks of codes as objects in their own right. We then discuss related work in the literature on formal verification of data obfuscation and layout obfuscation transformations in IMP, and conclude by discussing CompCert, a formally verified C compiler in Coq, along with work that has been done on obfuscation there, and muse on the possibility of implementing formal methods in the next generation of real-world obfuscation tools.

Page generated in 0.1111 seconds