• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 163
  • 34
  • 22
  • 12
  • 11
  • 5
  • 5
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 303
  • 303
  • 86
  • 55
  • 55
  • 51
  • 50
  • 47
  • 45
  • 44
  • 41
  • 34
  • 30
  • 29
  • 28
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Quality assessment of a large real world industry project

Glazunov, Vladimir January 2013 (has links)
Quality Monitor is application, which automatically analyzes software projects forquality and makes quality assessment reports. This thesis project aims to instantiate Quality Monitor for a large real-world .Net project and to extend Quality Monitor by considering other data sources than just source code. This extended analysis scope includes bug reports, features, and time reports besides .Net assemblies (code) as artifacts. Different tools were investigated for the analysis of code, bug reports, features and time reports. The analysis of .Net assemblies was implemented as none of the existing tools under evaluation met all requirements. The analysis of .Net assemblies was successfully completed; it allows the extraction data necessary for creating Call and Control Flow graphs. These graphs are used for calculating additional metrics allowing for an improved assessment of quality of the project. Implementation of .Net assembly reader was tested using large real world industrial project. Other data sources were analyzed theoretically, but excluded for further implementation. Altogether the thesis includes an analysis of possible Quality Monitor extensions including their requirements, design, and (partially) their implementation and evaluation.
92

Identification and Analysis of Combined Quality Assurance Approaches

Nha, Vi Tran Ngoc January 2010 (has links)
Context: Due to the increasing size and complexity of software today, the amount of effort for software quality assurance (QA) is growing and getting more and more expensive. There are many techniques lead to the improvement in software QA. Static analysis can obtain very good coverage while analyze program without execution, but it has the weakness of imprecision by false errors. In contrast, dynamic analysis can obtain only partial coverage due to a large number of possible test cases, but the reported errors are more precise. Static and dynamic analyses can complement each other by providing valuable information that would be missed by using isolated analysis technique. Although many studies investigate the QA approaches that combine static and dynamic QA techniques, it is unclear what we have learned from these studies, because no systematic synthesis exists to date. Method: This thesis is intended to provide basic key concepts for combined QA approaches. A major part of this thesis presents the systematic review that brings details discussion about state of the art on the approaches that combine static and dynamic QA techniques. The systematic review is aimed at the identification of the existed combined QA approaches, how to classify them, their purposes and input as well as introduce which combination is available. Result: The results show that, there are two relations in the combination of static and dynamic techniques such as integration and separation. Besides, the objectives of combined QA approaches were introduced according to QA process quality and product quality. The most common inputs for combined approaches were also discussed. Moreover, we identified which combination of static and dynamic techniques should or should not be used as well as the potential combination for further research.
93

Analýza řetězců v Code Contracts / String Analysis for Code Contracts

Dort, Vlastimil January 2016 (has links)
Using contracts such as method preconditions, postconditions, and class invariants in code is an approach to preventing bugs in object-oriented programs. In the .NET Framework, this is possible due to the framework of Code Contracts, which includes Clousot, a tool for static program analysis based on abstract interpretation. Although string is a fundamental type in .NET programs, Clousot does not have a usable support for analysis of string values. In this thesis, we explore the specifics of string manipulation in the C# language and in the .NET Framework, and show how they can be covered by static analysis. Our approach is to use the methods of the String class and a subset of regular expressions to specify string properties in code, and to use abstract interpretation with non-relational abstract domains to reason about those properties. We chose a small number of already published abstract domains for strings, which vary in their complexity and ability to represent different properties. We adapted those domains to our setting, including definitions of abstract semantics for the supported string methods. We implemented the abstract domains in Clousot in a way that cooperates with numerical analysis and allows adding more string abstract domains in the future. Powered by TCPDF (www.tcpdf.org)
94

EVA, an Evolved Value Analysis for Frama-C : structuring an abstract interpreter through value and state abstractions / Structurer un interpréteur abstrait autour d'abstractions d'états et de valeurs : EVA, une analyse de valeurs évoluée pour Frama-C

Bühler, David 15 March 2017 (has links)
Cette thèse propose un nouveau cadre pour la composition de domaines abstraits. L'idée principale en est l'organisation d'une sémantique abstraite suivant la distinction usuelle entre expressions et instructions, en cours dans la plupart des langages impératifs. La définition d'une sémantique abstraite peut alors se diviser entre abstractions de valeurs et abstractions d'états (ou domaine abstrait). Les abstractions de valeurs représentent les valeurs possibles d'une expression en un point donné, et assurent l'interprétation de la sémantique des expressions. Les abstractions d'états représentent les états machines qui peuvent se produire lors de l'exécution d'un programme, et permettent d'interpréter la sémantique des instructions. De ce choix de conception découle naturellement un élégant système de communication entre abstractions. Lors de l'interprétation d'une instruction, les abstractions d'états peuvent échanger des informations au moyen d'abstractions de valeurs, qui expriment des propriétés à propos des expressions. Les valeurs forment donc une interface de communication entre états abstraits, mais sont également des éléments canoniques de l'interprétation abstraite. Ils peuvent donc eux-même être combinés par les moyens existants de composition d'abstractions, permettant encore davantage d'interactions entre les composants des sémantiques abstraites. Cette thèse explore les possibilités offertes par cette nouvelle architecture des sémantiques abstraites. Nous décrivons en particulier des stratégies efficaces pour le calcul d'abstractions de valeurs précises à partir des propriétés inférées par les domaines, et nous illustrons les différentes possibilités d'interactions que ce système offre. L'architecture que nous proposons inclue également une collaboration directe des abstractions pour l'émission des alarmes qui signalent les erreurs possibles du programme analysé. Nous proposons également un mécanisme permettant d'interagir avec les composants d'une combinaison générique de types OCaml. Nous utilisons des GADT pour encoder la structure interne d'une combinaison, et construisons automatiquement les fonctions d'injection et de projection entre le produit et ses composants. Cette fonctionnalité permet d'établir une communication directe entre les différentes abstractions d'un interpréteur abstrait. Enfin, une dernière contribution de cette thèse est l'extension automatique de domaines abstraits à l'aide de prédicats logiques qui évitent les pertes d'information aux points de jonction. De fait, lorsque plusieurs chemins d'exécution se rejoignent, un domaine abstrait doit représenter les comportements possibles de chacun des chemins, ce qui engendre souvent des pertes de précision. Pour remédier à cette limitation, nous proposons de propager un ensemble d'états abstraits, munis chacun d'un prédicat qui indique sous quelle condition l'état est valable. Contrairement à d'autres approches, notre analyse ne maintient pas une stricte partition des états abstraits, car les prédicats utilisés ne sont pas mutuellement exclusifs. Cette particularité rend possible des optimisations cruciales pour le passage à l'échelle de cette technique, confirmée par nos résultats expérimentaux sur un programme industriel généré. L'ensemble du système de composition des abstractions proposé dans cette thèse a été mis en œuvre dans EVA, la nouvelle version de l'interpréteur abstrait de Frama-C. EVA a été spécifiquement conçu pour faciliter l'introduction de nouvelles abstractions et permettre des interactions riches entre ces abstractions. Grâce à son architecture modulaire et extensible, cinq nouveaux domaines abstraits ont pu être introduit dans l'analyseur en moins d'un an, améliorant ainsi tant ses capacités que sa précision. / This thesis proposes a new framework for the combination of multiple domains in the abstract interpretation theory. Its core concept is the structuring of the abstract semantics by following the usual distinction between expressions and statements. This can be achieved by a convenient architecture where abstractions are separated in two layers: value abstractions, in charge of the expression semantics, and state abstractions —or abstract domains—, in charge of the statement semantics. This design leads naturally to an elegant communication system where the abstract domains, when interpreting a statement, interact and exchange information through value abstractions, that express properties about expressions. While the values form the communication interface between domains, they are also standard elements of the abstract interpretation framework. The communication system is thus embedded in the abstract semantics, and the usual tools of abstract interpretation apply naturally to value abstractions. For instance, different kinds of value abstractions can be composed through the existing methods of combination of abstractions, enabling even further interaction between the components of the abstract semantics. This thesis explores the possibilities offered by this framework. We discuss efficient strategies to compute precise value abstractions from the information inferred by abstract domains, and illustrate the means of communication between different state abstractions. Our architecture also features a direct collaboration for the emission of alarms that report the possible errors of a program. We also proposes a mechanism to enable interacting with the components of a modular combination of OCaml types. We use GADT to encode the inner shape of a combination, and automatically build injection and projection functions between a product of datatypes and its components. This allows direct communications between the abstractions of an abstract interpreter. Finally, a last contribution of this thesis is the automatic extension of abstract domains to track sets of disjunctive abstract states, each one being qualified with a predicate for which the state holds. This enhances the precision of an abstract semantics at join points, when several possible paths of a program execution meet. At these points, predicates preserve the information usually lost by the merge of abstract states. Unlike other approaches, the analysis does not maintain a strict partition of the abstract states, as the predicates we use are not mutually exclusive. This design enables some optimizations that are crucial for scalability, as confirmed by our experimental results on an industrial, generated program. The general system of abstractions combination has been implemented within EVA, the new version of the abstract interpreter provided by the Frama-C platform. Thus, Eva enjoys a modular and extensible architecture designed to facilitate the introduction of new abstractions and to enable rich interactions between them. Thanks to this work, five new domains from the literature have been implemented in less than a year, enhancing the scope and the precision of the analyzer.
95

Static/Dynamic Analyses for Validation and Improvements of Multi-Model HPC Applications. / Analyse statique/dynamique pour la validation et l'amélioration des applications parallèles multi-modèles

Saillard, Emmanuelle 24 September 2015 (has links)
L’utilisation du parallélisme des architectures actuelles dans le domaine du calcul hautes performances, oblige à recourir à différents langages parallèles. Ainsi, l’utilisation conjointe de MPI pour le parallélisme gros grain, à mémoire distribuée et OpenMP pour du parallélisme de thread, fait partie des pratiques de développement d’applications pour supercalculateurs. Des erreurs, liées à l’utilisation conjointe de ces langages de parallélisme, sont actuellement difficiles à détecter et cela limite l’écriture de codes, permettant des interactions plus poussées entre ces niveaux de parallélisme. Des outils ont été proposés afin de palier ce problème. Cependant, ces outils sont généralement focalisés sur un type de modèle et permettent une vérification dite statique (à la compilation) ou dynamique (à l’exécution). Pourtant une combinaison statique/- dynamique donnerait des informations plus pertinentes. En effet, le compilateur est en mesure de donner des informations relatives au comportement général du code, indépendamment du jeu d’entrée. C’est par exemple le cas des problèmes liés aux communications collectives du modèle MPI. Cette thèse a pour objectif de développer des analyses statiques/dynamiques permettant la vérification d’une application parallèle mélangeant plusieurs modèles de programmation, afin de diriger les développeurs vers un code parallèle multi-modèles correct et performant. La vérification se fait en deux étapes. Premièrement, de potentielles erreurs sont détectées lors de la phase de compilation. Ensuite, un test au runtime est ajouté pour savoir si le problème va réellement se produire. Grâce à ces analyses combinées, nous renvoyons des messages précis aux utilisateurs et évitons les situations de blocage. / Supercomputing plays an important role in several innovative fields, speeding up prototyping or validating scientific theories. However, supercomputers are evolving rapidly with now millions of processing units, posing the questions of their programmability. Despite the emergence of more widespread and functional parallel programming models, developing correct and effective parallel applications still remains a complex task. Although debugging solutions have emerged to address this issue, they often come with restrictions. However programming model evolutions stress the requirement for a convenient validation tool able to handle hybrid applications. Indeed as current scientific applications mainly rely on the Message Passing Interface (MPI) parallel programming model, new hardwares designed for Exascale with higher node-level parallelism clearly advocate for an MPI+X solutions with X a thread-based model such as OpenMP. But integrating two different programming models inside the same application can be error-prone leading to complex bugs - mostly detected unfortunately at runtime. In an MPI+X program not only the correctness of MPI should be ensured but also its interactions with the multi-threaded model, for example identical MPI collective operations cannot be performed by multiple nonsynchronized threads. This thesis aims at developing a combination of static and dynamic analysis to enable an early verification of hybrid HPC applications. The first pass statically verifies the thread level required by an MPI+OpenMP application and outlines execution paths leading to potential deadlocks. Thanks to this analysis, the code is selectively instrumented, displaying an error and synchronously interrupting all processes if the actual scheduling leads to a deadlock situation.
96

Pointer analysis and separation logic

Sims, Elodie-Jane January 1900 (has links)
Doctor of Philosophy / Department of Computing and Information Sciences / David A. Schmidt / We are interested in modular static analysis to analyze softwares automatically. We focus on programs with data structures, and in particular, programs with pointers. The final goal is to find errors in a program (problems of dereferencing, aliasing, etc) or to prove that a program is correct (regarding those problems) in an automatic way. Isthiaq, Pym, O'Hearn and Reynolds have recently developed separation logics, which are Hoare logics with assertions and predicates language that allow to prove the correctness of programs that manipulate pointers. The semantics of the logic's triples ({P}C{P'}) is defined by predicate transformers in the style of weakest preconditions. We expressed and proved the correctness of those weakest preconditions (wlp) and strongest postconditions (sp), in particular in the case of while-loops. The advance from the existing work is that wlp and sp are defined for any formula, while previously existing rules had syntactic restrictions. We added fixpoints to the logic as well as a postponed substitution which then allow to express recursive formula. We expressed wlp and sp in the extended logic and proved their correctness. The postponed substitution is directly useful to express recursive formula. For example, [equations removed, still appears in abstract] describes the set of memory where x points to a list of integers. Next, the goal was to use separation logic with fixpoints as an interface language for pointer analysis. That is, translating the domains of those analyses into formula of the logic (and conversely) and to prove their correctness. One might also use the translations to prove the correctness of the pointer analysis itself. We illustrate this approach with a simple pointers-partitioning analysis. We translate the logic formula into an abstract language we designed which allows us to describe the type of values registered in the memory (nil, integer, booleans, pointers to pairs of some types, etc.) as well as the aliasing and non-aliasing relations between variables and locations in the memory. The main contribution is the definition of the abstract language and its semantics in a concrete domain which is the same as the one for the semantics of formula. In particular, the semantics of the auxiliary variables, which is usually a question of implementation, is explicit in our language and its semantics. The abstract language is a partially reduced product of several subdomains and can be parametrised with existing numerical domains. We created a subdomain which is a tabular data structure to cope with the imprecision from not having sets of graphs. We expressed and proved the translations of formula into this abstract language.
97

Security vulnerability verification through contract-based assertion monitoring at runtime

Hoole, Alexander M. 08 January 2018 (has links)
In this dissertation we seek to identify ways in which the systems development life cycle (SDLC) can be augmented with improved software engineering practices to measurably address security concerns that have arisen relating to security vulnerability defects in software. By proposing a general model for identifying potential vulnerabilities (weaknesses) and using runtime monitoring for verifying their reachability and exploitability during development and testing reduces security risk in delivered products. We propose a form of contract for our monitoring framework that is used to specify the environmental and system security conditions necessary for the generation of probes that monitor security assertions during runtime to verify suspected vulnerabilities. Our assertion-based security monitoring framework, based on contracts and probes, known as the Contract-Based Security Assertion Monitoring Framework (CB_SAMF) can be employed for verifying and reacting to suspected vulnerabilities in the application and kernel layers of the Linux operating system. Our methodology for integrating CB_SAMF into SDLC during development and testing to verify suspected vulnerabilities reduces the human effort by allowing developers to focus on fixing verified vulnerabilities. Metrics intended for the weighting, prioritizing, establishing confidence, and detectability of potential vulnerability categories are also introduced. These metrics and weighting approaches identify deficiencies in security assurance programs/products and also help focus resources towards a class of suspected vulnerabilities, or a detection method, which may presently be outside of the requirements and priorities of the system. Our empirical evaluation demonstrates the effectiveness of using contracts to verify exploitability of suspected vulnerabilities across five input validation related vulnerability types, combining our contracts with existing static analysis detection mechanisms, and measurably improving security assurance processes/products used in an enhanced SDLC. As a result of this evaluation we introduced two new security assurance test suites, through collaborations with the National Institute of Standards and Technology (NIST), replacing existing test suites. The new and revised test cases provide numerous improvements to consistency, accuracy, and preciseness along with enhanced test case metadata to aid researchers using the Software Assurance Reference Dataset (SARD). / Graduate
98

Calcul par analyse intervalle de certificats de barrière pour les systèmes dynamiques hybrides / Computation of barrier certificates for dynamical hybrids systems using interval analysis

Djaballah, Adel 03 July 2017 (has links)
Cette thèse développe des outils permettant de prouver qu’un système dynamique est sûr. En supposant qu’une partie de l’espace d’état est dangereuse, un système dynamique est dit sûr lorsque son état n’atteint jamais cette partie dangereuse au cours du temps, quel que soit l’état initial appartenant à un ensemble d’états initiaux admissibles et quel que soit le niveau de perturbation restant dans un domaine admissible. Les outils proposés cherchent à établir des preuves de sûreté pour des systèmes décrits par des modèles dynamiques non-linéaires et des modèles dynamiques hybrides. Prouver qu’un système dynamique est sûr en calculant explicitement l’ensemble des trajectoires possibles du système lorsque le modèle dynamique est non-linéaire et perturbé reste une tâche très difficile. C’est pourquoi cette thèse aborde ce problème à l’aide de fonctions barrières paramétrées. Une barrière, lorsqu’elle existe, permet de partitionner l’espace d’état et d’isoler l’ensemble des trajectoires possibles de l’état du système de la partie dangereuse de l’espace d’état. La fonction paramétrique décrivant la barrière doit satisfaire un certain nombre de contraintes impliquant la dynamique du modèle, l’ensemble des états initiaux possibles, et l’ensemble dangereux. Ces contraintes ne sont pas convexes en général, ce qui complique la recherche de fonctions barrières satisfaisantes. Précédemment, seules des fonctions barrières polynomiales ont été considérées pour des modèles dynamiques polynomiaux. Cette thèse considère des systèmes dynamiques relativement généraux avec des barrières paramétriques quelconques. Les solutions présentées exploitent des outils de satisfaction de contraintes sur des domaines continus et des outils issus de l’analyse par intervalles. Dans un premier temps, cette thèse considère des systèmes dynamiques non-linéaires à temps continu. Le problème de conception d’une barrière paramétrique est formulé comme un problème de satisfaction des contraintes sur des domaines réels avec des variables quantifiées de manière existentielle et universelle. L’algorithme CSC-FPS a été adapté afin de résoudre le problème de synthèse de barrière. Cet algorithme combine une exploration de l’espace des paramètres de la barrière et une phase de vérification des propriétés de la barrière. A l’aide de contracteurs, il est possible de significativement accélérer la recherche de solutions. Dans un second temps, ces résultats sont étendus au cas de systèmes décrits par des modèles dynamiques hybrides. La propriété de sûreté doit être prouvée lors de l’évolution à temps continu du système dynamique, mais aussi pendant les transitions du système. Ceci nécessite l’introduction de contraintes supplémentaires qui lient les fonctions barrières associées à chaque mode à temps continu entre elles. Réaliser la synthèse de toutes les fonctions barrières pour les différents modes simultanément n’est envisageable que pour des systèmes de très petite dimension avec peu de modes. Une approche séquentielle a été proposée. Les contraintes liées aux transitions sont introduites progressivement entre les modes pour lesquels une barrière a déjà été obtenue. Lorsque certaines contraintes de transition ne sont pas satisfaites, une méthode de backtracking doit être mise en œuvre afin de synthétiser des barrières offrant une meilleure prise en compte des contraintes de transition non satisfaites. Ces approches ont été évaluées et comparées avec des techniques de l’état de l’art sur des systèmes décrits par des modèles à temps continu et des modèles hybrides. / This thesis addresses the problem of proving the safety of systems described by non-linear dynamical models and hybrid dynamical models. A system is said to be safe if all trajectories of its state do not reach an unsafe region. Proving the safety of systems by explicitly computing all its trajectories when its dynamic is non-linear or when its behavior is described by an hybrid model with non-linear dynamics remains a challenging task. This thesis considers the barrier function approach to prove the safety of a system. A barrier function, when it exists, partitions the state space and isolates the trajectories of the system starting from any possible initial values of the state and the unsafe part of the state space. The set of constraints, which have to be satisfied by a barrier function are usually non-convex, rendering the search of satisfying barrier functions hard. Previously, only polynomial barrier functions were taken in consideration and for systems with polynomial dynamics. This thesis considers relatively general dynamical systems with generic non-linear barrier functions. The solutions presented are based on template barrier functions, constraint satisfaction problems, and interval analysis. The first part of the thesis focuses on non-linear dynamical systems. The barrier function design problem is formulated as a constraint satisfaction problem that can be solved using tools from interval analysis. This formulation allows one to prove the safety of a non-linear dynamical system by finding the parameters of a template barrier function such that all constraints are satisfied using the FPS-CSC algorithm, which has been adapted and supplemented with contractors to improve its efficiency. The second part of the thesis is dedicated to the design of barrier functions for systems described by hybrid dynamical models. Safety properties have to be proven during the continuous-time evolution of the system, but also during transitions. This leads to additional constraints that have to be satisfied by candidate barrier functions. Solving all the constraints simultaneously to find all the barrier functions is usually computationally intractable. In the proposed approach, the algorithm explores all the locations sequentially. Transition constraints are introduced progressively between the already explored locations. Backtracking to previous location is considered when transition constraints are not satisfied. The efficiency of the proposed approaches has been compared with state-of-the-art solutions.
99

A Linter for Static Analysis of MiniZinc Models

Rimskog, Erik January 2021 (has links)
MiniZinc is a modelling language for constraint satisfaction and optimisation problems. It can be used to solve difficult problems by declaratively modelling them and giving them to a generic solver. A linter, a tool for static analysis, is implemented for MiniZinc to provide analysis for improving models. Suggesting rewrites that will speed up solving, removing unnecessary constructs, and pointing out potential problems are examples of the analysis this tool provides. A method for finding points of interest in abstract syntax trees (parsed models) is designed and implemented. The linter is tested and evaluated against models in the MiniZinc Benchmarks, a collection of models used to benchmark solvers. The result from running the linter on one of the models from the benchmarks is more closely inspected and evaluated. The suggestions were correct and made the model simpler, but, unfortunately, there was no noticeable impact on the solving speed.
100

Combating Fault Tolerance Bugs in Cloud Systems

Chen, Haicheng 04 October 2021 (has links)
No description available.

Page generated in 0.4654 seconds