• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 871
  • 125
  • 116
  • 106
  • 63
  • 24
  • 24
  • 21
  • 12
  • 9
  • 8
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 1767
  • 421
  • 360
  • 299
  • 272
  • 263
  • 254
  • 223
  • 211
  • 193
  • 179
  • 172
  • 129
  • 123
  • 123
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
561

Résolution de contraintes sur les flottants dédiée à la vérification de programmes / Constraint solver over floating-point numbers designed for program verification

Belaid, Mohammed 04 December 2013 (has links)
La vérification de programmes avec des calculs sur les nombres à virgule flottante est une étape très importante dans le développement de logiciels critiques. Les calculs sur les nombres flottants sont généralement imprécis, et peuvent dans certains cas diverger par rapport au résultat attendu sur les nombres réels. L’objectif de cette thèse est de concevoir un solveur de contraintes sur les nombres à virgule flottante dédié à la vérification de programmes. Nous présentons dans ce manuscrit une nouvelle méthode de résolution de contraintes sur les flottants. Cette méthode se base principalement sur la sur-approximation des contraintes sur les flottants par des contraintes sur les réels. Cette sur-approximation doit être conservative des solutions sur les flottants. Les contraintes obtenues sont ensuite résolues par un solveur de contraintes sur les réels. Nous avons proposé un algorithme de filtrage des domaines sur les flottants basé sur le concept de la sur-approximation qui utilise des techniques de programmation linéaire. Nous avons aussi proposé une méthode de recherche de solutions basée sur des heuristiques. Cette méthode offre aussi la possibilité de comparer le comportement des programmes par rapport à une spécification sur les réels. Ces méthodes ont été implémentées et expérimentées sur un ensemble de programmes avec du calcul sur les nombres flottants. / The verification of programs with floating-point numbers computation is an important issue in the development of critical software systems. Computations over floating-point numbers are not accurate, and the results may be very different from the expected results over real numbers. The aim of this thesis is to design a constraint solver over floating-point numbers for program verification purposes. We introduce a new method for solving constraints over floating-point numbers. This method is based on an over-approximation of floating-point constraints using constraints over real numbers. This overapproximation is safe, that’s to say it doesn’t loose any solution over the floats. The generated constraints are then solved with a constraint solver over real numbers. We propose a new filtering algorithm using linear programming techniques, which takes advantage of these over-approximations of floating-point constraints. We introduce also new search methods and heuristics to find floating-point solutions of these constraints. Using our implementation, we show on a set of counter-examples the difference of the execution of programs over the floats with the specification over real numbers.
562

Prescriptive Safety-Checks through Automated Proofs for Control-Flow Integrity

Tan, Jiaqi 01 November 2016 (has links)
Embedded software today is pervasive: they can be found everywhere, from coffee makers and medical devices, to cars and aircraft. Embedded software today is also open and connected to the Internet, exposing them to external attacks that can cause its Control-Flow Integrity (CFI) to be violated. Control-Flow Integrity is an important safety property of software, which ensures that the behavior of the software is not inadvertently changed. The violation of CFI in software can cause unintended behaviors, and can even lead to catastrophic incidents in safety-critical systems. This dissertation develops a two-part approach for CFI: (i) prescribing source-code safetychecks, that prevent the root-causes of CFI, that programmers can insert themselves, and (ii) formally proving CFI for the machine-code of programs with source-code safety-checks. First, our prescribed safety-checks, when applied, prevent the root-causes of CFI, thereby enabling software to recover from CFI violations in a customizable way. In addition, our prescribed safety-checks are visible to programmers, empowering them to ensure that the behavior of their software is not inadvertently changed by the prescribed safety-checks. However, programmer-inserted safety-checks may be incomplete. Thus, current techniques for proving CFI, which assume that safety-checks are complete, may not work. Second, this dissertation develops a logic approach that automates formal proofs of CFI for the machine-code of software containing both source-code CFI safety-checks and system calls. We extend an existing trustworthy Hoare logic with new proof rules, proof tactics, and a novel proof-search algorithm, which exploit the principle of local reasoning for safety properties to automatically generate CFI proofs for the machine-code of programs compiled with our prescribed source-code safety-checks. To the best of our knowledge, our approach to CFI is the first to combine programmer-visible source-code enforcement mechanisms for CFI–enabling programmers to customize them and observe that their software is not inadvertently changed–with machine-code proofs of CFI that can be automated, and that does not require a trusted or verified compiler to ensure its proven properties hold in machine-code. We evaluate our CFI approach on realistic embedded software. We evaluate our approach on the MiBench and WCET benchmarks, implementations of common file utilities, and programs interfacing with hardware inputs and outputs on the Raspberry Pi single-board-computer. The variety of our target programs, and our ability to support useful features such as file and hardware inputs and outputs, demonstrate the wide applicability of our approach.
563

Dependence analysis for inferring information flow properties in Spark ADA programs

Thiagarajan, Hariharan January 1900 (has links)
Master of Science / Department of Computing and Information Sciences / John Hatcliff / With the increase in development of safety and security critical systems, it is important to have more sophisticated methods for engineering such systems. It can be difficult to understand and verify critical properties of these systems because of their ever growing size and complexity. Even a small error in a complex system may result in casualty or significant monetary loss. Consequently, there is a rise in the demand for scalable and accurate techniques to enable faster development and verification of high assurance systems. This thesis focuses on discovering dependencies between various parts of a system and leveraging that knowledge to infer information flow properties and to verify security policies specified for the system. The primary contribution of this thesis is a technique to build dependence graphs for languages which feature abstraction and refinement. Inter-procedural slicing and inter-procedural chopping are the techniques used to analyze the properties of the system statically. The approach outlined in this thesis provides a domain-specific language to query the information flow properties and to specify security policies for a critical system. The spec- ified policies can then be verified using existing static analysis techniques. All the above contributions are integrated with a development environment used to develop the critical system. The resulting software development tool helps programmers develop, infer, and verify safety and security systems in a single unified environment.
564

Enhancing the Verification-Driven Learning Model for Data Structures with Visualization

Kondeti, Yashwanth Reddy 04 August 2011 (has links)
The thesis aims at teaching various data structures algorithms using the Visualization Learning tool. The main objective of the work is to provide a learning opportunity for novice computer science students to gain a broader exposure towards data structure programming. The visualization learning tool is based on the Verification-Driven Learning model developed for software engineering. The tool serves as a platform for demonstrating visualizations of various data structures algorithms. All the visualizations are designed to emphasize the important operational features of various data structures. The learning tool encourages students into learning data structures by designing Learning Cases. The Learning Cases have been carefully designed to systematically implant bugs in a properly functioning visualization. Students are assigned the task of analyzing the code and also identify the bugs through quizzing. This provides students with a challenging hands-on learning experience that complements students’ textbook knowledge. It also serves as a significant foundation for pursuing future courses in data structures.
565

Fundamentální analýza měnového kurzu EUR/USD / Fundamental analysis of the exchange rate of EUR/USD

Ševčík, Václav January 2009 (has links)
The aim of this thesis is empirical verification of the fundamental theory of exchange rate determination in the case of the currency pair EUR/USD. The theoretical part is devoted to the issue of exchange rate theory, with emphasis on the importance of the currency pair EUR/USD, and major characteristics of the fundamental theory of exchange rate determination. Attention is also paid to methods of analysis of time series, which will be used in the analytical part. The analytical part is devoted to an empirical verification of the underlying theories. On the basis of these theories are developed econometric models, which are then tested using the methods of linear regression and cointegration. The results of the models and their relevance are discussed in conclusion.
566

Safety verification of model based reinforcement learning controllers using reachability analysis

Akshita Gupta (7047728) 13 August 2019 (has links)
<div>Reinforcement Learning (RL) is a data-driven technique which is finding increasing application in the development of controllers for sequential decision making problems. Their wide adoption can be attributed to the fact that the development of these controllers is independent of the</div><div>knowledge of the system and thus can be used even when the environment dynamics are unknown. Model-Based RL controllers explicitly model the system dynamics from the observed (training) data using a function approximator, followed by using a path planning algorithm to obtain the optimal control sequence. While these controllers have been proven to be successful in simulations, lack of strong safety guarantees in the presence of noise makes them ill-posed for deployment on hardware, specially in safety critical systems. The proposed work aims at bridging this gap by providing a verification framework to evaluate the safety guarantees for a Model-Based RL controller. Our method builds upon reachability analysis to determine if there is any action which can drive the system into a constrained (unsafe) region. Consequently, our method can provide a binary yes or no answer to whether all the initial set of states are (un)safe to propagate trajectories from in the presence of some bounded noise.</div>
567

Ensuring Network Designs Meet Performance Requirements under Failures

Yiyang Chang (6872636) 13 August 2019 (has links)
<div> <div> <div> <p>With the prevalence of web and cloud-based services, there is an ever growing requirement on the underlying network infrastructure to ensure that business critical traffic is continually serviced with acceptable performance. Networks must meet their performance requirements under failures. The global scale of cloud provider networks and the rapid evolution of these networks imply that failures are the norm in production networks today. Unplanned downtime can cost billions of dollars, and cause catastrophic consequences. The thesis is motivated by these challenges and aims to provide a principled solution to certifying network performance under failures. Network performance certification is complicated, due to both the variety of ways a network can fail, and the rich ways a network can respond to failures. The key contributions of this thesis are: (i) a general framework for robustly certifying the worst-case performance of a network across a given set of uncertain scenarios. A key novelty is that the framework models flexible network response enabled by recent emerging trends such as Software-Defined Networking; (ii) a toolkit which automates the key steps needed in robust certification making it suitable for use by a network architect, and which enables experimentation on a wide range of robust certification of practical interest; (iii) Slice, a general framework which efficiently classifies failure scenarios based on whether network performance is acceptable for those scenarios, and which allows reasoning about performance requirements that must be met over a given percentage of scenarios. We also show applications of our frameworks in synthesizing designs that are guaranteed to meet a performance goal over all or a desired percentage of a given set of scenarios. The thesis focuses on wide-area networks, but the approaches apply to data-center networks as well.</p></div></div></div>
568

Approaching real time dynamic signature verification from a systems and control perspective.

Gu, Yi 31 October 2006 (has links)
Student Number : 9901877H MSc Dissertation School of Electrical and Information Engineering Faculty of Engineering and the Built Environment / algorithm. The origins of handwriting idiosyncrasies and habituation are explained using systems theory, and it is shown that the 2/3 power law governing biomechanics motion also applies to handwriting. This leads to the conclusion that it is possible to derive handwriting velocity profiles from a static image, and that a successful forgery of a signature is only possible in the event of the forger being able to generate a signature using natural ballistic motion. It is also shown that significant portion of the underlying dynamic system governing the generation of handwritten signatures can be inferred by deriving time segmented transfer function models of the x and y co-ordinate velocity profiles of a signature. The prototype algorithm consequently developed uses x and y components of pen-tip velocity profiles (vx[n] and vy[n]) to create signature representations based on autoregression-with-exogenous-input (ARX) models. Verification is accomplished using a similarity measure based on the results of a k-step ahead predictor and 5 complementary metrics. Using 350 signatures collected from 21 signers, the system’s false acceptance (FAR) and false rejection (FRR) rates were 2.19% and 27.05% respectively. This high FRR is a result of measurement inadequacies, and it is believed that the algorithm’s FRR is approximately 18%.
569

Système d'identification de personnes basé sur la rétine / Personal identification system based on retina

Chihaoui, Takwa 06 December 2018 (has links)
Notre travail s’inscrit dans le cadre de la biométrie par la rétine. La rétine est la couche sensorielle de l’œil, elle présente une texture riche et unique même chez les jumeaux. Ses propriétés ont fait de la biométrie par la rétine un axe de recherche actif. En effet, de nombreuses méthodes ont été proposées pour les différentes étapes de la méthode biométrique allant du prétraitement de l’image rétinienne à son analyse, en passant par sa caractérisation, afin d’identifier et authentifier un individu. Nous nous intéressons dans ces travaux de thèse, à l’étude, la conception, le développement et l’évaluation d’une nouvelle méthode biométrique basée sur la rétine. Notre première contribution réside dans la conception d’une méthode d’analyse d’image rétinienne saine et pathologique, suivie d’une sélection d’une région d’intérêt autour du disque optique. Cette méthode améliore la qualité de l’image rétinienne et extrait une région d’intérêt la plus stable de la rétine afin de maintenir une densité d’information satisfaisante, pour assurer une meilleure qualité de reconnaissance. Notre deuxième contribution porte sur la proposition d’une nouvelle méthode d’extraction de caractéristiques locales basée sur le descripteur standard SIFT. Elle applique une nouvelle approche reposant sur la suppression des points d’intérêt non informatifs extraits par le descripteur standard SIFT. Cette nouvelle méthode d’extraction des caractéristiques locales réduit le nombre des points d’intérêt redondants tout en maintenant la qualité de la description. Nous avons validé, la méthode biométrique proposée sur différentes bases comprenant des images saines et pathologiques. Les résultats obtenus montrent des performances encourageantes. Ces résultats indiquent, que la méthode que nous avons proposée, localise correctement la région d’intérêt rétinienne. En mode identification, un taux d’identification correcte d’environ 99.8% est atteint. En mode vérification, nous avons obtenu un taux FRR de 0.12% quant aux taux FAR et EER (erreur), ils sont de 0%. L’étude comparative a montré que notre méthode est plus discriminative que d’autres méthodes de l’état de l’art, notamment celles basées sur la segmentation et l’extraction de l’arbre vasculaire / Our work is part of the retina biometrics. The retina is the sensory layer of the eye; it has a rich and unique texture even in twins. Its properties have made the retina biometrics an active research area. Indeed, numerous methods have been proposed for the various stages of the biometric method, from pretreatment of the retinal image to its analysis, through its characterization, in order to identify and authenticate an individual. We are interested in this work in these thesis works, the study, design, development and evaluation of a new biometric method based on the retina. This thesis presents our contributions for each stage of the proposed biometric method. Our first contribution lies in the proposition of a healthy and pathological retinal image analysis method, followed by a selection of a region of interest around the optical disc. This method improves the quality of the retinal image and extracts a more stable region of interest from the retina to maintain a satisfactory information density, to ensure a better quality of recognition. Our second contribution consists in proposing a new method for extracting local characteristics based on the standard SIFT descriptor. It applies a new method based on the removal of non-informative points of interest extracted by the standard SIFT descriptor. This new method of extracting local features reduces the number of redundant points of interest while maintaining the quality of the description. We validated, the proposed biometric method on different bases including healthy and pathological images. This biometric method has yielded encouraging results on healthy and pathological retinal images. The results obtained show encouraging performances. These results indicate that the method we have proposed, correctly locates the retinal region of interest. In identification mode, a correct identification rate of approximately 99.8% is reached. In verification mode, we obtained 0.12% as FRR error rate and 0% for the FAR and EER error rates. The comparative study showed that our method is more discriminative than other state-of-the-art methods, especially those based on segmentation and extraction of the vascular tree
570

Des spécifications en langage naturel aux spécifications formelles via une ontologie comme modèle pivot / From natural language specifications to formal specifications via an ontology as a pivot model

Sadoun, Driss 17 June 2014 (has links)
Le développement d'un système a pour objectif de répondre à des exigences. Aussi, le succès de sa réalisation repose en grande partie sur la phase de spécification des exigences qui a pour vocation de décrire de manière précise et non ambiguë toutes les caractéristiques du système à développer.Les spécifications d'exigences sont le résultat d'une analyse des besoins faisant intervenir différentes parties. Elles sont généralement rédigées en langage naturel (LN) pour une plus large compréhension, ce qui peut mener à diverses interprétations, car les textes en LN peuvent contenir des ambiguïtés sémantiques ou des informations implicites. Il n'est donc pas aisé de spécifier un ensemble complet et cohérent d'exigences. D'où la nécessité d'une vérification formelle des spécifications résultats.Les spécifications LN ne sont pas considérées comme formelles et ne permettent pas l'application directe de méthodes vérification formelles.Ce constat mène à la nécessité de transformer les spécifications LN en spécifications formelles.C'est dans ce contexte que s'inscrit cette thèse.La difficulté principale d'une telle transformation réside dans l'ampleur du fossé entre spécifications LN et spécifications formelles.L'objectif de mon travail de thèse est de proposer une approche permettant de vérifier automatiquement des spécifications d'exigences utilisateur, écrites en langage naturel et décrivant le comportement d'un système.Pour cela, nous avons exploré les possibilités offertes par un modèle de représentation fondé sur un formalisme logique.Nos contributions portent essentiellement sur trois propositions :1) une ontologie en OWL-DL fondée sur les logiques de description, comme modèle de représentation pivot permettant de faire le lien entre spécifications en langage naturel et spécifications formelles; 2) une approche d'instanciation du modèle de représentation pivot, fondée sur une analyse dirigée par la sémantique de l'ontologie, permettant de passer automatiquement des spécifications en langage naturel à leur représentation conceptuelle; et 3) une approche exploitant le formalisme logique de l'ontologie, pour permettre un passage automatique du modèle de représentation pivot vers un langage de spécifications formelles nommé Maude. / The main objective of system development is to address requirements. As such, success in its realisation is highly dependent on a requirement specification phase which aims to describe precisely and unambiguously all the characteristics of the system that should be developed. In order to arrive at a set of requirements, a user needs analysis is carried out which involves different parties (stakeholders). The system requirements are generally written in natural language to garantuee a wider understanding. However, since NL texts can contain semantic ambiguities, implicit information, or other inconsistenties, this can lead to diverse interpretations. Hence, it is not easy to specify a set of complete and consistent requirements, and therefore, the specified requirements must be formally checked. Specifications written in NL are not considered to be formal and do not allow for a direct application of formal methods. We must therefore transform NL requirements into formal specifications. The work presented in this thesis was carried out in this framework. The main difficulty of such transformation is the gap between NL requirements and formal specifications. The objective of this work is to propose an approach for an automatic verification of user requirements which are written in natural language and describe a system's expected behaviour. Our approach uses the potential offered by a representation model based on a logical formalism. Our contribution has three main aspects: 1) an OWL-DL ontology based on description logic, used as a pivot representation model that serves as a link between NL requirements to formal specifications; 2) an approach for the instantiation of the pivot ontology, which allows an automatic transformation of NL requirements to their conceptual representations; and 3) an approach exploiting the logical formalism of the ontology in order to automatically translate the ontology into a formal specification language called Maude.

Page generated in 0.1257 seconds