• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 14
  • 11
  • 7
  • 1
  • Tagged with
  • 73
  • 73
  • 48
  • 47
  • 47
  • 47
  • 38
  • 27
  • 24
  • 23
  • 23
  • 14
  • 13
  • 11
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Multioperator Weighted Monadic Datalog

Stüber, Torsten 06 May 2011 (has links) (PDF)
In this thesis we will introduce multioperator weighted monadic datalog (mwmd), a formal model for specifying tree series, tree transformations, and tree languages. This model combines aspects of multioperator weighted tree automata (wmta), weighted monadic datalog (wmd), and monadic datalog tree transducers (mdtt). In order to develop a rich theory we will define multiple versions of semantics for mwmd and compare their expressiveness. We will study normal forms and decidability results of mwmd and show (by employing particular semantic domains) that the theory of mwmd subsumes the theory of both wmd and mdtt. We conclude this thesis by showing that mwmd even contain wmta as a syntactic subclass and present results concerning this subclass.
32

A formal language theory approach to music generation

Schulze, Walter 03 1900 (has links)
Thesis (MSc (Mathematical Sciences))-- University of Stellenbosch, 2010. / ENGLISH ABSTRACT: We investigate the suitability of applying some of the probabilistic and automata theoretic ideas, that have been extremely successful in the areas of speech and natural language processing, to the area of musical style imitation. By using music written in a certain style as training data, parameters are calculated for (visible and hidden) Markov models (of mixed, higher or first order), in order to capture the musical style of the training data in terms of mathematical models. These models are then used to imitate two instrument music in the trained style. / AFRIKAANSE OPSOMMING: Hierdie tesis ondersoek die toepasbaarheid van probabilitiese en outomaatteoretiese konsepte, wat uiters suksesvol toegepas word in die gebied van spraak en natuurlike taal-verwerking, op die gebied van musiekstyl nabootsing. Deur gebruik te maak van musiek wat geskryf is in ’n gegewe styl as aanleer data, word parameters vir (sigbare en onsigbare) Markov modelle (van gemengde, hoër- of eerste- orde) bereken, ten einde die musiekstyl van die data waarvan geleer is, in terme van wiskundige modelle te beskryf. Hierdie modelle word gebruik om musiek vir twee instrumente te genereer, wat die musiek waaruit geleer is, naboots.
33

Extensions of Presburger arithmetic and model checking one-counter automata

Lechner, Antonia January 2016 (has links)
This thesis concerns decision procedures for fragments of linear arithmetic and their application to model-checking one-counter automata. The first part of this thesis covers the complexity of decision problems for different types of linear arithmetic, namely the existential subset of the first-order linear theory over the p-adic numbers and the existential subset of Presburger arithmetic with divisibility, with all integer constants and coefficients represented in binary. The most important result of this part is a new upper complexity bound of <b>NEXPTIME</b> for existential Presburger arithmetic with divisibility. The best bound that was known previously was <b>2NEXPTIME</b>, which followed directly from the original proof of decidability of this theory by Lipshitz in 1976. Lipshitz also gave a proof of <b>NP</b>-hardness of the problem in 1981. Our result is the first improvement of the bound since this original description of a decision procedure. Another new result, which is both an important building block in establishing our new upper complexity bound for existential Presburger arithmetic with divisibility and an interesting result in its own right, is that the decision problem for the existential linear theory of the p-adic numbers is in the Counting Hierarchy <b>CH</b>, and thus within <b>PSPACE</b>. The precise complexity of this problem was stated as open by Weispfenning in 1988, who showed that it is in <b>EXPTIME</b> and <b>NP</b>-hard. The second part of this thesis covers two problems concerning one-counter automata. The first problem is an LTL synthesis problem on one-counter automata with integer-valued and parameterised updates and with equality tests. The decidability of this problem was stated as open by G&ouml;ller et al. in 2010. We give a reduction of this problem to the decision problem of a subset of Presburger arithmetic with divisibility with one quantifier alternation and a restriction on existentially quantified variables. A proof of decidability of this theory is currently under review. The final result of this thesis concerns a type of one-counter automata that differs from the previous one in that it allows parameters only on tests, not actions, and it includes both equality and disequality tests on counter values. The decidability of the basic reachability problem on such one-counter automata was stated as an open problem by Demri and Sangnier in 2010. We show that this problem is decidable by a reduction to the decision problem for Presburger arithmetic.
34

Algorithmic verification problems in automata-theoretic settings

Bundala, Daniel January 2014 (has links)
Problems in formal verification are often stated in terms of finite automata and extensions thereof. In this thesis we investigate several such algorithmic problems. In the first part of the thesis we develop a theory of completeness thresholds in Bounded Model Checking. A completeness threshold for a given model M and a specification &phi; is a bound k such that, if no counterexample to &phi; of length k or less can be found in M, then M in fact satisfies &phi;. We settle a problem of Kroening et al. [KOS<sup>+</sup>11] in the affirmative, by showing that the linearity problem for both regular and &omega;-regular specifications (provided as finite automata and Buchi automata respectively) is PSPACE-complete. Moreover, we establish the following dichotomies: for regular specifications, completeness thresholds are either linear or exponential, whereas for &omega;-regular specifications, completeness thresholds are either linear or at least quadratic in the recurrence diameter of the model under consideration. Given a formula in a temporal logic such as LTL or MTL, a fundamental problem underpinning automata-based model checking is the complexity of evaluating the formula on a given finite word. For LTL, the complexity of this task was recently shown to be in NC [KF09]. In the second part of the thesis we present an NC algorithm for MTL, a quantitative (or metric) extension of LTL, and give an AC<sup>1</sup> algorithm for UTL, the unary fragment of LTL. We then establish a connection between LTL path checking and planar circuits which, among others, implies that the complexity of LTL path checking depends on the Boolean connectives allowed: adding Boolean exclusive or yields a temporal logic with P-complete path-checking problem. In the third part of the thesis we study the decidability of the reachability problem for parametric timed automata. The problem was introduced over 20 years ago by Alur, Henzinger, and Vardi [AHV93]. It is known that for three or more parametric clocks the problem is undecidable. We translate the problem to reachability questions in certain extensions of parametric one-counter machines. By further reducing to satisfiability in Presburger arithmetic with divisibility, we obtain decidability results for several classes of parametric one-counter machines. As a corollary, we show that, in the case of a single parametric clock (with arbitrarily many nonparametric clocks) the reachability problem is NEXP-complete, improving the nonelementary decision procedure of Alur et al. The case of two parametric clocks is open. Here, we show that the reachability is decidable in this case of automata with a single parameter.
35

Sistemas dinâmicos de eventos discretos com aplicação ao fluxo geodésico em superfícies hiperbólicas / Discrete event dynamical systems with application to the geodesic flow on hyperbolic surfaces

Chaves, Daniel Pedro Bezerra 12 May 2011 (has links)
Orientador: Reginaldo Palazzo Júnior / Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação / Made available in DSpace on 2018-08-19T10:50:06Z (GMT). No. of bitstreams: 1 Chaves_DanielPedroBezerra_D.pdf: 1159929 bytes, checksum: 06894c7e904c6209a690af3080f7cc32 (MD5) Previous issue date: 2011 / Resumo: Neste trabalho apresentamos um método de descrição combinatorial para o fluxo geodesico sobre uma região hiperbólica compacta, tendo como objetivo associar a seqüências de codificação, parâmetros topologicos oriundos destas superfícies. Isto permite conjugar conceitos topologicos e combinatoriais oriundos das superfícies estudadas com conceitos de teoria da informação e codificação. Demonstramos como a propriedade de completude de um sistema dinâmico de eventos discretos invariantes no tempo se reflete na topologia do espaço de trajetórias do sistema, quando especificadas por seqüências bi-infinitas e descritas sobre um alfabeto finito. A mesma estrutura obtida pelo processo de codificação do fluxo geodesico, e a qual passamos a chamar de sistema simbólico fechado (ssf). Identificamos como um ssf pode ser caracterizado globalmente, através do seu conjunto de restrições irredutíveis, ou localmente, por conjuntos de restrições dependentes do contexto. Ambas derivadas de relações de ordem parcial. Disto determinamos métodos de representação do ssf. Através da relação entre os métodos de codificação aritmético e geométrico, propomos processos de codificação sobre superfícies hiperbólicas, determinando como as representações mínimas das seqüências código do fluxo geodesico podem ser construídas a partir das propriedades topológicas e combinatoriais da superfície / Abstract: In this work we present methods for a combinatorial description of the geodesic flow on a hyperbolic compact surface, with the intent of identifying how the topological parameters of the surface may be associated with discrete sequences. This approach allows to conjugate the topological and combinatorial properties of a surface with concepts of information theory and coding. We determine the intrinsic topological property of complete and time-invariant discrete dynamical systems whose trajectories are bi-infinite sequences over a finite alphabet. The same structure generated by the geodesic flow coding methods, that we call shift space. We show how a shift space can be completely characterized by the irreducible forbidden set and locally by the constraint sets, and how both can be obtained through partial order relations. As consequence of these results, some constructions to represent the shift spaces are proposed. Methods for coding source sequences on hyperbolic surfaces are proposed, based on T-piecewise and common-sets relations that exist between these methods. We conclude by specifying a construction procedure for presentations of arithmetic codes that is related with the topological and combinatorial properties of the hyperbolic surface / Doutorado / Telecomunicações e Telemática / Doutor em Engenharia Elétrica
36

Raisonnement automatisé pour la logique de séparation avec des définitions inductives / Automated reasoning in separation logic with inductive definitions

Serban, Cristina 31 May 2018 (has links)
La contribution principale de cette thèse est un système de preuve correct et complet pour les implications entre les prédicats inductifs, fréquemment rencontrées lors de la vérification des programmes qui utilisent des structures de données récursives allouées dynamiquement. Nous introduisons un système de preuve généralisé pour la logique du premier ordre et nous l'adaptons à la logique de séparation, car ceci est un cadre qui répond aux plusieurs difficultés posées par le raisonnement sur les tas alloués dynamiquement. La correction et la complétude sont assurées par quatre restrictions sémantiques et nous proposons également un semi-algorithme de recherche de preuves qui devient une procédure de décision pour le problème d'implication lorsque les restrictions sémantiques sont respectées.Ce raisonnement d'ordre supérieur sur les implications nécessite des procédures de décision de premier ordre pour la logique sous-jacente lors de l'application des règles d'inférence et lors de la recherche des preuves. Ainsi, nous fournissons deux procédures de décision pour la logique de séparation, en considérant le fragment sans quantificateurs et le fragment quantifié de façon Exists*Forall*, qui ont été intégrées dans le solveur SMT open source CVC4.Finalement, nous présentons une implémentation de notre système de preuve pour la logique de séparation, qui utilise ces procédures de décision. Étant donné des prédicats inductifs et une requête d'implication, un avertissement est émis lorsqu'une ou plusieurs restrictions sémantiques sont violées. Si l'implication est valide, la sortie est une preuve. Sinon, un ou plusieurs contre-exemples sont fournis. / The main contribution of this thesis is a sound and complete proof system for entailments between inductive predicates, which are frequently encountered when verifying programs that work with dynamically allocated recursive data structures. We introduce a generalized proof system for first-order logic, and then adapt it to separation logic, a framework that addresses many of the difficulties posed by reasoning about dynamically allocated heaps. Soundness and completeness are ensured through four semantic restrictions and we also propose a proof-search semi-algorithm that becomes a decision procedure for the entailment problem when the semantic restrictions hold.This higher-order reasoning about entailments requires first-order decision procedures for the underlying logic when applying inference rules and during proof search. Thus, we provide two decision procedures for separation logic, considering the quantifier-free and the Exists*Forall*-quantified fragments, which were integrated in the open-source, DPLL(T)-based SMT solver CVC4.Finally, we also give an implementation of our proof system for separation logic, which uses these decision procedures. Given some inductive predicate definitions and an entailment query as input, a warning is issued when one or more semantic restrictions are violated. If the entailment is found to be valid, the output is a proof. Otherwise, one or more counterexamples are provided.
37

Advancements in Dependability Analysis of Safety-Critical Systems : Addressing Specification Formulation and Verification Challenges / Framsteg inom tillförlitlighetsanalys av säkerhetskritiska system : Utmaningar inom specifikationsformulering och verifiering

Yu, Zelin January 2023 (has links)
Safety-critical systems have garnered increasing attention, particularly regarding their dependability analysis. In modern times, these systems comprise numerous components, making it crucial to verify that lower-level components adhere to their specifications will ensure the overall system’s compliance with its top-level specification. However, two issues arise in this verification process. Firstly, many industrial applications lack lower-level natural-language specifications for their components, relying solely on toplevel specifications. Secondly, many current verification algorithms need to explore the continuous time evolution of the behavioral combinations of these components, and the combination of components to be explored will rise exponentially with the number of components. To address these challenges, this paper presents significant contributions. Firstly, it introduces a novel method that leverages the structures of redundancy systems to create naturallanguage specifications for components derived from a top-level specification. This approach facilitates a more efficient decomposition of the top-level specification, allowing for greater ease in handling component behaviors. Secondly, the proposed method is successfully applied to Scania’s brake system, leading to the decomposition of its top-level specification. To verify this decomposition, an existing verification algorithm is selected, and the results are impressive. The proposed method effectively addresses the issue of exponential growth in component behavior combinations, which was previously mentioned. Specifically, in the case of the Scania brake system, the number of combinations is dramatically reduced from 27 to a mere 13, showcasing the significant improvement achieved with the new method. / Säkerhetskritiska system har fått ökad uppmärksamhet, särskilt när det gäller deras pålitlighetsanalys. I moderna tider består dessa system av talrika komponenter, vilket gör det avgörande att verifiera att komponenter på lägre nivå följer sina specifikationer för att säkerställa att hela systemet följer sin övergripande specifikation. Två utmaningar uppstår dock i denna verifieringsprocess. För det första saknar många industriella tillämpningar naturligspråksspecifikationer för komponenter på lägre nivå och förlitar sig enbart på övergripande specifikationer. För det andra behöver många nuvarande verifieringsalgoritmer utforska de kontinuerliga tidsutvecklingarna av beteendekombinationer hos dessa komponenter, och antalet kombinationer som ska utforskas ökar exponentiellt med antalet komponenter. För att tackla dessa utmaningar presenterar den här artikeln betydande bidrag. För det första introducerar den en ny metod som utnyttjar strukturer i redundanta system för att skapa naturligspråksspecifikationer för komponenter som härleds från en övergripande specifikation. Denna metod underlättar en mer effektiv uppdelning av övergripande specifikation, vilket gör det enklare att hantera komponentbeteenden. För det andra tillämpas den föreslagna metoden framgångsrikt på Scanias bromssystem, vilket leder till en uppdelning av dess övergripande specifikation. För att verifiera denna uppdelning väljs en befintlig verifieringsalgoritm, och resultaten är imponerande. Den föreslagna metoden hanterar effektivt problemet med exponentiell tillväxt i komponentbeteendekombinationer, vilket tidigare nämnts. Specifikt, för Scanias bromssystem minskar antalet kombinationer dramatiskt från 27 till endast 13, vilket tydligt visar den betydande förbättring som uppnåtts med den nya metoden.
38

La coordination dans les grammaires d'interaction / Coordination in interaction grammars

Le Roux, Joseph 17 October 2007 (has links)
Cette thèse présente une modélisation des principaux aspects syntaxiques de la coordination dans les grammaires d'interaction de Guy Perrier. Les grammaires d'interaction permettent d'expliciter la valence des groupes conjoints. C'est précisément sur cette notion qu'est fondée notre modélisation. Nous présentons également tous les travaux autour de cette modélisation qui nous ont permis d'aboutir à une implantation réaliste: le développement du logiciel XMG et son utilisation pour l'écriture de grammaires lexicalisées, le filtrage lexical par intersection d'automates et l'analyse syntaxique. / This thesis presents a modelisation of the main syntactical aspects of coordination using Guy Perrier's Interaction Grammars as the target formalism. Interaction Grammars make it possible to explicitly define conjuncts' valencies. This is precisely what our modelisation is based upon. We also present work around this modelisation that enabled us to provide a realistic implementation: lexicalized grammar development (using our tool XMG), lexical disambiguation based on automata intersection and parsing.
39

Ambiguity Detection for Programming Language Grammars

Basten, Bas 15 December 2011 (has links) (PDF)
Context-free grammars are the most suitable and most widely used method for describing the syntax of programming languages. They can be used to generate parsers, which transform a piece of source code into a tree-shaped representation of the code's syntactic structure. These parse trees can then be used for further processing or analysis of the source text. In this sense, grammars form the basis of many engineering and reverse engineering applications, like compilers, interpreters and tools for software analysis and transformation. Unfortunately, context-free grammars have the undesirable property that they can be ambiguous, which can seriously hamper their applicability. A grammar is ambiguous if at least one sentence in its language has more than one valid parse tree. Since the parse tree of a sentence is often used to infer its semantics, an ambiguous sentence can have multiple meanings. For programming languages this is almost always unintended. Ambiguity can therefore be seen as a grammar bug. A specific category of context-free grammars that is particularly sensitive to ambiguity are character-level grammars, which are used to generate scannerless parsers. Unlike traditional token-based grammars, character-level grammars include the full lexical definition of their language. This has the advantage that a language can be specified in a single formalism, and that no separate lexer or scanner phase is necessary in the parser. However, the absence of a scanner does require some additional lexical disambiguation. Character-level grammars can therefore be annotated with special disambiguation declarations to specify which parse trees to discard in case of ambiguity. Unfortunately, it is very hard to determine whether all ambiguities have been covered. The task of searching for ambiguities in a grammar is very complex and time consuming, and is therefore best automated. Since the invention of context-free grammars, several ambiguity detection methods have been developed to this end. However, the ambiguity problem for context-free grammars is undecidable in general, so the perfect detection method cannot exist. This implies a trade-off between accuracy and termination. Methods that apply exhaustive searching are able to correctly find all ambiguities, but they might never terminate. On the other hand, approximative search techniques do always produce an ambiguity report, but these might contain false positives or false negatives. Nevertheless, the fact that every method has flaws does not mean that ambiguity detection cannot be useful in practice. This thesis investigates ambiguity detection with the aim of checking grammars for programming languages. The challenge is to improve upon the state-of-the-art, by finding accurate enough methods that scale to realistic grammars. First we evaluate existing methods with a set of criteria for practical usability. Then we present various improvements to ambiguity detection in the areas of accuracy, performance and report quality. The main contributions of this thesis are two novel techniques. The first is an ambi- guity detection method that applies both exhaustive and approximative searching, called AMBIDEXTER. The key ingredient of AMBIDEXTER is a grammar filtering technique that can remove harmless production rules from a grammar. A production rule is harmless if it does not contribute to any ambiguity in the grammar. Any found harmless rules can therefore safely be removed. This results in a smaller grammar that still contains the same ambiguities as the original one. However, it can now be searched with exhaustive techniques in less time. The grammar filtering technique is formally proven correct, and experimentally validated. A prototype implementation is applied to a series of programming language grammars, and the performance of exhaustive detection methods are measured before and after filtering. The results show that a small investment in filtering time can substantially reduce the run-time of exhaustive searching, sometimes with several orders of magnitude. After this evaluation on token-based grammars, the grammar filtering technique is extended for use with character-level grammars. The extensions deal with the increased complexity of these grammars, as well as their disambiguation declarations. This enables the detection of productions that are harmless due to disambiguation. The extentions are experimentally validated on another set of programming language grammars from practice, with similar results as before. Measurements show that, even though character-level grammars are more expensive to filter, the investment is still very worthwhile. Exhaustive search times were again reduced substantially. The second main contribution of this thesis is DR. AMBIGUITY, an expert system to help grammar developers to understand and solve found ambiguities. If applied to an ambiguous sentence, DR. AMBIGUITY analyzes the causes of the ambiguity and proposes a number of applicable solutions. A prototype implementation is presented and evaluated with a mature Java grammar. After removing disambiguation declarations from the grammar we analyze sentences that have become ambiguous by this removal. The results show that in all cases the removed filter is proposed by DR. AMBIGUITY as a possible cure for the ambiguity. Concluding, this thesis improves ambiguity detection with two novel methods. The first is the ambiguity detection method AMBIDEXTER that applies grammar filtering to substantially speed up exhaustive searching. The second is the expert system DR. AMBIGUITY that automatically analyzes found ambiguities and proposes applicable cures. The results obtained with both methods show that automatic ambiguity detection is now ready for realistic programming language grammars.
40

Vérification semi-automatique de primitives cryptographiques

Heraud, Sylvain 12 March 2012 (has links) (PDF)
CertiCrypt est une bibliothèque qui permet de vérifier la sécurité exacte de primitives cryptographiques dans l'assistant à la preuve Coq. CertiCrypt instrumente l'approche des preuves par jeux, et repose sur de nombreux domaines comme les probabilités, la complexité, l'algèbre, la sémantique des langages de programmation, et les optimisations de programmes. Dans cette thèse, nous présentons deux exemples d'utilisation d'EasyCrypt: le schéma d'encryption Hashed ElGamal, et les protocoles à connaissance nulle. Ces exemples, ainsi que les travaux antérieurs sur CertiCrypt, démontrent qu'il est possible de formaliser des preuves complexes; toutefois, l'utilisation de CertiCrypt demande une bonne expertise en Coq, et demeure laborieuse. Afin de faciliter l'adoption des preuves formelles par la communauté cryptographique, nous avons développé EasyCrypt, un outil semi-automatique capable de reconstruire des preuves formelles de sécurité à partir d'une ébauche formelle de preuve. EasyCrypt utilise des outils de preuves automatiques pour vérifier les ébauches de preuves, puis les compiles vers des preuves vérifiables avec CertiCrypt. Nous validons EasyCrypt en prouvant à nouveau Hashed ElGamal, et comparons cette nouvelle preuve avec celle en CertiCrypt. Nous prouvons également le schéma d'encryption Cramer-Shoup. Enfin, nous expliquerons comment étendre le langage de CertiCrypt à des classes de complexité implicite, permettant de modéliser la notion de fonctions en temps polynomial.

Page generated in 0.059 seconds