• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 282
  • 148
  • 84
  • 32
  • 27
  • 14
  • 14
  • 13
  • 10
  • 6
  • 6
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 737
  • 137
  • 130
  • 93
  • 85
  • 84
  • 82
  • 80
  • 65
  • 60
  • 49
  • 48
  • 48
  • 46
  • 46
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
171

Elliptic curve cryptography, zero-knowledge proof, and Lamport's hash chain in a distributed authentication system

Chang, Simon Yi-Fan January 2013 (has links)
Thesis (M.S.C.S.) PLEASE NOTE: Boston University Libraries did not receive an Authorization To Manage form for this thesis or dissertation. It is therefore not openly accessible, though it may be available by request. If you are the author or principal advisor of this work and would like to request open access for it, please contact us at open-help@bu.edu. Thank you. / This paper proposes a novel distributed authentication system that uses robust alternatives in cryptographic algorithms to grant a third-party access to personal data without compromising a user's credentials. The paper examines briefly the concept of distributed authentication systems, and discusses how elliptic curve cryptography and Lamport's hash chain can operate in a zero-knowledge proof to establish and manage trust. The paper also discusses how this design avoids some of the most common flaws in distributed authentication systems. Finally, based on results from tests conducted with included source codes, the paper argues that increasing number of rounds of zero-knowledge proof yields substantially faster performance than increasing the modulus for elliptic curve calculations while maintaining comparable levels of security. / 2999-01-01
172

Relating Understanding of Inverse and Identity to Engagement in Proof in Abstract Algebra

Plaxco, David Bryant 05 September 2015 (has links)
In this research, I set out to elucidate the relationships that might exist between students' conceptual understanding upon which they draw in their proof activity. I explore these relationships using data from individual interviews with three students from a junior-level Modern Algebra course. Each phase of analysis was iterative, consisting of iterative coding drawing on grounded theory methodology (Charmaz, 2000, 2006; Glaser and Strauss, 1967). In the first phase, I analyzed the participants' interview responses to model their conceptual understanding by drawing on the form/function framework (Saxe, et al., 1998). I then analyzed the participants proof activity using Aberdein's (2006a, 2006b) extension of Toulmin's (1969) model of argumentation. Finally, I analyzed across participants' proofs to analyze emerging patterns of relationships between the models of participants' understanding of identity and inverse and the participants' proof activity. These analyses contributed to the development of three emerging constructs: form shifts in service of sense-making, re-claiming, and lemma generation. These three constructs provide insight into how conceptual understanding relates to proof activity. / Ph. D.
173

Normalisation & equivalence in proof theory & type theory

Lengrand, Stéphane J. E. January 2006 (has links)
At the heart of the connections between Proof Theory and Type Theory, the Curry-Howard correspondence provides proof-terms with computational features and equational theories, i.e. notions of normalisation and equivalence. This dissertation contributes to extend its framework in the directions of proof-theoretic formalisms (such as sequent calculus) that are appealing for logical purposes like proof-search, powerful systems beyond propositional logic such as type theories, and classical (rather than intuitionistic) reasoning. Part I is entitled Proof-terms for Intuitionistic Implicational Logic. Its contributions use rewriting techniques on proof-terms for natural deduction (Lambda-calculus) and sequent calculus, and investigate normalisation and cut-elimination, with call-by-name and call-by-value semantics. In particular, it introduces proof-term calculi for multiplicative natural deduction and for the depth-bounded sequent calculus G4. The former gives rise to the calculus Lambdalxr with explicit substitutions, weakenings and contractions that refines the Lambda-calculus and Beta-reduction, and preserves strong normalisation with a full notion of composition of substitutions. The latter gives a new insight to cut-elimination in G4. Part II, entitled Type Theory in Sequent Calculus develops a theory of Pure Type Sequent Calculi (PTSC), which are sequent calculi that are equivalent (with respect to provability and normalisation) to Pure Type Systems but better suited for proof-search, in connection with proof-assistant tactics and proof-term enumeration algorithms. Part III, entitled Towards Classical Logic, presents some approaches to classical type theory. In particular it develops a sequent calculus for a classical version of System F_omega. Beyond such a type theory, the notion of equivalence of classical proofs becomes crucial and, with such a notion based on parallel rewriting in the Calculus of Structures, we compute canonical representatives of equivalent proofs.
174

Of Proofs, Mathematicians, and Computers

Yepremyan, Astrik 01 January 2015 (has links)
As computers become a more prevalent commodity in mathematical research and mathematical proof, the question of whether or not a computer assisted proof can be considered a mathematical proof has become an ongoing topic of discussion in the mathematics community. The use of the computer in mathematical research leads to several implications about mathematics in the present day including the notion that mathematical proof can be based on empirical evidence, and that some mathematical conclusions can be achieved a posteriori instead of a priori, as most mathematicians have done before. While some mathematicians are open to the idea of a computer-assisted proof, others are skeptical and would feel more comfortable if presented with a more traditional proof, as it is more surveyable. A surveyable proof enables mathematicians to see the validity of a proof, which is paramount for mathematical growth, and offer critique. In my thesis, I will present the role that the mathematical proof plays within the mathematical community, and thereby conclude that because of the dynamics of the mathematical community and the constant activity of proving, the risks that are associated with a mistake that stems from a computer-assisted proof can be caught by the scrupulous activity of peer review in the mathematics community. Eventually, as the following generations of mathematicians become more trained in using computers and in computer programming, they will be able to better use computers in producing evidence, and in turn, other mathematicians will be able to both understand and trust the resultant proof. Therefore, it remains that whether or not a proof was achieved by a priori or a posteriori, the validity of a proof will be determined by the correct logic behind it, as well as its ability to convince the members of the mathematical community—not on whether the result was reached a priori with a traditional proof, or a posteriori with a computer-assisted proof.
175

Graphical representation of canonical proof : two case studies

Heijltjes, Willem Bernard January 2012 (has links)
An interesting problem in proof theory is to find representations of proof that do not distinguish between proofs that are ‘morally’ the same. For many logics, the presentation of proofs in a traditional formalism, such as Gentzen’s sequent calculus, introduces artificial syntactic structure called ‘bureaucracy’; e.g., an arbitrary ordering of freely permutable inferences. A proof system that is free of bureaucracy is called canonical for a logic. In this dissertation two canonical proof systems are presented, for two logics: a notion of proof nets for additive linear logic with units, and ‘classical proof forests’, a graphical formalism for first-order classical logic. Additive linear logic (or sum–product logic) is the fragment of linear logic consisting of linear implication between formulae constructed only from atomic formulae and the additive connectives and units. Up to an equational theory over proofs, the logic describes categories in which finite products and coproducts occur freely. A notion of proof nets for additive linear logic is presented, providing canonical graphical representations of the categorical morphisms and constituting a tractable decision procedure for this equational theory. From existing proof nets for additive linear logic without units by Hughes and Van Glabbeek (modified to include the units naively), canonical proof nets are obtained by a simple graph rewriting algorithm called saturation. Main technical contributions are the substantial correctness proof of the saturation algorithm, and a correctness criterion for saturated nets. Classical proof forests are a canonical, graphical proof formalism for first-order classical logic. Related to Herbrand’s Theorem and backtracking games in the style of Coquand, the forests assign witnessing information to quantifiers in a structurally minimal way, reducing a first-order sentence to a decidable propositional one. A similar formalism ‘expansion tree proofs’ was presented by Miller, but not given a method of composition. The present treatment adds a notion of cut, and investigates the possibility of composing forests via cut-elimination. Cut-reduction steps take the form of a rewrite relation that arises from the structure of the forests in a natural way. Yet reductions are intricate, and initially not well-behaved: from perfectly ordinary cuts, reduction may reach unnaturally configured cuts that may not be reduced. Cutelimination is shown using a modified version of the rewrite relation, inspired by the game-theoretic interpretation of the forests, for which weak normalisation is shown, and strong normalisation is conjectured. In addition, by a more intricate argument, weak normalisation is also shown for the original reduction relation.
176

Certification of a Tool Chain for Deductive Program Verification

Herms, Paolo 14 January 2013 (has links) (PDF)
This thesis belongs to the domain of software verification. The goalof verifying software is to ensure that an implementation, a program,satisfies the requirements, the specification. This is especiallyimportant for critical computer programs, such as control systems forair planes, trains and power plants. Here a malfunctioning occurringduring operation would have catastrophic consequences. Software requirements can concern safety or functioning. Safetyrequirements, such as not accessing memory locations outside validbounds, are often implicit, in the sense that any implementation isexpected to be safe. On the other hand, functional requirementsspecify what the program is supposed to do. The specification of aprogram is often expressed informally by describing in English or someother natural language the mission of a part of the program code.Usually program verification is then done by manual code review,simulation and extensive testing. But this does not guarantee that allpossible execution cases are captured. Deductive program proving is a complete way to ensure soundness of theprogram. Here a program along with its specificationis a mathematical object and its desired properties are logicaltheorems to be formally proved. This way, if the underlying logicsystem is consistent, we can be absolutely sure that the provenproperty holds for the program in any case.Generation of verification conditions is a technique helpingthe programmer to prove the properties he wants about his programs.Here a VCG tool analyses a program and its formal specification andproduces a mathematical formula, whose validity implies the soundnessof the program with respect to its specification. This is particularlyinteresting when the generated formulas can be proved automatically byexternal SMT solvers.This approach is based on works of Hoare and Dijkstra and iswell-understood and shown correct in theory. Deductive verificationtools have nowadays reached a maturity allowing them to be used inindustrial context where a very high level of assurance isrequired. But implementations of this approach must deal with allkinds of language features and can therefore become quite complex andcontain errors -- in the worst case stating that a program correcteven if it is not. This raises the question of the level ofconfidence granted to these tools themselves. The aim of this thesis is to address this question. We develop, inthe Coq system, a certified verification-condition generator (VCG) forACSL-annotated C programs.Our first contribution is the formalisation of an executableVCG for the Whycert intermediate language,an imperative language with loops, exceptions and recursive functionsand its soundness proof with respect to the blocking big-step operational semantics of the language.A second contribution is the formalisation of the ACSL logicallanguage and the semantics of ACSL annotations of Compcert's Clight.From the compilation of ACSL annotated Clight programs to Whycertprograms and its semantics preservation proof combined with a Whycertaxiomatisation of the Compcert memory model results our maincontribution: an integrated certified tool chainfor verification of C~programs on top of Compcert. By combining oursoundness result with the soundness of the Compcert compiler we obtaina Coq theorem relating the validity of the generated proof obligationswith the safety of the compiled assembly code.
177

La charge de la preuve en droit civil / Burden of proof in civil law

Hoffschir, Nicolas 11 December 2014 (has links)
La charge de la preuve constitue un concept original, qui porte le sceau des évolutions du temps et des fondements du Droit. Historiquement, la notion de charge de la preuve désigne une tâche individuelle, celle du plaideur qui doit, par son seul effort, convaincre le juge du bien-fondé de sa cause. Aujourd’hui, en raison de l’essor de l’idée de vérité et de la volonté de renforcer les liens de solidarité unissant les individus, elle est appréhendée comme une exigence générale de comportement imposant à tout justiciable de contribuer à la manifestation de la vérité. Or, il est inopportun d’assimiler l’ensemble des devoirs probatoires à des charges. De fait, seuls ceux dont un plaideur doit spontanément s’accomplir afin de faire triompher sa cause doivent être qualifiés ainsi. Cela permet alors de concevoir que la charge de la preuve n’impose pas uniquement des devoirs durant le procès mais, également, avant toute saisine du juge. Tenu de réunir des preuves et de les produire en justice, le titulaire de la charge de la preuve n’est pas toujours en mesure d’assumer la tâche qui lui incombe. Le législateur ou le juge peuvent alors fournir des remèdes en facilitant ou en dispensant le titulaire de la charge de la preuve d’accomplir ses devoirs. Rétablie dans sa cohérence, la charge de la preuve permet ainsi de comprendre l’utilité de certains mécanismes techniques et d’opérer une lecture nouvelle du droit positif. / The burden of proof constitutes an original concept which epitomizes the evolution of time and of the founding principles of law. Historically, the notion of burden of proof referred to the individual role of the litigant who, through his own effort, had to convince the judge of the soundness of his cause. Nowadays, considering the importance of truth in our society as well as the willingness to tighten solidarity between individuals, it is considered as a basic requirement for a litigant to contribute to the emergence of truth. Yet, it is inappropriate to make confusion between probationary duties and charges. As a matter of fact, only the duties that the litigant has to carry out in order to win over his cause can be qualified as burden of proof. This implies that the burden of proof not only imposes duties during the trial but also before the referral of the case to court. Bound to gather proofs and produce them in court, the incumbent is not always in a situation to assume the burden of the proof. Legal precedents (law, jurisprudence) can then be used to either facilitate or to exempt the former of his obligations. In light of this new coherence, the burden of proof facilitates the understanding of certain technical mechanisms and allows for a new reading of the applicable law.
178

Repenser la bibliothèque réelle de Coq : vers une formalisation de l'analyse classique mieux adaptée / Reinventing Coq's Reals library : toward a more suitable formalization of classical analysis

Lelay, Catherine 15 June 2015 (has links)
L'analyse réelle a de nombreuses applications car c'est un outil approprié pour modéliser de nombreux phénomènes physiques et socio-économiques. En tant que tel, sa formalisation dans des systèmes de preuve formelle est justifié pour permettre aux utilisateurs de vérifier formellement des théorèmes mathématiques et l'exactitude de systèmes critiques. La bibliothèque standard de Coq dispose d'une axiomatisation des nombres réels et d'une bibliothèque de théorèmes d'analyse réelle. Malheureusement, cette bibliothèque souffre de nombreuses lacunes. Par exemple, les définitions des intégrales et des dérivées sont basées sur les types dépendants, ce qui les rend difficiles à utiliser dans la pratique. Cette thèse décrit d'abord l'état de l'art des différentes bibliothèques d'analyse réelle disponibles dans les assistants de preuve. Pour pallier les insuffisances de la bibliothèque standard de Coq, nous avons conçu une bibliothèque facile à utiliser : Coquelicot. Une façon plus facile d'écrire les formules et les théorèmes a été mise en place en utilisant des fonctions totales à la place des types dépendants pour écrire les limites, dérivées, intégrales et séries entières. Pour faciliter l'utilisation, la bibliothèque dispose d'un ensemble complet de théorèmes couvrant ces notions, mais aussi quelques extensions comme les intégrales à paramètres et les comportements asymptotiques. En plus, une hiérarchie algébrique permet d'appliquer certains théorèmes dans un cadre plus générique comme les nombres complexes pour les matrices. Coquelicot est une extension conservative de l'analyse classique de la bibliothèque standard de Coq et nous avons démontré les théorèmes de correspondance entre les deux formalisations. Nous avons testé la bibliothèque sur plusieurs cas d'utilisation : sur une épreuve du Baccalauréat, pour les définitions et les propriétés des fonctions de Bessel ainsi que pour la solution de l'équation des ondes en dimension 1. / Real analysis is pervasive to many applications, if only because it is a suitable tool for modeling physical or socio-economical systems. As such, its support is warranted in proof assistants, so that the users have a way to formally verify mathematical theorems and correctness of critical systems. The Coq system comes with an axiomatization of standard real numbers and a library of theorems on real analysis. Unfortunately, this standard library is lacking some widely used results. For instance, the definitions of integrals and derivatives are based on dependent types, which make them cumbersome to use in practice. This thesis first describes various state-of-the-art libraries available in proof assistants. To palliate the inadequacies of the Coq standard library, we have designed a user-friendly formalization of real analysis: Coquelicot. An easier way of writing formulas and theorem statements is achieved by relying on total functions in place of dependent types for limits, derivatives, integrals, power series, and so on. To help with the proof process, the library comes with a comprehensive set of theorems that cover not only these notions, but also some extensions such as parametric integrals and asymptotic behaviors. Moreover, an algebraic hierarchy makes it possible to apply some of the theorems in a more generic setting, such as complex numbers or matrices. Coquelicot is a conservative extension of the classical analysis of Coq's standard library and we provide correspondence theorems between the two formalizations. We have exercised the library on several use cases: in an exam at university entry level, for the definitions and properties of Bessel functions, and for the solution of the one-dimensional wave equation.
179

Vo svetle intuicionizmu: dve štúdie v teórii dôkazov / In the Light of Intuitionism: Two Investigations in Proof Theory

Akbartabatabai, Seyedamirhossein January 2018 (has links)
In the Light of Intuitionism: Two Investigations in Proof Theory This dissertation focuses on two specific interconnections between the clas- sical and the intuitionistic proof theory. In the first part, we will propose a formalization for Gödel's informal reading of the BHK interpretation, using the usual classical arithmetical proofs. His provability interpretation of the propositional intuitionistic logic, first appeared in [1], in which he introduced the modal system, S4, as a formalization of the intuitive concept of prov- ability and then translated IPC to S4 in a sound and complete manner. His work suggested the search for a concrete provability interpretation for the modal logic S4 which itself leads to a concrete provability interpretation for the intutionistic logic. In the first chapter of this work, we will try to solve this problem. For this purpose, we will generalize Solovay's provabil- ity interpretation of the modal logic GL to capture other modal logics such as K4, KD4 and S4. Then, using the mentioned Gödel's translation, we will propose a formalization for the BHK interpretation via classical proofs. As a consequence, it will be shown that the BHK interpretation is powerful enough to admit many different formalizations that surprisingly capture dif- ferent propositional logics, including...
180

Consensus algorithms for adding blocks to private blockchains in IIoT networks : Comparison of two Proof-of-Authentication implementations on IIoT hardware / Konsensusalgoritmer för att lägga till block till privata blockkedjor i IIoT-nätverk : Jämförelse av två Proof-of-Authentication-implementeringar på IIoT-hårdvara

Juvencius, Kamile, Ankarberg, Therése January 2021 (has links)
The Internet of Things (IoT) market is growing by the day and there are no signs of stagnation. As the market grows, it becomes all the more important to address security concerns. A major security issue of IoT is that the devices usually send their collected data to a centralized entity, creating a single point of failure. A solution to this is decentralization. Blockchain technology offers not only decentralization, but also immutability and data integrity. In blockchain, it is the consensus algorithm that is used to coordinate the devices and achieve unanimity within the network. These consensus algorithms are generally computationally expensive and are typically not compatible with IoT devices, due to the limited resources of the devices and the need to send data in real-time. This study implemented and compared two versions of the consensus algorithms Proof of Authentication, which are designed specifically for IoT devices. The results show that one algorithm is considerably faster than the other, however, a fair comparison could not be made due to unforeseeable difficulties with the Industrial IoT(IIoT) device used in this study. This study concluded that algorithm 1 is most likely a viable choice as a consensus algorithm for IIoT networks. No conclusion could be drawn for algorithm 2 due to the unsatisfactory implementation because of the limitations of the devices used in this study. / Sakernas internet (IoT)-marknaden växer dagligen och det finns inga tecken på stagnation. När marknaden växer blir det desto viktigare att hantera säkerhetsproblem. En viktig säkerhetsfråga för IoT är att enheterna vanligtvis skickar sin insamlade data till en central enhet, vilket skapar en enda svag länk. En lösning på detta är decentralisering. Blockchain-tekniken erbjuder inte bara decentralisering utan också oföränderlighet och dataintegritet. I blockchain är det konsensusalgoritmen som används för att samordna enheterna i nätverket och uppnå enhällighet. Dessa algoritmer är i allmänhet beräkningsmässigt dyra och är vanligtvis inte kompatibla med IoT-enheter på grund av enheternas begränsade resurser och behov av att skicka data i realtid. Denna studie implementerade och jämförde två konsensusalgoritmer utformade speciellt för IoT-enheter. Resultaten visar att ena algoritmen är betydligt snabbare än den andra, men en rättvis jämförelse kunde inte göras på grund av oförutsägbara svårigheter med Industriella IoT (IIoT)-enheten som användes i denna studie. Denna studie drog slutsatsen att algoritm 1 sannolikt är ett genomförbart val som en konsensusalgoritm för IIoT-nätverk. Ingen slutsats kunde dras för algoritm 2 på grund av det otillfredsställande genomförandet på grund av begränsningarna för enheterna som används i denna studie.

Page generated in 0.2269 seconds