• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 39
  • 39
  • 9
  • 7
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Effects of Error Messages on a Student’s Ability to Understand and Fix Programming Errors

January 2017 (has links)
abstract: Assemblers and compilers provide feedback to a programmer in the form of error messages. These error messages become input to the debugging model of the programmer. For the programmer to fix an error, they should first locate the error in the program, understand what is causing that error, and finally resolve that error. Error messages play an important role in all three stages of fixing of errors. This thesis studies the effects of error messages in the context of teaching programming. Given an error message, this work investigates how it effects student’s way of 1) understanding the error, and 2) fixing the error. As part of the study, three error message types were developed – Default, Link and Example, to better understand the effects of error messages. The Default type provides an assembler-centric single line error message, the Link type provides a program-centric detailed error description with a hyperlink for more information, and the Example type provides a program centric detailed error description with a relevant example. All these error message types were developed for assembly language programming. A think aloud programming exercise was conducted as part of the study to capture the student programmer’s knowledge model. Different codes were developed to analyze the data collected as part of think aloud exercise. After transcribing, coding, and analyzing the data, it was found that the Link type of error message helped to fix the error in less time and with fewer steps. Among the three types, the Link type of error message also resulted in a significantly higher ratio of correct to incorrect steps taken by the programmer to fix the error. / Dissertation/Thesis / Masters Thesis Software Engineering 2017
32

Analise de protocolos TCP para redes de alta velocidade / Analysis of TCP protocols for high speed networks

Michel, Neila Fernanda 12 October 2008 (has links)
Orientador: Nelson Luis Saldanha da Fonseca / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Computação / Made available in DSpace on 2018-08-13T20:11:45Z (GMT). No. of bitstreams: 1 Michel_NeilaFernanda_M.pdf: 1961141 bytes, checksum: f4470a332c9b5f764b7bd3b912c27db1 (MD5) Previous issue date: 2008 / Resumo: O protoloco TCP Reno começou a operar em 01 de Janeiro de 1983. Desde sua concepção, houveram expressivos avanços nas tecnologias de transmissão de dados, como a introdução de enlaces ópticos e de satélites, 'que permitem altas taxas de transmissão. O TCP Reno continua sendo o protocolo padrão na Internet. Entretanto existem inúmeras dificuldades na sua operação, que causam baixo desempenho em redes com produto banda-atraso elevado, devido ao mecanismo de atualização do tamanho da janela de congestionamento. Para superar suas deficiências, vários protocolos variantes do TCP Reno têm sido propostos com o objetivo de melhorar a utilização de recursos em redes de alta velocidade, e por esta razão são denominados "Protocolos para Redes de Alta Velocidade" . Para que as proposições de variantes do TCP levem a protocolos que operem eficientemente em redes com alto valor do produto banda-atraso, é necessário que se avalie quais variações da dinâmica do crescimento da janela de transmissão são efetivas. Nesta dissertação avalia-se um conjunto de protocolos variantes do TCP para redes de alta velocidade através de simulações. Diversas propriedades importantes para protocolos do tipo TCP são investigadas / Abstract: The importance of the Reno TCP protocol started operating in January 01 st, 1983. Since this milestone, there were expressive advances on data transmission technologies, such as those of optical fibers and satellite channels which allow high transmission rates on long distances. Reno TCP remains the Internet standard protocol. However, its operation presents several dificulties in networks with high bandwidth-delay product, producing poor performance due its window congestion mechanism. In order to overcome its deficiencies, several variants have been proposed with the aim of enhancing its utilization in high speed networks. These variants are generally called "High Speed Networks Protocols". Transmission chàracteristics, properties and performance evaluation are essencial to the identification of problems and avoidance of poor performance. In this dissertation a set of TCP variants protocols for high speed networks is evaluated. Several important properties of these protocols are investigated / Mestrado / Redes de Computadores / Mestre em Ciência da Computação
33

Databáze specifikací bezpečnostních protokolů / Specifications Database of Security Protocols

Ondráček, David January 2008 (has links)
Original protocols, which were created during early development of computer networks, no longer provide sufficient security. This is the reason why new protocols are developed and implemented. The important component of this process is formal verification, which is used to analyze the developed protocols and check whether a successful attack is possible or not. This thesis presents selected security protocols and tools for their formal verification. Further, the selected protocols are specified in LySa calculus and results of their analysis using LySatool are presented and discussed.
34

The Work Process of Research Librarians: Implementation of the Abstraction-Decomposition Space

Simons, Kevin J. 19 April 2005 (has links)
No description available.
35

Advanced Features in Protocol Verification: Theory, Properties, and Efficiency in Maude-NPA

Santiago Pinazo, Sonia 31 March 2015 (has links)
The area of formal analysis of cryptographic protocols has been an active one since the mid 80’s. The idea is to verify communication protocols that use encryption to guarantee secrecy and that use authentication of data to ensure security. Formal methods are used in protocol analysis to provide formal proofs of security, and to uncover bugs and security flaws that in some cases had remained unknown long after the original protocol publication, such as the case of the well known Needham-Schroeder Public Key (NSPK) protocol. In this thesis we tackle problems regarding the three main pillars of protocol verification: modelling capabilities, verifiable properties, and efficiency. This thesis is devoted to investigate advanced features in the analysis of cryptographic protocols tailored to the Maude-NPA tool. This tool is a model-checker for cryptographic protocol analysis that allows for the incorporation of different equational theories and operates in the unbounded session model without the use of data or control abstraction. An important contribution of this thesis is relative to theoretical aspects of protocol verification in Maude-NPA. First, we define a forwards operational semantics, using rewriting logic as the theoretical framework and the Maude programming language as tool support. This is the first time that a forwards rewriting-based semantics is given for Maude-NPA. Second, we also study the problem that arises in cryptographic protocol analysis when it is necessary to guarantee that certain terms generated during a state exploration are in normal form with respect to the protocol equational theory. We also study techniques to extend Maude-NPA capabilities to support the verification of a wider class of protocols and security properties. First, we present a framework to specify and verify sequential protocol compositions in which one or more child protocols make use of information obtained from running a parent protocol. Second, we present a theoretical framework to specify and verify protocol indistinguishability in Maude-NPA. This kind of properties aim to verify that an attacker cannot distinguish between two versions of a protocol: for example, one using one secret and one using another, as it happens in electronic voting protocols. Finally, this thesis contributes to improve the efficiency of protocol verification in Maude-NPA. We define several techniques which drastically reduce the state space, and can often yield a finite state space, so that whether the desired security property holds or not can in fact be decided automatically, in spite of the general undecidability of such problems. / Santiago Pinazo, S. (2015). Advanced Features in Protocol Verification: Theory, Properties, and Efficiency in Maude-NPA [Tesis doctoral no publicada]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/48527 / TESIS
36

Modeling and Analysis of Advanced Cryptographic Primitives and Security Protocols in Maude-NPA

Aparicio Sánchez, Damián 23 December 2022 (has links)
Tesis por compendio / [ES] La herramienta criptográfica Maude-NPA es un verificador de modelos especializado para protocolos de seguridad criptográficos que tienen en cuenta las propiedades algebraicas de un sistema criptográfico. En la literatura, las propiedades criptográficas adicionales han descubierto debilidades de los protocolos de seguridad y, en otros casos, son parte de los supuestos de seguridad del protocolo para funcionar correctamente. Maude-NPA tiene una base teórica en la rewriting logic, la unificación ecuacional y el narrowing para realizar una búsqueda hacia atrás desde un patrón de estado inseguro para determinar si es alcanzable o no. Maude-NPA se puede utilizar para razonar sobre una amplia gama de propiedades criptográficas, incluida la cancelación del cifrado y descifrado, la exponenciación de Diffie-Hellman, el exclusive-or y algunas aproximaciones del cifrado homomórfico. En esta tesis consideramos nuevas propiedades criptográficas, ya sea como parte de protocolos de seguridad o para descubrir nuevos ataques. También hemos modelado diferentes familias de protocolos de seguridad, incluidos los Distance Bounding Protocols or Multi-party key agreement protocolos. Y hemos desarrollado nuevas técnicas de modelado para reducir el coste del análisis en protocolos con tiempo y espacio. Esta tesis contribuye de varias maneras al área de análisis de protocolos criptográficos y muchas de las contribuciones de esta tesis pueden ser útiles para otras herramientas de análisis criptográfico. / [CA] L'eina criptografica Maude-NPA es un verificador de models especialitzats per a protocols de seguretat criptogràfics que tenen en compte les propietats algebraiques d'un sistema criptogràfic. A la literatura, les propietats criptogràfiques addicionals han descobert debilitats dels protocols de seguretat i, en altres casos, formen part dels supòsits de seguretat del protocol per funcionar correctament. Maude-NPA te' una base teòrica a la rewriting lògic, la unificació' equacional i narrowing per realitzar una cerca cap enrere des d'un patró' d'estat insegur per determinar si es accessible o no. Maude-NPA es pot utilitzar per raonar sobre una amplia gamma de propietats criptogràfiques, inclosa la cancel·lació' del xifratge i desxifrat, l'exponenciacio' de Diffie-Hellman, el exclusive-or i algunes aproximacions del xifratge homomòrfic. En aquesta tesi, considerem noves propietats criptogràfiques, ja sigui com a part de protocols de seguretat o per descobrir nous atacs. Tambe' hem modelat diferents famílies de protocols de seguretat, inclosos els Distance Bounding Protocols o Multi-party key agreement protocols. I hem desenvolupat noves tècniques de modelització' de protocols per reduir el cost de l'analisi en protocols amb temps i espai. Aquesta tesi contribueix de diverses maneres a l’àrea de l’anàlisi de protocols criptogràfics i moltes de les contribucions d’aquesta tesi poden ser útils per a altres eines d’anàlisi criptogràfic. / [EN] The Maude-NPA crypto tool is a specialized model checker for cryptographic security protocols that take into account the algebraic properties of the cryptosystem. In the literature, additional crypto properties have uncovered weaknesses of security protocols and, in other cases, they are part of the protocol security assumptions in order to function properly. Maude-NPA has a theoretical basis on rewriting logic, equational unification, and narrowing to perform a backwards search from an insecure state pattern to determine whether or not it is reachable. Maude-NPA can be used to reason about a wide range of cryptographic properties, including cancellation of encryption and decryption, Diffie-Hellman exponentiation, exclusive-or, and some approximations of homomorphic encryption. In this thesis, we consider new cryptographic properties, either as part of security protocols or to discover new attacks. We have also modeled different families of security protocols, including Distance Bounding Protocols or Multi-party key agreement protocols. And we have developed new protocol modeling techniques to reduce the time and space analysis effort. This thesis contributes in several ways to the area of cryptographic protocol analysis and many of the contributions of this thesis can be useful for other crypto analysis tools. / This thesis would not have been possible without the funding of a set of research projects. The main contributions and derivative works of this thesis have been made in the context of the following projects: - Ministry of Economy and Business of Spain : Project LoBaSS Effective Solutions Based on Logic, Scientific Research under award number TIN2015-69175-C4-1-R, this project was focused on using powerful logic-based technologies to analyze safety-critical systems. - Air Force Office of Scientific Research of United States of America : Project Advanced symbolic methods for the cryptographic protocol analyzer Maude-NPA Scientific Research under award number FA9550-17-1-0286 - State Investigation Agency of Spain : Project FREETech: Formal Reasoning for Enabling and Emerging Technologies Scientific I+D-i Research under award number RTI2018-094403-B-C32 / Aparicio Sánchez, D. (2022). Modeling and Analysis of Advanced Cryptographic Primitives and Security Protocols in Maude-NPA [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/190915 / Compendio
37

The work process of research librarians implementation of the abstraction-decomposition space /

Simons, Kevin J. January 2005 (has links)
Thesis (Ph.D.)--Miami University, Dept. of Psychology, 2005. / Title from second page of PDF document. Document formatted into pages; contains [2], vi, 72, [22] p. : ill. Includes bibliographical references (p. 35-37).
38

Processus ascendants et descendants en compréhension de l'oral en langue étrangère - Problèmes et retombées didactiques pour la compréhension de l'anglais / Foreign-language listening : bottom-up and top-down processes – issues for EFL teaching and research

Zoghlami, Naouel 27 November 2015 (has links)
Cette thèse s’intéresse à la relation complexe entre processus ascendants (bottom-up) et processus descendants (top-down) dans la compréhension de l’oral L2 ; c’est-à-dire entre l’utilisation du signal sonore et de l’input linguistique d’une part, et l’intégration de connaissances diverses (linguistiques, discursives, pragmatiques, générales) d’autre part. Malgré de nombreux travaux en psycholinguistique, en acquisition des langues étrangères (L2) et en didactique (par exemple, Cutler & Clifton, 1999 ; Field, 2008; Rost, 2002 ; Brown, 1990), notre connaissance des processus cognitifs complexes de l’écoute ainsi que l’effet de certains facteurs sur la compréhension de l’oral restent à approfondir. Une appréciation plus fine de ces processus est indispensable aux questions relatives à l’enseignement et à l’apprentissage de la compréhension de l’oral en L2.Partant du principe que l’écoute a la même architecture cognitive en L1 et L2, nous commençons par un résumé – et une synthèse – des modèles psycholinguistiques récents de la compréhension de l’oral en L1. Nous examinons également les principaux facteurs rendant la compréhension difficile. Notre résumé des études en L2 consacrées au rôle de l'information descendante et au comportement stratégique des auditeurs L2 souligne la contribution importante de la métacognition. Nous tenterons donc ici de clarifier un certain flou terminologique qui caractérise ces travaux, et nous proposerons un modèle qui fait part de la métacognition dans la compréhension unidirectionnelle de l’oral en L2. Nous présentons ensuite les résultats d'une étude que nous avons menée pour analyser la contribution exacte de certains facteurs à l'écoute en L2.Les participants à cette étude sont des enseignants (n=23) et surtout des apprenants (n=226) français et tunisiens de l’anglais. En s’appuyant sur des méthodes multiples, à la fois quantitatives (différent tests et questionnaires) et qualitatives (questionnaires, tâches de réflexion à haute voix - Ericsson & Simon, 1993 - et de dévoilement graduel - gating, Grosjean, 1980), nous nous interrogeons: 1) sur les facteurs perçus par les apprenants et les enseignants comme étant déterminants dans la compréhension de l’anglais oral ; 2) sur la contribution relative de la compétence linguistique en L2, la discrimination auditive, la reconnaissance lexicale, et les compétences en méta-compréhension à une compréhension auditive réussie; 3) sur les problèmes que rencontrent nos auditeurs L2 lors de l’écoute, et sur les comportements stratégiques adoptés pour y faire face. Nous regardons plus particulièrement (pour ces différents paramètres) les différents niveaux de compétence en compréhension des sujets (avec une analyse poussée des auditeurs compétents et des auditeurs moins-compétents), ainsi qu’à de possibles influences de deux langues maternelles distinctes (français vs arabe tunisien). Nos analyses montrent : 1) ce que les apprenants et les enseignant perçoivent comme facteurs influant la compréhension de l’oral diffère relativement de ce qui la rend réellement problématique; 2) que la reconnaissance des mots et la connaissance lexicale contribuent significativement à la variation dans la compréhension, avec la reconnaissance des mots étant le prédicteur le plus fort ; 3) que les problèmes rencontrés en temps réel sont principalement de bas-niveau (segmentation de la parole), et que si les stratégies généralement contribuent à la compréhension, elles ne sont pas discriminatoires. Ce qui distingue donc l’auditeur expert du novice est son traitement formel (plus efficace et automatique) et non pas stratégique de l’information orale. Ces résultats sont discutés en rapport avec notre cadre théorique et selon une perspective pédagogique. / This thesis focuses on the complex relationship between bottom-up and top-down processes in L2 speech comprehension; i.e. between the use of the signal and the linguistic input on one hand, and the integration of various types of knowledge (linguistic, discourse, pragmatic, general) on the other hand. Despite a large body of research on the cognitive processes underlying listening in psycholinguistics, foreign language (L2) acquisition and teaching (e.g., Cutler & Clifton, 1999; Field, 2008a; Rost, 2002; Brown, 1990), there are still gaps in our understanding of these processes and the impact certain factors have on listening comprehension. Assuming that L1 and L2 listening follow the same cognitive architecture, we first review recent psycholinguistic models of L1 listening. We also examine the main factors constraining L2 listening comprehension. As our summary of the few SLA studies that have investigated the role of bottom-up information and the strategic behavior of L2 listeners points to the important contribution of metacognition, we clarify the terminological fuzziness characterizing this concept, and propose a model of metacognition in real-world unidirectional L2 listening. We then present the results of a study that we conducted to investigate the exact contribution of these different factors to L2 listening. The participants in this study were EFL French and Tunisian teachers (n=23) and learners (n=226). Using mixed quantitative (different tests and questionnaires) and qualitative (protocol analysis and gating experiments - Ericsson & Simon, 1993; Grosjean, 1980) methods, our aim was to investigate: 1) the factors perceived by learners and teachers as influencing L2 listening; 2) the relative contribution of linguistic knowledge, auditory discrimination, spoken word recognition (SWR), and meta-comprehension knowledge to successful L2 listening; 3) on-line listening problems and strategy use. For all of these parameters, we looked more closely at different levels of listening proficiency (various analyses of the performance of skilled and unskilled L2 listeners), as well as the possible influence of the two L1s (French and Tunisian Arabic) involved in the study.Our analyses show that: 1) there is a general discrepancy between what is perceived as making L2 listening difficult and what really renders it problematic; 2) SWR and vocabulary knowledge contribute significantly to the variance in L2 listening, with SWR being a stronger predictor; 3) listening problems encountered on-line are mainly lower-level (segmentation) and, although strategies contribute to speech comprehension, they are not discriminatory. What characterizes a proficient L2 listener seems to be accurate formal processing, not strategic processing of oral input. The findings are discussed from a theoretical and pedagogical perspective. Keywords: listening comprehension, French and Tunisian learners of L2 English, bottom-up and top-down processes, formal processing, integration and situation model, attentional resources, gating, protocol analysis, comparative analysis
39

On the security of authentication protocols on the web / La sécurité des protocoles d’authentification sur leWeb

Delignat-Lavaud, Antoine 14 March 2016 (has links)
Est-il possible de démontrer un théorème prouvant que l’accès aux données confidentielles d’un utilisateur d’un service Web (tel que GMail) nécessite la connaissance de son mot de passe, en supposant certaines hypothèses sur ce qu’un attaquant est incapable de faire (par exemple, casser des primitives cryptographiques ou accéder directement aux bases de données de Google), sans toutefois le restreindre au point d’exclure des attaques possibles en pratique?Il existe plusieurs facteurs spécifiques aux protocoles du Web qui rendent impossible une application directe des méthodes et outils existants issus du domaine de l’analyse des protocoles cryptographiques.Tout d’abord, les capacités d’un attaquant sur le Web vont largement au-delà de la simple manipulation des messages échangés entre le client et le serveur sur le réseau. Par exemple, il est tout à fait possible (et même fréquent en pratique) que l’utilisateur ait dans son navigateur un onglet contenant un site contrôlé par l’adversaire pendant qu’il se connecte à sa messagerie (par exemple, via une bannière publicitaire) ; cet onglet est, comme n’importe quel autre site, capable de provoquer l’envoi de requêtes arbitraires vers le serveur de GMail, bien que la politique d’isolation des pages du navigateur empêche la lecture directe de la réponse à ces requêtes. De plus, la procédure pour se connecter à GMail implique un empilement complexe de protocoles : tout d’abord, un canal chiffré, et dont le serveur est authentifié, est établi avec le protocole TLS ; puis, une session HTTP est créée en utilisant un cookie ; enfin, le navigateur exécute le code JavaScript retourné par le client, qui se charge de demander son mot de passe à l’utilisateur.Enfin, même en imaginant que la conception de ce système soit sûre, il suffit d’une erreur minime de programmation (par exemple, une simple instruction goto mal placée) pour que la sécurité de l’ensemble de l’édifice s’effondre.Le but de cette thèse est de bâtir un ensemble d’outils et de librairies permettant de programmer et d’analyser formellement de manière compositionelle la sécurité d’applicationsWeb confrontées à un modère plausible des capacités actuelles d’un attaquant sur le Web. Dans cette optique, nous étudions la conception des divers protocoles utilisés à chaque niveau de l’infrastructure du Web (TLS, X.509, HTTP, HTML, JavaScript) et évaluons leurs compositions respectives. Nous nous intéressons aussi aux implémentations existantes et en créons de nouvelles que nous prouvons correctes afin de servir de référence lors de comparaisons. Nos travaux mettent au jour un grand nombre de vulnérabilités aussi bien dans les protocoles que dans leurs implémentations, ainsi que dans les navigateurs, serveurs, et sites internet ; plusieurs de ces failles ont été reconnues d’importance critiques. Enfin, ces découvertes ont eu une influence sur les versions actuelles et futures du protocole TLS. / As ever more private user data gets stored on the Web, ensuring proper protection of this data (in particular when it transits through untrusted networks, or when it is accessed by the user from her browser) becomes increasingly critical. However, in order to formally prove that, for instance, email from GMail can only be accessed by knowing the user’s password, assuming some reasonable set of assumptions about what an attacker cannot do (e.g. he cannot break AES encryption), one must precisely understand the security properties of many complex protocols and standards (including DNS, TLS, X.509, HTTP, HTML,JavaScript), and more importantly, the composite security goals of the complete Web stack.In addition to this compositional security challenge, onemust account for the powerful additional attacker capabilities that are specific to the Web, besides the usual tampering of network messages. For instance, a user may browse a malicious pages while keeping an active GMail session in a tab; this page is allowed to trigger arbitrary, implicitly authenticated requests to GMail using JavaScript (even though the isolation policy of the browser may prevent it from reading the response). An attacker may also inject himself into honest page (for instance, as a malicious advertising script, or exploiting a data sanitization flaw), get the user to click bad links, or try to impersonate other pages.Besides the attacker, the protocols and applications are themselves a lot more complex than typical examples from the protocol analysis literature. Logging into GMail already requires multiple TLS sessions and HTTP requests between (at least) three principals, representing dozens of atomic messages. Hence, ad hoc models and hand written proofs do not scale to the complexity of Web protocols, mandating the use of advanced verification automation and modeling tools.Lastly, even assuming that the design of GMail is indeed secure against such an attacker, any single programming bug may completely undermine the security of the whole system. Therefore, in addition to modeling protocols based on their specification, it is necessary to evaluate implementations in order to achieve practical security.The goal of this thesis is to develop new tools and methods that can serve as the foundation towards an extensive compositional Web security analysis framework that could be used to implement and formally verify applications against a reasonably extensive model of attacker capabilities on the Web. To this end, we investigate the design of Web protocols at various levels (TLS, HTTP, HTML, JavaScript) and evaluate their composition using a broad range of formal methods, including symbolic protocol models, type systems, model extraction, and type-based program verification. We also analyze current implementations and develop some new verified versions to run tests against. We uncover a broad range of vulnerabilities in protocols and their implementations, and propose countermeasures that we formally verify, some of which have been implemented in browsers and by various websites. For instance, the Triple Handshake attack we discovered required a protocol fix (RFC 7627), and influenced the design of the new version 1.3 of the TLS protocol.

Page generated in 0.0337 seconds