• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 127
  • 31
  • 16
  • 11
  • 10
  • 8
  • 8
  • 4
  • 3
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 250
  • 250
  • 138
  • 98
  • 94
  • 69
  • 46
  • 39
  • 36
  • 35
  • 35
  • 28
  • 24
  • 24
  • 22
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

ARQUITETURAS DE CRIPTOGRAFIA DE CHAVE PÚBLICA: ANÁLISE DE DESEMPENHO E ROBUSTEZ / PUBLIC-KEY CRYPTOGRAPHY ARCHITECTURES: PERFORMANCE AND ROBUSTNESS EVALUATION

Perin, Guilherme 15 April 2011 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / Given the evolution of the data communication field, and the resulting increase of the information flow in data, networks security became a major concern. Modern cryptographic methods are mathematically reliable. However their implementation in hardware leaks confidential information through side-channels like power consumption and electromagnetic emissions. Although performance issues are crucial for a hardware design, aspects of robustness against attacks based on side-channel informations have gained much attention in recent years. This work focuses on hardware architectures based on the RSA public-key algorithm, originally proposed in 1977 by Rivest, Shamir and Adleman. This algorithm has the modular exponentiation as its main operation and it is performed through successive modular multiplications. Because the RSA involves integers of 1024 bits or more, the inherent division of modular multiplications became the main concern. The Montgomery algorithm, proposed in 1985, is a largely used method for hardware designs of modular multiplications, because it avoids divisions and all operations are performed in a multiple-precision context with all terms represented in a numerical base, generally, a power of two. This dissertation proposes a systolic architecture able to perform the Montgomery modular multiplication with multiple-precision arithmetic. Following, an improvement to the systolic architecture is presented, through an architecture that computes the Montgomery multiplication by multiplexing the multi-precision arithmetic processes. The multiplexed architecture is employed in the left-to-right square-and-multiply and square-and-multiply always modular exponentiation methods and is subjected to SPA (Simple Power Analysis) and SEMA (Simple Electromagnetic Analysis) side-channel attacks and robustness aspects are analysed. Different word sizes (numerical bases) are applied as well as different input operands. As an improvement to SPA and SEMA attacks, the power consumption and electromagnetic traces are demodulated in amplitude to eliminate the clock harmonics influence in the acquired traces. Finally, interpretations, conclusions and countermeasure propositions to the multiplexed architecture against the implemented side-channel attacks are presented. / Com a expansão da área de comunicação de dados e o consequente aumento do fluxo de informações, a segurança tem se tornado uma grande preocupação. Apesar dos métodos criptográficos modernos serem matematicamente seguros, sua implementação em hardware tende a apresentar fugas de informações confidenciais por canais laterais, tais como consumo de potência e emissões eletromagnéticas. Embora questões de desempenho sejam cruciais para um projeto de hardware, aspectos de robustez contra ataques baseados em fugas de informações por canais laterais tem ganhado maior atenção nos últimos anos. Neste trabalho, explora-se arquiteturas em hardware voltadas para o algoritmo de chave pública RSA, originalmente proposto em 1977 por Rivest, Shamir e Adleman. Este algoritmo possui como principal operação a exponenciação modular, e esta é calculada através de sucessivas multiplicações modulares. Sendo que o RSA envolve números inteiros da ordem de 1024 bits ou mais, a operação de divisão inerente em multiplicações modulares torna-se o principal problema. O algoritmo de Montgomery, proposto em 1985, é um método bastante utilizado na implementação da multiplicação modular em hardware, pois além de evitar divisões, trabalha em um contexto de precisão múltipla com termos representados por bases numéricas, geralmente, potências de dois. Dentro deste contexto, propõe-se inicialmente uma arquitetura sistólica, baseada nas propriedades de aritmética de precisão múltipla do Algoritmo de Montgomery. Em seguida, apresenta-se uma melhoria para a arquitetura sistólica, através de uma arquitetura que realiza a multiplicação modular de Montgomery voltada à multiplexação dos processos aritméticos. A arquitetura multiplexada é empregada nos métodos de exponenciação modular left-to-right square-and-multiply e square-and-multiply always e é submetida a ataques por canais laterais SPA (Simple Power Analysis) e SEMA (Simple Electromagnetic Analysis) e aspectos de robustez da arquitetura multiplexada são analisados para diversos tamanhos de palavras (base numérica do algoritmo de Montgomery). Como proposta de melhoria aos ataques por canais laterais simples, os traços de consumo de potência e emissão eletromagnética são demodulados em amplitude de modo a eliminar a influência das harmônicas do sinal de clock sobre os traços coletados. Por fim, interpretações e conclusões dos resultados são apresentados, assim como propostas de contra-medidas para a arquitetura multiplexada com relação aos ataques por canais laterais realizados.
222

CONTRA-MEDIDA POR RANDOMIZAÇÃO DE ACESSO À MEMÓRIA EM ARQUITETURA DE CRIPTOGRAFIA DE CHAVE PÚBLICA / MEMORY RANDOM ACCESS COUNTERMEASURE ON A PUBLIC KEY CRYPTOGRAPHY ARCHITECTURE

Henes, Felipe Moraes 18 November 2013 (has links)
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / The expansion of the data communication, due to the large ow of information that pass through these systems has meant that the security becomes an item of constant concern. Even when considering the efficient encryption systems that exists today, which present relevant mathematical protection, some implementations in hardware of these systems will favor the leak of confidential information through side channels attacks, such as power consumption and electromagnetic radiation. Performance issues have fundamental importance in the design of a physical system, however aspects which make the system robust against side channel attacks has gotten more attention nowadays.This work focuses on hardware architectures based on the RSA public key algorithm, proposed by Rivest, Shamir and Adleman in 1977, which presents the modular exponentiation operation, calculated from several modular multiplications, as main operation. The RSA algorithm involves integers in order of 1024 or 2048 bits, so the division inherent in modular multiplications can become a major problem. In order to avoid these divisions, the Montgomery algorithm, proposed in 1985, appears as an efficient alternative. On this context, this dissertation presents a multiplexed architecture based on the properties of the Montgomery's algorithm. Forwarding, an improvement to this architecture is presented, implemented with the randomization of internal memories accesses, in order to increase system robustness against specialized side-channel attacks. Thus, the implemented architecture is exposed to side channels SPA (Simple Power Analysis) and SEMA (Simple Electromagnetig Analysis) and the aspects of security and robustness of the implemented system are evaluated and presented. / A constante expansão dos sistemas de comunicação de dados devido ao grande fluxo de informações que trafegam por estes sistemas tem feito com que a segurança se torne um item de constante preocupação. Mesmo ao considerar-se os eficientes sistemas de criptografia atuais, os quais apresentam relevante proteção matemática, a implementação em hardware destes sistemas tende a propiciar a fuga de informações confidenciais através de ataques por canais laterais, como consumo de potência e emissão eletromagnética. Mesmo sabendo-se que questões de desempenho tem fundamental importância no projeto de um sistema físico, aspectos que tornem o sistema robusto frente a ataques por canais laterais tem obtido maior atenção nos últimos anos. Neste trabalho apresentam-se arquiteturas implementadas em hardware para o cálculo do algoritmo de chave pública RSA, proposto por Rivest, Shamir e Adleman em 1977, o qual tem como principal operarção a exponenciação modular, calculada a partir de várias multiplicações modulares. Sabendo-se que o algoritmo RSA envolve números inteiros da ordem de 1024 ou 2048 bits, a divisão inerente em multiplicações modulares pode tornar-se o grande problema. A fim de que se evite estas divisões, o algoritmo de Montgomery, proposto em 1985, aparece como uma boa alternativa por também trabalhar em um contexto de precisão múltipla e com números na base numérica de potência de dois. Neste contexto apresenta-se inicialmente uma arquitetura multiplexada, baseada nas propriedades de execução do algoritmo de Montgomery. A seguir apresenta-se uma melhoria a esta arquitetura com a implementação da randomização dos acessos as memórias internas, com o objetivo de aumentar a robustez do sistema frente a ataques por canais laterais especializados. Sendo assim, a arquitetura implementada é submetida a ataques por canais laterais SPA (Simple Power Analysis) e SEMA (Simple Electromagnetig Analysis) e os aspectos de segurança e robustez do sistema implementado são analisados e apresentados.
223

Gaussian sampling in lattice-based cryptography / Le Gaussian sampling dans la cryptographie sur les réseaux euclidiens

Prest, Thomas 08 December 2015 (has links)
Bien que relativement récente, la cryptographie à base de réseaux euclidiens s’est distinguée sur de nombreux points, que ce soit par la richesse des constructions qu’elle permet, par sa résistance supposée à l’avènement des ordinateursquantiques ou par la rapidité dont elle fait preuve lorsqu’instanciée sur certaines classes de réseaux. Un des outils les plus puissants de la cryptographie sur les réseaux est le Gaussian sampling. À très haut niveau, il permet de prouver qu’on connaît une base particulière d’un réseau, et ce sans dévoiler la moindre information sur cette base. Il permet de réaliser une grande variété de cryptosystèmes. De manière quelque peu surprenante, on dispose de peu d’instanciations pratiques de ces schémas cryptographiques, et les algorithmes permettant d’effectuer du Gaussian sampling sont peu étudiés. Le but de cette thèse est de combler le fossé qui existe entre la théorie et la pratique du Gaussian sampling. Dans un premier temps, nous étudions et améliorons les algorithmes existants, à la fois par une analyse statistique et une approche géométrique. Puis nous exploitons les structures sous-tendant de nombreuses classes de réseaux, ce qui nous permet d’appliquer à un algorithme de Gaussian sampling les idées de la transformée de Fourier rapide, passant ainsi d’une complexité quadratique à quasilinéaire. Enfin, nous utilisons le Gaussian sampling en pratique et instancions un schéma de signature et un schéma de chiffrement basé sur l’identité. Le premierfournit des signatures qui sont les plus compactes obtenues avec les réseaux à l’heure actuelle, et le deuxième permet de chiffrer et de déchiffrer à une vitesse près de mille fois supérieure à celle obtenue en utilisant un schéma à base de couplages sur les courbes elliptiques. / Although rather recent, lattice-based cryptography has stood out on numerous points, be it by the variety of constructions that it allows, by its expected resistance to quantum computers, of by its efficiency when instantiated on some classes of lattices. One of the most powerful tools of lattice-based cryptography is Gaussian sampling. At a high level, it allows to prove the knowledge of a particular lattice basis without disclosing any information about this basis. It allows to realize a wide array of cryptosystems. Somewhat surprisingly, few practical instantiations of such schemes are realized, and the algorithms which perform Gaussian sampling are seldom studied. The goal of this thesis is to fill the gap between the theory and practice of Gaussian sampling. First, we study and improve the existing algorithms, byboth a statistical analysis and a geometrical approach. We then exploit the structures underlying many classes of lattices and apply the ideas of the fast Fourier transform to a Gaussian sampler, allowing us to reach a quasilinearcomplexity instead of quadratic. Finally, we use Gaussian sampling in practice to instantiate a signature scheme and an identity-based encryption scheme. The first one yields signatures that are the most compact currently obtained in lattice-based cryptography, and the second one allows encryption and decryption that are about one thousand times faster than those obtained with a pairing-based counterpart on elliptic curves.
224

Advances in public-key cryptology and computer exploitation / Avancées en cryptologie à clé publique et exploitation informatique

Géraud, Rémi 05 September 2017 (has links)
La sécurité de l’information repose sur la bonne interaction entre différents niveaux d’abstraction : les composants matériels, systèmes d’exploitation, algorithmes, et réseaux de communication. Cependant, protéger ces éléments a un coût ; ainsi de nombreux appareils sont laissés sans bonne couverture. Cette thèse s’intéresse à ces différents aspects, du point de vue de la sécurité et de la cryptographie. Nous décrivons ainsi de nouveaux algorithmes cryptographiques (tels que des raffinements du chiffrement de Naccache–Stern), de nouveaux protocoles (dont un algorithme d’identification distribuée à divulgation nulle de connaissance), des algorithmes améliorés (dont un nouveau code correcteur et un algorithme efficace de multiplication d’entiers),ainsi que plusieurs contributions à visée systémique relevant de la sécurité de l’information et à l’intrusion. En outre, plusieurs de ces contributions s’attachent à l’amélioration des performances des constructions existantes ou introduites dans cette thèse. / Information security relies on the correct interaction of several abstraction layers: hardware, operating systems, algorithms, and networks. However, protecting each component of the technological stack has a cost; for this reason, many devices are left unprotected or under-protected. This thesis addresses several of these aspects, from a security and cryptography viewpoint. To that effect we introduce new cryptographic algorithms (such as extensions of the Naccache–Stern encryption scheme), new protocols (including a distributed zero-knowledge identification protocol), improved algorithms (including a new error-correcting code, and an efficient integer multiplication algorithm), as well as several contributions relevant to information security and network intrusion. Furthermore, several of these contributions address the performance of existing and newly-introduced constructions.
225

Infrastruktura veřejných klíčů / Infrastructure of public keys

Bědajánek, Ondřej January 2008 (has links)
The subject of my thesis dscribes function and principles of the public key infrastructure as well as certificate authority. Under the operation system Linux was created self signed certificate authority. Web interface was devoloped in PHP for the purpose of the generation, distribution and rejection certificates. Configuration files for OpenVPN are included in the thesis and wireless security is achived by OpenVPN.
226

Laboratorní úloha infrastruktury veřejných klíčů / Lab of public key infrastructure

Slavík, Petr January 2009 (has links)
The aim of this thesis is to study and describe the theme of Public Key Infrastructure (PKI). Within the scope of minute PKI characterization there is a gradual depiction of particular structural elements, which are above all represented by cryptographic operations (asymetric and symetric cryptography, hash function and digital signature); then, there are also individual PKI subjects that are dealt with, like eg. certification authority, certificates, security protocols, secure heap etc. Last but not least there are a few complete Public Key Infrastructure implementation solutions described (OpenSSL, Microsft CA). The practical part of the thesis, a lab exercise, gives potential students the knowledge of installing OpenSSL system based certification authority. The next task educate students how to secure web server with certificate signed with own CA and also how to secure web server users‘ access control through certificates signed by the previously installed CA.
227

Cryptography and number theory in the classroom -- Contribution of cryptography to mathematics teaching

Klembalski, Katharina 02 May 2012 (has links)
Cryptography fascinates people of all generations and is increasingly presented as an example for the relevance and application of the mathematical sciences. Indeed, many principles of modern cryptography can be described at a secondary school level. In this context, the mathematical background is often only sparingly shown. In the worst case, giving mathematics this character of a tool reduces the application of mathematical insights to the message ”cryptography contains math”. This paper examines the question as to what else cryptography can offer to mathematics education. Using the RSA cryptosystem and related content, specific mathematical competencies are highlighted that complement standard teaching, can be taught with cryptography as an example, and extend and deepen key mathematical concepts.
228

A Side-Channel Attack on Masked and Shuffled Implementations of M-LWE and M-LWR Cryptography : A case study of Kyber and Saber / En sidokanalsattack på implementationer av M-LWE- och M-LWR-kryptografi skyddade med maskering och slumpad operationsordning : En studie av Kyber och Saber

Backlund, Linus January 2023 (has links)
In response to the threat of a future, large-scale, quantum computer, the American National Institute of Standards and Technology (NIST) initiated a competition for designs of quantum-resistant cryptographic primitives. In 2022, the lattice-based Module-Learning With Errors (M-LWE) scheme Kyber emerged as the winner to be standardized. The standardization procedure and development of secure implementations call for thorough evaluation and research. One of the main threats to implementations of cryptographic algorithms today is Side-Channel Analysis (SCA), which is the topic of this thesis. Previous work has presented successful power-based attacks on implementations of lattice cryptography protected by masking and even masking combined with shuffling. Shuffling makes SCA harder as the order of independent instructions is randomized, reducing the correlation between operations and power consumption. This randomization is commonly implemented by shuffling the order of the indexes used to iterate over a loop, using the modern Fisher-Yates algorithm. This work describes a new attack that defeats the shuffling countermeasure by first attacking the generation of the index permutation itself. The attack first recovers the positions of the first and last indexes, 0 and 255, and then rotates the encrypted messages using a ciphertext malleability applicable to many ring-based LWE schemes to shift two bits into the known positions from which they can be recovered. This procedure is repeated to recover full messages in 128 rotations. The attack is tested and evaluated on masked and shuffled implementations of Kyber as well as Saber, another similar finalist of the NIST competition which is based on the Module-Learning With Rounding (M-LWR) problem. Compared to the previous attack on masked and shuffled Saber, which required 61,680 traces, the 4,608 needed for this attack demonstrates a 13-fold improvement. / Som svar på hotet från en framtida, storskalig kvantdator initierade amerikanska National Institute of Standards and Technology (NIST) en tävling för design av kvantsäker kryptografi. Den gitter-baserade Module-Learning With Errors algoritmen Kyber valdes 2022 till vinnare och därmed till att standardiseras. Standardiseringsprocessen och utvecklingen av säkra implementationer manar till utvärderingar och forskning. Ett av de primära hoten mot implementationer av kryptografiska algoritmer är sidokanalsanalys, vilket är fokus i detta arbete. Tidigare attacker har genom effektanalys demonsterat lyckade attacker på implementationer av gitter-baserade algoritmer skyddade genom maskering samt maskering och slumpad ordning av operationer. Slumpad ordning av oberoende operationer gör sidokanalsanalys svårare då korrelationen till effektförbrukningen minskar. Denna slumpordning brukar vanligtiv implementeras genom att slumpmässigt permutera, med den moderna implementationen av Fisher-Yates, de index som används i en kodslinga. I detta arbete presenteras en ny attack som till först extraherar positionen av det första och det sista indexen, 0 och 255, innan de två motsvarande meddelandebitarna extraheras. Bitarna i meddelandet roteras till de kända positionerna med en metod för skiffertextmanipulation som är vanlig bland ring-baserade LWE-designer. Denna process upprepas 128 gånger för att få fram hela meddelandet. Attacken has testats och utvärderats på implementationer, skyddade genom maskering kombinerad med slumpad operationsordning, av både Kyber och en liknande NIST-finalist, Saber. Jämfört med den tidigare attacken på Saber med samma skyddsåtgärder minskar den nya metoden det antal mätningar som krävs från 61,608 till 4,608, vilket motsvarar en 13-falding förbättring.
229

New authentication mechanism using certificates for big data analytic tools

Velthuis, Paul January 2017 (has links)
Companies analyse large amounts of sensitive data on clusters of machines, using a framework such as Apache Hadoop to handle inter-process communication, and big data analytic tools such as Apache Spark and Apache Flink to analyse the growing amounts of data. Big data analytic tools are mainly tested on performance and reliability. Security and authentication have not been enough considered and they lack behind. The goal of this research is to improve the authentication and security for data analytic tools.Currently, the aforementioned big data analytic tools are using Kerberos for authentication. Kerberos has difficulties in providing multi factor authentication. Attacks on Kerberos can abuse the authentication. To improve the authentication, an analysis of the authentication in Hadoop and the data analytic tools is performed. The research describes the characteristics to gain an overview of the security of Hadoop and the data analytic tools. One characteristic is that the usage of the transport layer security (TLS) for the security of data transportation. TLS usually establishes connections with certificates. Recently, certificates with a short time to live can be automatically handed out.This thesis develops new authentication mechanism using certificates for data analytic tools on clusters of machines, providing advantages over Kerberos. To evaluate the possibility to replace Kerberos, the mechanism is implemented in Spark. As a result, the new implementation provides several improvements. The certificates used for authentication are made valid with a short time to live and are thus less vulnerable to abuse. Further, the authentication mechanism solves new requirements coming from businesses, such as providing multi-factor authenticationand scalability.In this research a new authentication mechanism is developed, implemented and evaluated, giving better data protection by providing improved authentication.
230

Efficient, Scalable and Secure Vehicular Communication System : An Experimental Study

Singh, Shubhanker January 2020 (has links)
Awareness of vehicles’ surrounding conditions is important in today’s intelligent transportation system. A wide range of effort has been put in to deploy Vehicular Communication (VC) systems to make driving conditions safer and more efficient. Vehicles are aware of their surroundings with the help of authenticated safety beacons in VC systems. Since vehicles act according to the information conveyed by such beacons, verification of beacons plays an important role in becoming aware of and predicting the status of the sender vehicle. The idea of implementing secure mechanisms to deal with a high rate of incoming beacons and processing them with high efficiency becomes a very important part of the whole VC network. The goal of this work was to implement a scheme that deals with a high rate of the incoming beacon, preserve non-repudiation of the accepted messages which contains information about the current and near-future status of the sender vehicle, and at the same time keep the computation overhead as low as possible. Along with this, maintaining user privacy from a legal point of view as well as from a technical perspective by implementing privacy-enhancing technologies. These objectives were achieved by the introduction of Timed Efficient Stream Loss-Tolerant Authentication (TESLA), periodic signature verification, and cooperative verification respectively. Four different scenarios were implemented and evaluated, starting and building upon the baseline approach. Each approach addressed the problems that were aimed at this work and results show improved scalability and efficiency with the introduction of TESLA, periodic signature verification, and cooperative verification. / Medvetenheten om fordons omgivande förhållanden är viktig i dagens intelligenta transportsystem. Ett stort antal ansträngningar har lagts ned för att distribuera VC system för att göra körförhållandena säkrare och effektivare. Fordon är medvetna om sin omgivning med hjälp av autentiserade säkerhetsfyrar i VC system. Eftersom fordon agerar enligt den information som förmedlas av sådana fyrar, spelar verifiering av fyrar en viktig roll för att bli medveten om och förutsäga avsändarfordonets status. Idén att implementera säkra mekanismer för att hantera en hög frekvens av inkommande fyrar och bearbeta dem med hög effektivitet blir en mycket viktig del av hela VC nätverket. Målet med detta arbete var att implementera ett schema som behandlar en hög hastighet för det inkommande fyren, bevara icke-förkastelse av de accepterade meddelandena som innehåller information om den aktuella och närmaste framtida statusen för avsändarfordonet och samtidigt håll beräkningen så låg som möjligt. Tillsammans med detta upprätthåller användarnas integritet ur juridisk synvinkel såväl som ur ett tekniskt perspektiv genom att implementera integritetsförbättrande teknik. Dessa mål uppnåddes genom införandet av TESLA, periodisk signatur verifiering respektive samarbets verifiering. Fyra olika scenarier implementerades och utvärderades med utgångspunkt från baslinjemetoden. Varje tillvägagångssätt tog upp de problem som riktades mot detta arbete och resultaten visar förbättrad skalbarhet och effektivitet med införandet av TESLA, periodisk signatur verifiering och samarbets verifiering.

Page generated in 0.0308 seconds