• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 776
  • 132
  • 121
  • 68
  • 67
  • 36
  • 27
  • 24
  • 12
  • 12
  • 8
  • 7
  • 7
  • 7
  • 6
  • Tagged with
  • 1463
  • 516
  • 464
  • 315
  • 236
  • 232
  • 221
  • 197
  • 162
  • 159
  • 156
  • 144
  • 134
  • 134
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
331

Denial of service attacks on 802.1X security protocol

Ozan, Orhan 03 1900 (has links)
Approved for public release, distribution unlimited / Wireless Local Area Networks (WLANs) are quickly becoming popular in daily life. Users are adopting the latest technology to save time and costs. In addition, WLANs are providing high-speed network access to the users. There are security concerns with WLANs that must be considered when deploying them over critical infrastructure, such as military and administrative government LANs. The IEEE 802.11 wireless standard specifies both an authentication service and encryption protocol, but research has demonstrated that these protocols are severely flawed. The IEEE has established a new workgroup, the IEEE 802.11i, to address all the security vulnerabilities of the 802.11 security protocol. The workgroup proposed using the IEEE 802.1X Port-Based Network Access Control Standard as an interim measure to meet the security requirements of the WLANs and to maintain the confidentiality, authenticity, and availability of the data until the workgroup is finished with the new specifications. Using an open-source test-bed for evaluating DoS attacks on WLANs, this research demonstrates four different DoS attacks that verify the weaknesses of the IEEE 802.1X protocol. Solutions are provided to mitigate the effects of such DoS attacks. / Lieutenant Junior Grade, Turkish Navy
332

Security-driven Design Optimization of Mixed Cryptographic Implementations in Distributed, Reconfigurable, and Heterogeneous Embedded Systems

Nam, HyunSuk, Nam, HyunSuk January 2017 (has links)
Distributed heterogeneous embedded systems are increasingly prevalent in numerous applications, including automotive, avionics, smart and connected cities, Internet of Things, etc. With pervasive network access within these systems, security is a critical design concern. This dissertation presents a modeling and optimization framework for distributed, reconfigurable, and heterogeneous embedded systems. Distributed embedded systems consist of numerous interconnected embedded devices, each composed of different computing resources, such single core processors, asymmetric multicore processors, field-programmable gate arrays (FPGAs), and various combinations thereof. A dataflow-based modeling framework for streaming applications integrates models for computational latency, mixed cryptographic implementations for inter-task and intra task communication, security levels, communication latency, and power consumption. For the security model, we present a level-based modeling of cryptographic algorithms using mixed cryptographic implementations, including both symmetric and asymmetric implementations. We utilize a multi-objective genetic optimization algorithm to optimize security and energy consumption subject to latency and minimum security level constraints. The presented methodology is evaluated using a video-based object detection and tracking application and several synthetic benchmarks representing various application types. Experimental results for these design and optimization frameworks demonstrate the benefits of mixed cryptographic algorithm security model compared to single cryptographic algorithm alternatives. We further consider several distributed heterogeneous embedded systems architectures.
333

Promestra Security compared with other random number generators

Korsbakke, Andreas, Ringsell, Robin January 2019 (has links)
Background. Being able to trust cryptographic algorithms is a crucial part of society today, because of all the information that is gathered by companies all over the world. With this thesis, we want to help both Promestra AB and potential future customers to evaluate if you can trust their random number generator. Objectives. The main objective for the study is to compare the random number generator in Promestra security with the help of the test suite made by the NationalInstitute of Standards and Technology. The comparison will be made with other random number generators such as Mersenne Twister, Blum-Blum-Schub and more. Methods. The selected method in this study was to gather a total of 100 million bits of each random number generator and use these in the National Institute ofStandards and Technology test suite for 100 tests to get a fair evaluation of the algorithms. The test suite provides a statistical summary which was then analyzed. Results. The results show how many iterations out of 100 that have passed and also the distribution between the results. The obtained results show that there are some random number generators that have been tested that clearly struggles in many of the tests. It also shows that half of the tested generators passed all of the tests. Conclusions. Promestra security and Blum-Blum-Schub is close to passing all the tests, but in the end, they cannot be considered to be the preferable random number generator. The five that passed and seem to have no clear limitations are:Random.org, Micali-Schnorr, Linear-Congruential, CryptGenRandom, and MersenneTwister.
334

Frequency and encryption usage, investigation of the wireless landscape. : A study of access points in Karlskrona

Karlsson, Emelia, Lidmark, Joel January 2019 (has links)
Background. Wireless connectivity is simple and convenient for the user. This is the reasons why it is predominantly used today for local networks at home. However the potential drawbacks facing this technology is unknown to many of its users. This study is aimed at examining some of these issues in the context of what is used today.Objectives. This study intends to research what types of security features and frequency settings are being used today. It also aims to evaluate what this means in the context of security and usability effecting the user.Methods. The approach of this study is to gather networks in different geographical areas. To do this a Raspberry Pi with an external antenna is used. When the data collection is completed, the networks are broken down into categories. Results. The results show significant frequency overlap on the most commonly used channels. There are vastly more overlap in areas with apartment buildings compared to other residential areas. The results also show that most networks are using secure encryption settings. Conclusions. Careful selection of channels is required to minimise interference, but methods for doing so is specific for each environment. Security wise there are no big concerns except when it comes to password selection.
335

Elliptic Curves and their Applications to Cryptography

Bathgate, Jonathan January 2007 (has links)
Thesis advisor: Benjamin Howard / In the last twenty years, Elliptic Curve Cryptography has become a standard for the transmission of secure data. The purpose of my thesis is to develop the necessary theory for the implementation of elliptic curve cryptosystems, using elementary number theory, abstract algebra, and geometry. This theory is based on developing formulas for adding rational points on an elliptic curve. The set of rational points on an elliptic curve form a group over the addition law as it is defined. Using the group law, my study continues into computing the torsion subgroup of an elliptic curve and considering elliptic curves over finite fields. With a brief introduction to cryptography and the theory developed in the early chapters, my thesis culminates in the explanation and implementation of three elliptic curve cryptosystems in the Java programming language. / Thesis (BA) — Boston College, 2007. / Submitted to: Boston College. College of Arts and Sciences. / Discipline: Mathematics. / Discipline: College Honors Program.
336

The GNU Taler system : practical and provably secure electronic payments / Le système GNU Taler : Paiements électroniques pratiques et sécurisés

Dold, Florian 25 February 2019 (has links)
Les nouveaux protocoles de réseautage et cryptographiques peuvent considérablement améliorer les systèmes de paiement électroniques en ligne. Le présent mémoire porte sur la conception, la mise en œuvre et l’analyse sécuritaire du GNU Taler, un système de paiement respectueux de la vie privée conçu pour être pratique pour l’utilisation en ligne comme méthode de (micro-)paiement, et en même temps socialement et moralement responsable. La base technique du GNU Taler peut être dû à l’e-cash de David Chaum. Notre travail va au-delà de l’e-cash de Chaum avec un changement efficace, et la nouvelle notion de transparence des revenus garantissant que les marchands ne peuvent recevoir de manière fiable un paiement d’un payeur non fiable que lorsque leurs revenus du paiement est visible aux autorités fiscales. La transparence des revenus est obtenue grâce à l’introduction d’un protocole d’actualisation donnant lieu à un changement anonyme pour un jeton partiellement dépensé sans avoir besoin de l’introduction d’une évasion fiscale échappatoire. De plus, nous démontrons la sécurité prouvable de la transparence anonyme de nos revenus e-cash, qui concerne en plus l’anonymat habituel et les propriétés infalsifiables de l’e-cash, ainsi que la conservation formelle des fonds et la transparence des revenus. Notre mise en œuvre du GNU Taler est utilisable par des utilisateurs non-experts et s’intègre à l’architecture du web moderne. Notre plateforme de paiement aborde une série de questions pratiques telles que la prodigue des conseils aux clients, le mode de remboursement, l’intégration avec les banques et les chèques “know-your-customer (KYC)”, ainsi que les exigences de sécurité et de fiabilité de la plateforme web. Sur une seule machine, nous réalisons des taux d’opérations qui rivalisent avec ceux des processeurs de cartes de crédit commerciaux globaux. Pendant que les crypto-monnaies basées sur la preuve de travail à l’instar de Bitcoin doivent encore être mises à l’échelle pour servir de substituant aux systèmes de paiement établis, d’autres systèmes plus efficaces basés sur les Blockchains avec des algorithmes de consensus plus classiques pourraient avoir des applications prometteurs dans le secteur financier. Nous faisons dans la conception, la mise en œuvre et l’analyse de la Byzantine Set Union Consensus, un protocole de Byzantine consensus qui s’accorde sur un (Super-)ensemble d’éléments à la fois, au lieu d’accepter en séquence les éléments individuels sur un ensemble. Byzantine Set consensus peut être utilisé comme composante de base pour des chaînes de blocs de permissions, où (à l’instar du style Nakamoto consensus) des blocs entiers d’opérations sont convenus à la fois d’augmenter le taux d’opération. / We describe the design and implementation of GNU Taler, an electronic payment system based on an extension of Chaumian online e-cash with efficient change. In addition to anonymity for customers, it provides the novel notion of income transparency, which guarantees that merchants can reliably receive a payment from an untrusted payer only when their income from the payment is visible to tax authorities. Income transparency is achieved by the introduction of a refresh protocol, which gives anonymous change for a partially spent coin without introducing a tax evasion loophole. In addition to income transparency, the refresh protocol can be used to implement Camenisch-style atomic swaps, and to preserve anonymity in the presence of protocol aborts and crash faults with data loss by participants. Furthermore, we show the provable security of our income-transparent anonymous e-cash, which, in addition to the usual anonymity and unforgeability proper- ties of e-cash, also formally models conservation of funds and income transparency. Our implementation of GNU Taler is usable by non-expert users and integrates with the modern Web architecture. Our payment platform addresses a range of practical issues, such as tipping customers, providing refunds, integrating with banks and know-your-customer (KYC) checks, as well as Web platform security and reliability requirements. On a single machine, we achieve transaction rates that rival those of global, commercial credit card processors. We increase the robustness of the exchange—the component that keeps bank money in escrow in exchange for e-cash—by adding an auditor component, which verifies the correct operation of the system and allows to detect a compromise or misbehavior of the exchange early. Just like bank accounts have reason to exist besides bank notes, e-cash only serves as part of a whole payment system stack. Distributed ledgers have recently gained immense popularity as potential replacement for parts of the traditional financial industry. While cryptocurrencies based on proof-of-work such as Bitcoin have yet to scale to be useful as a replacement for established payment systems, other more efficient systems based on Blockchains with more classical consensus algorithms might still have promising applications in the financial industry. We design, implement and analyze the performance of Byzantine Set Union Consensus (BSC), a Byzantine consensus protocol that agrees on a (super-)set of elements at once, instead of sequentially agreeing on the individual elements of a set. While BSC is interesting in itself, it can also be used as a building block for permissioned Blockchains, where—just like in Nakamoto-style consensus—whole blocks of transactions are agreed upon at once, increasing the transaction rate.
337

Lattice-based digital signature and discrete gaussian sampling

Ricosset, Thomas 12 November 2018 (has links) (PDF)
Lattice-based cryptography has generated considerable interest in the last two decades due toattractive features, including conjectured security against quantum attacks, strong securityguarantees from worst-case hardness assumptions and constructions of fully homomorphicencryption schemes. On the other hand, even though it is a crucial part of many lattice-basedschemes, Gaussian sampling is still lagging and continues to limit the effectiveness of this newcryptography. The first goal of this thesis is to improve the efficiency of Gaussian sampling forlattice-based hash-and-sign signature schemes. We propose a non-centered algorithm, with aflexible time-memory tradeoff, as fast as its centered variant for practicable size of precomputedtables. We also use the Rényi divergence to bound the precision requirement to the standarddouble precision. Our second objective is to construct Falcon, a new hash-and-sign signaturescheme, based on the theoretical framework of Gentry, Peikert and Vaikuntanathan for latticebasedsignatures. We instantiate that framework over NTRU lattices with a new trapdoor sampler.
338

Sortir de Babel : une République des Langues en quête d’une « langue universelle » à la Renaissance et à l’Age classique ? / Escaping from Babel : a Republic of Languages in search of a “Universal Language” in the Early Modern Age ?

Simon, Fabien Dimitri 02 December 2011 (has links)
L’Europe de la Renaissance et de l’Âge classique a été le terrain d’une quête protéiforme de la langue universelle (recherches sur la langue d’Adam, encyclopédies de tous les idiomes de la terre, langues créées ex nihilo…). Afin de percevoir les conditions sociales de production de ce savoir linguistique, cette étude se propose d’élaborer une histoire, moins de la langue universelle elle-même que de ses concepteurs ; une histoire sociale et culturelle de ces pratiques intellectuelles, dans une perspective pluridisciplinaire et à l’échelle européenne. Les acteurs sociaux impliqués dans cette quête s’inscrivent dans des réseaux particuliers, liés à des institutions qui participent pleinement de la transformation du monde moderne (Royal Society, ordre jésuite…). Ils sont souvent des figures de la République des Lettres et en forment, par leurs travaux linguistiques et les correspondances fournies qu’ils suscitent, une province particulière : la « République des Langues ». S’y joue rien moins que le choix, non pas de la langue du bon usage – celle des grammairiens – mais de la langue de la science et de la vérité, la langue de la République des Lettres elle-même. Comment des savants européens contribuent-ils par cet espace social virtuel à faire exister leurs utopies linguistiques ? Discutés dans le cadre de ces réseaux européens transnationaux, les projets apparaissent comme des technologies littéraires et sociales, maîtrisées seulement par un petit nombre d’individus ; ces langues pour tous sont donc indissociablement des langues à l’usage du « moins grand nombre », des langues de distinction / During the Early Modern Age, Europe was the field of a protean quest for the universal language (researches on Adam’s language were carried out, encyclopedias of all the idioms spoken on earth were written, languages were created ex nihilo…). In order to understand the social conditions presiding over the production of that linguistic knowledge, the aim of this study is to retrace the history of the universal language planners rather than that of the language itself. It means to elaborate a social and cultural history of these intellectual practices on a European scale, in a multidisciplinary perspective. The social actors involved in that quest for the universal language were members of specific networks and connected with institutions which actively participated in the transformation of the modern world (the Royal Society, the Jesuits…). They were often prominent figures of the Republic of Letters within which, through their linguistic works and the numerous correspondences these gave rise to, they formed a specific province – the “Republic of Languages”. What was at stake was nothing less than choosing, not the language defining correct usage – that of the grammarians – but the language of sciences and truth, that of the Republic of Letters itself. How did Europeans scholars give life to their linguistic utopias through that virtual social space? Discussed within the framework of these transnational European networks, thelinguistic proects appeared like literary and social technologies, only mastered by a small group of individuals. Therefore these languages intended to be “for all” paradoxically turned out to be languages for “the happy few”, languages of distinction
339

Quantum Circuits for Symmetric Cryptanalysis

Unknown Date (has links)
Quantum computers and quantum computing is a reality of the near feature. Companies such as Google and IBM have already declared they have built a quantum computer and tend to increase their size and capacity moving forward. Quantum computers have the ability to be exponentially more powerful than classical computers today. With this power modeling behavior of atoms or chemical reactions in unusual conditions, improving weather forecasts and traffic conditions become possible. Also, their ability to exponentially speed up some computations makes the security of todays data and items a major concern and interest. In the area of cryptography, some encryption schemes (such as RSA) are already deemed broken by the onset of quantum computing. Some encryption algorithms have already been created to be quantum secure and still more are being created each day. While these algorithms in use today are considered quantum-safe not much is known of what a quantum attack would look like on these algorithms. Specifically, this paper discusses how many quantum bits, quantum gates and even the depth of these gates that would be needed for such an attack. The research below was completed to shed light on these areas and offer some concrete numbers of such an attack. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2018. / FAU Electronic Theses and Dissertations Collection
340

Hardware design and performance analysis for cryptographic sponge BlaMka. / Projeto de hardware e análise de desempenho para a exponja criptográfica BlaMka.

Rossetti, Jônatas Faria 19 May 2017 (has links)
To evaluate the performance of a hardware design, it is necessary to select the met- rics of interest. Several metrics can be chosen, but in general three of them are considered basic: area, latency, and power. From these, other metrics of practical interest such as throughput and energy consumption can be obtained. These metrics relate to one another by creating trade-offs that designers need to know to execute the best design decisions. Some works address optimized hardware design for improving one of these metrics. In other works, optimizations are made for two of them. Others analyze the trade-off between two of these metrics. However, the literature lacks of works that analyze the behavior of three metrics together. In this work, we intend to contribute to bridge this gap, proposing a method that allow analyzing trade-offs among area, power, and throughput. To verify the proposed method, the permutation function of crypto- graphic sponge BlaMka was chosen as a case study. No hardware implementation has been found for this algorithm yet. Therefore, an additional contribution is to provide its first hardware design. Combinational and sequential circuits were designed and synthesized for ASIC and FPGA. With the synthesis results, a detailed performance analysis was performed for each platform, starting from a one-dimensional analysis, going through a two-dimensional analysis, and culminating in a three-dimensional analysis. Two techniques were presented for such analysis, namely projections approach and planes approach. Although there is room for improvement, the proposed method is a initial step showing that, in fact, a trade-off between three metrics can be analyzed, and that it is also possible to find balanced performance points. From the two approaches presented, it was possible to derive a criterion to select optimizations when we have restrictions, such as a desired throughput range or a maximum physical size, and when we do not have restrictions, in which case we can choose the optimization with the most balanced performance. / Para avaliar o desempenho de um projeto de hardware, é necessário selecionar as métricas de interesse. Várias métricas podem ser escolhidas, mas em geral três delas são consideradas básicas: área, latência e potência. A partir delas, podem ser obtidas outras métricas de interesse prático, tais como vazão e consumo de energia. Essas métricas relacionam-se entre si, criando trade-offs que os projetistas precisam conhecer para executar as melhores decisões de projeto. Alguns trabalhos abordam o projeto de hardware otimizado para melhorar uma dessas métricas. Em outros trabalhos, as otimizações são feitas para duas delas, mas sem analisar como uma terceira métrica se relaciona com as demais. Outros analisam o trade-off entre duas dessas métricas. Entretanto, a literatura carece de trabalhos que analisem o comportamento de três métricas em conjunto. Neste trabalho, pretendemos contribuir para preencher essa lacuna, propondo um método que permita a análise de trade-offs entre área, potência e vazão. Para verificar o método proposto, foi escolhida a função de permutação da esponja criptográfica BlaMka como estudo de caso. Até o momento, nenhuma implementação em hardware foi encontrada para esse algoritmo. Dessa forma, uma contribuição adicional é apresentar seu primeiro projeto de hardware. Circuitos combinacionais e sequenciais foram projetados e sintetizados para ASIC e FPGA. Com os resultados de síntese, foi realizada uma análise de desempenho detalhada para cada plataforma, a partir de uma análise unidimensional, passando por uma análise bidimensional e culminando em uma análise tridimensional. Duas técnicas foram apresentadas para tal análise tridimensional, chamadas abordagem das projeções e abordagem dos planos. Embora passível de melhorias, o método apresentado é um passo inicial mostrando que, de fato, um trade-off entre três métricas pode ser analisado, e que também é possível encontrar pontos de desempenho balanceado. A partir das duas abordagens, foi possível derivar um critério para selecionar otimizações quando há restrições, como um faixa de vazão desejada ou um tamanho físico máximo, e quando não há restrições, caso em que é possível escolher a otimização com o desempenho mais balanceado.

Page generated in 0.0193 seconds