• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 776
  • 132
  • 121
  • 68
  • 67
  • 36
  • 27
  • 24
  • 12
  • 12
  • 8
  • 7
  • 7
  • 7
  • 6
  • Tagged with
  • 1463
  • 516
  • 464
  • 315
  • 236
  • 232
  • 221
  • 197
  • 162
  • 159
  • 156
  • 144
  • 134
  • 134
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
721

Fast and flexible hardware support for elliptic curve cryptography over multiple standard prime finite fields

Alrimeih, Hamad 29 March 2012 (has links)
Exchange of private information over a public medium must incorporate a method for data protection against unauthorized access. Elliptic curve cryptography (ECC) has become widely accepted as an efficient mechanism to secure private data using public-key protocols. Scalar multiplication (which translates into a sequence of point operations each involving several modular arithmetic operations) is the main ECC computation, where the scalar value is secret and must be secured. In this dissertation, we consider ECC over five standard prime finite fields recommended by the National Institute of Standard and Technology (NIST), with the corresponding prime sizes of 192, 224, 256, 384, and 521 bits. This dissertation presents our general hardware-software approach and technical details of our novel hardware processor design, aimed at accelerating scalar multiplications with flexible security-performance tradeoffs. To enhance performance, our processor exploits parallelism by pipelining modular arithmetic computations and associated input/output data transfers. To enhance security, modular arithmetic computations and associated data transfers are grouped into atomically executed computational blocks, in order to make curve point operations indistinguishable and thus mask the scalar value. The flexibility of our processor is achieved through the software-controlled hardware programmability, which allows for different scenarios of computing atomic block sequences. Each scenario is characterized by a certain trade-off between the processor’s security and performance. As the best trade-off scenario is specific to the user and/or application requirements, our approach allows for such a scenario to be chosen dynamically by the system software, thus facilitating system adaptation to dynamically changing requirements. Since modular multiplications are the most critical low-level operation in ECC computations, we also propose a novel modular multiplier specifically optimized to take full advantage of the fast reduction algorithms associated with the five NIST primes. The proposed architecture has been prototyped on a Xilinx Virtex-6 FPGA and takes between 0.30 ms and 3.91 ms to perform a typical scalar multiplication. Such performance figures demonstrate both flexibility and efficiency of our proposed design and compares favourably against other systems reported in the literature. / Graduate
722

On studying Whitenoise stream-cipher against Power Analysis Attacks

Zakeri, Babak 17 December 2012 (has links)
This report describes the works done since May 2010 to December 2012 on breaking Whitenoise encryption algorithm. It is mainly divided into two sections: Studying the stream-cipher developed by Whitenoise lab and its implementation on a FPGA against certain group of indirect attacks called Power Analysis Attacks, and reviewing the process of development and results of experiments applied on a power sampling board which was developed during this project. For the first part the algorithm and the implementation would be reverse engineered and reviewed. Various blocks of the implementation would be studied one by one against some indirect attacks. It would be shown that those attacks are useless or at least very weak against Whitenoise. A new scenario would then be proposed to attack the implementation. An improvement to the new scenario would also be presented to completely hack the implementation. However it would also be shown that the complete hack requires very accurate equipment, large number of computations and applying a lot of tests and thus Whitenoise seems fairly strong against this specific group of attacks. In the next section the requirements of a power consumption measurement setup would be discussed. Then the motivations and goals of building such a board would be mentioned. Some important concepts and consideration in building the board, such as schematic of the amplifier, multilayer designing, embedding a BGA component, star grounding, inductance reduction, and other concepts would be presented. Then the results of applied tests on the produced board would be discussed. The precision of the measurements, some pattern recognition along with some other results would be illustrated. Also some important characteristics such as linearity of measurements would be investigated and proved to exist. In the end some topics as possible future works, such as more pattern recognition, or observing the effect of masks on the power consumption would be suggested. / Graduate
723

Analyse et conception de techniques opérationnelles de stéganographie

Bodin, Nicolas 28 August 2013 (has links) (PDF)
La stéganographie est la science de l'écriture cachée. Dans ce contexte, un individu tente de communiquer avec une entité sans éveiller les soupçons sur le fondement même de la communication. Cette science vient en complément de la cryptographie (sécurisation de la communication -- COMSEC) lorsqu'un besoin d'invisibilité de la communication se fait sentir. Cette thèse, réalisée sous la tutelle et au profit de l'État Major des Armées, traite donc des différentes techniques permettant l'élaboration d'un schéma de stéganographie (sécurisation de la transmission -- TRANSEC), techniquement opérationnel et assez solide, visant à insérer un message d'une dizaine de kilo-octets dans une image JPEG de dimensions raisonnables, et capable de résister aux différentes attaques données par l'état de l'art. Afin de rendre ce schéma le plus sûr possible, les formats de données les plus courants sont étudiés (JPEG, BMP, MP3), avant de définir un premier algorithme d'insertion. Ce dernier, fondé sur les travaux de Hopper, reste conceptuel mais permet de définir les fondements de notre algorithme (nommé IMEI). Ce dernier maximise l'efficacité d'insertion, et donc minimise le nombre de modifications apportées au cover-médium. Une analyse de l'algorithme HUGO présenté dans le contexte du challenge BOSS va nous permettre de définir un protocole de stéganalyse, ainsi qu'une deuxième brique importante pour l'IMEI. La dernière partie de ce manuscrit regroupe la partie stéganalyse, avec l'évaluation de l'IMEI et une stéganalyse réellement opérationnelle dans laquelle nous pouvons retrouver une étude de l'utilisation concrète de la stéganographie et de son évaluation.
724

The Self Power Map and its Image Modulo a Prime

Anghel, Catalina Voichita 02 August 2013 (has links)
The self-power map is the function from the set of natural numbers to itself which sends the number $n$ to $n^n$. Motivated by applications to cryptography, we consider the image of this map modulo a prime $p$. We study the question of how large $x$ must be so that $n^n \equiv a \bmod p$ has a solution with $1 \le n \le x$, for every residue class $a$ modulo $p$. While $n^n \bmod p$ is not uniformly distributed, it does appear to behave in certain ways as a random function. We give a heuristic argument to show that the expected $x$ is approximately ${p^2\log \phi(p-1)/\phi(p-1)}$, using the coupon collector problem as a model. Rigorously, we prove the bound $x <p^{2-\alpha}$ for sufficiently large $p$ and a fixed constant $\alpha > 0$ independent of $p$, using a counting argument and exponential sum bounds. Additionally, we prove nontrivial bounds on the number of solutions of $n^n \equiv a \bmod p$ for a fixed residue class $a$ when $1 \le n \le x$, extending the known bounds when $1 \le n \le p-1$.
725

The Self Power Map and its Image Modulo a Prime

Anghel, Catalina Voichita 02 August 2013 (has links)
The self-power map is the function from the set of natural numbers to itself which sends the number $n$ to $n^n$. Motivated by applications to cryptography, we consider the image of this map modulo a prime $p$. We study the question of how large $x$ must be so that $n^n \equiv a \bmod p$ has a solution with $1 \le n \le x$, for every residue class $a$ modulo $p$. While $n^n \bmod p$ is not uniformly distributed, it does appear to behave in certain ways as a random function. We give a heuristic argument to show that the expected $x$ is approximately ${p^2\log \phi(p-1)/\phi(p-1)}$, using the coupon collector problem as a model. Rigorously, we prove the bound $x <p^{2-\alpha}$ for sufficiently large $p$ and a fixed constant $\alpha > 0$ independent of $p$, using a counting argument and exponential sum bounds. Additionally, we prove nontrivial bounds on the number of solutions of $n^n \equiv a \bmod p$ for a fixed residue class $a$ when $1 \le n \le x$, extending the known bounds when $1 \le n \le p-1$.
726

Towards Template Security for Iris-based Biometric Systems

Fouad, Marwa 18 April 2012 (has links)
Personal identity refers to a set of attributes (e.g., name, social insurance number, etc.) that are associated with a person. Identity management is the process of creating, maintaining and destroying identities of individuals in a population. Biometric technologies are technologies developed to use statistical analysis of an individual’s biological or behavioral traits to determine his identity. Biometrics based authentication systems offer a reliable solution for identity management, because of their uniqueness, relative stability over time and security (among other reasons). Public acceptance of biometric systems will depend on their ability to ensure robustness, accuracy and security. Although robustness and accuracy of such systems are rapidly improving, there still remain some issues of security and balancing it with privacy. While the uniqueness of biometric traits offers a convenient and reliable means of identification, it also poses the risk of unauthorized cross-referencing among databases using the same biometric trait. There is also a high risk in case of a biometric database being compromised, since it’s not possible to revoke the biometric trait and re-issue a new one as is the case with passwords and smart keys. This unique attribute of biometric based authentication system poses a challenge that might slow down public acceptance and the use of biometrics for authentication purposes in large scale applications. In this research we investigate the vulnerabilities of biometric systems focusing on template security in iris-based biometric recognition systems. The iris has been well studied for authentication purposes and has been proven accurate in large scale applications in several airports and border crossings around the world. The most widely accepted iris recognition systems are based on Daugman’s model that creates a binary iris template. In this research we develop different systems using watermarking, bio-cryptography as well as feature transformation to achieve revocability and security of binary templates in iris based biometric authentication systems, while maintaining the performance that enables widespread application of these systems. All algorithms developed in this research are applicable on already existing biometric authentication systems and do not require redesign of these existing, well established iris-based authentication systems that use binary templates.
727

Experimental quantum communication in demanding regimes

Meyer-Scott, Evan January 2011 (has links)
Quantum communication promises to outperform its classical counterparts and enable protocols previously impossible. Specifically, quantum key distribution (QKD) allows a cryptographic key to be shared between distant parties with provable security. Much work has been performed on theoretical and experi- mental aspects of QKD, and the push is on to make it commercially viable and integrable with existing technologies. To this end I have performed simulations and experiments on QKD and other quantum protocols in regimes previously unexplored. The first experiment involves QKD via distributed entanglement through the standard telecommunications optical fibre network. I show that entanglement is preserved, even when the photons used are a shorter wavelength than the design of the optical fibre calls for. This surprising result is then used to demonstrate QKD over installed optical fibre, even with co-propagating classical traffic. Because the quantum and classical signals are sufficiently separated in wavelength, little cross-talk is observed, leading to high compatibility between this type of QKD and existing telecommunications infrastructure. Secondly, I demonstrate the key components of fully-modulated decoy-state QKD over the highest-loss channel to date, using a novel photon source based on weak coherent (laser) pulses. This system has application in a satellite uplink of QKD, which would enable worldwide secure communication. The uplink allows the complex quantum source to be kept on the ground while only simple receivers are in space, but suffers from high link loss due to atmospheric turbulence, necessitating the use of specific photon detectors and highly tailored photon pulses. My results could be applied in a near term satellite mission.
728

On Error Detection and Recovery in Elliptic Curve Cryptosystems

Alkhoraidly, Abdulaziz Mohammad January 2011 (has links)
Fault analysis attacks represent a serious threat to a wide range of cryptosystems including those based on elliptic curves. With the variety and demonstrated practicality of these attacks, it is essential for cryptographic implementations to handle different types of errors properly and securely. In this work, we address some aspects of error detection and recovery in elliptic curve cryptosystems. In particular, we discuss the problem of wasteful computations performed between the occurrence of an error and its detection and propose solutions based on frequent validation to reduce that waste. We begin by presenting ways to select the validation frequency in order to minimize various performance criteria including the average and worst-case costs and the reliability threshold. We also provide solutions to reduce the sensitivity of the validation frequency to variations in the statistical error model and its parameters. Then, we present and discuss adaptive error recovery and illustrate its advantages in terms of low sensitivity to the error model and reduced variance of the resulting overhead especially in the presence of burst errors. Moreover, we use statistical inference to evaluate and fine-tune the selection of the adaptive policy. We also address the issue of validation testing cost and present a collection of coherency-based, cost-effective tests. We evaluate variations of these tests in terms of cost and error detection effectiveness and provide infective and reduced-cost, repeated-validation variants. Moreover, we use coherency-based tests to construct a combined-curve countermeasure that avoids the weaknesses of earlier related proposals and provides a flexible trade-off between cost and effectiveness.
729

Optimal Pairings on BN Curves

Yu, Kewei 17 August 2011 (has links)
Bilinear pairings are being used in ingenious ways to solve various protocol problems. Much research has been done on improving the efficiency of pairing computations. This thesis gives an introduction to the Tate pairing and some variants including the ate pairing, Vercauteren's pairing, and the R-ate pairing. We describe the Barreto-Naehrig (BN) family of pairing-friendly curves, and analyze three different coordinates systems (affine, projective, and jacobian) for implementing the R-ate pairing. Finally, we examine some recent work for speeding the pairing computation and provide improved estimates of the pairing costs on a particular BN curve.
730

Counting And Constructing Boolean Functions With Particular Difference Distribution Vectors

Yildirim, Elif 01 June 2004 (has links) (PDF)
In this thesis we deal with the Boolean functions with particular difference distribution vectors. Besides the main properties, we especially focus on strict avalanche criterion for cryptographic aspects. Not only we deal with known methods we also demonstrate some new methods for counting and constructing such functions. Furthermore, performing some statistical tests, we observed a number of interesting properties.

Page generated in 0.0293 seconds