211 |
Δημιουργία εφαρμογής για την κρυπτογράφηση και αποκρυπτογράφηση γραπτού κειμένουΚακαβάς, Γεράσιμος 05 May 2009 (has links)
Ο βασικός στόχος της παρούσας διπλωματικής εργασίας είναι η δημιουργία μίας βιβλιοθήκης συναρτήσεων κρυπτογράφησης γραπτού κειμένου, η οποία να μπορεί στην συνέχεια να επεξεργαστεί και να αναπτυχθεί περεταίρω με την προσθήκη νέων συναρτήσεων ή ακόμη και με την βελτίωση των ήδη υπαρχόντων. Στα πλαίσια της υλοποιήθηκαν τέσσερεις συναρτήσεις κρυπτογράφησης-αποκρυπτογράφησης οι οποίες παρουσιάζονται αναλυτικά στο κείμενο της εργασίας. Οι συναρτήσεις αυτές συγκρίνονται ως προς την ταχύτητά τους με τον αλγόριθμο κρυπτογράφησης DES (Data Encryption Standard) από όπου προκύπτει ότι μία εξ αυτών επιτυγχάνει μικρότερους χρόνους κρυπτογράφησης-αποκρυπτογράφησης, ενώ οι χρόνοι των υπολοίπων τριών είναι ελαφρώς υψηλότεροι. Τέλος πραγματοποιείται μία εκτίμηση ως προς την ασφάλεια που παρέχεται από την χρήση των συγκεκριμένων συναρτήσεων. / The main goal of this diploma dissertation is to create a library of cryptographic functions applied in text, which can then be able to be processed and further more expand by adding new functions or even by improving the existing ones. Within the framework of this dissertation four encryption-decryption functions were created, which are widely presented in this text. The speed of these functions is compared to the algorithm DES (Data Encryption Standard) from which results that one of them achieves faster times of encryption-decryption while the other are taking slightly longer to complete. Finally, an estimation is carried out as far as the safety provided by the use of these specific functions.
|
212 |
Transformations for linguistic steganographyChang, Ching-Yun January 2013 (has links)
No description available.
|
213 |
Evaluating Large Degree Isogenies between Elliptic CurvesSoukharev, Vladimir 12 1900 (has links)
An isogeny between elliptic curves is an algebraic morphism which is a group homomorphism. Many applications in cryptography require evaluating large degree isogenies between elliptic curves efficiently. For ordinary curves of the same endomorphism ring, the previous fastest algorithm known has a worst case running time which is exponential in the length of the input. In this thesis we solve this problem in subexponential time under reasonable heuristics. We give two versions of our algorithm, a slower version assuming GRH and a faster version assuming stronger heuristics. Our approach is based on factoring the ideal corresponding to the kernel of the isogeny, modulo principal ideals, into a product of smaller prime ideals for which the isogenies can be computed directly. Combined with previous work of Bostan et al., our algorithm yields equations for large degree isogenies in quasi-optimal time given only the starting curve and the kernel.
|
214 |
Classical and quantum strategies for bit commitment schemes in the two-prover modelSimard, Jean-Raymond. January 2007 (has links)
We show that the long-standing assumption of "no-communication" between the provers of the two-prover model is not sufficiently precise to guarantee the security of a bit commitment scheme against malicious adversaries. Indeed, we show how a simple correlated random variable, which does not allow to communicate, can be used to cheat a simplified version (sBGKW) of the bit commitment scheme of Ben-Or, Goldwasser, Kilian, and Wigderson [BGKW88]. Instead we propose a stronger notion of separation between the two provers which takes into account correlated computations. To emphasize the risk that entanglement still represents for the security of a commitment scheme despite the stronger notion of separation, we present two variations of the sBGKW scheme that can be cheated by quantum provers with probability (almost) one. A complete proof of security against quantum adversaries is then given for the sBGKW scheme. By reduction we also obtain the security of the original BGKW scheme against quantum provers. For the unfamiliar reader, basic notions of quantum processing are provided to facilitate the understanding of the proofs presented.
|
215 |
On the Security of Some Variants of RSAHinek, M. Jason January 2007 (has links)
The RSA cryptosystem, named after its inventors, Rivest, Shamir and Adleman, is the most widely known and widely used public-key cryptosystem in the world today. Compared to other public-key cryptosystems, such as
elliptic curve cryptography, RSA requires longer keylengths and is computationally more expensive. In order to address these shortcomings, many variants of RSA have been proposed over the years. While the security
of RSA has been well studied since it was proposed in 1977, many of these variants have not. In this thesis, we investigate the security of five of these variants of RSA. In particular, we provide detailed analyses of the best known algebraic attacks (including some new attacks) on instances of
RSA with certain special private exponents, multiple instances of RSA sharing a common small private exponent, Multi-prime RSA, Common Prime RSA and Dual RSA.
|
216 |
Revisiting the security model for aggregate signature schemesLacharité, Marie-Sarah January 2014 (has links)
Aggregate signature schemes combine the digital signatures of multiple users on different messages into one single signature. The Boneh-Gentry-Lynn-Shacham (BGLS) aggregate signature scheme is one such scheme, based on pairings, where anyone can aggregate the signatures in any order. We suggest improvements to its current chosen-key security model. In particular, we argue that the scheme should be resistant to attackers that can adaptively choose their target users, and either replace other users' public keys or expose other users' private keys. We compare these new types of forgers to the original targeted-user forger, building up to the stronger replacement-and-exposure forger. Finally, we present a security reduction for a variant of the BGLS aggregate signature scheme with respect to this new notion of forgery. Recent attacks by Joux and others on the discrete logarithm problem in small-characteristic finite fields dramatically reduced the security of many type I pairings. Therefore, we explore security reductions for BGLS with type III rather than type I pairings. Although our reductions are specific to BGLS, we believe that other aggregate signature schemes could benefit from similar changes to their security models.
|
217 |
Cryptographic Credentials with Privacy-preserving Biometric BindingsBissessar, David 22 January 2013 (has links)
Cryptographic credentials allow user authorizations to be granted and verified. and have such applications as e-Passports, e-Commerce, and electronic cash. This thesis proposes a privacy protecting approach of binding biometrically derived keys to cryptographic credentials to prevent unauthorized lending. Our approach builds on the 2011 work of Adams, offering additional benefits of privacy protection of biometric information, generality on biometric modalities, and performance. Our protocol integrates into Brands’ Digital Credential scheme, and the Anonymous Credentials scheme of Camenisch and Lysyanskaya. We describe a detailed integration with the Digital Credential Scheme and sketch the integration into the Anonymous Credentials scheme. Security proofs for non-transferability, correctness of ownership, and unlinkability are provided for the protocol’s instantiation into Digital Credentials.
Our approach uses specialized biometric devices in both the issue and show protocols. These devices are configured with our proposed primitive, the fuzzy ex-tractor indistinguishability adaptor which uses a traditional fuzzy extractor to create and regenerate cryptographic keys from biometric data and IND-CCA2 secure en-cryption protect the generated public data against multiplicity attacks. Pedersen commitments are used to hold the key at issue and show time, and A zero-knowledge proof of knowledge is used to ensure correspondence of key created at issue-time and regenerated at show-time. The above is done in a manner which preserves biometric privacy, as and delivers non-transferability of digital credentials.
The biometric itself is not stored or divulged to any of the parties involved in the protocol. Privacy protection in multiple enrollments scenarios is achieved by the fuzzy extractor indistinguishability adapter. The zero knowledge proof of knowledge is used in the showing protocol to prove knowledge of values without divulging them.
|
218 |
Democracy Enhancing Technologies: Toward deployable and incoercible E2E electionsClark, Jeremy January 2011 (has links)
End-to-end verifiable election systems (E2E systems) provide a provably correct tally while maintaining the secrecy of each voter's ballot, even if the voter is complicit in demonstrating how they voted. Providing voter incoercibility is one of the main challenges of designing E2E systems, particularly in the case of internet voting. A second challenge is building deployable, human-voteable E2E systems that conform to election laws and conventions. This dissertation examines deployability, coercion-resistance, and their intersection in election systems. In the course of this study, we introduce three new election systems, (Scantegrity, Eperio, and Selections), report on two real-world elections using E2E systems (Punchscan and Scantegrity), and study incoercibility issues in one deployed system (Punchscan). In addition, we propose and study new practical primitives for random beacons, secret printing, and panic passwords. These are tools that can be used in an election to, respectively, generate publicly verifiable random numbers, distribute the printing of secrets between non-colluding printers, and to covertly signal duress during authentication. While developed to solve specific problems in deployable and incoercible E2E systems, these techniques may be of independent interest.
|
219 |
On Experimental Quantum Communication and CryptographyErven, Christopher January 2012 (has links)
One of the most fascinating recent developments in research has been how different disciplines have become more and more interconnected. So much so that fields as disparate as information theory and fundamental physics have combined to produce ideas for the next generation of computing and secure information technologies, both of which have far reaching consequences. For more than fifty years Moore's law, which describes the trend of the transistor's size shrinking by half every two years, has proven to be uncannily accurate. However, the computing industry is now approaching a fundamental barrier as the size of a transistor approaches that of an individual atom and the laws of physics and quantum mechanics take over. Rather then look at this as the end, quantum information science has emerged to ask the question of what additional power and functionality might be realized by harnessing some of these quantum effects. This thesis presents work on the sub-field of quantum cryptography which seeks to use quantum means in order to assure the security of ones communications. The beauty of quantum cryptographic methods are that they can be proven secure, now and indefinitely into the future, relying solely on the validity of the laws of physics for their proofs of security. This is something which is impossible for nearly all current classical cryptographic methods to claim.
The thesis begins by examining the first implementation of an entangled quantum key distribution system over two free-space optical links. This system represents the first test-bed of its kind in the world and while its practical importance in terrestrial applications is limited to a smaller university or corporate campus, the system mimics the setup for an entangled satellite system aiding in the study of distributing entangled photons from an orbiting satellite to two earthbound receivers. Having completed the construction of a second free-space link and the automation of the alignment system, I securely distribute keys to Alice and Bob in two distant locations separated by 1,575 m with no direct line-of-sight between them. I examine all of the assumptions necessary for my claims of security, something which is particularly important for moving these systems out of the lab and into commercial industry. I then go on to describe the free-space channel over which the photons are sent and the implementation of each of the major system components. I close with a discussion of the experiment which saw raw detected entangled photon rates of 565 s^{-1} and a quantum bit error rate (QBER) of 4.92% resulting in a final secure key rate of 85 bits/s. Over the six hour night time experiment I was able to generate 1,612,239 bits of secure key.
With a successful QKD experiment completed, this thesis then turns to the problem of improving the technology to make it more practical by increasing the key rate of the system and thus the speed at which it can securely encrypt information. It does so in three different ways, involving each of the major disciplines comprising the system: measurement hardware, source technology, and software post-processing. First, I experimentally investigate a theoretical proposal for biasing the measurement bases in the QKD system showing a 79% improvement in the secret key generated from the same raw key rates. Next, I construct a second generation entangled photon source with rates two orders of magnitude higher than the previous source using the idea of a Sagnac interferometer. More importantly, the new source has a QBER as low as 0.93% which is not only important for the security of the QKD system but will be required for the implementation of a new cryptographic primitive later. Lastly, I study the free-space link transmission statistics and the use of a signal-to-noise ratio (SNR) filter to improve the key rate by 25.2% from the same amount of raw key. The link statistics have particular relevance for a current project with the Canadian Space Agency to exchange a quantum key with an orbiting satellite - a project which I have participated in two feasibility studies for.
Wanting to study the usefulness of more recent ideas in quantum cryptography this thesis then looks at the first experimental implementation of a new cryptographic primitive called oblivious transfer (OT) in the noisy storage model. This primitive has obvious important applications as it can be used to implement a secure identification scheme provably secure in a quantum scenario. Such a scheme could one day be used, for example, to authenticate a user over short distances, such as at ATM machines, which have proven to be particularly vulnerable to hacking and fraud. Over a four hour experiment, Alice and Bob measure 405,642,088 entangled photon pairs with an average QBER of 0.93% allowing them to create a secure OT key of 8,939,150 bits. As a first implementer, I examine many of the pressing issues currently preventing the scheme from being more widely adopted such as the need to relax the dependance of the OT rate on the loss of the system and the need to extend the security proof to cover a wider range of quantum communication channels and memories. It is important to note that OT is fundamentally different than QKD for security as the information is never physically exchanged over the communication line but rather the joint equality function f(x) = f(y) is evaluated. Thus, security in QKD does not imply security for OT.
Finally, this thesis concludes with the construction and initial alignment of a second generation free-space quantum receiver, useful for increasing the QKD key rates, but designed for a fundamental test of quantum theory namely a Svetlichny inequality violation. Svetlichny's inequality is a generalization of Bell's inequality to three particles where any two of the three particles maybe be non-locally correlated. Even so, a violation of Svetlichny's inequality shows that certain quantum mechanical states are incompatible with this restricted class of non-local yet realistic theories. Svetlichny's inequality is particularly important because while there has been an overwhelming number of Bell experiments performed testing two-body correlations, experiments on many-body systems have been few and far between. Experiments of this type are particularly valuable to explore since we live in a many-body world. The new receiver incorporates an active polarization analyzer capable of switching between measurement bases on a microsecond time-scale through the use of a Pockels cell while maintaining measurements of a high fidelity. Some of the initial alignment and analysis results are detailed including the final measured contrasts of 1:25.2 and 1:22.6 in the rectilinear and diagonal bases respectively.
|
220 |
Quantum Key Distribution Data Post-Processing with Limited Resources: Towards Satellite-Based Quantum CommunicationGigov, Nikolay 15 January 2013 (has links)
Quantum key distribution (QKD), a novel cryptographic technique for secure distribution of secret keys between two parties, is the first successful quantum technology to emerge from quantum information science. The security of QKD is guaranteed by fundamental properties of quantum mechanical systems, unlike public-key cryptography whose security depends on difficult to solve mathematical problems such as factoring. Current terrestrial quantum links are limited to about 250 km. However, QKD could soon be deployed on a global scale over free-space links to an orbiting satellite used as a trusted node.
Envisioning a photonic uplink to a quantum receiver positioned on a low Earth orbit satellite, the Canadian Quantum Encryption and Science Satellite (QEYSSat) is a collaborative project involving Canadian universities, the Canadian Space Agency (CSA) and industry partners. This thesis presents some of the research conducted towards feasibility studies of the QEYSSat mission.
One of the main goals of this research is to develop technologies for data acquisition and processing required for a satellite-based QKD system. A working testbed system helps to establish firmly grounded estimates of the overall complexity, the computing resources necessary, and the bandwidth requirements of the classical communication channel. It can also serve as a good foundation for the design and development of a future payload computer onboard QEYSSat.
This thesis describes the design and implementation of a QKD post-processing system which aims to minimize the computing requirements at one side of the link, unlike most traditional implementations which assume symmetric computing resources at each end. The post-processing software features precise coincidence analysis, error correction based on low-density parity-check codes, privacy amplification employing Toeplitz hash functions, and a procedure for automated polarization alignment.
The system's hardware and software components integrate fully with a quantum optical apparatus used to demonstrate the feasibility of QKD with a satellite uplink. Detailed computing resource requirements and QKD results from the operation of the entire system at high-loss regimes are presented here.
|
Page generated in 0.0734 seconds