• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 767
  • 132
  • 121
  • 68
  • 66
  • 36
  • 27
  • 24
  • 12
  • 12
  • 8
  • 7
  • 7
  • 7
  • 6
  • Tagged with
  • 1448
  • 513
  • 458
  • 312
  • 235
  • 230
  • 221
  • 192
  • 162
  • 156
  • 150
  • 144
  • 134
  • 133
  • 109
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Anonymity and Linkability

January 2018 (has links)
acase@tulane.edu / This thesis considers systems for anonymous communication between users of a cybersystem. Specifically, we consider the scenario where communications generated by the same user repeatedly over time can or must be linked. Linked user behavior can leak information, which adversaries can use to de-anonymize users. Analyzing linked behavior can also generate information about the use of anonymity protocols that can be valuable for research, leading to more effective protocols. But techniques to collect such data must include assurances that the methods and outputs do not compromise user privacy. A main result of this thesis is an anonymity protocol called Private Set-Union Cardinality, designed to aggregate linked private user data safely. We prove that Private Set-Union Cardinality securely calculates the noisy cardinality of the union of a collection of distributed private data sets. This protocol is intended to take measurements in real-world anonymity systems like Tor and we prove it is secure even if a majority of the participants are dishonest as well as under general concurrent composition. The remaining results analyze path selection in anonymous routing systems. To obtain our results, we develop a mathematical framework to measure information leakage during repeated linkable path selection and propose new metrics: a radius that measures worst-case behavior, and a neighborhood graph that visualizes degradation of the system over time as a whole. We use these metrics to derive theoretical upper bounds on an adversary's accuracy in de-anonymization. Finally, we investigate an attack where users can be de-anonymized due to the information an adversary learns when failing to observe some event. We call these occurrences non-observations and we develop a theory of non-observations in anonymous routing systems, deriving theoretical bounds on the information leakage due to this behavior in the general case and for Tor. / 1 / Ellis Fenske
82

Quantum Encryption with Certified Deletion

Islam, Rabib 20 January 2020 (has links)
In the context of classical information, every message is composed of 0s and 1s; these messages can generally be copied at will. However, when quantum phenomena are used to model information, this guarantee no longer exists. This difference gives rise to a range of cryptographic possibilities when one considers encoding certain messages as quantum information. In our case, we analyze a potential benefit of encoding part of an encryption scheme’s ciphertext as quantum information. We call this type of ciphertext a quantum ciphertext. In particular, quantum ciphertexts are useful when one wants to be able to prove the deletion of the plaintext underlying a ciphertext. Since classical ciphertexts can be copied, clearly such feat is impossible using classical information alone. However, we show that quantum encodings allow such certified deletion. More precisely, we show that it is possible to encrypt classical data into a quantum ciphertext such that the recipient of the ciphertext can produce a classical string which proves to the originator that the recipient has relinquished any chance of recovering the plaintext, should the decryption key be revealed. Our scheme is feasible with current quantum technology: the honest parties only require quantum devices for single-qubit preparation and measurements, and the scheme is robust against noise in these devices. Furthermore, we provide a proof of security which requires only a finite amount of communication, and which therefore avoids the common technique of relying on the analysis of an asymptotic case.
83

René Schoof's Algorithm for Determining the Order of the Group of Points on an Elliptic Curve over a Finite Field

McGee, John J. 08 June 2006 (has links)
Elliptic curves have a rich mathematical history dating back to Diophantus (c. 250 C.E.), who used a form of these cubic equations to find right triangles of integer area with rational sides. In more recent times the deep mathematics of elliptic curves was used by Andrew Wiles et. al., to construct a proof of Fermat's last theorem, a problem which challenged mathematicians for more than 300 years. In addition, elliptic curves over finite fields find practical application in the areas of cryptography and coding theory. For such problems, knowing the order of the group of points satisfying the elliptic curve equation is important to the security of these applications. In 1985 René Schoof published a paper [5] describing a polynomial time algorithm for solving this problem. In this thesis we explain some of the key mathematical principles that provide the basis for Schoof's method. We also present an implementation of Schoof's algorithm as a collection of Mathematica functions. The operation of each algorithm is illustrated by way of numerical examples. / Master of Science
84

Parallel Windowed Method for Scalar Multiplication in Elliptic Curve Cryptography

Bouman, Tanya January 2021 (has links)
Commercial applications, including Blockchain, require large numbers of cryptographic signing and verification operations, increasingly using Elliptic Curve Cryptography. This uses a group operation (called point addition) in the set of points on an elliptic curve over a prime field. Scalar multiplication of the repeated addition of a fixed point, P , in the curve. Along with the infinity point, which serves as the identity of addition and the zero of scalar multiplication, this forms a vector space over the prime field. The scalar multiplication can be accelerated by decomposing the number of additions into nibbles or other digits, and using a pre-computed table of values P , 2P , 3P, . . . This is called a windowed method. To avoid side-channel attacks, implementations must ensure that the time and power used do not depend on the scalar. Avoiding conditional execution ensures constant-time and constant-power execution. This thesis presents a theoretical reduction in latency for the windowed method by introducing parallelism. Using three cores can achieve an improvement of 42% in the latency versus a single-threaded computation. / Thesis / Master of Science (MSc)
85

The Elliptic Curve Group Over Finite Fields: Applications in Cryptography

Lester, Jeremy W. 28 September 2012 (has links)
No description available.
86

The application of cryptography to data base security /

Gudes, Ehud January 1976 (has links)
No description available.
87

Efficient Implementations of Post-quantum Isogeny-based Cryptography

Unknown Date (has links)
Quantum computers are envisioned to be able to solve mathematical problems which are currently unsolvable for conventional computers, because of their exceptional computational power from quantum mechanics. Therefore, if quantum computers are ever built in large scale, they will certainly be able to solve many classical exponential complexity problems such as the hard problems which the current public key cryptography is constructed upon. To counteract this problem, the design of post-quantum cryptography protocols is necessary to preserve the security in the presence of quantum adversaries. Regardless of whether we can estimate the exact time for the advent of the quantum computing era, security protocols are required to be resistant against potentially-malicious power of quantum computing. In this thesis, the main focus is on the sperformance improvement of one of the potential PQC candidates, isogeny-based cryptography. Several optimized implementations of cryptography applications based on this primitive are presented. From a general viewpoint, the proposed methods, implementation techniques and libraries have a practical impact on the performance evaluation of post-quantum cryptography schemes in a wide range of applications. In particular, the provided benchmarks and optimizations on ARM-powered processors provide a reference for comparison and evaluation of isogeny-based cryptography with other post-quantum candidates during the first round of NIST's PQC standardization process. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2018. / FAU Electronic Theses and Dissertations Collection
88

Efficient Algorithms for Elliptic Curve Cryptosystems

Guajardo, Jorge 28 March 2000 (has links)
Elliptic curves are the basis for a relative new class of public-key schemes. It is predicted that elliptic curves will replace many existing schemes in the near future. It is thus of great interest to develop algorithms which allow efficient implementations of elliptic curve crypto systems. This thesis deals with such algorithms. Efficient algorithms for elliptic curves can be classified into low-level algorithms, which deal with arithmetic in the underlying finite field and high-level algorithms, which operate with the group operation. This thesis describes three new algorithms for efficient implementations of elliptic curve cryptosystems. The first algorithm describes the application of the Karatsuba-Ofman Algorithm to multiplication in composite fields GF((2n)m). The second algorithm deals with efficient inversion in composite Galois fields of the form GF((2n)m). The third algorithm is an entirely new approach which accelerates the multiplication of points which is the core operation in elliptic curve public-key systems. The algorithm explores computational advantages by computing repeated point doublings directly through closed formulae rather than from individual point doublings. Finally we apply all three algorithms to an implementation of an elliptic curve system over GF((216)11). We provide ablolute performance measures for the field operations and for an entire point multiplication. We also show the improvements gained by the new point multiplication algorithm in conjunction with the k-ary and improved k-ary methods for exponentiation.
89

Lightweight Cryptography Meets Threshold Implementation: A Case Study for SIMON

Shahverdi, Aria 26 August 2015 (has links)
"Securing data transmission has always been a challenge. While many cryptographic algorithms are available to solve the problem, many applications have tough area constraints while requiring high-level security. Lightweight cryptography aims at achieving high-level security with the benefit of being low cost. Since the late nineties and with the discovery of side channel attacks the approach towards cryptography has changed quite significantly. An attacker who can get close to a device can extract sensitive data by monitoring side channels such as power consumption, sound, or electromagnetic emanation. This means that embedded implementations of cryptographic schemes require protection against such attacks to achieve the desired level of security. In this work we combine a low-cost embedded cipher, Simon, with a stateof-the-art side channel countermeasure called Threshold Implementation (TI). We show that TI is a great match for lightweight cryptographic ciphers, especially for hardware implementation. Our implementation is the smallest TI of a block-cipher on an FPGA. This implementation utilizes 96 slices of a low-cost Spartan-3 FPGA and 55 slices a modern Kintex-7 FPGA. Moreover, we present a higher order TI which is resistant against second order attacks. This implementation utilizes 163 slices of a Spartan-3 FPGA and 95 slices of a Kintex-7 FPGA. We also present a state of the art leakage analysis and, by applying it to the designs, show that the implementations achieve the expected security. The implementations even feature a significant robustness to higher order attacks, where several million observations are needed to detect leakage."
90

NTRU over the Eisenstein Integers

Jarvis, Katherine 29 March 2011 (has links)
NTRU is a fast public-key cryptosystem that is constructed using polynomial rings with integer coefficients. We present ETRU, an NTRU-like cryptosystem based on the Eisenstein integers. We discuss parameter selection and develop a model for the probabilty of decryption failure. We also provide an implementation of ETRU. We use theoretical and experimental data to compare the security and efficiency of ETRU to NTRU with comparable parameter sets and show that ETRU is an improvement over NTRU in terms of security.

Page generated in 0.0491 seconds