141 |
Secure and Trusted Partial White-box Verification Based on Garbled CircuitsZhong, Hongsheng January 2016 (has links)
Verification is a process that checks whether a program G, implemented by a devel- oper, correctly complies with the corresponding requirement specifications. A verifier, whose interests may be different from the developer, will conduct such verification on G. However, as the developer and the verifier distrust each other probably, either of them may exhibit harmful behavior and take advantage of the verification. Generally, the developer hopes to protect the content privacy of the program, while the verifier wants to conduct effective verification to detect the possible errors. Therefore, a ques- tion inevitably arises: How to conduct an effective and efficient kind of verification, without breaking the security requirements of the two parties?
We treat verification as a process akin to testing, i.e. verifying the design with test cases and checking the results. In order to make the verification more effective, we get rid of the limitations in traditional testing approaches, like black-box and white-box testing, and propose the “partial white-box verification”.
Taking circuits as the description means, we regard the program as a circuit graph. Making the structure of the graph public, we manage to make the verification process in such a graph partially white-box. Via garbled circuits, commitment schemes and other techniques, the security requirements in such verification are guaranteed. / Thesis / Master of Science (MSc)
|
142 |
Real World Secret Leaking / REAL WORLD SECRET LEAKING: THE DESIGN AND ANALYSIS OF A PROTOCOL CREATED FOR THE PURPOSE OF LEAKING DOCUMENTS UNDER SURVEILLANCEKnopf, Karl January 2019 (has links)
In scenarios where an individual wishes to leak confidential information to an unauthorized party, he may do so in a public or an anonymous way. When acting publicly a leaker exposes his identity, whereas acting anonymously a leaker can introduce doubts about the information’s authenticity. Current solutions assume anonymity from everyone except a trusted third party or rely on the leaker possessing prior cryptographic keys, both of which are inadequate assumptions in real-world secret leaking scenarios.
In this research we present a system called the attested drop protocol which provides confidentiality for the leaker, while still allowing leaked documents to have their origins verified. The protocol relies on identities associated with common communication mediums, and seeks to avoid having the leaker carry out sophisticated cryptographic operations. We also present two constructions of the general protocol, where each is designed to protect against different forms of adversarial surveillance. We use ceremony analysis and other techniques from the provable security paradigm to formally describe and evaluate security goals for both constructions. / Thesis / Master of Science (MSc) / Whistleblowing is an activity where an individual leaks some secrets about an organization to an unauthorized entity, often for moral or regulatory reasons. When doing so, the whistleblower is faced with the choice of acting publicly, and risking retribution or acting anonymously and risking not being believed. We have designed a protocol called the attested drop protocol, which protects the identity of the whistleblower, while allowing the unauthorized entity to have a means of verifying that the leak came from the organization. This protocol makes use of preexisting identities associated with a communication medium, such as emails, to avoid using cryptographic primitives that are impractical.
|
143 |
Two mathematical security aspects of the RSA cryptosystem : signature padding schemes and key generation with a backdoorArboit, Geneviève January 2008 (has links)
No description available.
|
144 |
Fault Attacks on Cryptosystems: Novel Threat Models, Countermeasures and Evaluation MetricsFarhady Ghalaty, Nahid 19 August 2016 (has links)
Recent research has demonstrated that there is no sharp distinction between passive attacks based on side-channel leakage and active attacks based on fault injection. Fault behavior can be processed as side-channel information, offering all the benefits of Differential Power Analysis including noise averaging and hypothesis testing by correlation. In fault attacks, the adversary induces faults into a device while it is executing a known program and observes the reaction. The abnormal reactions of the device are later analyzed to obtain the secrets of the program under execution.
Fault attacks are a powerful threat. They are used to break cryptosystems, Pay TVs, smart cards and other embedded applications. In fault attack resistant design, the fault is assumed to be induced by a smart, malicious, determined attacker who has high knowledge of the design under attack. Moreover, the purpose of fault attack resistant design is for the system to work correctly under intentional fault injection without leaking any secret data information.
Towards building a fault attack resistant design, the problem can be categorized into three main subjects:
• Investigating novel and more powerful threat models and attack procedures.
• Proposing countermeasures to build secure systems against fault attacks
• Building evaluation metrics to measure the security of designs
In this regard, my thesis has covered the first bullet, by proposing the Differential Fault Intensity Analysis (DFIA) based on the biased fault model. The biased fault model in this attack means the gradual behavior of the fault as a cause of increasing the intensity of fault injection. The DFIA attack has been successfully launched on AES, PRESENT and LED block ciphers. Our group has also recently proposed this attack on the AES algorithm running on a LEON3 processor.
In our work, we also propose a countermeasure against one of the most powerful types of fault attacks, namely, Fault Sensitivity Analysis (FSA). This countermeasure is based on balancing the delay of the circuit to destroy the correlation of secret data and timing delay of a circuit.
Additionally, we propose a framework for assessing the vulnerability of designs against fault attacks. An example of this framework is the Timing Violation Vulnerability Factor (TVVF) that is a metric for measuring the vulnerability of hardware against timing violation attacks. We compute TVVF for two implementations of AES algorithm and measure the vulnerability of these designs against two types of fault attacks.
For future work, we plan to propose an attack that is a combination of power measurements and fault injections. This attack is more powerful in the sense that it has less fault injection restrictions and requires less amount of information from the block cipher's data. We also plan to design more efficient and generic evaluation metrics than TVVF.
As shown in this thesis, fault attacks are more serious threat than considered by the cryptography community. This thesis provides a deep understanding of the fault behavior in the circuit and therefore a better knowledge on powerful fault attacks. The techniques developed in this dissertation focus on different aspects of fault attacks on hardware architectures and microprocessors. Considering the proposed fault models, attacks, and evaluation metrics in this thesis, there is hope to develop robust and fault attack resistant microprocessors. We conclude this thesis by observing future areas and opportunities for research. / Ph. D.
|
145 |
Privacy Preservation for Cloud-Based Data Sharing and Data AnalyticsZheng, Yao 21 December 2016 (has links)
Data privacy is a globally recognized human right for individuals to control the access to their personal information, and bar the negative consequences from the use of this information. As communication technologies progress, the means to protect data privacy must also evolve to address new challenges come into view. Our research goal in this dissertation is to develop privacy protection frameworks and techniques suitable for the emerging cloud-based data services, in particular privacy-preserving algorithms and protocols for the cloud-based data sharing and data analytics services.
Cloud computing has enabled users to store, process, and communicate their personal information through third-party services. It has also raised privacy issues regarding losing control over data, mass harvesting of information, and un-consented disclosure of personal content. Above all, the main concern is the lack of understanding about data privacy in cloud environments. Currently, the cloud service providers either advocate the principle of third-party doctrine and deny users' rights to protect their data stored in the cloud; or rely the notice-and-choice framework and present users with ambiguous, incomprehensible privacy statements without any meaningful privacy guarantee.
In this regard, our research has three main contributions. First, to capture users' privacy expectations in cloud environments, we conceptually divide personal data into two categories, i.e., visible data and invisible data. The visible data refer to information users intentionally create, upload to, and share through the cloud; the invisible data refer to users' information retained in the cloud that is aggregated, analyzed, and repurposed without their knowledge or understanding.
Second, to address users' privacy concerns raised by cloud computing, we propose two privacy protection frameworks, namely individual control and use limitation. The individual control framework emphasizes users' capability to govern the access to the visible data stored in the cloud. The use limitation framework emphasizes users' expectation to remain anonymous when the invisible data are aggregated and analyzed by cloud-based data services.
Finally, we investigate various techniques to accommodate the new privacy protection frameworks, in the context of four cloud-based data services: personal health record sharing, location-based proximity test, link recommendation for social networks, and face tagging in photo management applications. For the first case, we develop a key-based protection technique to enforce fine-grained access control to users' digital health records. For the second case, we develop a key-less protection technique to achieve location-specific user selection. For latter two cases, we develop distributed learning algorithms to prevent large scale data harvesting. We further combine these algorithms with query regulation techniques to achieve user anonymity.
The picture that is emerging from the above works is a bleak one. Regarding to personal data, the reality is we can no longer control them all. As communication technologies evolve, the scope of personal data has expanded beyond local, discrete silos, and integrated into the Internet. The traditional understanding of privacy must be updated to reflect these changes. In addition, because privacy is a particularly nuanced problem that is governed by context, there is no one-size-fit-all solution. While some cases can be salvaged either by cryptography or by other means, in others a rethinking of the trade-offs between utility and privacy appears to be necessary. / Ph. D. / Data privacy is a globally recognized human right for individuals to control the access to their personal information, and bar the negative consequences from the use of this information. As communication technologies progress, the means to protect data privacy must also evolve to address new challenges come into view. Our research goal in this dissertation is to develop privacy protection frameworks and techniques for the emerging cloud-based data services, in particular privacy-preserving algorithms and protocols for the cloud-based data sharing and data analytics services.
Our research has three main contributions. First, to capture users’ privacy expectations in the cloud computing paradigm, we conceptually divide personal data into two categories, <i>i.e., visible</i> data and <i>invisible</i> data. The visible data refer to information users intentionally create, upload to, and share through the cloud; the invisible data refer to users’ information retained in the cloud that is aggregated, analyzed, and repurposed without their knowledge or understanding.
Second, to address users’ privacy concerns raised by cloud computing, we propose two privacy protection frameworks, namely <i>individual control</i> and <i>use limitation</i>. The individual control framework emphasizes users’ capability to govern the access to the visible data stored in the cloud. The use limitation framework emphasizes users’ expectation to remain anonymous when the invisible data are aggregated and analyzed by cloud-based data services.
Finally, we investigate various techniques to accommodate the new privacy protection frameworks, in the context of four cloud-based data services: personal health record sharing, location-based proximity test, link recommendation for social networks, and face tagging for photo management applications. For the first case, we develop a key-based protection technique to enforce fine-grained access control to users’ digital health records. For the second case, we develop a key-less protection technique to achieve location-specific user selection. For latter two cases, we develop distributed learning algorithms to prevent large scale data harvesting. We further combine these algorithms with query regulation techniques to achieve user anonymity.
|
146 |
A Secure Key Encapsulation Mechanism in Quantum Hybrid Settings / Hybrid Key Encapsulation MechanismsGoncalves, Brian January 2018 (has links)
Quantum computers pose a long-term threat to many currently used cryptographic schemes
as they are able to efficiently solve the computational problems those schemes are based on. This threat has lead to research into quantum-resistant cryptographic schemes to eventually replace those currently used, as well as research into how to ease the transition from classical schemes to quantum-resistant ones. One approach to address these issues is to use a combiner that creates hybrid schemes, that is schemes which are classically and quantum-resistant, to protect against quantum attacks and maintain current security guarantees. Such combiners are used as a way to provide trust from different schemes and their differing computational difficulty assumptions rather than a single scheme. which may later become vulnerable. An important type of scheme that must be secure against both classical and quantum attacks are key encapsulation mechanisms (KEMs), as they are commonly used for constructing public-key encryption and key exchange protocols. We first define new security notions for KEMs modeling attackers of various levels of quantum power ranging from fully classical to fully quantum. We then construct a combiner that creates hybrid schemes for key encapsulation mechanisms which is secure against adversaries with varying levels of quantum power over time and can be implemented efficiently. Our construction provides an efficient method to combine KEMs using an additional scheme. This construction is also general enough that it can be implemented in settings such as key exchange protocols, like those used in the Transport Layer Security (TLS) protocol for web browsers, without affecting existing structure meaningfully. / Thesis / Master of Science (MSc) / Quantum computers present a threat to current cryptography, as they would be able to break many widely used public-key encryption schemes. In order maintain the security of communication infrastructure it is important that quantum-resistant algorithms become more common in use. However, adoption of quantum-resistant algorithms has been relatively slow, in part due to not wanting to risk abandoning schemes that are secure currently. In this thesis we focus on a specific type of scheme called a key encapsulation mechanism (KEM), used to fix a session key for communicating. We construct a secure way to combine currently secure KEMs and quantum-resistant KEMs that are secure now and against quantum computer. Our construction is simple enough that it can be implemented efficiently to provide quantum-resistant security, thus encouraging adoption of quantum-resistant algorithms.
|
147 |
Efficient and Tamper-Resilient Architectures for Pairing Based CryptographyOzturk, Erdinc 04 January 2009 (has links)
Identity based cryptography was first proposed by Shamir in 1984. Rather than deriving a public key from private information, which would be the case in traditional public key encryption schemes, in identity based schemes a user's identity plays the role of the public key. This reduces the amount of computations required for authentication, and simplifies key-management. Efficient and strong implementations of identity based schemes are based around easily computable bilinear mappings of two points on an elliptic curve onto a multiplicative subgroup of a field, also called pairing. The idea of utilizing the identity of the user simplifies the public key infrastructure. However, since pairing computations are expensive for both area and timing, the proposed identity based cryptosystem are hard to implement. In order to be able to efficiently utilize the idea of identity based cryptography, there is a strong need for an efficient pairing implementations. Pairing computations could be realized in multiple fields. Since the main building block and the bottleneck of the algorithm is multiplication, we focused our research on building a fast and small arithmetic core that can work on multiple fields. This would allow a single piece of hardware to realize a wide spectrum of cryptographic algorithms, including pairings, with minimal amount of software coding. We present a novel unified core design which is extended to realize Montgomery multiplication in the fields GF(2^n), GF(3^m), and GF(p). Our unified design supports RSA and elliptic curve schemes, as well as identity based encryption which requires a pairing computation on an elliptic curve. The architecture is pipelined and is highly scalable. The unified core utilizes the redundant signed digit representation to reduce the critical path delay. While the carry-save representation used in classical unified architectures is only good for addition and multiplication operations, the redundant signed digit representation also facilitates efficient computation of comparison and subtraction operations besides addition and multiplication. Thus, there is no need for transformation between the redundant and non-redundant representations of field elements, which would be required in classical unified architectures to realize the subtraction and comparison operations. We also quantify the benefits of unified architectures in terms of area and critical path delay. We provide detailed implementation results. The metric shows that the new unified architecture provides an improvement over a hypothetical non-unified architecture of at least 24.88 % while the improvement over a classical unified architecture is at least 32.07 %. Until recently there has been no work covering the security of pairing based cryptographic hardware in the presence of side-channel attacks, despite their apparent suitability for identity-aware personal security devices, such as smart cards. We present a novel non-linear error coding framework which incorporates strong adversarial fault detection capabilities into identity based encryption schemes built using Tate pairing computations. The presented algorithms provide quantifiable resilience in a well defined strong attacker model. Given the emergence of fault attacks as a serious threat to pairing based cryptography, the proposed technique solves a key problem when incorporated into software and hardware implementations. In this dissertation, we also present an efficient accelerator for computing the Tate Pairing in characteristic 3, based on the Modified Duursma Lee algorithm.
|
148 |
Performance analysis of lattice based post-quantum secure cryptography with JavaJohansson, Alexander January 2019 (has links)
Efficient quantum computers will break most of today’s public-key cryptosystems. Therefore, the National Institute of Standards and Technology (NIST) calls for proposals to standardise one or more quantum-secure cryptographic schemes. Eventually, banks must adopt the standardised schemes, but little is known about how efficient such an implementation would be in Java, one of the standard programming languages for banks. In this thesis, we test and evaluate a post-quantum secure encryption scheme known as FrodoKEM, which is based on a hard lattice problem known as Learning With Errors (LWE). We found that a post-quantum secure encryption version of FrodoKEM provides strong theoretical security regarding the criteria given by NIST, and is also sufficiently fast for key generation, encryption and decryption. These results imply that it could be possible to implement these types of post-quantum secure algorithms in high-level programming languages such as Java, demonstrating that we no longer are limited to use low-level languages such as C. Consequently, we can easier and cheaper implement post-quantum secure cryptography.
|
149 |
Anonymity With AuthenticitySwaroop, D 12 1900 (has links) (PDF)
Cryptography is science of secure message transmission. Cryptanalysis is involved with breaking these encrypted messages. Both cryptography and cryptanalysis constitute together to form cryptology.
Anonymity means namelessness i.e., the quality or state of being unknown while authenticity translates to the quality or condition of being authentic or genuine. Anonymity and authenticity are two different embodiments of personal secrecy. Modern power has increased in its capacity to designate individuals, due to which they find it inconvenient to continue communicating, remaining anonymous.
In this thesis we are going to describe an anonymous system which consists of a number of entities which are anonymous and are communicating with each other without revealing their identity and at the same time maintaining their authenticity such that an anonymous entity(sayE1)will be able to verify that, the message it received from another anonymous entity(sayE2)subsequent to an initial message from E2, are in fact from E2 itself. Later when E2 tries to recommend a similar communication to E1 with another anonymous entity E3 in the system, E1 must be able to verify that recommendation, without E2 losing its authenticity of its communication with E1 to E3.
This thesis is divided into four chapters. The first chapter is an introduction to cryptography, symmetric key cryptography and public key cryptography. It also summarizes the contribution of this thesis.
The second chapter gives various protocol for the above problem ’Anonymity with Authenticity’ along with its extension. Totally six protocols are proposed for the above problem.
In third chapter all these six protocols are realized using four different schemes, where each scheme has its own pros and cons.
The fourth and final chapter concludes with a note on what possible factors these four different realization schemes need to be chosen and other possible realization schemes.
|
150 |
Encrypt/Decrypt COMSEC Unit for Space-based Command and Telemetry ApplicationsMerz, Doug, Maples, Bruce 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the system-level architecture and design concept of a communications security
(COMSEC) equipment intended for space-based low data rate (< 1 Mbps) command and telemetry
applications. The COMSEC Unit is a stand-alone piece of equipment which provides decryption of
uplink command and control information and encryption of downlink telemetry data. The system-level
architecture is described followed by an overview of the digital design concepts and a
discussion of applications. Finally, although specifically targeted for narrowband command and
telemetry applications, this design approach is flexible enough to accommodate other algorithms of
choice as well as operate in higher data rate applications.
|
Page generated in 0.0984 seconds