• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 775
  • 132
  • 121
  • 68
  • 66
  • 36
  • 27
  • 24
  • 12
  • 12
  • 8
  • 7
  • 7
  • 7
  • 6
  • Tagged with
  • 1459
  • 515
  • 461
  • 314
  • 236
  • 232
  • 221
  • 196
  • 162
  • 158
  • 155
  • 144
  • 134
  • 134
  • 111
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Secure and Trusted Partial White-box Verification Based on Garbled Circuits

Zhong, Hongsheng January 2016 (has links)
Verification is a process that checks whether a program G, implemented by a devel- oper, correctly complies with the corresponding requirement specifications. A verifier, whose interests may be different from the developer, will conduct such verification on G. However, as the developer and the verifier distrust each other probably, either of them may exhibit harmful behavior and take advantage of the verification. Generally, the developer hopes to protect the content privacy of the program, while the verifier wants to conduct effective verification to detect the possible errors. Therefore, a ques- tion inevitably arises: How to conduct an effective and efficient kind of verification, without breaking the security requirements of the two parties? We treat verification as a process akin to testing, i.e. verifying the design with test cases and checking the results. In order to make the verification more effective, we get rid of the limitations in traditional testing approaches, like black-box and white-box testing, and propose the “partial white-box verification”. Taking circuits as the description means, we regard the program as a circuit graph. Making the structure of the graph public, we manage to make the verification process in such a graph partially white-box. Via garbled circuits, commitment schemes and other techniques, the security requirements in such verification are guaranteed. / Thesis / Master of Science (MSc)
142

Real World Secret Leaking / REAL WORLD SECRET LEAKING: THE DESIGN AND ANALYSIS OF A PROTOCOL CREATED FOR THE PURPOSE OF LEAKING DOCUMENTS UNDER SURVEILLANCE

Knopf, Karl January 2019 (has links)
In scenarios where an individual wishes to leak confidential information to an unauthorized party, he may do so in a public or an anonymous way. When acting publicly a leaker exposes his identity, whereas acting anonymously a leaker can introduce doubts about the information’s authenticity. Current solutions assume anonymity from everyone except a trusted third party or rely on the leaker possessing prior cryptographic keys, both of which are inadequate assumptions in real-world secret leaking scenarios. In this research we present a system called the attested drop protocol which provides confidentiality for the leaker, while still allowing leaked documents to have their origins verified. The protocol relies on identities associated with common communication mediums, and seeks to avoid having the leaker carry out sophisticated cryptographic operations. We also present two constructions of the general protocol, where each is designed to protect against different forms of adversarial surveillance. We use ceremony analysis and other techniques from the provable security paradigm to formally describe and evaluate security goals for both constructions. / Thesis / Master of Science (MSc) / Whistleblowing is an activity where an individual leaks some secrets about an organization to an unauthorized entity, often for moral or regulatory reasons. When doing so, the whistleblower is faced with the choice of acting publicly, and risking retribution or acting anonymously and risking not being believed. We have designed a protocol called the attested drop protocol, which protects the identity of the whistleblower, while allowing the unauthorized entity to have a means of verifying that the leak came from the organization. This protocol makes use of preexisting identities associated with a communication medium, such as emails, to avoid using cryptographic primitives that are impractical.
143

Two mathematical security aspects of the RSA cryptosystem : signature padding schemes and key generation with a backdoor

Arboit, Geneviève January 2008 (has links)
No description available.
144

Fault Attacks on Cryptosystems: Novel Threat Models, Countermeasures and Evaluation Metrics

Farhady Ghalaty, Nahid 19 August 2016 (has links)
Recent research has demonstrated that there is no sharp distinction between passive attacks based on side-channel leakage and active attacks based on fault injection. Fault behavior can be processed as side-channel information, offering all the benefits of Differential Power Analysis including noise averaging and hypothesis testing by correlation. In fault attacks, the adversary induces faults into a device while it is executing a known program and observes the reaction. The abnormal reactions of the device are later analyzed to obtain the secrets of the program under execution. Fault attacks are a powerful threat. They are used to break cryptosystems, Pay TVs, smart cards and other embedded applications. In fault attack resistant design, the fault is assumed to be induced by a smart, malicious, determined attacker who has high knowledge of the design under attack. Moreover, the purpose of fault attack resistant design is for the system to work correctly under intentional fault injection without leaking any secret data information. Towards building a fault attack resistant design, the problem can be categorized into three main subjects: • Investigating novel and more powerful threat models and attack procedures. • Proposing countermeasures to build secure systems against fault attacks • Building evaluation metrics to measure the security of designs In this regard, my thesis has covered the first bullet, by proposing the Differential Fault Intensity Analysis (DFIA) based on the biased fault model. The biased fault model in this attack means the gradual behavior of the fault as a cause of increasing the intensity of fault injection. The DFIA attack has been successfully launched on AES, PRESENT and LED block ciphers. Our group has also recently proposed this attack on the AES algorithm running on a LEON3 processor. In our work, we also propose a countermeasure against one of the most powerful types of fault attacks, namely, Fault Sensitivity Analysis (FSA). This countermeasure is based on balancing the delay of the circuit to destroy the correlation of secret data and timing delay of a circuit. Additionally, we propose a framework for assessing the vulnerability of designs against fault attacks. An example of this framework is the Timing Violation Vulnerability Factor (TVVF) that is a metric for measuring the vulnerability of hardware against timing violation attacks. We compute TVVF for two implementations of AES algorithm and measure the vulnerability of these designs against two types of fault attacks. For future work, we plan to propose an attack that is a combination of power measurements and fault injections. This attack is more powerful in the sense that it has less fault injection restrictions and requires less amount of information from the block cipher's data. We also plan to design more efficient and generic evaluation metrics than TVVF. As shown in this thesis, fault attacks are more serious threat than considered by the cryptography community. This thesis provides a deep understanding of the fault behavior in the circuit and therefore a better knowledge on powerful fault attacks. The techniques developed in this dissertation focus on different aspects of fault attacks on hardware architectures and microprocessors. Considering the proposed fault models, attacks, and evaluation metrics in this thesis, there is hope to develop robust and fault attack resistant microprocessors. We conclude this thesis by observing future areas and opportunities for research. / Ph. D.
145

Privacy Preservation for Cloud-Based Data Sharing and Data Analytics

Zheng, Yao 21 December 2016 (has links)
Data privacy is a globally recognized human right for individuals to control the access to their personal information, and bar the negative consequences from the use of this information. As communication technologies progress, the means to protect data privacy must also evolve to address new challenges come into view. Our research goal in this dissertation is to develop privacy protection frameworks and techniques suitable for the emerging cloud-based data services, in particular privacy-preserving algorithms and protocols for the cloud-based data sharing and data analytics services. Cloud computing has enabled users to store, process, and communicate their personal information through third-party services. It has also raised privacy issues regarding losing control over data, mass harvesting of information, and un-consented disclosure of personal content. Above all, the main concern is the lack of understanding about data privacy in cloud environments. Currently, the cloud service providers either advocate the principle of third-party doctrine and deny users' rights to protect their data stored in the cloud; or rely the notice-and-choice framework and present users with ambiguous, incomprehensible privacy statements without any meaningful privacy guarantee. In this regard, our research has three main contributions. First, to capture users' privacy expectations in cloud environments, we conceptually divide personal data into two categories, i.e., visible data and invisible data. The visible data refer to information users intentionally create, upload to, and share through the cloud; the invisible data refer to users' information retained in the cloud that is aggregated, analyzed, and repurposed without their knowledge or understanding. Second, to address users' privacy concerns raised by cloud computing, we propose two privacy protection frameworks, namely individual control and use limitation. The individual control framework emphasizes users' capability to govern the access to the visible data stored in the cloud. The use limitation framework emphasizes users' expectation to remain anonymous when the invisible data are aggregated and analyzed by cloud-based data services. Finally, we investigate various techniques to accommodate the new privacy protection frameworks, in the context of four cloud-based data services: personal health record sharing, location-based proximity test, link recommendation for social networks, and face tagging in photo management applications. For the first case, we develop a key-based protection technique to enforce fine-grained access control to users' digital health records. For the second case, we develop a key-less protection technique to achieve location-specific user selection. For latter two cases, we develop distributed learning algorithms to prevent large scale data harvesting. We further combine these algorithms with query regulation techniques to achieve user anonymity. The picture that is emerging from the above works is a bleak one. Regarding to personal data, the reality is we can no longer control them all. As communication technologies evolve, the scope of personal data has expanded beyond local, discrete silos, and integrated into the Internet. The traditional understanding of privacy must be updated to reflect these changes. In addition, because privacy is a particularly nuanced problem that is governed by context, there is no one-size-fit-all solution. While some cases can be salvaged either by cryptography or by other means, in others a rethinking of the trade-offs between utility and privacy appears to be necessary. / Ph. D.
146

Efficient and Tamper-Resilient Architectures for Pairing Based Cryptography

Ozturk, Erdinc 04 January 2009 (has links)
Identity based cryptography was first proposed by Shamir in 1984. Rather than deriving a public key from private information, which would be the case in traditional public key encryption schemes, in identity based schemes a user's identity plays the role of the public key. This reduces the amount of computations required for authentication, and simplifies key-management. Efficient and strong implementations of identity based schemes are based around easily computable bilinear mappings of two points on an elliptic curve onto a multiplicative subgroup of a field, also called pairing. The idea of utilizing the identity of the user simplifies the public key infrastructure. However, since pairing computations are expensive for both area and timing, the proposed identity based cryptosystem are hard to implement. In order to be able to efficiently utilize the idea of identity based cryptography, there is a strong need for an efficient pairing implementations. Pairing computations could be realized in multiple fields. Since the main building block and the bottleneck of the algorithm is multiplication, we focused our research on building a fast and small arithmetic core that can work on multiple fields. This would allow a single piece of hardware to realize a wide spectrum of cryptographic algorithms, including pairings, with minimal amount of software coding. We present a novel unified core design which is extended to realize Montgomery multiplication in the fields GF(2^n), GF(3^m), and GF(p). Our unified design supports RSA and elliptic curve schemes, as well as identity based encryption which requires a pairing computation on an elliptic curve. The architecture is pipelined and is highly scalable. The unified core utilizes the redundant signed digit representation to reduce the critical path delay. While the carry-save representation used in classical unified architectures is only good for addition and multiplication operations, the redundant signed digit representation also facilitates efficient computation of comparison and subtraction operations besides addition and multiplication. Thus, there is no need for transformation between the redundant and non-redundant representations of field elements, which would be required in classical unified architectures to realize the subtraction and comparison operations. We also quantify the benefits of unified architectures in terms of area and critical path delay. We provide detailed implementation results. The metric shows that the new unified architecture provides an improvement over a hypothetical non-unified architecture of at least 24.88 % while the improvement over a classical unified architecture is at least 32.07 %. Until recently there has been no work covering the security of pairing based cryptographic hardware in the presence of side-channel attacks, despite their apparent suitability for identity-aware personal security devices, such as smart cards. We present a novel non-linear error coding framework which incorporates strong adversarial fault detection capabilities into identity based encryption schemes built using Tate pairing computations. The presented algorithms provide quantifiable resilience in a well defined strong attacker model. Given the emergence of fault attacks as a serious threat to pairing based cryptography, the proposed technique solves a key problem when incorporated into software and hardware implementations. In this dissertation, we also present an efficient accelerator for computing the Tate Pairing in characteristic 3, based on the Modified Duursma Lee algorithm.
147

Performance analysis of lattice based post-quantum secure cryptography with Java

Johansson, Alexander January 2019 (has links)
Efficient quantum computers will break most of today’s public-key cryptosystems. Therefore, the National Institute of Standards and Technology (NIST) calls for proposals to standardise one or more quantum-secure cryptographic schemes. Eventually, banks must adopt the standardised schemes, but little is known about how efficient such an implementation would be in Java, one of the standard programming languages for banks. In this thesis, we test and evaluate a post-quantum secure encryption scheme known as FrodoKEM, which is based on a hard lattice problem known as Learning With Errors (LWE). We found that a post-quantum secure encryption version of FrodoKEM provides strong theoretical security regarding the criteria given by NIST, and is also sufficiently fast for key generation, encryption and decryption. These results imply that it could be possible to implement these types of post-quantum secure algorithms in high-level programming languages such as Java, demonstrating that we no longer are limited to use low-level languages such as C. Consequently, we can easier and cheaper implement post-quantum secure cryptography.
148

Anonymity With Authenticity

Swaroop, D 12 1900 (has links) (PDF)
Cryptography is science of secure message transmission. Cryptanalysis is involved with breaking these encrypted messages. Both cryptography and cryptanalysis constitute together to form cryptology. Anonymity means namelessness i.e., the quality or state of being unknown while authenticity translates to the quality or condition of being authentic or genuine. Anonymity and authenticity are two different embodiments of personal secrecy. Modern power has increased in its capacity to designate individuals, due to which they find it inconvenient to continue communicating, remaining anonymous. In this thesis we are going to describe an anonymous system which consists of a number of entities which are anonymous and are communicating with each other without revealing their identity and at the same time maintaining their authenticity such that an anonymous entity(sayE1)will be able to verify that, the message it received from another anonymous entity(sayE2)subsequent to an initial message from E2, are in fact from E2 itself. Later when E2 tries to recommend a similar communication to E1 with another anonymous entity E3 in the system, E1 must be able to verify that recommendation, without E2 losing its authenticity of its communication with E1 to E3. This thesis is divided into four chapters. The first chapter is an introduction to cryptography, symmetric key cryptography and public key cryptography. It also summarizes the contribution of this thesis. The second chapter gives various protocol for the above problem ’Anonymity with Authenticity’ along with its extension. Totally six protocols are proposed for the above problem. In third chapter all these six protocols are realized using four different schemes, where each scheme has its own pros and cons. The fourth and final chapter concludes with a note on what possible factors these four different realization schemes need to be chosen and other possible realization schemes.
149

Encrypt/Decrypt COMSEC Unit for Space-based Command and Telemetry Applications

Merz, Doug, Maples, Bruce 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / This paper describes the system-level architecture and design concept of a communications security (COMSEC) equipment intended for space-based low data rate (< 1 Mbps) command and telemetry applications. The COMSEC Unit is a stand-alone piece of equipment which provides decryption of uplink command and control information and encryption of downlink telemetry data. The system-level architecture is described followed by an overview of the digital design concepts and a discussion of applications. Finally, although specifically targeted for narrowband command and telemetry applications, this design approach is flexible enough to accommodate other algorithms of choice as well as operate in higher data rate applications.
150

A network-based asynchronous architecture for cryptographic devices

Spadavecchia, Ljiljana January 2006 (has links)
The traditional model of cryptography examines the security of the cipher as a mathematical function. However, ciphers that are secure when specified as mathematical functions are not necessarily secure in real-world implementations. The physical implementations of ciphers can be extremely difficult to control and often leak socalled side-channel information. Side-channel cryptanalysis attacks have shown to be especially effective as a practical means for attacking implementations of cryptographic algorithms on simple hardware platforms, such as smart-cards. Adversaries can obtain sensitive information from side-channels, such as the timing of operations, power consumption and electromagnetic emissions. Some of the attack techniques require surprisingly little side-channel information to break some of the best known ciphers. In constrained devices, such as smart-cards, straightforward implementations of cryptographic algorithms can be broken with minimal work. Preventing these attacks has become an active and a challenging area of research. Power analysis is a successful cryptanalytic technique that extracts secret information from cryptographic devices by analysing the power consumed during their operation. A particularly dangerous class of power analysis, differential power analysis (DPA), relies on the correlation of power consumption measurements. It has been proposed that adding non-determinism to the execution of the cryptographic device would reduce the danger of these attacks. It has also been demonstrated that asynchronous logic has advantages for security-sensitive applications. This thesis investigates the security and performance advantages of using a network-based asynchronous architecture, in which the functional units of the datapath form a network. Non-deterministic execution is achieved by exploiting concurrent execution of instructions both with and without data-dependencies; and by forwarding register values between instructions with data-dependencies using randomised routing over the network. The executions of cryptographic algorithms on different architectural configurations are simulated, and the obtained power traces are subjected to DPA attacks. The results show that the proposed architecture introduces a level of non-determinism in the execution that significantly raises the threshold for DPA attacks to succeed. In addition, the performance analysis shows that the improved security does not degrade performance.

Page generated in 0.0482 seconds