Spelling suggestions: "subject:"verifiable computational"" "subject:"certifiable computational""
1 |
Toward practical argument systems for verifiable computationSetty, Srinath T.V. 09 February 2015 (has links)
How can a client extract useful work from a server without trusting it to compute correctly? A modern motivation for this classic question is third party computing models in which customers outsource their computations to service providers (as in cloud computing). In principle, deep results in complexity theory and cryptography imply that it is possible to verify that an untrusted entity executed a computation correctly. For instance, the server can employ probabilistically checkable proofs (PCPs) in conjunction with cryptographic commitments to generate a succinct proof of correct execution, which the client can efficiently check. However, these theoretical solutions are impractical: they require thousands of CPU years to verifiably execute even simple computations. This dissertation describes the design, implementation, and experimental evaluation viiiof a system, called Pepper, that brings this theory into the realm of plausibility. Pepper incorporates a series of algorithmic improvements and systems engineering techniques to improve performance by over 20 orders of magnitude, relative to an implementation of the theory without our refinements. These include a new probabilistically checkable proof encoding with nearly optimal asymptotics, a concise representation for computations, a more efficient cryptographic commitment primitive, and a distributed implementation of the server with GPU acceleration to reduce latency. Additionally, Pepper extends the verification machinery to handle realistic applications of third party computing: those that interact with remote storage or state (e.g., MapReduce jobs, database queries). To do so, Pepper composes techniques from untrusted storage with the aforementioned technical machinery to verifiably offload both computations and state. Furthermore, to make it easy to use this technology, Pepper includes a compiler to automatically transform programs in a subset of C into executables that run verifiably. One of the chief limitations of Pepper is that verifiable execution is still orders of magnitude slower than an unverifiable native execution. Nonetheless, Pepper takes powerful results from complexity theory and verifiable computation a few steps closer to practicality / text
|
2 |
Function-specific schemes for verifiable computationPapadopoulos, Dimitrios 07 December 2016 (has links)
An integral component of modern computing is the ability to outsource data and computation to powerful remote servers, for instance, in the context of cloud computing or remote file storage. While participants can benefit from this interaction, a fundamental security issue that arises is that of integrity of computation: How can the end-user be certain that the result of a computation over the outsourced data has not been tampered with (not even by a compromised or adversarial server)?
Cryptographic schemes for verifiable computation address this problem by accompanying each result with a proof that can be used to check the correctness of the performed computation. Recent advances in the field have led to the first implementations of schemes that can verify arbitrary computations. However, in practice the overhead of these general-purpose constructions remains prohibitive for most applications, with proof computation times (at the server) in the order of minutes or even hours for real-world problem instances. A different approach for designing such schemes targets specific types of computation and builds custom-made protocols, sacrificing generality for efficiency. An important representative of this function-specific approach is an authenticated data structure (ADS), where a specialized protocol is designed that supports query types associated with a particular outsourced dataset.
This thesis presents three novel ADS constructions for the important query types of set operations, multi-dimensional range search, and pattern matching, and proves their security under cryptographic assumptions over bilinear groups. The scheme for set operations can support nested queries (e.g., two unions followed by an intersection of the results), extending previous works that only accommodate a single operation. The range search ADS provides an exponential (in the number of attributes in the dataset) asymptotic improvement from previous schemes for storage and computation costs. Finally, the pattern matching ADS supports text pattern and XML path queries with minimal cost, e.g., the overhead at the server is less than 4% compared to simply computing the result, for all our tested settings. The experimental evaluation of all three constructions shows significant improvements in proof-computation time over general-purpose schemes.
|
3 |
Practical Verified Computation with Streaming Interactive ProofsThaler, Justin R 14 October 2013 (has links)
As the cloud computing paradigm has gained prominence, the need for verifiable computation has grown urgent. Protocols for verifiable computation enable a weak client to outsource difficult computations to a powerful, but untrusted, server. These protocols provide the client with a (probabilistic) guarantee that the server performed the requested computations correctly, without requiring the client to perform the computations herself. / Engineering and Applied Sciences
|
4 |
Trust and verifiable computation for smart contracts in permissionless blockchainsHarz, Dominik January 2017 (has links)
Blockchains address trust through cryptography and consensus. Bitcoin is the first digital currency without trusted agents. Ethereum extends this technology by enabling agents on a blockchain, via smart contracts. However, a systemic trust model for smart contracts in blockchains is missing. This thesis describes the ecosystem of smart contracts as an open multi-agent system. A trust model introduces social control through deposits and review agents. Trust-related attributes are quantified in 2,561 smart contracts from GitHub. Smart contracts employ a mean of three variables and functions and one in ten has a security-related issue. Moreover, blockchains restrict computation tasks. Resolving these restrictions while maintaining trust requires verifiable computation. An algorithm for verifiable computation is developed and implemented in Solidity. It uses an arbiter enforcing the algorithm, computation services providing and verifying solutions, and a judge assessing solutions. Experiments are performed with 1000 iterations for one to six verifiers with a cheater prior probability of 30%, 50%, and 70%. The algorithm shows linear complexity for integer multiplication. The verification depends on cheater prior probability and amount of verifiers. In the experiments, six verifiers are sufficient to detect all cheaters for the three prior probabilities. / Blockchains adresserar tillit genom kryptografi och konsensus. Bitcoin är den första digitala valutan utan betrodda agenter. Ethereum utökar denna teknik genom att möjliggöra agenter i blockchain, via smart contracts. En systemisk förtroende modell för smart contracts i blockchains saknas emellertid. Denna avhandling beskriver ekosystemet för smarta kontrakt som ett öppet multi-agent system. En förtroendemodell introducerar social kontroll genom inlåning och granskningsagenter. Tillitrelaterade attribut kvantifieras i 2,561 smart contracts från GitHub. De använder ett medelvärde av tre variabler och funktioner med en av tio som har en säkerhetsre-laterad fråga. Dessutom blockchains begränsa beräkningsuppgifter. Att lösa dessa begränsningar samtidigt som du behåller förtroendet kräver kontrollerbar beräkning. En algoritm för verifierbar beräkning utvecklas och implementeras i Solidity. Den använder en arbiter som tillämpar algoritmen, computation services som tillhandahåller och verifierar lösningar och en judge som bedömer lösningar. Experiment utförs med 1000 iterationer för en till sex verifierare med en snyggare sannolikhet för 30%, 50% och 70%. Algoritmen visar linjär komplexitet för heltalsmultiplicering. Verifieringen beror på fuskans tidigare sannolikhet och antal verifierare. I experimenten är sex verifierare tillräckliga för att detektera alla cheaters för de tre tidigare sannolikheterna.
|
5 |
Towards Secure Outsourced Data Services in the Public CloudSun, Wenhai 25 July 2018 (has links)
Past few years have witnessed a dramatic shift for IT infrastructures from a self-sustained model to a centralized and multi-tenant elastic computing paradigm -- Cloud Computing, which significantly reshapes the landscape of existing data utilization services. In truth, public cloud service providers (CSPs), e.g. Google, Amazon, offer us unprecedented benefits, such as ubiquitous and flexible access, considerable capital expenditure savings and on-demand resource allocation. Cloud has become the virtual ``brain" as well to support and propel many important applications and system designs, for example, artificial intelligence, Internet of Things, and so forth; on the flip side, security and privacy are among the primary concerns with the adoption of cloud-based data services in that the user loses control of her/his outsourced data. Encrypting the sensitive user information certainly ensures the confidentiality. However, encryption places an extra layer of ambiguity and its direct use may be at odds with the practical requirements and defeat the purpose of cloud computing technology. We believe that security in nature should not be in contravention of the cloud outsourcing model. Rather, it is expected to complement the current achievements to further fuel the wide adoption of the public cloud service. This, in turn, requires us not to decouple them from the very beginning of the system design. Drawing the successes and failures from both academia and industry, we attempt to answer the challenges of realizing efficient and useful secure data services in the public cloud. In particular, we pay attention to security and privacy in two essential functions of the cloud ``brain", i.e. data storage and processing. Our first work centers on the secure chunk-based deduplication of encrypted data for cloud backup and achieves the performance comparable to the plaintext cloud storage deduplication while effectively mitigating the information leakage from the low-entropy chunks. On the other hand, we comprehensively study the promising yet challenging issue of search over encrypted data in the cloud environment, which allows a user to delegate her/his search task to a CSP server that hosts a collection of encrypted files while still guaranteeing some measure of query privacy. In order to accomplish this grand vision, we explore both software-based secure computation research that often relies on cryptography and concentrates on algorithmic design and theoretical proof, and trusted execution solutions that depend on hardware-based isolation and trusted computing. Hopefully, through the lens of our efforts, insights could be furnished into future research in the related areas. / Ph. D. / Past few years have witnessed a dramatic shift for IT infrastructures from a self-sustained model to a centralized and multi-tenant elastic computing paradigm – Cloud Computing, which significantly reshapes the landscape of existing data utilization services. In truth, public cloud service providers (CSPs), e.g. Google, Amazon, offer us unprecedented benefits, such as ubiquitous and flexible access, considerable capital expenditure savings and on-demand resource allocation. Cloud has become the virtual “brain” as well to support and propel many important applications and system designs, for example, artificial intelligence, Internet of Things, and so forth; on the flip side, security and privacy are among the primary concerns with the adoption of cloud-based data services in that the user loses control of her/his outsourced data. Encryption definitely provides strong protection to user sensitive data, but it also disables the direct use of cloud data services and may defeat the purpose of cloud computing technology. We believe that security in nature should not be in contravention of the cloud outsourcing model. Rather, it is expected to complement the current achievements to further fuel the wide adoption of the public cloud service. This, in turn, requires us not to decouple them from the very beginning of the system design. Drawing the successes and failures from both academia and industry, we attempt to answer the challenges of realizing efficient and useful secure data services in the public cloud. In particular, we pay attention to security and privacy in two essential functions of the cloud “brain”, i.e. data storage and processing. The first part of this research aims to provide a privacy-preserving data deduplication scheme with the performance comparable to the existing cloud backup storage deduplication. In the second part, we attempt to secure the fundamental information retrieval functions and offer effective solutions in various contexts of cloud data services.
|
Page generated in 0.1067 seconds