• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 351
  • 65
  • 51
  • 33
  • 17
  • 17
  • 17
  • 17
  • 17
  • 17
  • 8
  • 5
  • 3
  • 3
  • 1
  • Tagged with
  • 846
  • 846
  • 406
  • 279
  • 267
  • 137
  • 134
  • 130
  • 111
  • 107
  • 104
  • 96
  • 87
  • 86
  • 81
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

Exploring the Composition of Coding Theory and Cryptography through Secure Computation, Succinct Arguments, and Local Codes

Alexander R Block (13154601) 26 July 2022 (has links)
<p>    We examine new ways in which coding theory and cryptography continue to be  composed together, and show that the composition of these two fields  yield new constructions in the areas of Secure Computation Protocols,  Succinct Interactive Arguments, and Locally Decodable Codes. This  dissertation is a continuation of several decades of research in  composing coding theory and cryptography; examples include secret  sharing, encryption schemes, randomness extraction, pseudo-random number  generation, and the PCP theorem, to name a few. </p> <p>    In Part I of this dissertation, we examine the composition of  coding theory with cryptography, explicitly and implicitly. On the  explicit side, we construct a new family of linear error-correcting  codes, based on algebraic geometric codes, and use this family to  construct new correlation extractors (Ishai et al., FOCS 2009).  Correlation extractors are two-party secure computation protocols for  distilling samples of a leaky correlation (e.g., pre-processed secret  shares that have been exposed to side-channel attacks) into secure and  fresh shares of another correlation (e.g., shares of oblivious  transfer). Our correlation extractors are (nearly) optimal in all  parameters. On the implicit side, we use coding theoretic arguments to  show the security of succinct interactive arguments (Micali, FOCS 1994).  Succinct interactive arguments are a restriction of interactive proofs  (Goldwasser, Micali, Rackoff, STOC 1985) for which security only holds  against computationally bounded provers (i.e., probabilistic polynomial  time), and where the proofs are sub-linear in the size of the statement  being proven. Our new succinct interactive arguments are the first  public-coin, zero-knowledge arguments with time and space efficient  provers: we give two protocols where any NP statement that is verifiable  by a time-T space-S RAM program in is provable time O~(T) and space S *  polylog(T).<br>      In Part II of this dissertation, we examine the composition of  cryptography with coding theory, again explicitly and implicitly,  focusing specifically on locally decodable codes (Katz and Trevisan,  STOC 2000). Locally decodable codes, or LDCs, are error-correcting codes  with super-efficient probabilistic decoding procedures that allow for  decoding individual symbols of the encoded message, without decoding the  entire codeword. On the implicit side, we utilize cryptographic  analysis tools to give a conceptually simpler proof of the so-called  "Hamming-to-InsDel" compiler (Ostrovsky and Paskin-Cherniavsky, ITS  2015). This compiler transforms any Hamming LDC (i.e., a code that is  resilient to bit-flip errors) to another LDC that is resilient to the  broad class of insertion-deletion errors, approximately preserving the  rate and error-tolerance of the code at the cost of a poly-logarithmic  increase in the query complexity. We further extend this compiler to  both the private LDC (Ostrovsky, Pandey, and Sahai, ICALP 2007) setting,  where the encoder and decoder are assumed to share a secret key unknown  to the adversarial channel, and the resource-bounded LDC (Blocki,  Kulkarni, and Zhou, ITC 2020) setting, where the adversarial channel is  assumed to be resource constrained. On the explicit side, we utilize two  cryptographic primitives to give new constructions of alternative  notions of LDCs. First, we use cryptographic puzzles (Bitansky et al.,  ITCS 2016) to construct resource-bounded Hamming LDCs in the standard  model without random oracles, answering an open question of Blocki,  Kulkarni, and Zhou (ITC 2020); we then naturally extend these LDCs to  the InsDel setting via our previously mentioned compiler. Second, we use  digital signature schemes to directly construct computationally relaxed  LDCs (Blocki et al., ITIT 2021) that are resilient to  insertion-deletion errors. Computationally relaxed LDCs allow the  decoder to output an extra symbol signifying it does not know the  correct output and are only secure against probabilistic polynomial time  adversarial channels. To the best of our knowledge, this is the first  such LDC (of any type) resilient against insertion-deletion errors that  does not rely on the aforementioned compiler.</p>
612

Volumes of Balls in Grassmann Manifolds with Applications to Coding Theory

Keenan, Patrick Jordan 19 April 2008 (has links)
<p> This thesis develops the Riemannian Geometry of the real and complex Grassmann Manifolds in a notationally accessible way. The canonical volume form is related to explicit Jacobi Field calculations. The implementation of a packing algorithm based on repulsive forces is proposed. Standard packing bounds and bounds on the volumes of geodesic balls are used to test the performance of the algorithm.</p> / Thesis / Master of Science (MSc)
613

Vector quantization in residual-encoded linear prediction of speech

Abramson, Mark January 1983 (has links)
No description available.
614

An experimental test of dual coding theory using various media and visual momentum in a multimedia environment

Neale, Wayne Carl 06 June 2008 (has links)
Various media (text, audio, graphics, and animation) were examined in the context of a multimedia environment. These media were used to test predictions based on a model of dual-coding theory (DCT) and on an engineering design principle named visual momentum. DCT is a general cognitive information processing theory. Under DCT, external stimuli are represented in either the verbal or nonverbal symbolic systems. The verbal system is specialized to handle language or abstract information. The nonverbal system is specialized to handle more concrete information such as images, environmental sounds, and writing patterns. Visual momentum is a general design principle used to demonstrate relationships between successively viewed computer screens. This study applied visual momentum through the use of animation to explicitly demonstrate the relationships between information represented in one format to that represented in another format both within and between computer screens. Subjects were required to complete a multimedia program explaining material about total quality management. A 3 x 3 between subjects design was modeled after DCT and visual momentum. Ninety subjects were exposed to various media conditions and were subsequently tested for retention and problem solving as well as several other measures. Generally, the results do not support DCT. However, some findings do support DCT. Dual—coded groups spent less time answering retention questions while performing better than single—coded groups. However, subjects spent more time on the material in these conditions. Subjects only receiving a single medium reviewed the material more often than those subjects receiving dual media except when presented with audio. This difference between text and audio does not support DCT. Those subjects in dual-coded graphic conditions reported more referential and associative processing than those subjects receiving dual-coded animation conditions. Generally, the results support visual momentum. Visual momentum reduce the time needed to answer retention questions as will as improved test performance beyond results predicted by DCT. Visual momentum also reduce the amount of cognitive processing needed to correctly answer the retention and problem solving questions. / Ph. D.
615

Digital image segmentation using periodic codings

Celik, Mehmet Kemal January 1988 (has links)
Digital image segmentation using periodic codings is explored with reference to two applications. First, the application of uniform periodic codings, to the problem of segmenting the in-focus regions in an image from the blurred parts, is discussed. The work presented in this part extends a previous investigation on this subject by considering the leakage effects. The method proposed consists of two stages. In each stage, filtering is done in the spatial frequency domain after uniform grating functions are applied to the images in the spatial domain. Then, algorithms for finding the period and phase of a physical grating are explored for a hybrid optical-digital application of the method. Second, a model for textures as the linear superposition of periodic narrowband components, defined as tones, is proposed. A priori information about the number of the tones, their spatial frequencies, and coefficients is necessary to generate tone and texture indicators. Tone indicators are obtained by filtering the image with complex analytical functions defined by the spatial frequencies of the tones present in the image. A criterion for choosing the dimensions of the filter is also provided. Texture indicators are then generated for each texture in the image by applying the a priori information of the tonal coefficients to the filtered images. Several methods for texture segmentation which employ texture indicators are proposed. Finally, examples which illustrate the characteristics of the method are presented. / Master of Science
616

Performance analysis of CDMA systems in multipath channels

Cameron, Rick 31 October 2009 (has links)
This thesis provides a performance analysis of Code Division Multiple Access (COMA) systems in non-ideal conditions. Two benefits of CDMA include resistance to multipath fading and graceful performance degradation in the presence of multiple access interference. These benefits have made CDMA especially attractive for cellular telephone systems and other wireless networks. This thesis project is divided into four tasks. The first objective is to implement a channel model incorporating multiple access interference and to verify the results. A method is used that obtains arbitrarily tight upper and lower bounds on the average probability of error without using a Gaussian approximation. The second objective is to model imperfect power control. The third objective is to incorporate multipath effects into this model using a discrete number of multipath components with some known probability distribution. This modeling is based on measurements taken in indoor and outdoor channels. The fourth objective is a performance analysis for a RAKE receiver applied to outdoor channels. / Master of Science
617

Low bit rate speech coding

Kritzinger, Carl 03 1900 (has links)
Thesis (MScIng (Electrical and Electronic Engineering))--University of Stellenbosch, 2006. / Despite enormous advances in digital communication, the voice is still the primary tool with which people exchange ideas. However, uncompressed digital speech tends to require prohibitively high data rates (upward of 64kbps), making it impractical for many applications. Speech coding is the process of reducing the data rate of digital voice to manageable levels. Parametric speech coders or vocoders utilise a-priori information about the mechanism by which speech is produced in order to achieve extremely efficient compression of speech signals (as low as 1 kbps). The greater part of this thesis comprises an investigation into parametric speech coding. This consisted of a review of the mathematical and heuristic tools used in parametric speech coding, as well as the implementation of an accepted standard algorithm for parametric voice coding. In order to examine avenues of improvement for the existing vocoders, we examined some of the mathematical structure underlying parametric speech coding. Following on from this, we developed a novel approach to parametric speech coding which obtained promising results under both objective and subjective evaluation. An additional contribution by this thesis was the comparative subjective evaluation of the effect of parametric speech coding on English and Xhosa speech. We investigated the performance of two different encoding algorithms on the two languages.
618

Number theoretic methods and their significance in computer science, information theory, combinatorics, and geometry

Bibak, Khodakhast 13 April 2017 (has links)
In this dissertation, I introduce some number theoretic methods and discuss their intriguing applications to a variety of problems in computer science, information theory, combinatorics, and geometry. First, using properties of Ramanujan sums and of the discrete Fourier transform of arithmetic functions, we give an explicit formula for the number of solutions of restricted linear congruences in their `most general case'. As a consequence, we derive necessary and su cient conditions under which these congruences have no solutions. The number of solutions of this kind of congruence was rst considered by Rademacher in 1925 and Brauer in 1926, in a special case. Since then, this problem has been studied, in several other special cases, in many papers. The problem is very well-motivated and has found intriguing applications in several areas of mathematics, computer science, and physics, and there is promise for more applications/implications in these or other directions. Universal hash functions, discovered by Carter and Wegman in 1979, have many important applications in computer science. Applying our results we construct an almost-universal hash function family which is used to give a generalization of a recent authentication code with secrecy scheme. As another application of our results, we prove an explicit and practical formula for the number of surface-kernel epimorphisms from a co-compact Fuchsian group to iv a cyclic group. This problem has important applications in combinatorics, geometry, string theory, and quantum eld theory (QFT). As a consequence, we obtain an `equivalent' form of Harvey's famous theorem on the cyclic groups of automorphisms of compact Riemann surfaces. We also consider the number of solutions of linear congruences with distinct coordinates, and using a graph theoretic method, generalize a result of Sch onemann from 1839. Also, we give explicit formulas for the number of solutions of unweighted linear congruences with distinct coordinates. Our main tools are properties of Ramanujan sums and of the discrete Fourier transform of arithmetic functions. Then, as an application, we derive an explicit formula for the number of codewords in the Varshamov{Tenengolts code V Tb(n) with Hamming weight k, that is, with exactly k 1's. The Varshamov{Tenengolts codes are an important class of codes that are capable of correcting asymmetric errors on a Z-channel. As another application, we derive Ginzburg's formula for the number of codewords in V Tb(n), that is, jV Tb(n)j. We even go further and discuss applications to several other combinatorial problems, some of which have appeared in seemingly unrelated contexts. This provides a general framework and gives new insight into these problems which might lead to further work. Finally, we bring a very deep result of Pierre Deligne into the area of coding theory we connect Lee codes to Ramanujan graphs by showing that the Cayley graphs associated with some quasi-perfect Lee codes are Ramanujan graphs (this solves a recent conjecture). Our main tools are Deligne's bound from 1977 for estimating a particular kind of trigonometric sum and a result of Lov asz from 1975 (or of Babai from 1979) which gives the eigenvalues of Cayley graphs of nite Abelian groups. Our proof techniques may motivate more work in the interactions between spectral graph theory, character theory, and coding theory, and may provide new ideas towards the long-standing Golomb{Welch conjecture. / Graduate / 0984
619

Homomorphic encryption and coding theory / Homomorphic encryption and coding theory

Půlpánová, Veronika January 2012 (has links)
Title: Homomorphic encryption and coding theory Author: Veronika Půlpánová Department: Department of algebra Supervisor: RNDr. Michal Hojsík, Ph.D., Department of algebra Abstract: The current mainstream in fully homomorphic encryption is the appro- ach that uses the theory of lattices. The thesis explores alternative approaches to homomorphic encryption. First we present a code-based homomorphic encrypti- on scheme by Armknecht et. al. and study its properties. Then we describe the family of cryptosystems commonly known as Polly Cracker and identify its pro- blematic aspects. The main contribution of this thesis is the design of a new fully homomorphic symmetric encryption scheme based on Polly Cracker. It proposes a new approach to overcoming the complexity of the simple Polly Cracker - based cryptosystems. It uses Gröbner bases to generate zero-dimensional ideals of po- lynomial rings over finite fields whose factor rings are then used as the rings of ciphertexts. Gröbner bases equip these rings with a multiplicative structure that is easily algorithmized, thus providing an environment for a fully homomorphic cryptosystem. Keywords: Fully homomorphic encryption, Polly Cracker, coding theory, zero- dimensional ideals
620

Analysis of bounded distance decoding for Reed Solomon codes

Babalola, Oluwaseyi Paul January 2017 (has links)
Masters Report A report submitted in ful llment of the requirements for the degree of Master of Science (50/50) in the Centre for Telecommunication Access and Services (CeTAS) School of Electrical and Information Engineering Faculty of Engineering and the Built Environment February 2017 / Bounded distance decoding of Reed Solomon (RS) codes involves nding a unique codeword if there is at least one codeword within the given distance. A corrupted message having errors that is less than or equal to half the minimum distance cor- responds to a unique codeword, and therefore will decode errors correctly using the minimum distance decoder. However, increasing the decoding radius to be slightly higher than half of the minimum distance may result in multiple codewords within the Hamming sphere. The list decoding and syndrome extension methods provide a maximum error correcting capability whereby the radius of the Hamming ball can be extended for low rate RS codes. In this research, we study the probability of having unique codewords for (7; k) RS codes when the decoding radius is increased from the error correcting capability t to t + 1. Simulation results show a signi cant e ect of the code rates on the probability of having unique codewords. It also shows that the probability of having unique codeword for low rate codes is close to one. / MT2017

Page generated in 0.071 seconds