Spelling suggestions: "subject:"encoding"" "subject:"uncoding""
1 |
Data compression for digital elevation modelsLewis, Michael January 1996 (has links)
No description available.
|
2 |
Data compression strategies for RDAT/DDS media in hostile environmentsThomas, Owen David John January 1996 (has links)
This thesis investigates the prevention of error propagation in magnetically recorded compressed data when severe environmental conditions result in uncorrected channel errors. The tape format DDS is examined and a computer simulation of its error correction procedures is described. This software implementation uses explicit parity byte equations and these are presented for all three Reed-Solomon codes. The simulation allows the calculation of the uncorrected error patterns when the recording is compromised and uncorrected byte errors are determined for given initial random and burst errors. Some of the more familiar data compression algorithms are visited before the little known adaptive Rice algorithm is described in detail. An analytic example is developed which demonstrates the coding mechanism. A synchronized piecewise compression strategy is adopted in which the synchronizing sequences are placed periodically into the compressed data stream. The synchronizing sequences are independent of the compression algorithm and may occur naturally in the compressed data stream. A cyclic count is added to the compressed data stream to number the groups of data between synchronizing sequences and prevent slippage in the data. The Rice algorithm is employed in the strategy to compress correlated physical data. A novel compressor is developed to compress mixed correlated physical data and text within the synchronization strategy. This compressor uses the Rice algorithm to compress the correlated data and a sliding window algorithm to compress the text and switches between the two algorithms as the data type varies. The sliding window compressor T.ZR is adopted when the same principles are applied to the robust compression of English text alone. TJ7R is modified to improve compression of relatively small pieces of English text. The synchronization strategy incorporating these algorithms has been simulated computationally. This simulation is linked to that of DDS in each test performed when the errors are both random and bursty. The decompressed data is compared to the original. The strategy is demonstrated to be effective in preventing error propagation beyond the data immediately affected by errors without significant damage to the compression ratio.
|
3 |
Gene Trap Identification of mbrg-1 : A Novel Mammalian Gene Encoding a Bromodomain Containing ProteinJing, Wuhua January 1995 (has links)
Note:
|
4 |
Lambda encodings in type theoryFu, Peng 01 July 2014 (has links)
Lambda encodings (such as Church encoding, Scott encoding and Parigot encoding) are methods to represent data in lambda calculus. Curry-Howard correspondence relates the formulas and proofs in intuitionistic logics to the types and programs in typed functional programming languages. Roughly speaking, Type theory (Intuitionistic Type Theory) formulates the intuitionistic logic in the style of typed functional programming language. This dissertation investigates the mechanisms to support lambda encodings in type theory. Type theory, for example, Calculus of Constructions(CC) does not directly support inductive data because the induction principle for the inductive data is proven to be not derivable. Thus inductive data together with inductive principle are added as primitive to CC, leading to several nontrivial extensions, e.g. Calculus of Inductive Constructions. In this dissertation, we explore alternatives to incorporate inductive data in type theory. We propose to consider adding an abstraction construct to the intuitionistic type to support lambda-encoded data, while still be able to derive the corresponding induction principle. The main benefit of this approach is that we obtain relatively simple systems, which are easier to analyze and implement.
|
5 |
Reevaluating Encoding-Retrieval Match and Cue OverloadShafer, Erica S. 06 December 2024 (has links)
The encoding specificity principle, initially proposed by Thompson and Tulving in 1973, asserts that the congruence between encoding and retrieval conditions plays a crucial role in successful memory retrieval. Although this principle has largely been supported, Nairne (2002) has challenged memory theorists to reconsider its direct causality, proposing that the diagnostic value of retrieval cues with respect to a specific memory is the primary determinant of successful retrieval. This study sought to investigate this claim. This study builds upon the work conducted by Goh and Lu (2012) by adapting the manipulations of encoding-retrieval match and cue overload in their original task design. The current study replaced the implicit category cue in the high-overload, high-match conditions with an explicit cue in an attempt to strengthen the manipulation. We hypothesized that the addition of an explicit high-overload cue to our experimental design would lead to a significant effect of encoding-retrieval match in the high-overload condition, in contrast with Goh and Lu’s (2012) findings. Our findings provide mixed support for this hypothesis. We observed weak evidence for a main effect of encoding-retrieval match, with better performance in the high-match condition than the low-match condition without evidence for a significant interaction between encoding-retrieval match and cue overload. However, planned t-tests somewhat conflicted with this finding in that encoding-retrieval match had a significant effect only when the cues were low-overload, not when match was increased with a high-overload cue. Further investigation is needed before conclusions can be drawn from this data. / Master of Science / Memory is often more successful when the conditions during learning match those during recall—a concept known as the encoding specificity principle, first proposed by Thompson and Tulving in 1973. This principle suggests that memory performance improves when the cues present during learning are also available during recall. While widely accepted, Nairne (2002) suggested that the effectiveness of memory retrieval depends more on how well a retrieval cue uniquely identifies the specific memory being recalled than on matching conditions. To explore this, we modified a task originally designed by Goh and Lu (2012) that tested the effects of matching learning and recall conditions and the presence of competing memory cues. Their original study design used semantic categories as cues, but the participants were not explicitly told to pay attention to the semantic category of the words. We replaced this implicit category label with a more explicit word cue to strengthen the experimental manipulations. We hypothesized that these changes would reveal a stronger effect of matching learning and recall conditions, particularly when multiple competing cues were present, contrary to Goh and Lu’s findings. Results partially supported this hypothesis: memory performance was better when learning and recall conditions matched, and explicit cues improved recall in some cases but not others. These findings suggest that both encoding-retrieval matching and the clarity of retrieval cues contribute to memory performance, highlighting areas for future research.
|
6 |
Type I restriction and modification systemsFuller-Pace, Frances Victoria January 1985 (has links)
No description available.
|
7 |
Information leakage in encrypted IP video trafficWampler, Christopher 07 January 2016 (has links)
We show that information leakage occurs in video over IP traffic, including for encrypted payloads. It is possible to detect events occurring in the field of view of a camera streaming live video through analysis of network traffic metadata including arrival time between packets, packets sizes, and video stream bandwidth. Event detection through metadata analysis is possible even when common encryption techniques are applied to the video stream such as SSL or AES. We have observed information leakage across multiple codes and cameras. Through timestamps added to the x264 codec, we establish a basis for detectability of events via packet timing. Laboratory experiments confirm that this event detection is possible in practice and repeatable. By collecting network traffic captures from over 100 Skype video calls we are able to see the impact of this information leakage under a variety of conditions.
|
8 |
Identification and characterization of cMG1, a primary response geneGomperts, Miranda January 1991 (has links)
No description available.
|
9 |
An investigation of 2#mu# plasmid protein functionTrebilcock, Anna E. January 1993 (has links)
No description available.
|
10 |
A study of binary codes to improve LIDAR performanceKeen, Tristan January 1996 (has links)
No description available.
|
Page generated in 0.0618 seconds