• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 178
  • 35
  • 34
  • 7
  • 5
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 318
  • 318
  • 140
  • 118
  • 84
  • 66
  • 64
  • 58
  • 52
  • 42
  • 37
  • 36
  • 32
  • 27
  • 26
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

DATA COMPRESSION STATISTICS AND IMPLICATIONS

Horan, Sheila 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Bandwidth is a precious commodity. In order to make the best use of what is available, better modulation schemes need to be developed, or less data needs to be sent. This paper will investigate the option of sending less data via data compression. The structure and the entropy of the data determine how much lossless compression can be obtained for a given set of data. This paper shows the data structure and entropy for several actual telemetry data sets and the resulting lossless compression obtainable using data compression techniques.
2

CURRENT STATUS OF DATA COMPRESSION IN TELEMETRY

Horan, Sheila B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California / Reduction of bandwidth for signal transmission is of paramount concern to many in the telemetry and wireless industry. One way to reduce bandwidth is to reduce the amount data being sent. There are several techniques available to reduce the amount of data. This paper will review the various types of data compression currently in use for telemetry data and how much compression is achieved.
3

Evaluation of ANSI compression in a bulk data file transfer system /

Chaulklin, Douglas Gary. January 1991 (has links)
Project report (M. Eng.)--Virginia Polytechnic Institute and State University, 1991. / Abstract. Includes bibliographical references (leaves 63-64). Also available via the Internet.
4

Spatio-temporal compression and rendering of time-varying volume data

Anagnostou, Kostantinos January 2001 (has links)
No description available.
5

Image and video compression using the wavelet transform

Lewis, A. S. January 1995 (has links)
No description available.
6

Vylepšení víceproudé komprese / Improvements of multistream compression

Unger, Lukáš January 2010 (has links)
Multistream compression is based on a transformation significantly different from the ones commonly used for data compression. This Master thesis concerns with the use of said method for the compression of text files written in natural language. The main goal of the thesis is to find suitable preprocessing methods for text transformation, which would enable the Multistream compression to achieve better compression ratios, together with searching for the best methods for coding of individual streams. The practical part of the thesis deals with the implementation of several transformation algorithms into the XBW project.
7

Lattice vector quantization for image coding

Sampson, Demetrios G. January 1995 (has links)
No description available.
8

APPLICATION OF DATA COMPRESSION TO FRAME AND PACKET TELEMETRY

Horan, Stephen, Horan, Sheila B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Reduction of signal transmission is of paramount concern to many in the telemetry and wireless industry. One technique that is available is the compression of the data before transmission. With telemetry type data, there are many approaches that can be used to achieve compression. Data compression of the Advanced Range Telemetry (ARTM) PCM data sets in the frame and packet modes, and for the entire data file will be considered and compared. The technique of differencing data will also be applied to the data files by subtracting the previous major frame and then applying compression techniques. It will be demonstrated that telemetry compression is a viable option to reduce the amount of data to be transmitted, and hence the bandwidth. However, this compression produces variable-length data segments with implications for real-time data synchronization.
9

Higher Compression from the Burrows-Wheeler Transform with New Algorithms for the List Update Problem

Chapin, Brenton 08 1900 (has links)
Burrows-Wheeler compression is a three stage process in which the data is transformed with the Burrows-Wheeler Transform, then transformed with Move-To-Front, and finally encoded with an entropy coder. Move-To-Front, Transpose, and Frequency Count are some of the many algorithms used on the List Update problem. In 1985, Competitive Analysis first showed the superiority of Move-To-Front over Transpose and Frequency Count for the List Update problem with arbitrary data. Earlier studies due to Bitner assumed independent identically distributed data, and showed that while Move-To-Front adapts to a distribution faster, incurring less overwork, the asymptotic costs of Frequency Count and Transpose are less. The improvements to Burrows-Wheeler compression this work covers are increases in the amount, not speed, of compression. Best x of 2x-1 is a new family of algorithms created to improve on Move-To-Front's processing of the output of the Burrows-Wheeler Transform which is like piecewise independent identically distributed data. Other algorithms for both the middle stage of Burrows-Wheeler compression and the List Update problem for which overwork, asymptotic cost, and competitive ratios are also analyzed are several variations of Move One From Front and part of the randomized algorithm Timestamp. The Best x of 2x - 1 family includes Move-To-Front, the part of Timestamp of interest, and Frequency Count. Lastly, a greedy choosing scheme, Snake, switches back and forth as the amount of compression that two List Update algorithms achieves fluctuates, to increase overall compression. The Burrows-Wheeler Transform is based on sorting of contexts. The other improvements are better sorting orders, such as “aeioubcdf...” instead of standard alphabetical “abcdefghi...” on English text data, and an algorithm for computing orders for any data, and Gray code sorting instead of standard sorting. Both techniques lessen the overwork incurred by whatever List Update algorithms are used by reducing the difference between adjacent sorted contexts.
10

Video data compression for telescience

Walker, Wendy Tolle, 1959- January 1988 (has links)
This paper recommends techniques to use for data compression of video data used to point a telescope and from a camera observing a robot, for transmission from the proposed U.S. Space Station to Earth. The mathematical basis of data compression is presented, followed by a general review of data compression techniques. A technique that has wide-spread use in data compression of videoconferencing images is recommended for the robot observation data. Bit rates of 60 to 400 kbits/sec can be achieved. Several techniques are modelled to find a best technique for the telescope data. Actual starfield images are used for the evaluation. The best technique is chosen on the basis of which model provides the most compression while preserving the important information in the images. Compression from 8 bits per pel to 0.015 bits per pel is achieved.

Page generated in 0.1078 seconds