Spelling suggestions: "subject:"data compression"" "subject:"mata compression""
21 |
RADIX 95n: Binary-to-Text Data ConversionJones, Greg, 1963-2017. 08 1900 (has links)
This paper presents Radix 95n, a binary to text data conversion algorithm. Radix 95n (base 95) is a variable length encoding scheme that offers slightly better efficiency than is available with conventional fixed length encoding procedures. Radix 95n advances previous techniques by allowing a greater pool of 7-bit combinations to be made available for 8-bit data translation. Since 8-bit data (i.e. binary files) can prove to be difficult to transfer over 7-bit networks, the Radix 95n conversion technique provides a way to convert data such as compiled programs or graphic images to printable ASCII characters and allows for their transfer over 7-bit networks.
|
22 |
AN ONBOARD PROCESSOR FOR FLIGHT TEST DATA ACQUISITION SYSTEMSWegener, John A., Blase, Gordon A. 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Today’s flight test programs are experiencing increasing demands for a greater number of high-rate
digital parameters, competition for spectrum space, and a need for operational flexibility in flight test
instrumentation. These demands must be met while meeting schedule and budget constraints. To
address these various needs, the Boeing Integrated Defense System (IDS) Flight Test
Instrumentation group in St. Louis has developed an onboard processing capability for use with
airborne instrumentation data collection systems. This includes a first-generation Onboard Processor
(OBP) which has been successfully used on the F/A-18E/F Super Hornet flight test program for four
years, and which provides a throughput of 5 Mbytes/s and a processing capability of 480 Mflops
(floating-point operations per second). Boeing IDS Flight Test is also currently developing a second
generation OBP which features greatly enhanced input and output flexibility and algorithm
programmability, and is targeted to provide a throughput of 160 Mbytes/s with a processing
capability of 16 Gflops. This paper describes these onboard processing capabilities and their
benefits.
|
23 |
CALCULATING POWER SPECTRAL DENSITY IN A NETWORKBASED TELEMETRY SYSTEMBrierley, Scott 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Calculating the power spectral density (PSD) at the transducer or data acquisition system offers advantages in a network-based telemetry system. The PSD is provided in real time to the users. The conversion to PSD can either be lossless (allowing a complete reconstruction of the transducer signal) or lossy (providing data compression). Post-processing can convert the PSD back to time histories if desired. A complete reconstruction of the signal is possible, including knowledge of the signal level between the sample periods. Properly implemented, this method of data collection provides a sharp anti-aliasing filter with minimal added cost. Currently no standards exist for generating PSDs on the vehicle. New standards could help telemetry system designers understand the benefits and limitations calculating the power spectral density in a network-based telemetry system.
|
24 |
Advanced Range Telemetry (ARTM): Preparing for a New Generation of TelemetryChalfant, Timothy A., Straehley, Erwin H., Switzer, Earl R. 10 1900 (has links)
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California / At open air test and training ranges, telemetry is beset by two opposing forces. One is the inexorable demand to deliver more information to users who must make decisions in ever shorter time frames. The other is the reduced availability of radio frequency spectrum, driven by its increased economic value to society as a whole. ARTM is planned to assure that test and training programs of the next several decades can meet their data quantity and quality objectives in the faces of these challenges. ARTM expects to improve the efficiency of spectrum usage by changing historical methods of acquiring telemetry data and transmitting it from systems under test to range customers. The program is initiating advances in coding, compression, data channel assignment, and modulation. Due to the strong interactions of these four dimensions, the effort is integrated in a single focused program. In that these are problems which are common throughout the test and training community, ARTM is a tri-service program embodying the DoD's Common Test and Training Range Architecture and Reliance principles in its management and organization. This paper will discuss the driving forces, the initial study areas, the organizational structure, and the program goals.
|
25 |
Image and video coding for noisy channelsRedmill, David Wallace January 1994 (has links)
No description available.
|
26 |
Compression of integral three-dimensional television picturesForman, Matthew Charles January 2000 (has links)
No description available.
|
27 |
MULTISPECTRAL DATA COMPRESSION USING STAGGERED DETECTOR ARRAYS (LANDSAT, REMOTE SENSING).GRAY, ROBERT TERRY. January 1983 (has links)
A multispectral image data compression scheme has been investigated in which a scene is imaged onto a detector array whose elements vary in spectral sensitivity. The elements are staggered such that the scene is undersampled within any single spectral band, but is sufficiently sampled by the total array. Compression thus results from transmitting only one spectral component of a scene at any given array coordinate. The pixels of the mosaic array may then be directly transmitted via PCM or undergo further compression (e.g. DPCM). The scheme has the advantages of attaining moderate compression without compression hardware at the transmitter, high compression with low-order DPCM processing, and a choice of reconstruction algorithms suitable to the application at hand. Efficient spatial interpolators such as parametric cubic convolution may be employed to fill in the missing pixels in each spectral band in cases where high resolution is not a requirement. However, high-resolution reconstructions are achieved by a space-variant minimum-mean-square spectral regression estimation of the missing pixels of each band from the adjacent samples of other bands. In this case, reconstruction accuracy is determined by the local spectral correlations between bands, the estimates of which include the effects of interband contrast reversal. Digital simulations have been performed on three-band aerial and four-band Landsat multispectral images. Spectral regressions of mosaic array data can provide reconstruction errors comparable to second-order DPCM processing and lower than common intraband interpolators at data rates of approximately 2 bits per pixel. When the mosaic data is itself DPCM-coded, the radiometric accuracy of spectral regression is superior to direct DPCM for equivalent bit rates.
|
28 |
Improving the performance of wide area networksHolt, Alan January 1999 (has links)
No description available.
|
29 |
Multi-variable block transforms for motion compensated digital video compressionMilburn, Paul Spencer January 1995 (has links)
No description available.
|
30 |
Best Linear Unbiased Estimation Fusion with ConstraintsZhang, Keshu 19 December 2003 (has links)
Estimation fusion, or data fusion for estimation, is the problem of how to best utilize useful information contained in multiple data sets for the purpose of estimating an unknown quantity — a parameter or a process. Estimation fusion with constraints gives rise to challenging theoretical problems given the observations from multiple geometrically dispersed sensors: Under dimensionality constraints, how to preprocess data at each local sensor to achieve the best estimation accuracy at the fusion center? Under communication bandwidth constraints, how to quantize local sensor data to minimize the estimation error at the fusion center? Under constraints on storage, how to optimally update state estimates at the fusion center with out-of-sequence measurements? Under constraints on storage, how to apply the out-of-sequence measurements (OOSM) update algorithm to multi-sensor multi-target tracking in clutter? The present work is devoted to the above topics by applying the best linear unbiased estimation (BLUE) fusion. We propose optimal data compression by reducing sensor data from a higher dimension to a lower dimension with minimal or no performance loss at the fusion center. For single-sensor and some particular multiple-sensor systems, we obtain the explicit optimal compression rule. For a multisensor system with a general dimensionality requirement, we propose the Gauss-Seidel iterative algorithm to search for the optimal compression rule. Another way to accomplish sensor data compression is to find an optimal sensor quantizer. Using BLUE fusion rules, we develop optimal sensor data quantization schemes according to the bit rate constraints in communication between each sensor and the fusion center. For a dynamic system, how to perform the state estimation and sensor quantization update simultaneously is also established, along with a closed form of a recursion for a linear system with additive white Gaussian noise. A globally optimal OOSM update algorithm and a constrained optimal update algorithm are derived to solve one-lag as well as multi-lag OOSM update problems. In order to extend the OOSM update algorithms to multisensor multitarget tracking in clutter, we also study the performance of OOSM update associated with the Probabilistic Data Association (PDA) algorithm.
|
Page generated in 0.0767 seconds