• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 13
  • 6
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 60
  • 41
  • 20
  • 17
  • 12
  • 12
  • 10
  • 10
  • 9
  • 8
  • 7
  • 7
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Lossless Triangle Mesh Compression

Zhu, Junjie 29 July 2013 (has links)
Triangle meshes have been widely used in many fields such as digital archives, computer-assisted design and in the game industry. The topic we are particularly interested is triangle mesh compression for fossil models. This thesis explores the research area of triangle mesh compression and implements a good compressor for our data set. The compressor is mainly based on a previously proposed algorithm, Edgebreaker, with modifications and improvements. The details of the implementation and our improvements are described in detail in this thesis. / Thesis (Master, Computing) -- Queen's University, 2013-07-27 18:49:44.051
2

Hardware Implementation of a Novel Image Compression Algorithm

Sanikomm, Vikas Kumar Reddy 20 January 2006 (has links)
Image-related communications are forming an increasingly large part of modern communications, bringing the need for efficient and effective compression. Image compression is important for effective storage and transmission of images. Many techniques have been developed in the past, including transform coding, vector quantization and neural networks. In this thesis, a novel adaptive compression technique is introduced based on adaptive rather than fixed transforms for image compression. The proposed technique is similar to Neural Network (NN)-based image compression and its superiority over other techniques is presented It is shown that the proposed algorithm results in higher image quality for a given compression ratio than existing Neural Network algorithms and that the training of this algorithm is significantly faster than the NN based algorithms. This is also compared to the JPEG in terms of Peak Signal to Noise Ratio (PSNR) for a given compression ratio and computational complexity. Advantages of this idea over JPEG are also presented in this thesis.
3

Lossless medical image compression using integer transforms and predictive coding technique

Neela, Divya January 1900 (has links)
Master of Science / Department of Electrical and Computer Engineering / D. V. Satish Chandra / The future of healthcare delivery systems and telemedical applications will undergo a radical change due to the developments in wearable technologies, medical sensors, mobile computing and communication techniques. E-health was born with the integration of networks and telecommunications when dealing with applications of collecting, sorting and transferring medical data from distant locations for performing remote medical collaborations and diagnosis. Healthcare systems in recent years rely on images acquired in two dimensional (2D) domain in the case of still images, or three dimensional (3D) domain for volumetric images or video sequences. Images are acquired with many modalities including X-ray, positron emission tomography (PET), magnetic resonance imaging (MRI), computed axial tomography (CAT) and ultrasound. Medical information is either in multidimensional or multi resolution form, this creates enormous amount of data. Efficient storage, retrieval, management and transmission of this voluminous data is extremely complex. One of the solutions to reduce this complex problem is to compress the medical data losslessly so that the diagnostics capabilities are not compromised. This report proposes techniques that combine integer transforms and predictive coding to enhance the performance of lossless compression. The performance of the proposed techniques is evaluated using compression measures such as entropy and scaled entropy.
4

DATA COMPRESSION STATISTICS AND IMPLICATIONS

Horan, Sheila 10 1900 (has links)
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Bandwidth is a precious commodity. In order to make the best use of what is available, better modulation schemes need to be developed, or less data needs to be sent. This paper will investigate the option of sending less data via data compression. The structure and the entropy of the data determine how much lossless compression can be obtained for a given set of data. This paper shows the data structure and entropy for several actual telemetry data sets and the resulting lossless compression obtainable using data compression techniques.
5

CURRENT STATUS OF DATA COMPRESSION IN TELEMETRY

Horan, Sheila B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California / Reduction of bandwidth for signal transmission is of paramount concern to many in the telemetry and wireless industry. One way to reduce bandwidth is to reduce the amount data being sent. There are several techniques available to reduce the amount of data. This paper will review the various types of data compression currently in use for telemetry data and how much compression is achieved.
6

Lossless convexification of optimal control problems

Harris, Matthew Wade 30 June 2014 (has links)
This dissertation begins with an introduction to finite-dimensional optimization and optimal control theory. It then proves lossless convexification for three problems: 1) a minimum time rendezvous using differential drag, 2) a maximum divert and landing, and 3) a general optimal control problem with linear state constraints and mixed convex and non-convex control constraints. Each is a unique contribution to the theory of lossless convexification. The first proves lossless convexification in the presence of singular controls and specifies a procedure for converting singular controls to the bang-bang type. The second is the first example of lossless convexification with state constraints. The third is the most general result to date. It says that lossless convexification holds when the state space is a strongly controllable subspace. This extends the controllability concepts used previously, and it recovers earlier results as a special case. Lastly, a few of the remaining research challenges are discussed. / text
7

Lossless and nearly-lossless image compression based on combinatorial transforms

Syahrul, Elfitrin 29 June 2011 (has links) (PDF)
Common image compression standards are usually based on frequency transform such as Discrete Cosine Transform or Wavelets. We present a different approach for loss-less image compression, it is based on combinatorial transform. The main transform is Burrows Wheeler Transform (BWT) which tends to reorder symbols according to their following context. It becomes a promising compression approach based on contextmodelling. BWT was initially applied for text compression software such as BZIP2 ; nevertheless it has been recently applied to the image compression field. Compression scheme based on Burrows Wheeler Transform is usually lossless ; therefore we imple-ment this algorithm in medical imaging in order to reconstruct every bit. Many vari-ants of the three stages which form the original BWT-based compression scheme can be found in the literature. We propose an analysis of the more recent methods and the impact of their association. Then, we present several compression schemes based on this transform which significantly improve the current standards such as JPEG2000and JPEG-LS. In the final part, we present some open problems which are also further research directions
8

Visually Lossless Compression Based on JPEG2000 for Efficient Transmission of High Resolution Color Aerial Images

Oh, Han 10 1900 (has links)
ITC/USA 2010 Conference Proceedings / The Forty-Sixth Annual International Telemetering Conference and Technical Exhibition / October 25-28, 2010 / Town and Country Resort & Convention Center, San Diego, California / Aerial image collections have experienced exponential growth in size in recent years. These high resolution images are often viewed at a variety of scales. When an image is displayed at reduced scale, maximum quantization step sizes for visually lossless quality become larger. However, previous visually lossless coding algorithms quantize the image with a single set of quantization step sizes, optimized for display at the full resolution level. This implies that if the image is rendered at reduced resolution, there are significant amounts of extraneous information in the codestream. Thus, in this paper, we propose a method which effectively incorporates multiple quantization step sizes, for various display resolutions, into the JPEG2000 framework. If images are browsed from a remote location, this method can significantly reduce bandwidth usage by only transmitting the portion of the codestream required for visually lossless reconstruction at the desired resolution. Experimental results for high resolution color aerial images are presented.
9

Isually Lossless Coding for Color Aerial Images Using PEG

Oh, Han, Kim, Yookyung 10 1900 (has links)
ITC/USA 2009 Conference Proceedings / The Forty-Fifth Annual International Telemetering Conference and Technical Exhibition / October 26-29, 2009 / Riviera Hotel & Convention Center, Las Vegas, Nevada / This paper describes a psychophysical experiment to measure visibility thresholds (VT) for quantization distortion in JPEG2000 and an associated quantization algorithm for visually lossless coding of color aerial images. The visibility thresholds are obtained from a quantization distortion model based on the statistical characteristics of wavelet coefficients and the deadzone quantizer of JPEG2000, and the resulting visibility thresholds are presented for the luminance component (Y) and two chrominance components (Cb and Cr). Using the thresholds, we have achieved visually lossless coding for 24-bit color aerial images at an average bitrate of 4.17 bits/pixels, which is approximately 30% of the bitrate required for numerically lossless coding.
10

APPLICATION OF DATA COMPRESSION TO FRAME AND PACKET TELEMETRY

Horan, Stephen, Horan, Sheila B. 10 1900 (has links)
International Telemetering Conference Proceedings / October 20-23, 2003 / Riviera Hotel and Convention Center, Las Vegas, Nevada / Reduction of signal transmission is of paramount concern to many in the telemetry and wireless industry. One technique that is available is the compression of the data before transmission. With telemetry type data, there are many approaches that can be used to achieve compression. Data compression of the Advanced Range Telemetry (ARTM) PCM data sets in the frame and packet modes, and for the entire data file will be considered and compared. The technique of differencing data will also be applied to the data files by subtracting the previous major frame and then applying compression techniques. It will be demonstrated that telemetry compression is a viable option to reduce the amount of data to be transmitted, and hence the bandwidth. However, this compression produces variable-length data segments with implications for real-time data synchronization.

Page generated in 0.0351 seconds