• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 28
  • 13
  • 6
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 60
  • 41
  • 20
  • 17
  • 12
  • 12
  • 10
  • 10
  • 9
  • 8
  • 7
  • 7
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

ECG compression for Holter monitoring

Ottley, Adam Carl 11 April 2007
Cardiologists can gain useful insight into a patient's condition when they are able to correlate the patent's symptoms and activities. For this purpose, a Holter Monitor is often used - a portable electrocardiogram (ECG) recorder worn by the patient for a period of 24-72 hours. Preferably, the monitor is not cumbersome to the patient and thus it should be designed to be as small and light as possible; however, the storage requirements for such a long signal are very large and can significantly increase the recorder's size and cost, and so signal compression is often employed. At the same time, the decompressed signal must contain enough detail for the cardiologist to be able to identify irregularities. "Lossy" compressors may obscure such details, where a "lossless" compressor preserves the signal exactly as captured.<p>The purpose of this thesis is to develop a platform upon which a Holter Monitor can be built, including a hardware-assisted lossless compression method in order to avoid the signal quality penalties of a lossy algorithm. <p>The objective of this thesis is to develop and implement a low-complexity lossless ECG encoding algorithm capable of at least a 2:1 compression ratio in an embedded system for use in a Holter Monitor. <p>Different lossless compression techniques were evaluated in terms of coding efficiency as well as suitability for ECG waveform application, random access within the signal and complexity of the decoding operation. For the reduction of the physical circuit size, a System On a Programmable Chip (SOPC) design was utilized. <p>A coder based on a library of linear predictors and Rice coding was chosen and found to give a compression ratio of at least 2:1 and as high as 3:1 on real-world signals tested while having a low decoder complexity and fast random access to arbitrary parts of the signal. In the hardware-assisted implementation, the speed of encoding was a factor of between four and five faster than a software encoder running on the same CPU while allowing the CPU to perform other tasks during the encoding process.
12

ECG compression for Holter monitoring

Ottley, Adam Carl 11 April 2007 (has links)
Cardiologists can gain useful insight into a patient's condition when they are able to correlate the patent's symptoms and activities. For this purpose, a Holter Monitor is often used - a portable electrocardiogram (ECG) recorder worn by the patient for a period of 24-72 hours. Preferably, the monitor is not cumbersome to the patient and thus it should be designed to be as small and light as possible; however, the storage requirements for such a long signal are very large and can significantly increase the recorder's size and cost, and so signal compression is often employed. At the same time, the decompressed signal must contain enough detail for the cardiologist to be able to identify irregularities. "Lossy" compressors may obscure such details, where a "lossless" compressor preserves the signal exactly as captured.<p>The purpose of this thesis is to develop a platform upon which a Holter Monitor can be built, including a hardware-assisted lossless compression method in order to avoid the signal quality penalties of a lossy algorithm. <p>The objective of this thesis is to develop and implement a low-complexity lossless ECG encoding algorithm capable of at least a 2:1 compression ratio in an embedded system for use in a Holter Monitor. <p>Different lossless compression techniques were evaluated in terms of coding efficiency as well as suitability for ECG waveform application, random access within the signal and complexity of the decoding operation. For the reduction of the physical circuit size, a System On a Programmable Chip (SOPC) design was utilized. <p>A coder based on a library of linear predictors and Rice coding was chosen and found to give a compression ratio of at least 2:1 and as high as 3:1 on real-world signals tested while having a low decoder complexity and fast random access to arbitrary parts of the signal. In the hardware-assisted implementation, the speed of encoding was a factor of between four and five faster than a software encoder running on the same CPU while allowing the CPU to perform other tasks during the encoding process.
13

Validation for Visually lossless Compression of Stereo Images

Feng, Hsin-Chang 10 1900 (has links)
ITC/USA 2013 Conference Proceedings / The Forty-Ninth Annual International Telemetering Conference and Technical Exhibition / October 21-24, 2013 / Bally's Hotel & Convention Center, Las Vegas, NV / This paper described the details of subjective validation for visually lossless compression of stereoscopic 3 dimensional (3D) images. The subjective testing method employed in this work is adapted from methods used previously for visually lossless compression of 2 dimensional (2D) images. Confidence intervals on the correct response rate obtained from the subjective validation of compressed stereo pairs provide reliable evidence to indicate that the compressed stereo pairs are visually lossless.
14

Measurement of Visibility Thresholds for Compression of Stereo Images

Feng, Hsin-Chang 10 1900 (has links)
ITC/USA 2012 Conference Proceedings / The Forty-Eighth Annual International Telemetering Conference and Technical Exhibition / October 22-25, 2012 / Town and Country Resort & Convention Center, San Diego, California / This paper proposes a method of measuring visibility thresholds for quantization distortion in JPEG2000 for compression of stereoscopic 3D images. The crosstalk effect is carefully considered to ensure that quantization errors in each channel of stereoscopic images are imperceptible to both eyes. A model for visibility thresholds is developed to reduce the daunting number of measurements required for subjective experiments.
15

A novel fully progressive lossy-to-lossless coder for arbitrarily-connected triangle-mesh models of images and other bivariate functions

Guo, Jiacheng 16 August 2018 (has links)
A new progressive lossy-to-lossless coding method for arbitrarily-connected triangle mesh models of bivariate functions is proposed. The algorithm employs a novel representation of a mesh dataset called a bivariate-function description (BFD) tree, and codes the tree in an efficient manner. The proposed coder yields a particularly compact description of the mesh connectivity by only coding the constrained edges that are not locally preferred Delaunay (locally PD). Experimental results show our method to be vastly superior to previously-proposed coding frameworks for both lossless and progressive coding performance. For lossless coding performance, the proposed method produces the coded bitstreams that are 27.3% and 68.1% smaller than those generated by the Edgebreaker and Wavemesh methods, respectively. The progressive coding performance is measured in terms of the PSNR of function reconstructions generated from the meshes decoded at intermediate stages. The experimental results show that the function approximations obtained with the proposed approach are vastly superior to those yielded with the image tree (IT) method, the scattered data coding (SDC) method, the average-difference image tree (ADIT) method, and the Wavemesh method with an average improvement of 4.70 dB, 10.06 dB, 2.92 dB, and 10.19 dB in PSNR, respectively. The proposed coding approach can also be combined with a mesh generator to form a highly effective mesh-based image coding system, which is evaluated by comparing to the popular JPEG2000 codec for images that are nearly piecewise smooth. The images are compressed with the mesh-based image coder and the JPEG2000 codec at the fixed compression rates and the quality of the resulting reconstructions are measured in terms of PSNR. The images obtained with our method are shown to have a better quality than those produced by the JPEG2000 codec, with an average improvement of 3.46 dB. / Graduate
16

Lossless medical image compression through lightweight binary arithmetic coding

Bartrina Rapesta, Joan, Sanchez, Victor, Serra Sagrsità, Joan, Marcellin, Michael W., Aulí Llinàs, Francesc, Blanes, Ian 19 September 2017 (has links)
A contextual lightweight arithmetic coder is proposed for lossless compression of medical imagery. Context definition uses causal data from previous symbols coded, an inexpensive yet efficient approach. To further reduce the computational cost, a binary arithmetic coder with fixed-length codewords is adopted, thus avoiding the normalization procedure common in most implementations, and the probability of each context is estimated through bitwise operations. Experimental results are provided for several medical images and compared against state-of-the-art coding techniques, yielding on average improvements between nearly 0.1 and 0.2 bps.
17

Compression of Medical Images Using Local Neighbor Difference

Patterson, Erin Leigh 24 August 2017 (has links)
No description available.
18

A Turbo Approach to Distributed Acoustic Detection and Estimation

Egger, Sean Robert 18 December 2009 (has links)
Networked, multi-sensor array systems have proven to be advantageous in the sensor world. A large amount of research has been conducted with these systems, with a main interest in data fusion. Intelligently processing the large amounts of data collected by these systems is required in order to fully utilize the benefits of a multi-sensor array system. A robust but flexible simulation environment would provide a platform for accurately comparing current and future data fusion theories. This thesis proposes a simulator model for testing fusion theories for these acoustic multi-sensor networks. An iterative, lossless data fusion algorithm was presented as the model for simulation development. The arrangement and orientation of objects in the simulation environment, as well as most other system parameters are defined by the user before the simulation runs. The sensor data, including noise, is generated at the appropriate time delay and propagation loss before being processed by a delay and sum beamformer and a matched filter. The resulting range-Doppler maps are modified to probability density functions, and translated to a single point of reference. The data is then combined into a single world model. An iterative process is used to filter out false targets and amplify true target detections. Data is fused from each multi-sensor array and from each simulation run. Target amplitudes are gained if they are present in all combined world models, and are otherwise reduced. This thesis presents the results of the fusion algorithm used, including multiple iterations, to prove the algorithms effectiveness. / Master of Science
19

Lossless and nearly-lossless image compression based on combinatorial transforms / Compression d'images sans perte ou quasi sans perte basée sur des transformées combinatoires

Syahrul, Elfitrin 29 June 2011 (has links)
Les méthodes classiques de compression d’image sont communément basées sur des transformées fréquentielles telles que la transformée en Cosinus Discret (DCT) ou encore la transformée discrète en ondelettes. Nous présentons dans ce document une méthode originale basée sur une transformée combinatoire celle de Burrows-Wheeler(BWT). Cette transformée est à la base d’un réagencement des données du fichier servant d’entrée au codeur à proprement parler. Ainsi après utilisation de cette méthode sur l’image originale, les probabilités pour que des caractères identiques initialement éloignés les uns des autres se retrouvent côte à côte sont alors augmentées. Cette technique est utilisée pour la compression de texte, comme le format BZIP2 qui est actuellement l’un des formats offrant un des meilleurs taux de compression. La chaîne originale de compression basée sur la transformée de Burrows-Wheeler est composée de 3 étapes. La première étape est la transformée de Burrows-Wheeler elle même qui réorganise les données de façon à regrouper certains échantillons de valeurs identiques. Burrows et Wheeler conseillent d’utiliser un codage Move-To-Front (MTF) qui va maximiser le nombre de caractères identiques et donc permettre un codage entropique (EC) (principalement Huffman ou un codeur arithmétique). Ces deux codages représentent les deux dernières étapes de la chaîne de compression. Nous avons étudié l’état de l’art et fait des études empiriques de chaînes de compression basées sur la transformée BWT pour la compression d’images sans perte. Les données empiriques et les analyses approfondies se rapportant aux plusieurs variantes de MTF et EC. En plus, contrairement à son utilisation pour la compression de texte,et en raison de la nature 2D de l’image, la lecture des données apparaît importante. Ainsi un prétraitement est utilisé lors de la lecture des données et améliore le taux de compression. Nous avons comparé nos résultats avec les méthodes de compression standards et en particulier JPEG 2000 et JPEG-LS. En moyenne le taux de com-pression obtenu avec la méthode proposée est supérieur à celui obtenu avec la norme JPEG 2000 ou JPEG-LS / Common image compression standards are usually based on frequency transform such as Discrete Cosine Transform or Wavelets. We present a different approach for loss-less image compression, it is based on combinatorial transform. The main transform is Burrows Wheeler Transform (BWT) which tends to reorder symbols according to their following context. It becomes a promising compression approach based on contextmodelling. BWT was initially applied for text compression software such as BZIP2 ; nevertheless it has been recently applied to the image compression field. Compression scheme based on Burrows Wheeler Transform is usually lossless ; therefore we imple-ment this algorithm in medical imaging in order to reconstruct every bit. Many vari-ants of the three stages which form the original BWT-based compression scheme can be found in the literature. We propose an analysis of the more recent methods and the impact of their association. Then, we present several compression schemes based on this transform which significantly improve the current standards such as JPEG2000and JPEG-LS. In the final part, we present some open problems which are also further research directions
20

Use of Multi-Threading, Modern Programming Language, and Lossless Compression in a Dynamic Commutation/Decommutation System

Wigent, Mark A., Mazzario, Andrea M., Matsumura, Scott M. 10 1900 (has links)
ITC/USA 2011 Conference Proceedings / The Forty-Seventh Annual International Telemetering Conference and Technical Exhibition / October 24-27, 2011 / Bally's Las Vegas, Las Vegas, Nevada / The Spectrum Efficient Technology Science and Technology (SET S&T) Program is sponsoring the development of the Dynamic Commutation and Decommutation System (DCDS), which optimizes telemetry data transmission in real time. The goal of DCDS is to improve spectrum efficiency - not through improving RF techniques but rather through changing and optimizing contents of the telemetry stream during system test. By allowing the addition of new parameters to the telemetered stream at any point during system test, DCDS removes the need to transmit measured data unless it is actually needed on the ground. When compared to serial streaming telemetry, real time re-formatting of the telemetry stream does require additional processing onboard the test article. DCDS leverages advances in microprocessor technology to perform this processing while meeting size, weight, and power constraints of the test environment. Performance gains of the system have been achieved by significant multi-threading of the application, allowing it to run on modern multi-core processors. Two other enhancing technologies incorporated into DCDS are the Java programming language and lossless compression.

Page generated in 0.067 seconds