• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 9
  • 3
  • 3
  • 2
  • 1
  • Tagged with
  • 19
  • 19
  • 9
  • 8
  • 6
  • 6
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

JPEG 2000 and parity bit replenishment for remote video browsing

Devaux, François-Olivier 19 September 2008 (has links)
This thesis is devoted to the study of a compression and transmission framework for video. It exploits the JPEG 2000 standard and the coding with side information principles to enable an efficient interactive browsing of video sequences. During the last decade, we have witnessed an explosion of digital visual information as well as a significant diversification of visualization devices. In terms of viewing experience, many applications now enable users to interact with the content stored on a distant server. Pausing video sequences to observe details by zooming and panning or, at the opposite, browsing low resolutions of high quality HD videos are becoming common tasks. The video distribution framework envisioned in this thesis targets such devices and applications. Based on the conditional replenishment framework, the proposed system combines two complementary coding methods. The first one is JPEG 2000, a scalable and very efficient compression algorithm. The second method is based on the coding with side information paradigm. This technique is relatively novel in a video context, and has been adapted to the particular scalable image representation adopted in this work. Interestingly, it has been improved by integrating an image source model and by exploiting the temporal correlation inherent to the sequence. A particularity of this work is the emphasis on the system scalability as well as on the server complexity. The proposed browsing architecture can scale to handle large volumes of content and serve a possibly very large number of heterogeneous users. This is achieved by defining a scheduler that adapts its decisions to the channel conditions and to user requirements expressed in terms of computational capabilities and spatio-temporal interest. This scheduling is carried out in real-time at low computational cost and in a post-compression way, without re-encoding the sequences.
12

Perceptually Lossless Coding of Medical Images - From Abstraction to Reality

Wu, David, dwu8@optusnet.com.au January 2007 (has links)
This work explores a novel vision model based coding approach to encode medical images at a perceptually lossless quality, within the framework of the JPEG 2000 coding engine. Perceptually lossless encoding offers the best of both worlds, delivering images free of visual distortions and at the same time providing significantly greater compression ratio gains over its information lossless counterparts. This is achieved through a visual pruning function, embedded with an advanced model of the human visual system to accurately identify and to efficiently remove visually irrelevant/insignificant information. In addition, it maintains bit-stream compliance with the JPEG 2000 coding framework and subsequently is compliant with the Digital Communications in Medicine standard (DICOM). Equally, the pruning function is applicable to other Discrete Wavelet Transform based image coders, e.g., The Set Partitioning in Hierarchical Trees. Further significant coding gains are ex ploited through an artificial edge segmentation algorithm and a novel arithmetic pruning algorithm. The coding effectiveness and qualitative consistency of the algorithm is evaluated through a double-blind subjective assessment with 31 medical experts, performed using a novel 2-staged forced choice assessment that was devised for medical experts, offering the benefits of greater robustness and accuracy in measuring subjective responses. The assessment showed that no differences of statistical significance were perceivable between the original images and the images encoded by the proposed coder.
13

A study of CABAC hardware acceleration with configurability in multi-standard media processing / En studie i konfigurerbar hårdvaruaccelerering för CABAC i flerstandards mediabearbetning

Flordal, Oskar January 2005 (has links)
<p>To achieve greater compression ratios new video and image CODECs like H.264 and JPEG 2000 take advantage of Context adaptive binary arithmetic coding. As it contains computationally heavy algorithms, fast implementations have to be made when they are performed on large amount of data such as compressing high resolution formats like HDTV. This document describes how entropy coding works in general with a focus on arithmetic coding and CABAC. Furthermore the document dicusses the demands of the different CABACs and propose different options to hardware and instruction level optimisation. Testing and benchmarking of these implementations are done to ease evaluation. The main contribution of the thesis is parallelising and unifying the CABACs which is discussed and partly implemented. The result of the ILA is improved program flow through a specialised branching operations. The result of the DHA is a two bit parallel accelerator with hardware sharing between JPEG 2000 and H.264 encoder with limited decoding support.</p>
14

Importance Prioritised Image Coding in JPEG 2000

Nguyen, Anthony Ngoc January 2005 (has links)
Importance prioritised coding is a principle aimed at improving the interpretability (or image content recognition) versus bit-rate performance of image coding systems. This can be achieved by (1) detecting and tracking image content or regions of interest (ROI) that are crucial to the interpretation of an image, and (2)compressing them in such a manner that enables ROIs to be encoded with higher fidelity and prioritised for dissemination or transmission. Traditional image coding systems prioritise image data according to an objective measure of distortion and this measure does not correlate well with image quality or interpretability. Importance prioritised coding, on the other hand, aims to prioritise image contents according to an 'importance map', which provides a means for modelling and quantifying the relative importance of parts of an image. In such a coding scheme the importance in parts of an image containing ROIs would be higher than other parts of the image. The encoding and prioritisation of ROIs means that the interpretability in these regions would be improved at low bit-rates. An importance prioritised image coder incorporated within the JPEG 2000 international standard for image coding, called IMP-J2K, is proposed to encode and prioritise ROIs according to an 'importance map'. The map can be automatically generated using image processing algorithms that result in a limited number of ROIs, or manually constructed by hand-marking OIs using a priori knowledge. The proposed importance prioritised coder coder provides a user of the encoder with great flexibility in defining single or multiple ROIs with arbitrary degrees of importance and prioritising them using IMP-J2K. Furthermore, IMP-J2K codestreams can be reconstructed by generic JPEG 2000 decoders, which is important for interoperability between imaging systems and processes. The interpretability performance of IMP-J2K was quantitatively assessed using the subjective National Imagery Interpretability Rating Scale (NIIRS). The effect of importance prioritisation on image interpretability was investigated, and a methodology to relate the NIIRS ratings, ROI importance scores and bit-rates was proposed to facilitate NIIRS specifications for importance prioritised coding. In addition, a technique is proposed to construct an importance map by allowing a user of the encoder to use gaze patterns to automatically determine and assign importance to fixated regions (or ROIs) in an image. The importance map can be used by IMP-J2K to bias the encoding of the image to these ROIs, and subsequently to allow a user at the receiver to reconstruct the image as desired by the user of the encoder. Ultimately, with the advancement of automated importance mapping techniques that can reliably predict regions of visual attention, IMP-J2K may play a significant role in matching an image coding scheme to the human visual system.
15

A study of CABAC hardware acceleration with configurability in multi-standard media processing / En studie i konfigurerbar hårdvaruaccelerering för CABAC i flerstandards mediabearbetning

Flordal, Oskar January 2005 (has links)
To achieve greater compression ratios new video and image CODECs like H.264 and JPEG 2000 take advantage of Context adaptive binary arithmetic coding. As it contains computationally heavy algorithms, fast implementations have to be made when they are performed on large amount of data such as compressing high resolution formats like HDTV. This document describes how entropy coding works in general with a focus on arithmetic coding and CABAC. Furthermore the document dicusses the demands of the different CABACs and propose different options to hardware and instruction level optimisation. Testing and benchmarking of these implementations are done to ease evaluation. The main contribution of the thesis is parallelising and unifying the CABACs which is discussed and partly implemented. The result of the ILA is improved program flow through a specialised branching operations. The result of the DHA is a two bit parallel accelerator with hardware sharing between JPEG 2000 and H.264 encoder with limited decoding support.
16

Komprese obrazu pomocí vlnkové transformace / Image Compression Using the Wavelet Transform

Kaše, David January 2015 (has links)
This thesis deals with image compression using wavelet, contourlet and shearlet transformation. It starts with quick look at image compression problem a quality measurement. Next are presented basic concepts of wavelets, multiresolution analysis and scaling function and detailed look at each transform. Representatives of algorithms for coeficients coding are EZW, SPIHT and marginally EBCOT. In second part is described design and implementation of constructed library. Last part compare result of transforms with format JPEG 2000. Comparison resulted in determining type of image in which implemented contourlet and shearlet transform were more effective than wavelet. Format JPEG 2000 was not exceeded.
17

Jádra schématu lifting pro vlnkovou transformaci / Lifting Scheme Cores for Wavelet Transform

Bařina, David Unknown Date (has links)
Práce se zaměřuje na efektivní výpočet dvourozměrné diskrétní vlnkové transformace. Současné metody jsou v práci rozšířeny v několika směrech a to tak, aby spočetly tuto transformaci v jediném průchodu, a to případně víceúrovňově, použitím kompaktního jádra. Tohle jádro dále může být vhodně přeorganizováno za účelem minimalizace užití některých prostředků. Představený přístup krásně zapadá do běžně používaných rozšíření SIMD, využívá hierarchii cache pamětí moderních procesorů a je vhodný k paralelnímu výpočtu. Prezentovaný přístup je nakonec začleněn do kompresního řetězce formátu JPEG 2000, ve kterém se ukázal být zásadně rychlejší než široce používané implementace.
18

Komprese obrazu pomocí vlnkové transformace / Image Compression Using the Wavelet Transform

Urbánek, Pavel January 2013 (has links)
This thesis is focused on subject of image compression using wavelet transform. The first part of this document provides reader with information about image compression, presents well known contemporary algorithms and looks into details of wavelet compression and following encoding schemes. Both JPEG and JPEG 2000 standards are introduced. Second part of this document analyzes and describes implementation of image compression tool including inovations and optimalizations. The third part is dedicated to comparison and evaluation of achievements.
19

Produkce digitálních obrazových dat a jejich kontrola / Digital Images Production and Quality Control

Vychodil, Bedřich January 2013 (has links)
(EN) This dissertation provides a broad understanding of fundamental terminology and an inside view of the digitization workflow and quality control processes. The main foci are on quality assurance during and outside the digitization processes and identification of the most suitable format for access, long term preservation, rendering and reformatting issues. An analysis of selected digitization centers is also included. An application called DIFFER (Determinator of Image File Format propERties) and subsequently The Image Data Validator - DIFFER has been developed using results from previously conducted research. The application utilizes new methods and technologies to help accelerate the whole processing and quality control. The goal was to develop a quality control application for a select group of relevant still image file formats capable of performing identification, characterization, validation and visual/mathematical comparison integrated into an operational digital preservation framework. This application comes with a well-structured graphic user interface, which helps the end user to understand the relationships between various file format properties, detect visual and non visual errors and simplify decision-making. Additional comprehensive annexes, describing the most crucial still image...

Page generated in 0.0191 seconds