• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 61
  • 14
  • 9
  • 8
  • 7
  • 5
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 129
  • 68
  • 67
  • 43
  • 31
  • 26
  • 24
  • 23
  • 22
  • 22
  • 19
  • 14
  • 14
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Digital rights management (DRM) - watermark encoding scheme for JPEG images

Samuel, Sindhu 12 September 2008 (has links)
The aim of this dissertation is to develop a new algorithm to embed a watermark in JPEG compressed images, using encoding methods. This encompasses the embedding of proprietary information, such as identity and authentication bitstrings, into the compressed material. This watermark encoding scheme involves combining entropy coding with homophonic coding, in order to embed a watermark in a JPEG image. Arithmetic coding was used as the entropy encoder for this scheme. It is often desired to obtain a robust digital watermarking method that does not distort the digital image, even if this implies that the image is slightly expanded in size before final compression. In this dissertation an algorithm that combines homophonic and arithmetic coding for JPEG images was developed and implemented in software. A detailed analysis of this algorithm is given and the compression (in number of bits) obtained when using the newly developed algorithm (homophonic and arithmetic coding). This research shows that homophonic coding can be used to embed a watermark in a JPEG image by using the watermark information for the selection of the homophones. The proposed algorithm can thus be viewed as a ‘key-less’ encryption technique, where an external bitstring is used as a ‘key’ and is embedded intrinsically into the message stream. The algorithm has achieved to create JPEG images with minimal distortion, with Peak Signal to Noise Ratios (PSNR) of above 35dB. The resulting increase in the entropy of the file is within the expected 2 bits per symbol. This research endeavor consequently provides a unique watermarking technique for images compressed using the JPEG standard. / Dissertation (MEng)--University of Pretoria, 2008. / Electrical, Electronic and Computer Engineering / unrestricted
102

Die Bilder sind weg!

Sontag, Ralph 07 June 2007 (has links)
Nachdem in kurzer Folge zwei Medien auf meinen Tisch gelangten, auf denen bis vor kurzem viele schöne Bilder gespeichert waren, die man nun aber nicht mehr finden konnte, unternahm ich erste vorsichtige Ausflüge in das Gebiet der Datenrettung. Die Wege und Werkzeuge, die letztendlich zum Erfolg führten, sollen in einem kurzen Vortrag vorgestellt werden.
103

JPEG 2000 and parity bit replenishment for remote video browsing

Devaux, François-Olivier 19 September 2008 (has links)
This thesis is devoted to the study of a compression and transmission framework for video. It exploits the JPEG 2000 standard and the coding with side information principles to enable an efficient interactive browsing of video sequences. During the last decade, we have witnessed an explosion of digital visual information as well as a significant diversification of visualization devices. In terms of viewing experience, many applications now enable users to interact with the content stored on a distant server. Pausing video sequences to observe details by zooming and panning or, at the opposite, browsing low resolutions of high quality HD videos are becoming common tasks. The video distribution framework envisioned in this thesis targets such devices and applications. Based on the conditional replenishment framework, the proposed system combines two complementary coding methods. The first one is JPEG 2000, a scalable and very efficient compression algorithm. The second method is based on the coding with side information paradigm. This technique is relatively novel in a video context, and has been adapted to the particular scalable image representation adopted in this work. Interestingly, it has been improved by integrating an image source model and by exploiting the temporal correlation inherent to the sequence. A particularity of this work is the emphasis on the system scalability as well as on the server complexity. The proposed browsing architecture can scale to handle large volumes of content and serve a possibly very large number of heterogeneous users. This is achieved by defining a scheduler that adapts its decisions to the channel conditions and to user requirements expressed in terms of computational capabilities and spatio-temporal interest. This scheduling is carried out in real-time at low computational cost and in a post-compression way, without re-encoding the sequences.
104

THE NILES-MERTON SONGS: A PERFORMANCE GUIDE OF SELECTED SONGS

Benningfield, Brittany C. 01 January 2018 (has links)
The Niles-Merton Songs is a two-opus collection of twenty-two songs by John Jacob Niles setting the poetry of Thomas Merton. The songs are beautiful but contrast from his more popular works in poetry and composition. This project explores reasons why these songs are rarely performed, and gives an overview and analysis of ten selected pieces. The document includes a brief introduction, biographies of Niles and Merton, information detailing Niles’s compositional process, and the technical and artistic requirements for performing the songs, including appropriate age and voice type.
105

Perceptually Lossless Coding of Medical Images - From Abstraction to Reality

Wu, David, dwu8@optusnet.com.au January 2007 (has links)
This work explores a novel vision model based coding approach to encode medical images at a perceptually lossless quality, within the framework of the JPEG 2000 coding engine. Perceptually lossless encoding offers the best of both worlds, delivering images free of visual distortions and at the same time providing significantly greater compression ratio gains over its information lossless counterparts. This is achieved through a visual pruning function, embedded with an advanced model of the human visual system to accurately identify and to efficiently remove visually irrelevant/insignificant information. In addition, it maintains bit-stream compliance with the JPEG 2000 coding framework and subsequently is compliant with the Digital Communications in Medicine standard (DICOM). Equally, the pruning function is applicable to other Discrete Wavelet Transform based image coders, e.g., The Set Partitioning in Hierarchical Trees. Further significant coding gains are ex ploited through an artificial edge segmentation algorithm and a novel arithmetic pruning algorithm. The coding effectiveness and qualitative consistency of the algorithm is evaluated through a double-blind subjective assessment with 31 medical experts, performed using a novel 2-staged forced choice assessment that was devised for medical experts, offering the benefits of greater robustness and accuracy in measuring subjective responses. The assessment showed that no differences of statistical significance were perceivable between the original images and the images encoded by the proposed coder.
106

A study of CABAC hardware acceleration with configurability in multi-standard media processing / En studie i konfigurerbar hårdvaruaccelerering för CABAC i flerstandards mediabearbetning

Flordal, Oskar January 2005 (has links)
<p>To achieve greater compression ratios new video and image CODECs like H.264 and JPEG 2000 take advantage of Context adaptive binary arithmetic coding. As it contains computationally heavy algorithms, fast implementations have to be made when they are performed on large amount of data such as compressing high resolution formats like HDTV. This document describes how entropy coding works in general with a focus on arithmetic coding and CABAC. Furthermore the document dicusses the demands of the different CABACs and propose different options to hardware and instruction level optimisation. Testing and benchmarking of these implementations are done to ease evaluation. The main contribution of the thesis is parallelising and unifying the CABACs which is discussed and partly implemented. The result of the ILA is improved program flow through a specialised branching operations. The result of the DHA is a two bit parallel accelerator with hardware sharing between JPEG 2000 and H.264 encoder with limited decoding support.</p>
107

Compression d'images dans les réseaux de capteurs sans fil

Makkaoui, Leila 26 November 2012 (has links) (PDF)
Cette thèse forme une contribution au problème de la conservation de l'énergie dans le cas particulier des réseaux de capteurs d'images, où une partie voire tous les nœuds du réseau sont équipés d'une petite caméra à technologie CMOS. Les images engagent des volumes de données très largement supérieurs aux mesures scalaires classiques telles que la température, et donc des dépenses énergétiques plus élevées. L'émetteur radio étant l'un des composants les plus gourmands en énergie, il est évident que la compression de l'image à la source peut réduire significativement l'énergie dépensée pour la transmission de l'image, tant au niveau du nœud caméra que des nœuds formant le chemin jusqu'au point de collecte. Toutefois, les méthodes de compression bien connues (JPEG, JPEG2000, SPIHT) sont mal adaptées à la limitation des ressources de calcul et de mémoire caractéristiques des nœuds-capteurs. Sur certaines plateformes matérielles, ces algorithmes ont même un coût énergétique supérieur au gain qu'ils amènent sur la transmission. Autrement dit, le nœud caméra épuise plus vite sa batterie en envoyant des images compressées que des images non compressées. La complexité de l'algorithme de compression est donc un critère de performance aussi important que le rapport débit-distorsion. Les contributions contenues dans ce mémoire de thèses sont triples : - Tout d'abord, nous avons proposé un algorithme de compression basé sur la transformée en cosinus discrète (DCT 8 points) de complexité réduite, combinant la méthode de DCT rapide la plus efficace de la littérature (DCT de Cordic-Loeffler) à une exécution réduite aux coefficients délimités par une zone carrée de taille k<8, les plus importants dans la reconstruction visuelle. Avec cette approche zonale, le nombre de coefficients à calculer, mais aussi à quantifier et encoder par bloc de 8x8 pixels est réduit à k^2 au lieu de 64, ce qui diminue mécaniquement le coût de la compression. - Nous avons ensuite étudié l'impact de k, donc du nombre de coefficients sélectionnés, sur la qualité de l'image finale. L'étude a été réalisée avec un jeu d'une soixantaine d'images de référence et la qualité des images était évaluée en utilisant plusieurs métriques, le PSNR, le PSNR-HVS et le MMSIM. Les résultats ont servi à identifier, pour un débit donné, la valeur limite de k qu'on peut choisir (statistiquement) sans dégradation perceptible de la qualité, et par conséquent les limites posées sur la réduction de la consommation d'énergie à débit et qualité constants. - Enfin, nous donnons les résultats de performances obtenus par des expérimentations sur une plateforme réelle composée d'un nœud Mica2 et d'une caméra Cyclops afin de démontrer la validité de nos propositions. Dans un scénario considérant des images de 128x128 pixels encodées à 0,5 bpp par exemple, la dépense d'énergie du nœud caméra (incluant compression et transmission) est divisée par 6 comparée au cas sans compression, et par 2 comparée au cas de l'algorithme JPEG standard.
108

Importance Prioritised Image Coding in JPEG 2000

Nguyen, Anthony Ngoc January 2005 (has links)
Importance prioritised coding is a principle aimed at improving the interpretability (or image content recognition) versus bit-rate performance of image coding systems. This can be achieved by (1) detecting and tracking image content or regions of interest (ROI) that are crucial to the interpretation of an image, and (2)compressing them in such a manner that enables ROIs to be encoded with higher fidelity and prioritised for dissemination or transmission. Traditional image coding systems prioritise image data according to an objective measure of distortion and this measure does not correlate well with image quality or interpretability. Importance prioritised coding, on the other hand, aims to prioritise image contents according to an 'importance map', which provides a means for modelling and quantifying the relative importance of parts of an image. In such a coding scheme the importance in parts of an image containing ROIs would be higher than other parts of the image. The encoding and prioritisation of ROIs means that the interpretability in these regions would be improved at low bit-rates. An importance prioritised image coder incorporated within the JPEG 2000 international standard for image coding, called IMP-J2K, is proposed to encode and prioritise ROIs according to an 'importance map'. The map can be automatically generated using image processing algorithms that result in a limited number of ROIs, or manually constructed by hand-marking OIs using a priori knowledge. The proposed importance prioritised coder coder provides a user of the encoder with great flexibility in defining single or multiple ROIs with arbitrary degrees of importance and prioritising them using IMP-J2K. Furthermore, IMP-J2K codestreams can be reconstructed by generic JPEG 2000 decoders, which is important for interoperability between imaging systems and processes. The interpretability performance of IMP-J2K was quantitatively assessed using the subjective National Imagery Interpretability Rating Scale (NIIRS). The effect of importance prioritisation on image interpretability was investigated, and a methodology to relate the NIIRS ratings, ROI importance scores and bit-rates was proposed to facilitate NIIRS specifications for importance prioritised coding. In addition, a technique is proposed to construct an importance map by allowing a user of the encoder to use gaze patterns to automatically determine and assign importance to fixated regions (or ROIs) in an image. The importance map can be used by IMP-J2K to bias the encoding of the image to these ROIs, and subsequently to allow a user at the receiver to reconstruct the image as desired by the user of the encoder. Ultimately, with the advancement of automated importance mapping techniques that can reliably predict regions of visual attention, IMP-J2K may play a significant role in matching an image coding scheme to the human visual system.
109

Securing digital images

Kailasanathan, Chandrapal. January 2003 (has links)
Thesis (Ph.D.)--University of Wollongong, 2003. / Typescript. Includes bibliographical references: leaf 191-198.
110

A study of image compression techniques, with specific focus on weighted finite automata

Muller, Rikus 12 1900 (has links)
Thesis (MSc (Mathematical Sciences)--University of Stellenbosch, 2005. / Image compression using weighted finite automata (WFA) is studied and implemented in Matlab. Other more prominent image compression techniques, namely JPEG, vector quantization, EZW wavelet image compression and fractal image compression are also presented. The performance of WFA image compression is then compared to those of some of the abovementioned techniques.

Page generated in 0.0307 seconds