• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 4
  • 1
  • Tagged with
  • 6
  • 6
  • 4
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Image Information Distance Analysis and Applications

Nikvand, Nima January 2014 (has links)
Image similarity or distortion assessment is fundamental to a broad range of applications throughout the field of image processing and machine vision. These include image restoration, denoising, coding, communication, interpolation, registration, fusion, classification and retrieval, as well as object detection, recognition, and tracking. Many existing image similarity measures have been proposed to work with specific types of image distortions (e.g., JPEG compression). There are also methods such as the structural similarity (SSIM) index that are applicable to a wider range of applications. However, even these "general-purpose" methods offer limited scopes in their applications. For example, SSIM does not apply or work properly when significant geometric changes exist between the two images being compared. The theory of Kolmogorov complexity provides solid groundwork for a generic information distance metric between any objects that minorizes all metrics in the class. The Normalized Information Distance (NID) metric provides a more useful framework. While appealing, the challenge lies in the implementation, mainly due to the non-computable nature of Kolmogorov complexity. To overcome this, a Normalized Compression Distance (NCD) measure was proposed, which is an effective approximation of NID and has found successful applications in the fields of bioinformatics, pattern recognition, and natural language processing. Nevertheless, the application of NID for image similarity and distortion analysis is still in its early stage. Several authors have applied the NID framework and the NCD algorithm to image clustering, image distinguishability, content-based image retrieval and video classification problems, but most reporting only moderate success. Moreover, due to their focuses on ! specific applications, the generic property of NID was not fully exploited. In this work, we aim for developing practical solutions for image distortion analysis based on the information distance framework. In particular, we propose two practical approaches to approximate NID for image similarity and distortion analysis. In the first approach, the shortest program that converts one image to another is found from a list of available transformations and a generic image similarity measure is built on computing the length of this shortest program as an approximation of the conditional Kolmogorov complexity in NID. In the second method, the complexity of the objects is approximated using Shannon entropy. Specifically we transform the reference and distorted images into wavelet domain and assume local independence among image subbands. Inspired by the Visual Information Fidelity (VIF) approach, the Gaussian Scale Mixture (GSM) model is adopted for Natural Scene Statistics (NSS) of the images to simplify the entropy computation. When applying image information distance framework in real-world applications, we find information distance measures often lead to useful features in many image processing applications. In particular, we develop a photo retouching distortion measure based on training a Gaussian kernel Support Vector Regression (SVR) model using information theoretic features extracted from a database of original and edited images. It is shown that the proposed measure is well correlated with subjective ranking of the images. Moreover, we propose a tone mapping operator parameter selection scheme for High Dynamic Range (HDR) images. The scheme attempts to find tone mapping parameters that minimize the NID of the HDR image and the resulting Low Dynamic Range (LDR) image, and thereby minimize the information loss in HDR to LDR tone mapping. The resulting images created by minimizing NID exhibit enhanced image quality.
2

Analysis of integrated transcriptomics and metabolomics data : a systems biology approach

Daub, Carsten Oliver January 2004 (has links)
Moderne Hochdurchsatzmethoden erlauben die Messung einer Vielzahl von komplementären Daten und implizieren die Existenz von regulativen Netzwerken auf einem systembiologischen Niveau. Ein üblicher Ansatz zur Rekonstruktion solcher Netzwerke stellt die Clusteranalyse dar, die auf einem Ähnlichkeitsmaß beruht.<br /> Wir verwenden das informationstheoretische Konzept der wechselseitigen Information, das ursprünglich für diskrete Daten definiert ist, als Ähnlichkeitsmaß und schlagen eine Erweiterung eines für gewöhnlich für die Anwendung auf kontinuierliche biologische Daten verwendeten Algorithmus vor. Wir vergleichen unseren Ansatz mit bereits existierenden Algorithmen. Wir entwickeln ein geschwindigkeitsoptimiertes Computerprogramm für die Anwendung der wechselseitigen Information auf große Datensätze. Weiterhin konstruieren und implementieren wir einen web-basierten Dienst fuer die Analyse von integrierten Daten, die durch unterschiedliche Messmethoden gemessen wurden. Die Anwendung auf biologische Daten zeigt biologisch relevante Gruppierungen, und rekonstruierte Signalnetzwerke zeigen Übereinstimmungen mit physiologischen Erkenntnissen. / Recent high-throughput technologies enable the acquisition of a variety of complementary data and imply regulatory networks on the systems biology level. A common approach to the reconstruction of such networks is the cluster analysis which is based on a similarity measure.<br /> We use the information theoretic concept of the mutual information, that has been originally defined for discrete data, as a measure of similarity and propose an extension to a commonly applied algorithm for its calculation from continuous biological data. We compare our approach to previously existing algorithms. We develop a performance optimised software package for the application of the mutual information to large-scale datasets. Furthermore, we design and implement a web-based service for the analysis of integrated data measured with different technologies. Application to biological data reveals biologically relevant groupings and reconstructed signalling networks show agreements with physiological findings.
3

Motor interference and behaviour adaptation in human-humanoid interactions

Shen, Qiming January 2013 (has links)
This thesis proposes and experimentally demonstrates an approach enabling a humanoid robot to adapt its behaviour to match a human’s behaviour in real-time human-humanoid interaction. The approach uses the information distance synchrony detection method, which is a novel method to measure the behaviour synchrony between two agents, as the core part of the behaviour adaptation mechanism to guide the humanoid robot to change its behaviour in the interaction. The feedback of the participants indicated that the application of this behaviour adaptation mechanism could facilitate human-humanoid interaction. The investigation of motor interference, which may be adopted as a possible metric to quantify the social competence of a robot, is also presented in this thesis. The results from two experiments indicated that both human participants’ beliefs about the engagement of the robot and the usage of rhythmic music might affect the elicitation of the motor interference effects. Based on these findings and recent research supporting the importance of other features in eliciting the interference effects, it can be hypothesized that the overall perception of a humanoid robot as a social entity instead of any individual feature of the robot is critical to elicit motor interference in a human observer’s behaviour. In this thesis, the term ‘overall perception’ refers to the human observer’s overall perception of the robot in terms of appearance, behaviour, the observer’s belief and environmental features that may affect the perception. Moreover, it was found in the motor coordination investigation that humans tended to synchronize themselves with a humanoid robot without being instructed to do so. This finding, together with the behaviour adaptation mechanism, may support the feasibility of bi-directional motor coordination in human-humanoid interaction.
4

Normalized social distance / Normalized social distance

Šlerka, Josef January 2019 (has links)
This dissertation thesis deals with the application of the concept of information distance to the social network data analysis. We consider this data as recorded acts of social action. As such, they express certain attitudes, values and intentions. We introduce a formula for calculating the Normalized Social Distance and, based on the series of case studies, we prove the usefulness and validity of this approach. The application of formal mathematical and computer science techniques to massive data records of human action in social network environments is enabled by the change brought by new media and the associated technological advancement. This change is accompanied by a gradual transition of research methods in the humanities, referred to as the onset of digital humanities. This approach is characterized by the application of quantitative methods in the field of humanities and the discovery of new data areas useful for analyses. In case of social media data, the differentiation between quantitative and qualitative methods is no longer valid. A good example is also this thesis, in which information theory specifically combines the methods of a traditional social network analysis and the Goffman's frame analysis of human action. Keywords Information distance, Normalized Social Distance, Kolmogorov...
5

Counting prime polynomials and measuring complexity and similarity of information

Rebenich, Niko 02 May 2016 (has links)
This dissertation explores an analogue of the prime number theorem for polynomials over finite fields as well as its connection to the necklace factorization algorithm T-transform and the string complexity measure T-complexity. Specifically, a precise asymptotic expansion for the prime polynomial counting function is derived. The approximation given is more accurate than previous results in the literature while requiring very little computational effort. In this context asymptotic series expansions for Lerch transcendent, Eulerian polynomials, truncated polylogarithm, and polylogarithms of negative integer order are also provided. The expansion formulas developed are general and have applications in numerous areas other than the enumeration of prime polynomials. A bijection between the equivalence classes of aperiodic necklaces and monic prime polynomials is utilized to derive an asymptotic bound on the maximal T-complexity value of a string. Furthermore, the statistical behaviour of uniform random sequences that are factored via the T-transform are investigated, and an accurate probabilistic model for short necklace factors is presented. Finally, a T-complexity based conditional string complexity measure is proposed and used to define the normalized T-complexity distance that measures similarity between strings. The T-complexity distance is proven to not be a metric. However, the measure can be computed in linear time and space making it a suitable choice for large data sets. / Graduate / 0544 0984 0405 / nrebenich@gmail.com
6

Fast Low Memory T-Transform: string complexity in linear time and space with applications to Android app store security.

Rebenich, Niko 27 April 2012 (has links)
This thesis presents flott, the Fast Low Memory T-Transform, the currently fastest and most memory efficient linear time and space algorithm available to compute the string complexity measure T-complexity. The flott algorithm uses 64.3% less memory and in our experiments runs asymptotically 20% faster than its predecessor. A full C-implementation is provided and published under the Apache Licence 2.0. From the flott algorithm two deterministic information measures are derived and applied to Android app store security. The derived measures are the normalized T-complexity distance and the instantaneous T-complexity rate which are used to detect, locate, and visualize unusual information changes in Android applications. The information measures introduced present a novel, scalable approach to assist with the detection of malware in app stores. / Graduate

Page generated in 0.1453 seconds