• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 653
  • 275
  • 82
  • 58
  • 32
  • 13
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 1371
  • 263
  • 216
  • 213
  • 184
  • 146
  • 121
  • 116
  • 102
  • 100
  • 79
  • 78
  • 77
  • 75
  • 71
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Holographic studies of thermalization and dissipation in strongly coupled theories

Tangarife García, Walter Orlando 18 September 2014 (has links)
This thesis presents a series of studies of thermalization and dissipation in a variety of strongly coupled systems. The main tool for these investigations is the Gauge/Gravity duality, which establishes a correspondence between a d+1-dimensional quantum theory of gravity and a d-dimensional quantum field theory. We study the decay rates of fluctuations around the thermal equilibrium in theories in non-commutative geometry. Rapid thermalization of such fluctuations is found and motivates the conjecture that the phenomena at the black hole horizon is described by non-local physics. In the same type of environment, we analyze the Langevin dynamics of a heavy quark, which undergoes Brownian motion. We find that the late-time behavior of the displacement squared is unaffected by the non-commutativity of the geometry. In a different scenario, we study the correlation functions in theories with quantum critical points. We compute the response of these quantum critical points to a disturbance caused by a massive charged particle and analyze its late time behavior. Finally, we analyze systems far-from-equilibrium as they evolve towards a thermal state. We characterize this evolution for systems with chemical potential by focusing on the ``strong subadditivity" property of their entanglement entropy. This is achieved on the gravity side by using time dependent functions for mass and charge in an AdS-Vaydia metric. / text
72

Image and video coding for noisy channels

Redmill, David Wallace January 1994 (has links)
No description available.
73

Some general properties of entropy for homogeneous systems

Kay, Amanda R. January 2000 (has links)
No description available.
74

Development and applications of high performance computing

Cox, Simon J. January 1998 (has links)
No description available.
75

Computer aided process design : the design of a distillation train and its control system

Hagemann, Johannes Franz January 1998 (has links)
No description available.
76

Randomness from space

Justamante, David 03 1900 (has links)
Approved for public release; distribution is unlimited / Includes supplementary material / Reissued 30 May 2017 with correction to degree on title page. / Randomness is at the heart of today's computing. There are two categorical methods to generate random numbers: pseudorandom number generation (PRNG) methods and true random number generation (TRNG) methods. While PRNGs operate orders of magnitude faster than TRNGs, the strength of PRNGs lies in their initial seed. TRNGs can function to generate such a seed. This thesis will focus on studying the feasibility of using the next generation Naval Postgraduate School Femto Satellite (NPSFS) as a TRNG. The hardware for the next generation will come from the Intel Quark D2000 along with its onboard BMC150 6-axis eCompass. We simulated 3-dimensional motion to see if any raw data from the BMC150 could be used as an entropy source for random number generation.We studied various "schemes" on how to select and output specific data bits to determine if more entropy and increased bitrate could be reached. Data collected in this thesis suggests that the BMC150 contains certain bits that could be considered good sources of entropy. Various schemes further utilized these bits to yield a strong entropy source with higher bitrate. We propose the NPSFS be studied further to find other sources of entropy. We also propose a prototype be sent into space for experimental verification of these results. / Lieutenant, United States Navy
77

Surrender to the Spinning: poems

Miner, Lauren A 01 January 2015 (has links)
This collection of poems explores themes of time and space, energy, entropy and decay, and the frames we use to resist the inevitable trend toward disorder that defines a human experience of the observable universe.
78

Generování náhodných dat z biometrických vzorků / Generating random data from biometric samples

Sachová, Romana January 2011 (has links)
Title: Generating random data from biometric samples Author: Bc. Romana Sachová Department: Department of Algebra Supervisor: Ing. Mgr. Zdeněk Říha, Ph.D. Supervisor's e-mail address: zriha@fi.muni.cz Abstract: This thesis aims to achieve the generation of random data from the bio- metric samples. Studying the biometric characteristics, randomness and generation of random data suitable for cryptography as well the variability of fingerprint, iris, face and human voice. In the practical part has been tested the variability of 200 prints from the same finger, using three factors: 1) The coordinates of fingerprints cores. Due to the repeatability of coordinates the obtained entropy was low. 2) Fingerprint area approximation. It was able to verify the diversity of all areas. The maximum available entropy remains around 15 bits. 3) Ridge lines distortion. From the core to the top of the fingerprint has been taken boxes containing part of the ridge line. For all boxes was calculated the average phase angle of the gradient which represents the change of intensity in the box. Vector of phase angles describes the ridge line distortion. Maximum estimated entropy of this vectors was estimated at 71,586 bits. Keywords: biometry, randomness, entropy 1
79

Entanglement and quantum communication complexity.

07 December 2007 (has links)
Keywords: entanglement, complexity, entropy, measurement In chapter 1 the basic principles of communication complexity are introduced. Two-party communication is described explicitly, and multi-party communication complexity is described in terms of the two-party communication complexity model. The relation to entropy is described for the classical communication model. Important concepts from quantum mechanics are introduced. More advanced concepts, for example the generalized measurement, are then presented in detail. In chapter 2 the di erent measures of entanglement are described in detail, and concrete examples are provided. Measures for both pure states and mixed states are described in detail. Some results for the Schmidt decomposition are derived for applications in communication complexity. The Schmidt decomposition is fundamental in quantum communication and computation, and thus is presented in considerable detail. Important concepts such as positive maps and entanglement witnesses are discussed with examples. Finally, in chapter 3, the communication complexity model for quantum communication is described. A number of examples are presented to illustrate the advantages of quantum communication in the communication complexity scenario. This includes communication by teleportation, and dense coding using entanglement. A few problems, such as the Deutsch-Jozsa problem, are worked out in detail to illustrate the advantages of quantum communication. The communication complexity of sampling establishes some relationships between communication complexity, the Schmidt rank and entropy. The last topic is coherent communication complexity, which places communication complexity completely in the domain of quantum computation. An important lower bound for the coherent communication complexity in terms of the Schmidt rank is dervived. This result is the quantum analogue to the log rank lower bound in classical communication complexity. / Prof. W.H. Steeb
80

Utilisation de la notion de copule en tomographie / Using the notion of copula in tomography

Pougaza, Doriano-Boris 16 December 2011 (has links)
Cette thèse porte sur le lien entre la tomographie et la notion de copule. La tomographie à rayons X consiste à (re)construire la structure cachée d'un objet (une densité de matière, la distribution d'une quantité physique, ou une densité de loi conjointe) à partir de certaines données obtenues ou mesurées de l'objet (les projections, les radiographies, les densités marginales). Le lien entre les mesures et l'objet se modélise mathématiquement par la Transformée à Rayons X ou la Transformée de Radon. Par exemple, dans les problèmes d'imagerie en géométrie parallèle, lorsqu'on a seulement deux projections à deux angles de 0 et pi/2 (horizontale et verticale), le problème peut être identifié comme un autre problème très important en mathématique qui est la détermination d'une densité conjointe à partir de ses marginales. En se limitant à deux projections, les deux problèmes sont des problèmes mal posés au sens de Hadamard. Il faut alors ajouter de l'information a priori, ou bien des contraintes supplémentaires. L'apport principal de cette thèse est l'utilisation des critères de plusieurs entropies (Rényi, Tsallis, Burg, Shannon) permettant d'aboutir à une solution régularisée. Ce travail couvre alors différents domaines. Les aspects mathématiques de la tomographie via l'élément fondamental qui est la transformée de Radon. En probabilité sur la recherche d'une loi conjointe connaissant ses lois marginales d'où la notion de ``copule'' via le théorème de Sklar. Avec seulement deux projections, ce problème est extrêmement difficile. Mais en assimilant les deux projections (normalisées) aux densités marginales et l'image à reconstruire à une densité de probabilité, le lien se fait et les deux problèmes sont équivalents et peuvent se transposer dans le cadre statistique. Pour caractériser toutes les images possibles à reconstruire on a choisi alors l'outil de la théorie de probabilité, c'est-à-dire les copules. Et pour faire notre choix parmi les copules ou les images nous avons imposé le critère d'information a priori qui se base sur différentes entropies. L'entropie est une quantité scientifique importante car elle est utilisée dans divers domaines (en Thermodynamique, en théorie de l'information, etc). Ainsi, en utilisant par exemple l'entropie de Rényi nous avons découvert de nouvelles classes de copules. Cette thèse apporte de nouvelles contributions à l'imagerie, par l'interaction entre les domaines qui sont la tomographie et la théorie des probabilités et statistiques. / This thesis studies the relationship between Computed Tomography (CT) and the notion of copula. In X-ray tomography the objective is to (re)construct an image representing the distribution of a physical quantity (density of matter) inside of an object from the radiographs obtained all around the object called projections. The link between these images and the object is described by the X-ray transform or the Radon transform. In 2D, when only two projections at two angles 0 and pi/2 (horizontal and vertical) are available, the problem can be identified as another problem in mathematics which is the determination of a joint density from its marginals, hence the notion of copula. Both problems are ill-posed in the sense of Hadamard. It requires prior information or additional criteria or constraints. The main contribution of this thesis is the use of entropy as a constraint that provides a regularized solution to this ill-posed inverse problem. Our work covers different areas. The mathematics aspects of X-ray tomography where the fundamental model to obtain projections is based mainly on the Radon transform. In general this transform does not provide all necessary projections which need to be associated with certain regularization techniques. We have two projections, which makes the problem extremely difficult, and ill-posed but noting that if a link can be done, that is, if the two projections can be equated with marginal densities and the image to reconstruct to a probability density, the problem translates into the statistical framework via Sklar's theorem. And the tool of probability theory called "copula" that characterizes all possible reconstructed images is suitable. Hence the choice of the image that will be the best and most reliable arises. Then we must find techniques or a criterion of a priori information, one of the criteria most often used, we have chosen is a criterion of entropy. Entropy is an important scientific quantity because it is used in various areas, originally in thermodynamics, but also in information theory. Different types of entropy exist (Rényi, Tsallis, Burg, Shannon), we have chosen some as criteria. Using the Rényi entropy we have discovered new copulas. This thesis provides new contributions to CT imaging, the interaction between areas that are tomography and probability theory and statistics.

Page generated in 0.0378 seconds