• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 653
  • 275
  • 82
  • 58
  • 32
  • 13
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 1371
  • 263
  • 216
  • 213
  • 184
  • 146
  • 121
  • 116
  • 102
  • 100
  • 79
  • 78
  • 77
  • 75
  • 71
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

A novel approach to detecting covert DNS tunnels using throughput estimation

Himbeault, Michael 22 April 2014 (has links)
In a world that relies heavily on data, protection of that data and of the motion of that data is of the utmost importance. Covert communication channels attempt to circumvent established methods of control, such as rewalls and proxies, by utilizing non-standard means of getting messages between two endpoints. The Domain Name System (DNS), the system that translates text-based resource names into machine-readable resource records, is a very common and e ective platform upon which covert channels can be built. This work proposes, and demonstrates the e ectiveness of, a novel technique that estimates data transmission throughput over DNS in order to identify the existence of a DNS tunnel against the background noise of legitimate network tra c. The proposed technique is robust in the face of the obfuscation techniques that are able to hide tunnels from existing detection methods.
102

Entropy measures in dynamical systems and their viability in characterizing bipedal walking gait dynamics

Leverick, Graham 11 September 2013 (has links)
Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this thesis, two novel entropy measures are developed based on using coarse quantization to classify and compare dynamical features within a time series; quantized dynamical entropy (QDE) and a quantized approximation of sample entropy (QASE). Following this, comprehensive guidelines for the quantification of complexity are presented based on a detailed investigation of the performance characteristics of the two developed measures and three existing measures; permutation entropy, sample entropy and fuzzy entropy. The sensitivity of the considered entropy measures to changes in dynamics was assessed using the case study of characterizing bipedal walking gait dynamics. Based on the analysis conducted, it was found that sample entropy and fuzzy entropy, while computationally inefficient, provide the best overall performance. In instances where computational efficiency is vital, QDE and QASE serve as viable alternatives to existing methods.
103

Evaluation of E-Participation Efficiency with Biodiversity Measures - the Case of the Digital Agenda Vienna

May, John, Leo, Hannes, Taudes, Alfred January 2015 (has links) (PDF)
We introduce the Effective Number of Issues measure for e-participation efficiency. This novel index is based on the Shannon entropy measure of biodiversity and summarizes the amount of information gained through an e-participation project in one number. This makes the comparison between different e-participation projects straightforward and lays the foundation for the rigorous analysis of success factors of e-participation projects in a data-driven way. After providing the formula and rationale for the new measure we use the ENI index to benchmark the idea generation process for the digital agenda Vienna against other projects. It turns out that the efficiency of this project is significantly higher than those observed for other cases. We conjecture that this can be attributed to the user-friendly design of the software platform and the effective communication strategy of the process management. Finally, suggestions for further research are given. (authors' abstract)
104

Holographic Entanglement Entropy: RG Flows and Singular Surfaces

Singh, Ajay 07 August 2012 (has links)
Over the past decade, the AdS/CFT correspondence has proven to be a remarkable tool to study various properties of strongly coupled field theories. In the context of the holography, Ryu and Takayanagi have proposed an elegant method to calculate entanglement entropy for these field theories. In this thesis, we use this holographic entanglement entropy to study a candidate c-theorem and entanglement entropy for singular surfaces. We use holographic entanglement entropy for strip geometry and construct a candidate c-function in arbitrary dimensions. For holographic theories dual to Einstein gravity, this c-function is shown to decrease monotonically along RG flows. A sufficient condition required for this monotonic flow is that the stress tensor of the matter fields driving the holographic RG flow must satisfy the null energy condition over the holographic surface used to calculate the entanglement entropy. In the case where the bulk theory is described by Gauss-Bonnet gravity, the latter condition alone is not sufficient to establish the monotonic flow of the c-function. We also observe that for certain holographic RG flows, the entanglement entropy undergoes a ‘phase transition’ as the size of the system grows and as a result, evolution of the c-function may exhibit a discontinuous drop. Then, we turn towards studying the holographic entanglement entropy for regions with a singular boundary in higher dimensions. Here, we find that various singularities make new universal contributions. When the boundary CFT has an even spacetime dimension, we find that the entanglement entropy of a conical surface contains a term quadratic in the logarithm of the UV cut-off. In four dimensions, the coefficient of this contribution is proportional to the central charge c. A conical singularity in an odd number of spacetime dimensions contributes a term proportional to the logarithm of the UV cut-off. We also study the entanglement entropy for various boundary surfaces with extended singularities. In these cases, extended singularities contribute through new linear or quadratic terms in logarithm only when the locus of the singularity is even dimensional and curved.
105

The Importance of the Entropy Inequality on Numerical Simulations Using Reduced Methane-air Reaction Mechanisms

Jones, Nathan 2012 August 1900 (has links)
Many reaction mechanisms have been developed over the past few decades to predict flame characteristics. A detailed reaction mechanism can predict flame characteristics well, but at a high computational cost. The reason for reducing reaction mechanisms is to reduce the computational time needed to simulate a problem. The focus of this work is on the validity of reduced methane-air combustion mechanisms, particularly pertaining to satisfying the entropy inequality. While much of this work involves a two-step reaction mechanism developed by Dr. Charles Westbrook and Dr. Frederick Dryer, some consideration is given to the four-step and three-step mechanisms of Dr. Norbert Peters. These mechanisms are used to simulate the Flame A experiment from Sandia National Laboratories. The two-step mechanism of Westbrook and Dryer is found to generate results that violate the entropy inequality. Modifications are made to the two-step mechanism simulation in an effort to reduce these violations. Two new mechanisms, Mech 1 and Mech 2, are developed from the original two-step reaction mechanism by modifying the empirical data constants in the Arrhenius reaction form. The reaction exponents are set to the stoichiometric coefficients of the reaction, and the concentrations computed from a one-dimensional flame simulation are matched by changing the Arrhenius parameters. The new mechanisms match experimental data more closely than the original two-step mechanism and result in a significant reduction in entropy inequality violations. The solution from Mech 1 had only 9 cells that violated the entropy inequality, while the original two-step mechanism of Westbrook and Dryer had 22,016 cells that violated the entropy inequality. The solution from Mech 2 did not have entropy inequality violations. The method used herein for developing the new mechanisms can be applied to more complex reaction mechanisms.
106

Potential utility of changes in entropy as an adjunct to the electrocardiography diagnosis of reversible myocardial ischaemia.

Zhao, Jinlin January 2008 (has links)
Background: The 12-lead electrocardiogram (ECG) is a pivotal clinical investigation for evaluations of disorders of myocardial electrophysiology and function. Myocardial ischemia is generally diagnosed on the basis of clinical history, combined with ST segment shifts and T wave changes on resting 12-lead ECG. The ECG is also used as a monitoring tool for assessment of resolution of transmural ischemia following emergency treatment. Because this technology is easy, noninvasive, and inexpensive, it represents a convenient central investigative modality. On the other hand, the 12-lead ECG exhibits very low predictive accuracy for the diagnosis of ischemia in the absence of concurrent symptoms. Even if ECG monitoring is combined with treadmill exercise, the sensitivity and positive predictive accuracy for detection of myocardial ischemia are only around 50% - 75%. Therefore, information from the ECG, combined with exercise test, does not usually have a large influence on clinical decision-making. A number of imaging techniques may be combined with pacing-induced tachycardia or pharmacological stress in order to improve the diagnostic accuracy of such provocative tests for ischemia beyond the level provided by continuous ECG monitoring alone. These include echocardiography, nuclear imaging with single photon emission computed tomography (SPECT) or positron emission tomography (PET) and magnetic resonance imaging. All add to the diagnostic accuracy of the provocative tests performed, but involve considerably incremental costs. The question therefore arises: is it possible to refine continuous ECG analysis during provocative testing in such a way that the diagnostic accuracy of the procedure can be improved? The majority of clinical studies has examined the accuracy or otherwise of the diagnosis of myocardial ischemia utilizing fluctuation of the ST segments during either “spontaneous” ischemia or during provocative manoeuvres (e.g. exercise). As previously stated, the diagnostic accuracy of such analyses tends to be mediocre; when subjected to utility evaluation under Bayesian considerations, they often add little to history/physical examination. However, a number of potential refinements of 12-lead ECG analysis have been proposed, in order to improve both detection and as well as localization and quantitation of ischemia. These include evaluation of a variety of the component waveforms of both the QRS complex and the ST segment of the ECG. Current experiments The currently described series of investigations arose from preliminary findings that myocardial ischemia in a canine model was associated with transient fluctuations in QRS entropy. Both evaluations performed related to the hypothesis that reversible myocardial ischemia causes transient increases in entropy within QRS complexes and ST segments of the human 12-lead ECG. A series of preliminary experiments suggested that such changes did indeed occur, mainly within the ST segment. The first series of experiments performed compared conventional continuous ST segment analysis within the 12-lead ECG is vs. continuous evaluation of entropy-derived parameters for the localization of ischemia induced by balloon inflation during non-emergency coronary angioplasty. In a series of 103 patients, localization of ischemia was similarly accurate for the entropy-based method and the ST segment assessment method. Ischemic zones were correctly localized by these approaches in 88% and 80% of cases, respectively (p not significant). There was poor concordance between the extent of ST elevation and changes in ST segment entropy. In a small subset of patients with complete bundle branch block and/or ST depression on resting ECG (n=22), entropy-based localization of ischemia was possible in 55% of cases compared with 41% via ST segment assessment (difference not significant). Post hoc analysis revealed that entropy fluctuations arose throughout the ST segment rather than predominantly at the J-point. The second series of experiments was carried out on patients undergoing pacing-induced provocation of possible myocardial ischemia, with scanning via myocardial perfusion imaging (SPECT) examination. As with the first series, 12-lead ECG recording and ST trend monitoring were performed during the pacing procedure. The ST segment deviation and the entropy-based analyses were used for localization of possible ischemia. Data analyses were correlated with myocardial perfusion imaging results. A total 43 patients were studied. Categorization of ischemia via ST segment assessment had only 30% concordance with myocardial perfusion imaging results, while entropy-based analyses had 58% concordance. Therefore neither “conventional” (i.e. ECG-based ST segment analysis) nor novel entropy-based analyses are currently of clinical utility for detection of tachycardia-induced ischemia. / Thesis (M.Med.Sc.) - University of Adelaide, School of Medicine, 2008
107

Image Thresholding Technique Based On Fuzzy Partition And Entropy Maximization

Zhao, Mansuo January 2005 (has links)
Thresholding is a commonly used technique in image segmentation because of its fast and easy application. For this reason threshold selection is an important issue. There are two general approaches to threshold selection. One approach is based on the histogram of the image while the other is based on the gray scale information located in the local small areas. The histogram of an image contains some statistical data of the grayscale or color ingredients. In this thesis, an adaptive logical thresholding method is proposed for the binarization of blueprint images first. The new method exploits the geometric features of blueprint images. This is implemented by utilizing a robust windows operation, which is based on the assumption that the objects have &quote;C&quote; shape in a small area. We make use of multiple window sizes in the windows operation. This not only reduces computation time but also separates effectively thin lines from wide lines. Our method can automatically determine the threshold of images. Experiments show that our method is effective for blueprint images and achieves good results over a wide range of images. Second, the fuzzy set theory, along with probability partition and maximum entropy theory, is explored to compute the threshold based on the histogram of the image. Fuzzy set theory has been widely used in many fields where the ambiguous phenomena exist since it was proposed by Zadeh in 1965. And many thresholding methods have also been developed by using this theory. The concept we are using here is called fuzzy partition. Fuzzy partition means that a histogram is parted into several groups by some fuzzy sets which represent the fuzzy membership of each group because our method is based on histogram of the image . Probability partition is associated with fuzzy partition. The probability distribution of each group is derived from the fuzzy partition. Entropy which originates from thermodynamic theory is introduced into communications theory as a commonly used criteria to measure the information transmitted through a channel. It is adopted by image processing as a measurement of the information contained in the processed images. Thus it is applied in our method as a criterion for selecting the optimal fuzzy sets which partition the histogram. To find the threshold, the histogram of the image is partitioned by fuzzy sets which satisfy a certain entropy restriction. The search for the best possible fuzzy sets becomes an important issue. There is no efficient method for the searching procedure. Therefore, expansion to multiple level thresholding with fuzzy partition becomes extremely time consuming or even impossible. In this thesis, the relationship between a probability partition (PP) and a fuzzy C-partition (FP) is studied. This relationship and the entropy approach are used to derive a thresholding technique to select the optimal fuzzy C-partition. The measure of the selection quality is the entropy function defined by the PP and FP. A necessary condition of the entropy function arriving at a maximum is derived. Based on this condition, an efficient search procedure for two-level thresholding is derived, which makes the search so efficient that extension to multilevel thresholding becomes possible. A novel fuzzy membership function is proposed in three-level thresholding which produces a better result because a new relationship among the fuzzy membership functions is presented. This new relationship gives more flexibility in the search for the optimal fuzzy sets, although it also increases the complication in the search for the fuzzy sets in multi-level thresholding. This complication is solved by a new method called the &quote;Onion-Peeling&quote; method. Because the relationship between the fuzzy membership functions is so complicated it is impossible to obtain the membership functions all at once. The search procedure is decomposed into several layers of three-level partitions except for the last layer which may be a two-level one. So the big problem is simplified to three-level partitions such that we can obtain the two outmost membership functions without worrying too much about the complicated intersections among the membership functions. The method is further revised for images with a dominant area of background or an object which affects the appearance of the histogram of the image. The histogram is the basis of our method as well as of many other methods. A &quote;bad&quote; shape of the histogram will result in a bad thresholded image. A quadtree scheme is adopted to decompose the image into homogeneous areas and heterogeneous areas. And a multi-resolution thresholding method based on quadtree and fuzzy partition is then devised to deal with these images. Extension of fuzzy partition methods to color images is also examined. An adaptive thresholding method for color images based on fuzzy partition is proposed which can determine the number of thresholding levels automatically. This thesis concludes that the &quote;C&quote; shape assumption and varying sizes of windows for windows operation contribute to a better segmentation of the blueprint images. The efficient search procedure for the optimal fuzzy sets in the fuzzy-2 partition of the histogram of the image accelerates the process so much that it enables the extension of it to multilevel thresholding. In three-level fuzzy partition the new relationship presentation among the three fuzzy membership functions makes more sense than the conventional assumption and, as a result, performs better. A novel method, the &quote;Onion-Peeling&quote; method, is devised for dealing with the complexity at the intersection among the multiple membership functions in the multilevel fuzzy partition. It decomposes the multilevel partition into the fuzzy-3 partitions and the fuzzy-2 partitions by transposing the partition space in the histogram. Thus it is efficient in multilevel thresholding. A multi-resolution method which applies the quadtree scheme to distinguish the heterogeneous areas from the homogeneous areas is designed for the images with large homogeneous areas which usually distorts the histogram of the image. The new histogram based on only the heterogeneous area is adopted for partition and outperforms the old one. While validity checks filter out the fragmented points which are only a small portion of the whole image. Thus it gives good thresholded images for human face images.
108

Topological entropy of linear systems and its application to optimal control /

Sun, Hui. January 2008 (has links)
Thesis (M.Phil.)--Hong Kong University of Science and Technology, 2008. / Includes bibliographical references (leaves 71-73). Also available in electronic version.
109

An entropy-based measurement framework for component-based hierarchical systems

Aktunc, Ozgur. January 2007 (has links) (PDF)
Thesis (Ph. D.)--University of Alabama at Birmingham, 2007. / Additional advisors: Gary J. Grimes, Chittoor V. Ramamoorthy, Murat N. Tanju, Gregg L. Vaughn, B. Earl Wells. Description based on contents viewed Feb. 12, 2009; title from PDF t.p. Includes bibliographical references (p. 150-158).
110

Optimal concentration for SU(1,1) coherent state transforms and an analogue of the Lieb-Wehrl conjecture for SU(1,1)

Bandyopadhyay, Jogia. January 2008 (has links)
Thesis (Ph.D.)--Physics, Georgia Institute of Technology, 2008. / Committee Chair: Eric A. Carlen; Committee Member: Jean Bellissard; Committee Member: Michael Loss; Committee Member: Predrag Cvitanovic.

Page generated in 0.0281 seconds