• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 654
  • 275
  • 82
  • 58
  • 32
  • 14
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • Tagged with
  • 1373
  • 263
  • 217
  • 213
  • 185
  • 146
  • 123
  • 117
  • 102
  • 100
  • 79
  • 78
  • 77
  • 75
  • 71
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
141

Entropy Measurements and Ball Cover Construction for Biological Sequences

Robertson, Jeffrey Alan 01 August 2018 (has links)
As improving technology is making it easier to select or engineer DNA sequences that produce dangerous proteins, it is important to be able to predict whether a novel DNA sequence is potentially dangerous by determining its taxonomic identity and functional characteristics. These tasks can be facilitated by the ever increasing amounts of available biological data. Unfortunately, though, these growing databases can be difficult to take full advantage of due to the corresponding increase in computational and storage costs. Entropy scaling algorithms and data structures present an approach that can expedite this type of analysis by scaling with the amount of entropy contained in the database instead of scaling with the size of the database. Because sets of DNA and protein sequences are biologically meaningful instead of being random, they demonstrate some amount of structure instead of being purely random. As biological databases grow, taking advantage of this structure can be extremely beneficial. The entropy scaling sequence similarity search algorithm introduced here demonstrates this by accelerating the biological sequence search tools BLAST and DIAMOND. Tests of the implementation of this algorithm shows that while this approach can lead to improved query times, constructing the required entropy scaling indices is difficult and expensive. To improve performance and remove this bottleneck, I investigate several ideas for accelerating building indices that support entropy scaling searches. The results of these tests identify key tradeoffs and demonstrate that there is potential in using these techniques for sequence similarity searches. / Master of Science / As biological organisms are created and discovered, it is important to compare their genetic information to known organisms in order to detect possible harmful or dangerous properties. However, the collection of published genetic information from known organisms is huge and growing rapidly, making it difficult to search. This thesis shows that it might be possible to use the non-random properties of biological information to increase the speed and efficiency of searches; that is, because genetic sequences are not random but have common structures, the increase of known data does not mean a proportional increase in complexity, known as entropy. Specifically, when comparing a new sequence to a set of previously known sequences, it is important to choose the correct algorithms for comparing the similarity of two sequences, also known as the distance between them. This thesis explores the performance of entropy scaling algorithm compared to several conventional tools.
142

Fundamental properties of Synthetic O-D Generation Formulations and Solutions

Paramahamsan, Harinarayan 17 February 1999 (has links)
Origin-Destination (O-D) matrices are required in order to model traffic routing behavior in networks. Direct techniques for collecting O-D information from home and roadside interviews have historically been utilized to estimate O-D tables. However, these techniques are not only very costly, labor intensive, and disruptive to trip makers, but traditionally also do not capture traffic peaking behavior, which is often required for traffic operational purposes. Consequently, more cost-effective indirect or synthetic O-D estimation techniques have been developed, and continue to be developed. They utilize readily available traffic volume counts to estimate the most likely O-D tables that may have generated the observed link counts. This thesis describes the basic formulations that have been proposed to formulate and solve the static O-D problem synthetically using link flow observations based on Maximum Entropy techniques. As is the case with many mathematical solutions to engineering problems, a number of simplifying assumptions have been made in order to solve the synthetic O-D problem. Unfortunately, the descriptions of these simplifying assumptions are often not fully described in the literature, and in some cases, these assumptions are not mentioned at all. Furthermore, the literature fails to systematically demonstrate what impact these assumptions have on the final O-D table estimate. Therefore, this thesis utilizes simple hypothetical networks to; 1. Demonstrate and compare the two main types of synthetic O-D formulations, namely the trip and volume based formulations. 2. Track the O-D estimation procedure from its initial formulation to its final formulation, demonstrating all significant assumptions that have been made and the implications of these assumptions on the final solution. 3. Demonstrate to what extent the final O-D estimation formulation remains valid when these assumptions are invoked. 4. Test the applicability of some packages which implement the various formulations and solution techniques that are available. / Master of Science
143

Monte Carlo Experiments on Maximum entropy Constructive Ensembles for Time Series Analysis and Inference

Ames, Allison Jennifer 29 June 2005 (has links)
In econometric analysis, the traditional bootstrap and related methods often require the assumption of stationarity. This assumption says that the distribution function of the process remains unchanged when shifted in time by an arbitrary value, imposing perfect time-homogeneity. In terms of the joint distribution, stationarity implies that the date of the first time index is not relevant. There are many problems with this assumption however for time series data. With time series, the order in which random realizations occur is crucial. This is why theorists work with stochastic processes, with two implicit arguments, w and t, where w represents the sample space and t represents the order. The question becomes, is there a bootstrap procedure that can preserve the ordering without assuming stationarity? The new method for maximum entropy ensembles proposed by Dr. H. D. Vinod might satisfy the Ergodic and Kolmogorov theorems, without assuming stationarity. / Master of Science
144

The dynameomics entropy dictionary: a large-scale assessment of conformational entropy across protein fold space

Towse, Clare-Louise, Akke, M., Daggett, V. 04 April 2017 (has links)
Yes / Molecular dynamics (MD) simulations contain considerable information with regard to the motions and fluctuations of a protein, the magnitude of which can be used to estimate conformational entropy. Here we survey conformational entropy across protein fold space using the Dynameomics database, which represents the largest existing dataset of protein MD simulations for representatives of essentially all known protein folds. We provide an overview of MD-derived entropies accounting for all possible degrees of dihedral freedom on an unprecedented scale. Although different side chains might be expected to impose varying restrictions on the conformational space that the backbone can sample, we found that the backbone entropy and side chain size are not strictly coupled. An outcome of these analyses is the Dynameomics Entropy Dictionary, the contents of which have been compared with entropies derived by other theoretical approaches and experiment. As might be expected, the conformational entropies scale linearly with the number of residues, demonstrating that conformational entropy is an extensive property of proteins. The calculated conformational entropies of folding agree well with previous estimates. Detailed analysis of specific cases identify deviations in conformational entropy from the average values that highlight how conformational entropy varies with sequence, secondary structure, and tertiary fold. Notably, alpha-helices have lower entropy on average than do beta-sheets, and both are lower than coil regions. / National Institutes of Health, US Department of Energy Office of Biological Research, National Energy Research Scientific Computing Center, Swedish Research Council, Knut and Alic Wallenberg Foundation
145

Entropy and Stability in Graphs

Joret, Gwenaël 14 December 2007 (has links)
Un stable (ou ensemble indépendant) est un ensemble de sommets qui sont deux à deux non adjacents. De nombreux résultats classiques en optimisation combinatoire portent sur le nombre de stabilité (défini comme la plus grande taille d'un stable), et les stables se classent certainement parmi les structures les plus simples et fondamentales en théorie des graphes. La thèse est divisée en deux parties, toutes deux liées à la notion de stables dans un graphe. Dans la première partie, nous étudions un problème de coloration de graphes, c'est à dire de partition en stables, où le but est de minimiser l'entropie de la partition. C'est une variante du problème classique de minimiser le nombre de couleurs utilisées. Nous considérons aussi une généralisation du problème aux couvertures d'ensembles. Ces deux problèmes sont appelés respectivement minimum entropy coloring et minimum entropy set cover, et sont motivés par diverses applications en théorie de l'information et en bioinformatique. Nous obtenons entre autres une caractérisation précise de la complexité de minimum entropy set cover : le problème peut être approximé à une constante lg e (environ 1.44) près, et il est NP-difficile de faire strictement mieux. Des résultats analogues sont prouvés concernant la complexité de minimum entropy coloring. Dans la deuxième partie de la thèse, nous considérons les graphes dont le nombre de stabilité augmente dès qu'une arête est enlevée. Ces graphes sont dit être "alpha-critiques", et jouent un rôle important dans de nombreux domaines, comme la théorie extrémale des graphes ou la combinatoire polyédrique. Nous revisitons d'une part la théorie des graphes alpha-critiques, donnant à cette occasion de nouvelles démonstrations plus simples pour certains théorèmes centraux. D'autre part, nous étudions certaines facettes du polytope des ordres totaux qui peuvent être vues comme une généralisation de la notion de graphe alpha-critique. Nous étendons de nombreux résultats de la théorie des graphes alpha-critiques à cette famille de facettes.
146

An entropic approach to the analysis of time series.

Scafetta, Nicola 12 1900 (has links)
Statistical analysis of time series. With compelling arguments we show that the Diffusion Entropy Analysis (DEA) is the only method of the literature of the Science of Complexity that correctly determines the scaling hidden within a time series reflecting a Complex Process. The time series is thought of as a source of fluctuations, and the DEA is based on the Shannon entropy of the diffusion process generated by these fluctuations. All traditional methods of scaling analysis, instead, are based on the variance of this diffusion process. The variance methods detect the real scaling only if the Gaussian assumption holds true. We call H the scaling exponent detected by the variance methods and d the real scaling exponent. If the time series is characterized by Fractional Brownian Motion, we have H¹d and the scaling can be safely determined, in this case, by using the variance methods. If, on the contrary, the time series is characterized, for example, by Lévy statistics, H ¹ d and the variance methods cannot be used to detect the true scaling. Lévy walk yields the relation d=1/(3-2H). In the case of Lévy flights, the variance diverges and the exponent H cannot be determined, whereas the scaling d exists and can be established by using the DEA. Therefore, only the joint use of two different scaling analysis methods, the variance scaling analysis and the DEA, can assess the real nature, Gauss or Lévy or something else, of a time series. Moreover, the DEA determines the information content, under the form of Shannon entropy, or of any other convenient entopic indicator, at each time step of the process that, given a sufficiently large number of data, is expected to become diffusion with scaling. This makes it possible to study the regime of transition from dynamics to thermodynamics, non-stationary regimes, and the saturation regime as well. First of all, the efficiency of the DEA is proved with theoretical arguments and with numerical work on artificial sequences. Then we apply the DEA to three different sets of real data, Genome sequences, hard x-ray solar flare waiting times and sequences of sociological interest. In all these cases the DEA makes new properties, overlooked by the standard method of analysis, emerge.
147

Entropy analysis in a channel flow with temperature dependent viscosity

Ndaba, Cynthia Reitumetse January 2007 (has links)
Thesis (M.Sc. (Applied Mathematics)) --University of Limpopo, 2007 / The thermodynamic irreversibility in any fluid flow process can be quantified through entropy analysis. The first law of thermodynamics is simply an expression of the conservation of energy principle. The second law of thermodynamics states that all real processes are irreversible. Entropy generation is a measure of the account of irreversibility associated with the real processes. As entropy generation takes place, the quality of energy (i.e. exergy) decreases. In order to preserve the quality of energy in a fluid flow process or at least to reduce the entropy generation, it is important to study the distribution of the entropy generation within the fluid volume. In this dissertation, the inherent irreversibility in the flow of a variable viscosity fluid in both a closed channel and an open channel is investigated. The channel is assumed to be narrow, so that the lubrication approximation may be applied and the fluid viscosity is assumed to vary linearly with temperature. Both the lower and the upper surfaces of the channel are maintained at different temperature. The simplified form of governing equations is obtained and solved analytically using a perturbation technique. Expressions for fluid velocity and temperature are derived which essentially expedite to obtain expressions for volumetric entropy generation numbers, irreversibility distribution ratio and the Bejan number in the flow field. In chapter 1, a historic background of the study is highlighted. Both closed and open channels problem are investigated in chapters 2 and 3. In chapter 4, generally discussion on the overall results obtained from the investigation is displayed together with possible areas of future research work.
148

Entropy and Architecture entropic phenomena actuating dynamic space /

Bernier, Jobe Paul. January 2008 (has links) (PDF)
Thesis (M Arch)--Montana State University--Bozeman, 2008. / Typescript. Chairperson, Graduate Committee: John Brittingham. Includes bibliographical references (leaves 90-92).
149

Ensemble Filtering Methods for Nonlinear Dynamics

Kim, Sangil January 2005 (has links)
The standard ensemble filtering schemes such as Ensemble Kalman Filter (EnKF) and Sequential Monte Carlo (SMC) do not properly represent states of low priori probability when the number of samples is too small and the dynamical system is high dimensional system with highly non-Gaussian statistics. For example, when the standard ensemble methods are applied to two well-known simple, but highly nonlinear systems such as a one-dimensional stochastic diffusion process in a double-well potential and the well-known three-dimensional chaotic dynamical system of Lorenz, they produce erroneous results to track transitions of the systems from one state to the other.In this dissertation, a set of new parametric resampling methods are introduced to overcome this problem. The new filtering methods are motivated by a general H-theorem for the relative entropy of Markov stochastic processes. The entropy-based filters first approximate a prior distribution of a given system by a mixture of Gaussians and the Gaussian components represent different regions of the system. Then the parameters in each Gaussian, i.e., weight, mean and covariance are determined sequentially as new measurements are available. These alternative filters yield a natural generalization of the EnKF method to systems with highly non-Gaussian statistics when the mixture model consists of one single Gaussian and measurements are taken on full states.In addition, the new filtering methods give the quantities of the relative entropy and log-likelihood as by-products with no extra cost. We examine the potential usage and qualitative behaviors of the relative entropy and log-likelihood for the new filters. Those results of EnKF and SMC are also included. We present results of the new methods on the applications to the above two ordinary differential equations and one partial differential equation with comparisons to the standard filters, EnKF and SMC. These results show that the entropy-based filters correctly track the transitions between likely states in both highly nonlinear systems even with small sample size N=100.
150

Two-phase Eulerian averaged formulation of entropy production for cavitation flow

Sun, Joseph 05 September 2014 (has links)
This research is focused on formulating a new model of entropy production for two-phase flow, including cavitating turbulent flow. In particular, it focuses on the following aspects of the fluid dynamics and the potential contribution of the model to fluid device design. It includes (i) developing a new turbulent entropy model, (ii) a new formula of entropy production rate for two-phase flow including cavitating turbulent flow based on the second law, (iii) applying the technique to study a NACA hydrofoil, and (iv) conducting associated performance analysis of a propeller using post-processing of the CFD results and demonstrating that entropy production of two-phase cavitating flow around the propeller can be correlated to the loss of power output. The first stage consists of formulating the entropy production for laminar channel flow using Gibb’s free energy. This model is validated through the analytically solved Navier-Stokes equations. Subsequently, the single-phase turbulent flow is formulated in a similar manner, but the validations are carried out by comparing the prediction of the model with DNS results. Then, the model of entropy production for two-phase turbulent flow is derived from Gibb’s equation and a version of the Reynolds averaged Navier-Stokes (RANS) equations. The k- ε model is employed to represent the turbulent properties of single phase and two phase flows. A developed inter-phase slip algorithm mixture model is applied to control over coupling of phases. The Rayleigh-Plesset equation is used to model the rate of mass generation of vapour at the inter phase. The standard k-ε turbulence equations are used to describe turbulence in the cavitation flow. The validations of CFD predictions include exploring the force and cavitation characteristics of the NACA 4412 hydrofoil section. The application of this entropy production model in engineering design is presented via the comparisons between CFD results and the experimental data for the velocity distributions behind propeller P5168.

Page generated in 0.0329 seconds