• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 67
  • 28
  • 7
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 137
  • 137
  • 25
  • 22
  • 19
  • 18
  • 17
  • 17
  • 16
  • 15
  • 15
  • 12
  • 12
  • 11
  • 11
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Local maximum entropy approximation-based modelling of the canine heart

Rama, Ritesh Rao January 2012 (has links)
Local Maximum Entropy (LME) method is an approximation technique which has been known to have good approximation characteristics. This is due to its non-negative shape functions and the weak Kronecker delta property which allow the solutions to be continuous and smooth as compared to the Moving Least Square method (MLS) which is used in the Element Free Galerkin method (EFG). The method is based on a convex optimisation scheme where a non-linear equation is solved with the help of a Newton algorithm, implemented in an in-house code called SESKA. In this study, the aim is to compare LME and MLS and highlight the differences. Preliminary benchmark tests of LME are found to be very conclusive. The method is able to approximate deformation of a cantilever beam with higher accuracy as compared to MLS. Moreover, its rapid convergence rate, based on a Cook's membrane problem, demonstrated that it requires a relatively coarser mesh to reach the exact solution. With those encouraging results, LME is then applied to a larger non-linear cardiac mechanics problem. That is simulating a healthy and a myocardial infarcted canine left ventricle (LV) during one heart beat. The LV is idealised by a prolate spheroidal ellipsoid. It undergoes expansion during the diastolic phase, addressed by a non-linear passive stress model which incorporates the transversely isotropic properties of the material. The contraction, during the systolic phase, is simulated by Guccione's active stress model. The infarct region is considered to be non-contractile and twice as stiff as the healthy tissue. The material loss, especially during the necrotic phase, is incorporated by the use of a homogenisation approach. Firstly, the loss of the contraction ability of the infarct region counteracts the overall contraction behaviour by a bulging deformation where the occurrence of high stresses are noted. Secondly, with regards to the behaviour of LME, it is found to feature high convergence rate and a decrease in computation time with respect to MLS. However, it is also observed that LME is quite sensitive to the nodal spacing in particular for an unstructured nodal distribution where it produces results that are completely unreliable.
2

Development and applications of high performance computing

Cox, Simon J. January 1998 (has links)
No description available.
3

Caractérisation stochastique des sprays ultrasoniques : le formalisme de l'entropie maximale

Dobre, Miruna 09 May 2003 (has links)
Développer une méthode de caractérisation théorique complète d'un spray sur base de la connaissance du mécanisme de formation des gouttes et pouvant être appliquée de façon similaire quel que soit le type de spray, constitue l'axe central de la présente recherche. La difficulté principale étant la connaissance de la physique de rupture de la nappe liquide en gouttelettes, l'étude entreprise s'est attachée à la description du spray ultrasonique, qui a l'avantage d'impliquer un mécanisme de formation d'ondes de surface (ondes de Faraday) largement étudié. Les moyens mis en oeuvre pour trouver la loi de distribution théorique qui décrit au mieux la pulvérisation ultrasonique sont, d'un côté, l'analyse de l'instabilité des ondes de surface, qui permet de déterminer les caractéristiques moyennes du spray, et de l'autre, une méthode stochastique, le formalisme de l'entropie maximale, qui fournit la distribution la plus probable basée sur les caractéristiques moyennes et sur les lois de conservation élémentaires applicables à tout type de pulvérisation (conservation de la masse et de l'énergie). La validation expérimentale de cette nouvelle approche théorique a permis en outre de développer de nouveaux designs de pulvérisateurs performants.// To develop a method of complete theoretical characterization of a spray based on the knowledge of the of droplet formation mechanism and being able to be applied in a similar way whatever the type of spray, constitute the central axis of this research. The main difficulty being the knowledge of the physics of liquid film break-up into droplets, the study undertaken was concerned with the description of the ultrasonic spray, which has the advantage of implying a mechanism of formation of surface waves (Faraday waves) largely studied. The means implemented to find the theoretical droplet size distribution which describes ultrasonic atomization as well as possible are, first, analysis of surface waves instability, which allows to determine the average characteristics of the spray, and then, a stochastic method, the maximum entropy formalism, which provides the most probable distribution based on the average characteristics and the elementary laws of conservation applicable to any type of atomization (mass and energy conservation). The experimental validation of this new theoretical approach made it possible moreover to develop new designs of powerful ultrasonic atomizers.
4

Image Restoration for Multiplicative Noise with Unknown Parameters

Chen, Ren-Chi 28 July 2006 (has links)
First, we study a Poisson model a polluted random screen. In this model, the defects on random screen are assumed Poisson-distribution and overlapped. The transmittance effects of overlapping defects are multiplicative. We can compute the autocorrelation function of the screen is obtained by defects' density, radius, and transmittance. Using the autocorrelation function, we then restore the telescope astronomy images. These image signals are generally degraded by their propagation through the random scattering in atmosphere. To restore the images, we estimate the three key parameters by three methods. They are expectation- maximization (EM) method and two Maximum-Entropy (ME) methods according to two different definitions. The restoration are successful and demonstrated in this thesis.
5

Topics in the foundations of statistical inference and statistical mechanics /

Guszcza, James. January 2000 (has links)
Thesis (Ph. D.)--University of Chicago, Dept. of Philosophy. / Includes bibliographical references. Also available on the Internet.
6

Entropy based techniques with applications in data mining

Okafor, Anthony. January 2005 (has links)
Thesis (Ph. D.)--University of Florida, 2005. / Title from title page of source document. Document formatted into pages; contains 97 pages. Includes vita. Includes bibliographical references.
7

The Genetic Algorithm and Maximum Entropy Dice

Fellman, Laura Suzanne 29 January 1996 (has links)
The Brandeis dice problem, originally introduced in 1962 by Jaynes as an illustration of the principle of maximum entropy, was solved using the genetic algorithm, and the resulting solution was compared with that obtained analytically. The effect of varying the genetic algorithm parameters was observed, and the optimum values for population size, mutation rate, and mutation interval were determined for this problem. The optimum genetic algorithm program was then compared to a completely random method of search and optimization. Finally, the genetic algorithm approach was extended to several variations of the original problem for which an analytical approach would be impractical.
8

Fundamental properties of Synthetic O-D Generation Formulations and Solutions

Paramahamsan, Harinarayan 17 February 1999 (has links)
Origin-Destination (O-D) matrices are required in order to model traffic routing behavior in networks. Direct techniques for collecting O-D information from home and roadside interviews have historically been utilized to estimate O-D tables. However, these techniques are not only very costly, labor intensive, and disruptive to trip makers, but traditionally also do not capture traffic peaking behavior, which is often required for traffic operational purposes. Consequently, more cost-effective indirect or synthetic O-D estimation techniques have been developed, and continue to be developed. They utilize readily available traffic volume counts to estimate the most likely O-D tables that may have generated the observed link counts. This thesis describes the basic formulations that have been proposed to formulate and solve the static O-D problem synthetically using link flow observations based on Maximum Entropy techniques. As is the case with many mathematical solutions to engineering problems, a number of simplifying assumptions have been made in order to solve the synthetic O-D problem. Unfortunately, the descriptions of these simplifying assumptions are often not fully described in the literature, and in some cases, these assumptions are not mentioned at all. Furthermore, the literature fails to systematically demonstrate what impact these assumptions have on the final O-D table estimate. Therefore, this thesis utilizes simple hypothetical networks to; 1. Demonstrate and compare the two main types of synthetic O-D formulations, namely the trip and volume based formulations. 2. Track the O-D estimation procedure from its initial formulation to its final formulation, demonstrating all significant assumptions that have been made and the implications of these assumptions on the final solution. 3. Demonstrate to what extent the final O-D estimation formulation remains valid when these assumptions are invoked. 4. Test the applicability of some packages which implement the various formulations and solution techniques that are available. / Master of Science
9

Testing for Structural Change: Evaluation of the Current Methodologies, a Misspecification Testing Perspective and Applications

Koutris, Andreas 26 April 2006 (has links)
The unit root revolution in time series modeling has created substantial interest in non- stationarity and its implications for empirical modeling. Beyond the original interest in trend vs. di¤erence non-stationarity, there has been renewed interest in testing and modeling structural breaks. The focus of my dissertation is on testing for departures from stationarity in a broader framework where unit root, mean trends and structural break non-stationarity constitute only a small subset of the possible forms of non-stationarity. In the fi¦rst chapter the most popular testing procedures for the assumption, in view of the fact that general forms of non-stationarity render each observation unique, I develop a testing procedure using a resampling scheme which is based on a Maximum Entropy replication algorithm. The proposed misspecification testing procedure relies on resampling techniques to enhance the informational content of the observed data in an attempt to capture heterogeneity 'locally' using rolling window estimators of the primary moments of the stochastic process. This provides an e¤ective way to enhance the sample information in order to assess the presence of departures from stationarity. Depending on the sample size, the method utilizes overlapping or non-overlapping window estimates. The e¤ectiveness of the testing procedure is assessed using extensive Monte Carlo simulations. The use of rolling non-overlapping windows improves the method by improving both the size and power of the test. In particular, the new test has empirical size very close to the nominal and very high power for a variety of departures from stationarity. The proposed procedure is then applied on seven macroeconomic series in the fourth chapter. Finally, the optimal choice of orthogonal polynomials, for hypothesis testing, is investigated in the last chapter. / Ph. D.
10

Monte Carlo Experiments on Maximum entropy Constructive Ensembles for Time Series Analysis and Inference

Ames, Allison Jennifer 29 June 2005 (has links)
In econometric analysis, the traditional bootstrap and related methods often require the assumption of stationarity. This assumption says that the distribution function of the process remains unchanged when shifted in time by an arbitrary value, imposing perfect time-homogeneity. In terms of the joint distribution, stationarity implies that the date of the first time index is not relevant. There are many problems with this assumption however for time series data. With time series, the order in which random realizations occur is crucial. This is why theorists work with stochastic processes, with two implicit arguments, w and t, where w represents the sample space and t represents the order. The question becomes, is there a bootstrap procedure that can preserve the ordering without assuming stationarity? The new method for maximum entropy ensembles proposed by Dr. H. D. Vinod might satisfy the Ergodic and Kolmogorov theorems, without assuming stationarity. / Master of Science

Page generated in 0.134 seconds