401 |
Modelling and analysis of geophysical turbulence : use of optimal transforms and basis setsGamage, Nimal K. K. 06 August 1990 (has links)
The use of efficient basis functions to model and represent flows with
internal sharp velocity gradients, such as shocks or eddy microfronts, are
investigated. This is achieved by analysing artificial data, observed atmospheric
turbulence data and by the use of a Burgers' equation based spectral
model. The concept of an efficient decomposition of a function into a basis
set is presented and alternative analysis methods are investigated. The
development of a spectral model using a generalized basis for the Burgers'
equation is presented and simulations are performed using a modified Walsh
basis and compared with the Fourier (trigonometric) basis and finite difference
techniques.
The wavelet transform is shown to be superior to the Fourier transform
or the windowed Fourier transform in terms of defining the predominant
scales in time series of turbulent shear flows and in 'zooming in' on local
coherent structures associated with sharp edges. Disadvantages are found
to be its inability to provide clear information on the scale of periodicity of
events. Artificial time series of varying amounts of noise added to structures
of different scales are analyzed using different wavelets to show that the
technique is robust and capable of detecting sharp edged coherent structures
such as those found in shear driven turbulence.
The Haar function is used as a wavelet to detect ubiquitous zones of
concentrated shear in turbulent flows sometimes referred to as microfronts.
The location and organization of these shear zones suggest that they may be
edges of larger scale eddies. A wavelet variance of the wavelet phase plane is
defined to detect and highlight events and obtain measures of predominant
scales of coherent structures. Wavelet skewness is computed as an indicator
of the systematic sign preference of the gradient of the transition zone. Inverse
wavelet transforms computed at the dilation corresponding to the peak
wavelet variance are computed and shown to contain a significant fraction of
the total energy contained in the record. The analysis of data and the numerical
simulation results are combined to propose that the sharp gradients
normally found in shear induced turbulence significantly affect the nature of
the turbulence and hence the choice of the basis set used for the simulation
of turbulence. / Graduation date: 1991
|
402 |
Anisotropic Superelasticity of Textured Ti-Ni SheetThamburaja, P., Gao, S., Yi, S., Anand, Lallit 01 1900 (has links)
A recently developed crystal-mechanics-based constitutive model for polycrystalline shape-memory alloys (Thamburaja and Anand [1]) is shown to quantitatively predict the in-plane anisotropy of superelastic sheet Ti-Ni to reasonable accord. / Singapore-MIT Alliance (SMA)
|
403 |
Parsing and Generating English Using Commutative TransformationsKatz, Boris, Winston, Patrick H. 01 May 1982 (has links)
This paper is about an implemented natural language interface that translates from English into semantic net relations and from semantic net relations back into English. The parser and companion generator were implemented for two reasons: (a) to enable experimental work in support of a theory of learning by analogy; (b) to demonstrate the viability of a theory of parsing and generation built on commutative transformations. The learning theory was shaped to a great degree by experiments that would have been extraordinarily tedious to perform without the English interface with which the experimental data base was prepared, revise, and revised again. Inasmuch as current work on the learning theory is moving toward a tenfold increase in data-base size, the English interface is moving from a facilitating role to an enabling one. The parsing and generation theory has two particularly important features: (a) the same grammar is used for both parsing and generation; (b) the transformations of the grammar are commutative. The language generation procedure converts a semantic network fragment into kernel frames, chooses the set of transformations that should be performed upon each frame, executes the specified transformations, combines the altered kernels into a sentence, performs a pronominalization process, and finally produces the appropriate English word string. Parsing is essentially the reverse of generation. The first step in the parsing process is splitting a given sentence into a set of kernel clauses along with a description of how those clauses hierarchically related to each other. The clauses are hierarchically related to each other. The clauses are used to produce a matrix embedded kernel frames, which in turn supply arguments to relation-creating functions. The evaluation of the relation-creating functions results in the construction of the semantic net fragments.
|
404 |
Case studies in omniparametric simulation /Lundin, Fredrik, January 2006 (has links)
Thesis (Ph. D.)--Chalmers tekniska högskola and Göteborgs universitet, 2006. / Includes bibliographical references (p. 219-224) and index.
|
405 |
Designing hydrogel microspheres from liquid-liquid phase transitions of aqueous polymer solutions /Yin, Xiangchun. Stöver, Harald D. H. January 1900 (has links)
Thesis (Ph.D.)--McMaster University, 2004. / Supervisor: Harald D. H. Stöver.
|
406 |
Nonlinear free boundary problems arising from soil freezing in a bounded region /Mohamed, Fouad Abd El-Aal. January 1983 (has links)
Thesis (Ph. D.)--Oregon State University, 1983. / Typescript (photocopy). Includes bibliographical references (leaves 130-132). Also available on the World Wide Web.
|
407 |
A study based on event configuration loop to convert casual loop diagram into stock flow diagram for system dynamicsChou, Yi-hung 28 August 2010 (has links)
Today, the threat to humanity survival, economic crisis, financial crisis, global warming, ecological extinction, greenhouse effect ... etc., are gradually grow by both detail and dynamic complexity process. Most current humanity facing problems is because human can¡¦t handle the gradually growing complexity system problems on our environment.
The main purpose of this research is to explore the causal feedback diagram model translation into stock flow diagram model, and to discover key transfer principle from current system dynamics and fundamental components. This will improve dynamic system accuracy and validity. According to model transformation design, this research is to provide a model based architecture, on simulating actual causal feedback diagram module with Maria 2 Plus provided function. Maria is the first Chinese language interface for the system dynamics simulation software. Through Software development tools transfer model and natural language operation interface to user easy use on causal feedback diagram and stock flow diagram model rapid creation. This will also decrease the learning cycle and will increase model creation speed and validity.
|
408 |
Generating Implicit Functions Model from Triangles Mesh Model by Using Genetic AlgorithmChen, Ya-yun 09 October 2005 (has links)
The implicit function model is nowadays generally applied to a lot of fields that need 3D, such as computer game, cartoon or for specially effect film. So far, most hardware are still to support the polygon-mesh model but not implicit function model, so polygon-mesh model is still the mainstream of computer graphics. However, translation between the two representation models becomes a new research topic.
This paper presents a new method to translate the triangles mesh model into the implicit functions model. The main concept is to use the binary space-partitioning tree to divide the points and patches in the triangle mesh model to create a hierarchical structure. For each leaf node in this hierarchical structure, we would generate a corresponding implicit function. These implicit functions are generated by the genetic algorithm. And the internal nodes in this hierarchical structure are blended by the blending operators. The blending operators make the surface become smooth and continual. The method we proposed reduces the data in a large amount because we only save the coefficients of the implicit surface. And the genetic algorithm can avoid the high computing complexity.
|
409 |
Verifying transformations between timed automata specifications and ECA rulesEricsson, Ann-Marie January 2003 (has links)
<p>Event-triggered real-time systems are desirable to use in environments where the arrival of events are hard to predict. The semantics of an event-triggered system is well mapped to the behaviour of an active database management system (ADBMS), specified using event-condition-action (ECA) rules. The benefits of using an active database, such as persistent data storage, concurrency control, timely response to event occurrences etc. highlights the need for a development method for event-triggered real-time systems using active databases.</p><p>However, there are problems left to be solved before an ADBMS can be used with confidence in real-time environments. The behaviour of a real-time system must be predictable, which implies a thorough analysed specification with e.g. specified worst case execution times. The predictability requirement is an obstacle for specifying real-time systems as ECA rules, since the rules may affect each other in many intricate ways which makes them hard to analyse. The interaction between the rules implies that it is not enough to verify the correctness of single rules; an analysis must consider the behaviour of the entire rule set.</p><p>In this dissertation, an approach for developing active applications is presented. A method is examined which starts with an analysed high-level timed automaton specification and transforms the specified behaviour into an implicitly analysed rule set. For this method to be useful, the transformation from timed automata to rules must preserve the exact behaviour of the high level specification. Hence, the aim of this dissertation is to verify transformations between timed automaton specifications and ECA rules.</p><p>The contribution of this project is a structured set of general transformations between timed automata specifications and ECA rules. The transformations include both transformations of small timed automata constructs for deterministic environments and formally verified timed automata patterns specifying the behaviour of composite events in recent and chronicle context.</p>
|
410 |
Case studies in omniparametric simulation /Lundin, Fredrik. January 2006 (has links)
Chalmers Univ. of Technology and Göteborg Univ., Diss.--Göteborg, 2006.
|
Page generated in 0.029 seconds