• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 2
  • 1
  • Tagged with
  • 8
  • 8
  • 8
  • 4
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

3-dimensional Heisenberg antiferromagnet in cubic lattice under time periodic magnetic field /

Chan, Chi Hung. January 2009 (has links)
Includes bibliographical references (p. 81).
2

Generalized uncertainty relations /

Akten, Burcu Elif, January 1999 (has links)
Thesis (Ph. D.)--University of Texas at Austin, 1999. / Vita. Includes bibliographical references (leaves 93-94). Available also in a digital version from Dissertation Abstracts.
3

Static critical properties of the pure and diluted Heisenberg or Ising models

Davies, Mathew Raymond January 1982 (has links)
Real space renormalisation group scaling techniques are used to investigate the static critical behaviour of the pure and dilute, classical, anisotropic Heisenberg model. Transfer matrix methods are employed to obtain asymptotically exact expressions for the correlation lengths and susceptibilities of the one-dimensional system. The resulting scaling relationships are combined with an approximate bond moving scheme to treat pure and dilute models in higher dimensionalities. Detailed discussions are given for the dependence of correlation lengths and susceptibilities on temperature, anisotropy and concentration, and fcr the critical temperature on anisotropy and concentration. Particular emphasis is given to the weakly anisotropic system near percolation threshold and comparisons are made between the results of the present analysis and those of neutron-scattering experiments on dilute quasi-two- and three-dimensional systems.
4

Some properties of Heisenberg systems containing substitutional impurities

Lovesey, Stephen W. January 1967 (has links)
No description available.
5

Eliminating Design Alternatives under Interval-Based Uncertainty

Rekuc, Steven Joseph 19 July 2005 (has links)
Typically, design is approached as a sequence of decisions in which designers select what they believe to be the best alternative in each decision. While this approach can be used to arrive at a final solution quickly, it is unlikely to result in the most-preferred solution. The reason for this is that all the decisions in the design process are coupled. To determine the most preferred alternative in the current decision, the designer would need to know the outcomes of all future decisions, information that is currently unavailable or indeterminate. Since the designer cannot select a single alternative because of this indeterminate (interval-based) uncertainty, a set-based design approach is introduced. The approach is motivated by the engineering practices at Toyota and is based on the structure of the Branch and Bound Algorithm. Instead of selecting a single design alternative that is perceived as being the most preferred at the time of the decision, the proposed set-based design approach eliminates dominated design alternatives: rather than selecting the best, eliminate the worst. Starting from a large initial design space, the approach sequentially reduces the set of non-dominated design alternatives until no further reduction is possible ??e remaining set cannot be rationally differentiated based on the available information. A single alternative is then selected from the remaining set of non-dominated designs. In this thesis, the focus is on the elimination step of the set-based design method: A criterion for rational elimination under interval-based uncertainty is derived. To be efficient, the criterion takes into account shared uncertainty ??certainty shared between design alternatives. In taking this uncertainty into account, one is able to eliminate significantly more design alternatives, improving the efficiency of the set-based design approach. Additionally, the criterion uses a detailed reference design to allow more elimination of inferior design sets without evaluating each alternative in that set. The effectiveness of this elimination is demonstrated in two examples: a beam design and a gearbox design.
6

Finite Quantum Theory of the Harmonic Oscillator

Shiri-Garakani, Mohsen 12 July 2004 (has links)
We apply the Segal process of group simplification to the linear harmonic oscillator. The result is a finite quantum theory with three quantum constants instead of the usual one. We compare the classical (CLHO), quantum (QLHO), and finite (FLHO) linear harmonic oscillators and their canonical or unitary groups. The FLHO is isomorphic to a dipole rotator with N=l(l+1) states where l is very large for physically interesting case. The position and momentum variables are quantized with uniform finite spectra. For fixed quantum constants and large N there are three broad classes of FLHO: soft, medium, and hard corresponding respectively to cases where ratio of the of potential energy to kinetic energy in the Hamiltonian is very small, almost equal to one, or very large The field oscillators responsible for infra-red and ultraviolet divergences are soft and hard respectively. Medium oscillators approximate the QLHO. Their low-lying states have nearly the same zero-point energy and level spacing as the QLHO, and nearly obeying the Heisenberg uncertainty principle and the equipartition principle. The corresponding rotators are nearly polarized along the z-axis. The soft and hard FLHO's have infinitesimal 0-point energy and grossly violate equipartition and the Heisenberg uncertainty principle. They do not resemble the QLHO at all. Their low-lying energy states correspond to rotators polaroizd along x-axis or y-axis respectively. Soft oscillators have frozen momentum, because their maximum potential energy is too small to produce one quantum of momentum. Hard oscillators have frozen position, because their maximum kinetic energy is too small to produce one quantum of momentum. Hard oscillators have frozen position, because their maximum kinetic energy is too small to excite one quantum of position.
7

Sir Arthur Eddington and the foundations of modern physics

Durham, Ian T. January 2005 (has links)
In this dissertation I analyze Sir Arthur Eddington's statistical theory as developed in the first six chapters of his posthumously published Fundamental Theory. In particular I look at the mathematical structure, philosophical implications, and relevancy to modern physics. This analysis is the only one of Fundamental Theory that compares it to modern quantum field theory and is the most comprehensive look at his statistical theory in four decades. Several major insights have been made in this analysis including the fact that he was able to derive Pauli's Exclusion Principle in part from Heisenberg's Uncertainty Principle. In addition the most profound general conclusion of this research is that Fundamental Theory is, in fact, an early quantum field theory, something that has never before been suggested. Contrary to the majority of historical reports and some comments by his contemporaries, this analysis shows that Eddington's later work is neither mystical nor was it that far from mainstream when it was published. My research reveals numerous profoundly deep ideas that were ahead of their time when Fundamental Theory was developed, but that have significant applicability at present. As such this analysis presents several important questions to be considered by modern philosophers of science, physicists, mathematicians, and historians. In addition it sheds new light on Eddington as a scientist and mathematician, in part indicating that his marginalization has been largely unwarranted.
8

Contribution à la théorie des ondelettes : application à la turbulence des plasmas de bord de Tokamak et à la mesure dimensionnelle de cibles / Contribution to the wavelet theory : Application to edge plasma turbulence in tokamaks and to dimensional measurement of targets

Scipioni, Angel 19 November 2010 (has links)
La nécessaire représentation en échelle du monde nous amène à expliquer pourquoi la théorie des ondelettes en constitue le formalisme le mieux adapté. Ses performances sont comparées à d'autres outils : la méthode des étendues normalisées (R/S) et la méthode par décomposition empirique modale (EMD).La grande diversité des bases analysantes de la théorie des ondelettes nous conduit à proposer une approche à caractère morphologique de l'analyse. L'exposé est organisé en trois parties.Le premier chapitre est dédié aux éléments constitutifs de la théorie des ondelettes. Un lien surprenant est établi entre la notion de récurrence et l'analyse en échelle (polynômes de Daubechies) via le triangle de Pascal. Une expression analytique générale des coefficients des filtres de Daubechies à partir des racines des polynômes est ensuite proposée.Le deuxième chapitre constitue le premier domaine d'application. Il concerne les plasmas de bord des réacteurs de fusion de type tokamak. Nous exposons comment, pour la première fois sur des signaux expérimentaux, le coefficient de Hurst a pu être mesuré à partir d'un estimateur des moindres carrés à ondelettes. Nous détaillons ensuite, à partir de processus de type mouvement brownien fractionnaire (fBm), la manière dont nous avons établi un modèle (de synthèse) original reproduisant parfaitement la statistique mixte fBm et fGn qui caractérise un plasma de bord. Enfin, nous explicitons les raisons nous ayant amené à constater l'absence de lien existant entre des valeurs élevées du coefficient d'Hurst et de supposées longues corrélations.Le troisième chapitre est relatif au second domaine d'application. Il a été l'occasion de mettre en évidence comment le bien-fondé d'une approche morphologique couplée à une analyse en échelle nous ont permis d'extraire l'information relative à la taille, dans un écho rétrodiffusé d'une cible immergée et insonifiée par une onde ultrasonore / The necessary scale-based representation of the world leads us to explain why the wavelet theory is the best suited formalism. Its performances are compared to other tools: R/S analysis and empirical modal decomposition method (EMD). The great diversity of analyzing bases of wavelet theory leads us to propose a morphological approach of the analysis. The study is organized into three parts. The first chapter is dedicated to the constituent elements of wavelet theory. Then we will show the surprising link existing between recurrence concept and scale analysis (Daubechies polynomials) by using Pascal's triangle. A general analytical expression of Daubechies' filter coefficients is then proposed from the polynomial roots. The second chapter is the first application domain. It involves edge plasmas of tokamak fusion reactors. We will describe how, for the first time on experimental signals, the Hurst coefficient has been measured by a wavelet-based estimator. We will detail from fbm-like processes (fractional Brownian motion), how we have established an original model perfectly reproducing fBm and fGn joint statistics that characterizes magnetized plasmas. Finally, we will point out the reasons that show the lack of link between high values of the Hurst coefficient and possible long correlations. The third chapter is dedicated to the second application domain which is relative to the backscattered echo analysis of an immersed target insonified by an ultrasonic plane wave. We will explain how a morphological approach associated to a scale analysis can extract the diameter information

Page generated in 0.1061 seconds