191 |
The relation between classical and quantum mechanicsTaylor, Peter January 1984 (has links)
This thesis examines the relation between classical and quantum mechanics from philosophical, mathematical and physical standpoints. It first presents arguments in support of "conjectural realism" in scientific theories distinguished by explicit contextual structure and empirical testability; and it analyses intertheoretic reduction in terms of weakly equivalent theories over a domain of applicability. Familiar formulations of classical and quantum mechanics are shown to follow from a general theory of mechanics based on pure states with an intrinsic probability structure. This theory is developed to the stage where theorems from quantum logic enable expression of the state geometry in Hilbert space. Quantum and classical mechanics are then elaborated and applied to subsystems and the measurement process. Consideration is also given to spacetime geometry and the constraints this places on the dynamics. Physics and Mathematics, it is argued, are growing apart; the inadequate treatment of approximations in general and localization in quantum mechanics in particular are seen as contributing factors. In the description of systems, the link between localization and lack of knowledge shows that quantum mechanics should reflect the domain of applicability. Restricting the class of states provides a means of achieving this goal. Localisation is then shown to have a mathematical expression in terms of compactness, which in tum is applied to yield a topological theory of bound and scattering states: Finally, the thesis questions the validity of "classical limits" and "quantisations" in intertheoretic reduction, and demonstrates that a widely accepted classical limit does not constitute a proof of reduction. It proposes a procedure for determining whether classical and quantum mechanics are weakly equivalent over a domain of applicability, and concludes that, in this restricted sense, classical mechanics reduces to quantum mechanics.
|
192 |
Topics in arithmetic combinatoricsSanders, Tom January 2007 (has links)
This thesis is chiefly concerned with a classical conjecture of Littlewood's regarding the L¹-norm of the Fourier transform, and the closely related idem-potent theorem. The vast majority of the results regarding these problems are, in some sense, qualitative or at the very least infinitary and it has become increasingly apparent that a quantitative state of affairs is desirable. Broadly speaking, the first part of the thesis develops three new tools for tackling the problems above: We prove a new structural theorem for the spectrum of functions in A(G); we extend the notion of local Fourier analysis, pioneered by Bourgain, to a much more general structure, and localize Chang's classic structure theorem as well as our own spectral structure theorem; and we refine some aspects of Freiman's celebrated theorem regarding the structure of sets with small doubling. These tools lead to improvements in a number of existing additive results which we indicate, but for us the main purpose is in application to the analytic problems mentioned above. The second part of the thesis discusses a natural version of Littlewood's problem for finite abelian groups. Here the situation varies wildly with the underlying group and we pay special attention first to the finite field case (where we use Chang's Theorem) and then to the case of residues modulo a prime where we require our new local structure theorem for A(G). We complete the consideration of Littlewood's problem for finite abelian groups by using the local version of Chang's Theorem we have developed. Finally we deploy the Freiman tools along with the extended Fourier analytic techniques to yield a fully quantitative version of the idempotent theorem.
|
193 |
Vector Bundles Over Hypersurfaces Of Projective VarietiesTripathi, Amit 07 1900 (has links) (PDF)
In this thesis we study some questions related to vector bundles over hypersurfaces. More precisely, for hypersurfaces of dimension ≥ 2, we study the extension problem of vector bundles. We find some cohomological conditions under which a vector bundle over an ample divisor of non-singular projective variety, extends as a vector bundle to an open set containing that ample divisor.
Our method is to follow the general Groethendieck-Lefschetz theory by showing that a vector bundle extension exists over various thickenings of the ample divisor.
For vector bundles of rank > 1, we find two separate cohomological conditions on vector bundles which shows the extension to an open set containing the ample divisor. For the case of line bundles, our method unifies and recovers the generalized Noether-Lefschetz theorems by Joshi and Ravindra-Srinivas.
In the last part of the thesis, we make a specific study of vector bundles over elliptic curve.
|
194 |
Tychonoff's theorem and its equivalence with the axiom of choiceTörnkvist, Robin January 2015 (has links)
In this essay we give an elementary introduction to topology so that we can prove Tychonoff’s theorem, and also its equivalence with the axiom of choice. / Denna uppsats tillhandahåller en grundläggande introduktion till topologi för att sedan bevisa Tychonoff’s theorem, samt dess ekvivalens med urvalsaxiomet.
|
195 |
Schema theory for gene expression programmingHuang, Zhengwen January 2014 (has links)
This thesis studied a new variant of Evolutionary Algorithms called Gene Expression Programming. The evolution process of Gene Expression Programming was investigated from the practice to the theory. As a practice level, the original version of Gene Expression Programming was applied to a classification problem and an enhanced version of the algorithm was consequently developed. This allowed the development of a general understanding of each component of the genotype and phenotype separated representation system of the solution employed by the algorithm. Based on such an understanding, a version of the schema theory was developed for Gene Expression Programming. The genetic modifications provided by each genetic operator employed by this algorithm were analysed and a set of theorems predicting the propagation of the schema from one generation to another was developed. Also a set of experiments were performed to test the validity of the developed schema theory obtaining good agreement between the experimental results and the theoretical predictions.
|
196 |
Automated discovery of inductive lemmasJohansson, Moa January 2009 (has links)
The discovery of unknown lemmas, case-splits and other so called eureka steps are challenging problems for automated theorem proving and have generally been assumed to require user intervention. This thesis is mainly concerned with the automated discovery of inductive lemmas. We have explored two approaches based on failure recovery and theory formation, with the aim of improving automation of firstand higher-order inductive proofs in the IsaPlanner system. We have implemented a lemma speculation critic which attempts to find a missing lemma using information from a failed proof-attempt. However, we found few proofs for which this critic was applicable and successful. We have also developed a program for inductive theory formation, which we call IsaCoSy. IsaCoSy was evaluated on different inductive theories about natural numbers, lists and binary trees, and found to successfully produce many relevant theorems and lemmas. Using a background theory produced by IsaCoSy, it was possible for IsaPlanner to automatically prove more new theorems than with lemma speculation. In addition to the lemma discovery techniques, we also implemented an automated technique for case-analysis. This allows IsaPlanner to deal with proofs involving conditionals, expressed as if- or case-statements.
|
197 |
Bayesian Networks with Expert Elicitation as Applicable to Student Retention in Institutional ResearchDunn, Jessamine Corey 13 May 2016 (has links)
The application of Bayesian networks within the field of institutional research is explored through the development of a Bayesian network used to predict first- to second-year retention of undergraduates. A hybrid approach to model development is employed, in which formal elicitation of subject-matter expertise is combined with machine learning in designing model structure and specification of model parameters. Subject-matter experts include two academic advisors at a small, private liberal arts college in the southeast, and the data used in machine learning include six years of historical student-related information (i.e., demographic, admissions, academic, and financial) on 1,438 first-year students. Netica 5.12, a software package designed for constructing Bayesian networks, is used for building and validating the model. Evaluation of the resulting model’s predictive capabilities is examined, as well as analyses of sensitivity, internal validity, and model complexity. Additionally, the utility of using Bayesian networks within institutional research and higher education is discussed.
The importance of comprehensive evaluation is highlighted, due to the study’s inclusion of an unbalanced data set. Best practices and experiences with expert elicitation are also noted, including recommendations for use of formal elicitation frameworks and careful consideration of operating definitions. Academic preparation and financial need risk profile are identified as key variables related to retention, and the need for enhanced data collection surrounding such variables is also revealed. For example, the experts emphasize study skills as an important predictor of retention while noting the absence of collection of quantitative data related to measuring students’ study skills. Finally, the importance and value of the model development process is stressed, as stakeholders are required to articulate, define, discuss, and evaluate model components, assumptions, and results.
|
198 |
Fourier restriction phenomenon in thin setsPapadimitropoulos, Christos January 2010 (has links)
We study the Fourier restriction phenomenon in settings where there is no underlying proper smooth subvariety. We prove an (Lp, L2) restriction theorem in general locally compact abelian groups and apply it in groups such as (Z/pLZ)n, R and locally compact ultrametric fields K. The problem of existence of Salem sets in a locally compact ultrametric field (K, | · |) is also considered. We prove that for every 0 < α < 1 and ǫ > 0 there exist a set E ⊂ K and a measure μ supported on E such that the Hausdorff dimension of E equals α and |bμ(x)| ≤ C|x|−α 2 +ǫ. We also establish the optimal extension of the Hausdorff-Young inequality in the compact ring of integers R of a locally compact ultrametric field K. We shall prove the following: For every 1 ≤ p ≤ 2 there is a Banach function space Fp(R) with σ-order continuous norm such that (i) Lp(R) ( Fp(R) ( L1(R) for every 1 < p < 2. (ii) The Fourier transform F maps Fp(R) to ℓp′ continuously. (iii) Lp(R) is continuously included in Fp(R) and Fp(R) is continuously included in L1(R). (iv) If Z is a Banach function space with the same properties as Fp(R) above, then Z is continuously included in Fp(R). (v) F1(R) = L1(R) and F2(R) = L2(R).
|
199 |
Scheme-based theorem discovery and concept inventionMontano-Rivas, Omar January 2012 (has links)
In this thesis we describe an approach to automatically invent/explore new mathematical theories, with the goal of producing results comparable to those produced by humans, as represented, for example, in the libraries of the Isabelle proof assistant. Our approach is based on ‘schemes’, which are formulae in higher-order logic. We show that it is possible to automate the instantiation process of schemes to generate conjectures and definitions. We also show how the new definitions and the lemmata discovered during the exploration of a theory can be used, not only to help with the proof obligations during the exploration, but also to reduce redundancies inherent in most theory-formation systems. We exploit associative-commutative (AC) operators using ordered rewriting to avoid AC variations of the same instantiation. We implemented our ideas in an automated tool, called IsaScheme, which employs Knuth-Bendix completion and recent automatic inductive proof tools. We have evaluated our system in a theory of natural numbers and a theory of lists.
|
200 |
Incremental modelling for verified communication architecturesBoehm, Peter January 2011 (has links)
Modern computer systems are advancing from multi-core to many-core designs and System-on-chips (SoC) are becoming increasingly complex while integrating a great variety of components, thus constituting complex distributed systems. Such architectures rely on extremely complex communication protocols to exchange data with required performance. Arguing formally about the correctness of communication is an acknowledged verification challenge. This thesis presents a generic framework that formalises the idea of incremental modelling and step-wise verification to tackle this challenge: to control the overall complexity, features are added incrementally to a simple initial model and the complexity of each feature is encapsulated into an independent modelling step. Two main strategies reduce the verification effort. First, models are constructed with verification support in mind and the verification process is spread over the modelling process. Second, generic correctness results for framework components allow the verification to be reduced to discharging local assumptions when a component is instantiated. Models in the framework are based on abstract state machines formalised in higher order logic using the Isabelle theorem prover. Two case studies show the utility and breadth of the approach: the ARM AMBA Advanced High-performance Bus protocol, an arbiter-based master-slave bus protocol, represents the family of SoC protocols; the PCI Express protocol, an off-chip point-to-point protocol, illustrates the application of the framework to sophisticated, performance-related features of current and future on-chip protocols. The presented methodology provides an alternative to the traditional monolithic and post-hoc verification approach.
|
Page generated in 0.072 seconds