• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • Tagged with
  • 15
  • 15
  • 8
  • 8
  • 7
  • 6
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

An Examination of Quantum Foundations

Markes, Sonia January 2011 (has links)
Quantum foundations is a field of diverse goals and methods. In this thesis, I will present three different approaches to quantum foundations, each emphasizing a different goal or perspective. The causaloid framework has the goal is to use insight from quantum foundations to study quantum gravity. Ontic models are a tool used to study realist theories of quantum mechanics from an operational quantum information perspective. Nelson's mechanics is a derivation of the Schrodinger equation using the machinery of stochastic mechanics. As each of these approaches has different set of goals, they are suited to different purposes and have different limitations. From the causaloid, I construct the concept of causally unbiased entropy and at the same time, find an emergent idea of causality in the form of a measure of causal connectedness, termed the Q factor. In the ontic models framework, I reproduce the generalization of the concept of contextuality. For Nelson's mechanics, I examine its relationship to Bohmian mechanics - a realist formulation of quantum mechanics. I will then examine the relationship of these different approaches to one another. From this examination I will introduce the concept of physical contextuality in order to ask whether contextuality could be more than just a mathematical artifact. I also include a discussion of the property of deficiency in ontic models and its relation to contextuality given certain constraints.
2

An Examination of Quantum Foundations

Markes, Sonia January 2011 (has links)
Quantum foundations is a field of diverse goals and methods. In this thesis, I will present three different approaches to quantum foundations, each emphasizing a different goal or perspective. The causaloid framework has the goal is to use insight from quantum foundations to study quantum gravity. Ontic models are a tool used to study realist theories of quantum mechanics from an operational quantum information perspective. Nelson's mechanics is a derivation of the Schrodinger equation using the machinery of stochastic mechanics. As each of these approaches has different set of goals, they are suited to different purposes and have different limitations. From the causaloid, I construct the concept of causally unbiased entropy and at the same time, find an emergent idea of causality in the form of a measure of causal connectedness, termed the Q factor. In the ontic models framework, I reproduce the generalization of the concept of contextuality. For Nelson's mechanics, I examine its relationship to Bohmian mechanics - a realist formulation of quantum mechanics. I will then examine the relationship of these different approaches to one another. From this examination I will introduce the concept of physical contextuality in order to ask whether contextuality could be more than just a mathematical artifact. I also include a discussion of the property of deficiency in ontic models and its relation to contextuality given certain constraints.
3

Quantum information, Bell inequalities and the no-signalling principle

Pitalúa-García, Damián January 2014 (has links)
This PhD thesis contains a general introduction and three main chapters. Chapter 2 investigates Bell inequalities that generalize the CHSH and Braunstein-Caves inequalities. Chapter 3 shows a derivation of an upper bound on the success probability of a class of quantum teleportation protocols, denoted as port-based teleportation, from the no-cloning theorem and the no-signalling principle. Chapter 4 introduces the principle of quantum information causality. Chapter 2 considers the predictions of quantum theory and local hidden variable theories (LHVT) for the correlations obtained by measuring a pair of qubits by projections defined by randomly chosen axes separated by a given angle θ. The predictions of LHVT correspond to binary colourings of the Bloch sphere with antipodal points oppositely coloured. We show a Bell inequality for all θ, which generalizes the CHSH and the Braunstein-Caves inequalities in the sense that the measurement choices are not restricted to be in a finite set, but are constrained only by the angle θ. We motivate and explore the hypothesis that for a continuous range of θ > 0, the maximum correlation (anticorrelation) is obtained by assigning to one qubit the colouring with one hemisphere black and the other white, and assigning the same (reverse) colouring to the other qubit. We describe numerical tests that are consistent with this hypothesis and bound the range of θ. Chapter 3 shows a derivation of an upper bound on the success probability of port-based teleportation from the no-cloning theorem and the no-signalling principle. Chapter 4 introduces the principle of quantum information causality, a quantum version of the information causality principle. The quantum information causality principle states the maximum amount of quantum information that a transmitted quantum system can communicate as a function of its dimension, independently of any quantum physical resources previously shared by the communicating parties. These principles reduce to the no-signalling principle if no systems are transmitted. We present a new quantum information task, the quantum information causality game, whose success probability is upper bounded by the new principle, and show that an optimal strategy to perform it combines the quantum teleportation and superdense coding protocols with a task that has classical inputs.
4

Relativistic quantum tasks

Adlam, Emily Christine January 2017 (has links)
Quantum mechanics, which describes the behaviour of matter and energy on very small scales, is one of the most successful theories in the history of science. Einstein's theory of special relativity, which describes the relationship between space and time, is likewise a highly successful and widely accepted theory. And yet there is a well-documented tension between the two theories, to the extent that it is still not clear that the two can ever be reconciled. This thesis is concerned with furthering the current understanding of the relationship between quantum mechanics and special relativity. In the first part of the thesis we study the behaviour of quantum information in relativistic spacetime. The field of quantum information arose from the realisation that quantum information has a number of crucial properties that distinguish it from classical information, such as the no-cloning property, quantum contextuality, and quantum discord. More recently, it has been realised that placing quantum information under relativistic constraints leads to the emergence of further unique features which are not exhibited by either non-relativistic quantum information or relativistic classical information; as part of this ongoing research programme we develop a new relativistic quantum `paradox' which puts pressure on conventional views about the spatiotemporal persistence of quantum states over time. We then study a new set of relativistic quantum protocols which involve the distribution of entangled states over spacetime, defining one task involving the distribution of the two halves of a known entangled state, and another task involving the distribution of the two halves of an unknown entangled state. The second part of the thesis deals with relativistic quantum cryptography, a field which first began attracting serious attention when it was realised that a cryptographic task known as `bit commitment,' can be implemented with perfect security under relativistic constraints. This result was highly significant, since it is provably impossible to implement bit commitment with perfect security in a purely classical or purely quantum context, and hence bit commitment is an ideal starting point for probing the power of relativistic quantum cryptography. In this thesis we propose several new relativistic quantum bit commitment protocols which have notable advantages over previously known protocols. We then move to a related task, a generalization of zero-knowledge proving which we refer to as knowledge-concealing evidencing of knowledge of a quantum state; we prove no-go theorems concerning the possibility of implementing this task with perfect security, and then set out a relativistic protocol for the task which is asymptotically secure as the dimension of the state in question becomes large. These results have interesting foundational significance above and beyond their applications in the field of cryptography, providing a new perspective on the connections between knowledge, realism and quantum states.
5

Topics in Quantum Foundations: Ontological Models, and Distinguishability as a Resource

Morris, Ryan January 2009 (has links)
This thesis covers research in two disjoint research areas: The ontological model program (formerly hidden-variables program) for quantum theory has a long and noble tradition in the quantum foundations literature. By postulating a physical reality beyond the quantum state, we gain intuition on quantum phenomena and also come to understand constraints on realist interpretations of quantum theory. Bell's theorem tells us that such an underlying reality must be non-local, while the Kochen-Specker contextuality theorem abuses the classical notion that measurement should simply reveal pre-existing properties of reality. Recent research programs suggest that it is beneficial to view the quantum state as representing purely information. We show that the only current model which does this in a satisfactory manner is unable to reproduce all the statistics of quantum measurements. A recent generalization of the notion of contextuality has allowed for proofs of contextuality which differ from the original Kochen-Specker notion. We add a new result which shows that measurements in a model where the quantum state represents information must be contextual. Additionally, we refine the generalized notion of contextuality into strong and weak forms in order to parse the relationship between new and old results. Entanglement resource theory is a highly successful investigation of the usefulness of entanglement for information processing tasks. In this thesis we apply the ideas from entanglement resource theory to another resource: state distinguishability. We show analogies between distinguishability resource theory and entanglement resource theory. In particular, the analogy includes: measures which are monotonic under a class of transformations; units of a resource; and bounds on measures in terms of the amount of the unit resource needed to form states and the amount of unit resource that can be extracted from states. We show that the pairs of states which can be reversibly converted into \emph{classical states} are exactly the pairs of simultaneously diagonalizable states. Lastly, we characterize the trace-distance distinguishability of formation on a qubit system.
6

Topics in Quantum Foundations: Ontological Models, and Distinguishability as a Resource

Morris, Ryan January 2009 (has links)
This thesis covers research in two disjoint research areas: The ontological model program (formerly hidden-variables program) for quantum theory has a long and noble tradition in the quantum foundations literature. By postulating a physical reality beyond the quantum state, we gain intuition on quantum phenomena and also come to understand constraints on realist interpretations of quantum theory. Bell's theorem tells us that such an underlying reality must be non-local, while the Kochen-Specker contextuality theorem abuses the classical notion that measurement should simply reveal pre-existing properties of reality. Recent research programs suggest that it is beneficial to view the quantum state as representing purely information. We show that the only current model which does this in a satisfactory manner is unable to reproduce all the statistics of quantum measurements. A recent generalization of the notion of contextuality has allowed for proofs of contextuality which differ from the original Kochen-Specker notion. We add a new result which shows that measurements in a model where the quantum state represents information must be contextual. Additionally, we refine the generalized notion of contextuality into strong and weak forms in order to parse the relationship between new and old results. Entanglement resource theory is a highly successful investigation of the usefulness of entanglement for information processing tasks. In this thesis we apply the ideas from entanglement resource theory to another resource: state distinguishability. We show analogies between distinguishability resource theory and entanglement resource theory. In particular, the analogy includes: measures which are monotonic under a class of transformations; units of a resource; and bounds on measures in terms of the amount of the unit resource needed to form states and the amount of unit resource that can be extracted from states. We show that the pairs of states which can be reversibly converted into \emph{classical states} are exactly the pairs of simultaneously diagonalizable states. Lastly, we characterize the trace-distance distinguishability of formation on a qubit system.
7

Negative Quasi-Probability in the Context of Quantum Computation

Veitch, Victor January 2013 (has links)
This thesis deals with the question of what resources are necessary and sufficient for quantum computational speedup. In particular, we study what resources are required to promote fault tolerant stabilizer computation to universal quantum computation. In this context we discover a remarkable connection between the possibility of quantum computational speedup and negativity in the discrete Wigner function, which is a particular distinguished quasi-probability representation for quantum theory. This connection allows us to establish a number of important results related to magic state computation, an important model for fault tolerant quantum computation using stabilizer operations supplemented by the ability to prepare noisy non-stabilizer ancilla states. In particular, we resolve in the negative the open problem of whether every non-stabilizer resource suffices to promote computation with stabilizer operations to universal quantum computation. Moreover, by casting magic state computation as resource theory we are able to quantify how useful ancilla resource states are for quantum computation, which allows us to give bounds on the required resources. In this context we discover that the sum of the negative entries of the discrete Wigner representation of a state is a measure of its usefulness for quantum computation. This gives a precise, quantitative meaning to the negativity of a quasi-probability representation, thereby resolving the 80 year debate as to whether this quantity is a meaningful indicator of quantum behaviour. We believe that the techniques we develop here will be widely applicable in quantum theory, particularly in the context of resource theories.
8

Negative Quasi-Probability in the Context of Quantum Computation

Veitch, Victor January 2013 (has links)
This thesis deals with the question of what resources are necessary and sufficient for quantum computational speedup. In particular, we study what resources are required to promote fault tolerant stabilizer computation to universal quantum computation. In this context we discover a remarkable connection between the possibility of quantum computational speedup and negativity in the discrete Wigner function, which is a particular distinguished quasi-probability representation for quantum theory. This connection allows us to establish a number of important results related to magic state computation, an important model for fault tolerant quantum computation using stabilizer operations supplemented by the ability to prepare noisy non-stabilizer ancilla states. In particular, we resolve in the negative the open problem of whether every non-stabilizer resource suffices to promote computation with stabilizer operations to universal quantum computation. Moreover, by casting magic state computation as resource theory we are able to quantify how useful ancilla resource states are for quantum computation, which allows us to give bounds on the required resources. In this context we discover that the sum of the negative entries of the discrete Wigner representation of a state is a measure of its usefulness for quantum computation. This gives a precise, quantitative meaning to the negativity of a quasi-probability representation, thereby resolving the 80 year debate as to whether this quantity is a meaningful indicator of quantum behaviour. We believe that the techniques we develop here will be widely applicable in quantum theory, particularly in the context of resource theories.
9

Reality, causality, and quantum theory

Allen, John-Mark January 2017 (has links)
Quantum theory describes our universe incredibly successfully. To our classically-inclined brains, however, it is a bizarre description that requires a reimagining of what fundamental reality, or 'ontology', could look like. This thesis examines different ontological features in light of the success of quantum theory, what it requires, and what it rules out. While these investigations are primarily foundational, they also have relevance to quantum information, quantum communication, and experiments on quantum systems. The way that quantum theory describes the state of a system is one of its most unintuitive features. It is natural, therefore, to ask whether a similarly strange description of states is required on an ontological level. This thesis proves that almost all quantum superposition states for d > 3 dimensions must be real - that is, present in the ontology in a well-defined sense. This is a strong requirement which prevents intuitive explanations of the many quantum phenomena which are based on superpositions. A new theorem is also presented showing that quantum theory is incompatible with macro-realist ontologies, where certain physical quantities must always have definite values. This improves on the Leggett-Garg argument, which also aims to prove incompatibility with macro-realism but contains loopholes. Variations on both of these results that are error-tolerant (and therefore amenable to experimentation) are presented, as well as numerous related theorems showing that the ontology of quantum states must be somewhat similar to the quantum states themselves in various specific ways. Extending these same methods to quantum communication, a simple proof is found showing that an exponential number of classical bits are required to communicate a linear number of qubits. That is, classical systems are exponentially bad at storing quantum data. Causal influences are another part of ontology where quantum theory demands a revision of our classical notions. This follows from the outcomes of Bell experiments, as rigorously shown in recent analyses. Here, the task of constructing a native quantum framework for reasoning about causal influences is tackled. This is done by first analysing the simple example of a common cause, from which a quantum version of Reichenbach's principle is identified. This quantum principle relies on an identification of quantum conditional independence which can be defined in four ways, each naturally generalising a corresponding definition for classical conditional independence. Not only does this allow one to reason about common causes in a quantum experiments, but it can also be generalised to a full framework of quantum causal models (mirroring how classical causal models generalise Reichenbach's principle). This new definition of quantum causal models is illustrated by examples and strengthened by it's foundation on a robust quantum Reichenbach's principle. An unusual, but surprisingly fruitful, setting for considering quantum ontology is found by considering time travel to the past. This provides a testbed for different ontological concepts in quantum theory and new ways to compare classical and quantum frameworks. It is especially useful for comparing computational properties. In particular, time travel introduces non-linearity to quantum theory, which brings (sometimes implicit) ontological assumptions to the fore while introducing strange new abilities. Here, a model for quantum time travel is presented which arguably has fewer objectionable features than previous attempts, while remaining similarly well-motivated. This model is discussed and compared with previous quantum models, as well as with the classical case. Together, these threads of investigation develop a better understanding of how quantum theory affects possible ontologies and how ontological prejudices influence quantum theory.
10

Shape space in terms of Wasserstein geometry and application to quantum physics

Lessel, Bernadette 28 June 2018 (has links)
No description available.

Page generated in 0.1186 seconds