1 |
Efficient verification of universal and intermediate quantum computingKapourniotis, Theodoros January 2016 (has links)
The promise of scalable quantum technology appears more realistic, after recent advances in both theory and experiment. Assuming a quantum computer is developed, the task of verifying the correctness of its outcome becomes crucial. Unfortunately, for a system that involves many particles, predicting its evolution via classical simulation becomes intractable. Moreover, verification of the outcome by computational methods, i.e. involving a classical witness, is believed inefficient for the hardest problems solvable by a quantum computer. A feasible alternative to verify quantum computation is via cryptographic methods, where an untrusted prover has to convince a weak verifier for the correctness of his outcome. This is the approach we take in this thesis. In the most standard configuration the prover is capable of computing all polynomial-time quantum circuits and the verifier is restricted to classical with very modest quantum power. The goal of existing verification protocols is to reduce the quantum requirements for the verifier - ideally making it purely classical - and reduce the communication complexity. In Part II we propose a composition of two existing verification protocols [Fitzsimons and Kashefi, 2012], [Aharonov et al., 2010] that achieves quadratic improvement in communication complexity, while keeping the quantum requirements for the verifier modest. Along this result, several new techniques are proposed, including the generalization of [Fitzsimons and Kashefi, 2012] to prime dimensions. In Part III we discuss the idea of model-specific quantum verification, where the prover is restricted to intermediate quantum power, i.e. between full-fledged quantum and purely classical, thus more feasible experimentally. As a proof of principle we propose a verification protocol for the One-Pure-Qubit computer [Knill and Laflamme, 1998], which tolerates noise and is capable of computing hard problems such as large matrix trace estimation. The verification protocol is an adaptation of [Fitzsimons and Kashefi, 2012] running on Measurement-Based Quantum Computing with newly proved properties of the underlying resources. Connections of quantum verification to other security primitives are considered in Part IV. Authenticated quantum communication has been already proved to relate to quantum verification. We expand this by proposing a quantum authentication protocol derived from [Fitzsimons and Kashefi, 2012] and discuss implications to verification with purely classical verifier. Connections between quantum security primitives, namely blindness - prover does not learn the computation -, and classical security are considered in Part V. We introduce a protocol where a client with restricted classical resources computes blindly a universal classical gate with the help of an untrusted server, by adding modest quantum capabilities to both client and server. This example of quantum-enhanced classical security we prove to be a task classically impossible.
|
2 |
Insights from quantum information into fundamental physicsFarrelly, Terence Christopher January 2015 (has links)
No description available.
|
3 |
Coherent behaviour of trapped electrons in a Coulomb glassFleming, Stephen January 2015 (has links)
No description available.
|
4 |
Information and entropy in quantum theoryMaroney, Owen Jack Ernest January 2002 (has links)
Recent developments in quantum computing have revived interest in the notion of information as a foundational principle in physics. It has been suggested that information provides a means of interpreting quantum theory and a means of understanding the role of entropy in thermodynamics. The thesis presents a critical examination of these ideas, and contrasts the use of Shannon information with the concept of 'active information' introduced by Bohm and Hiley. We look at certain thought experiments based upon the 'delayed choice' and 'quantum eraser' interference experiments, which present a complementarity between information gathered from a quantum measurement and interference effects. It has been argued that these experiments show the Bohm interpretation of quantum theory is untenable. We demonstrate that these experiments depend critically upon the assumption that a quantum optics device can operate as a measuring device, and show that, in the context of these experiments, it cannot be consistently understood in this way. By contrast, we then show how the notion of 'active information' in the Bohm interpretation provides a coherent explanation of the phenomena shown in these experiments. We then examine the relationship between information and entropy. The thought experiment connecting these two quantities is the Szilard Engine version of Maxwell's Demon, and it has been suggested that quantum measurement plays a key role in this. We provide the first complete description of the operation of the Szilard Engine as a quantum system. This enables us to demonstrate that the role of quantum measurement suggested is incorrect, and further, that the use of information theory to resolve Szilard's paradox is both unnecessary and insufficient. Finally we show that, if the concept of 'active information' is extended to cover thermal density matrices, then many of the conceptual problems raised by this paradox appear to be resolved.
|
5 |
OPTIMAL ANNEALING PATHS FOR ADIABATIC QUANTUM COMPUTATIONYousefabadi, Navid 09 December 2011 (has links)
Shor’s algorithm shows that circuit-model quantum computers can factorize integers in polynomial time – exponentially more efficiently than classical computers. There is currently no analogous algorithm for Adiabatic Quantum Computers(AQCs). We illustrate through a number of factorization problems that a naive AQC implemen- tation fails to reveal an exponential speed up. An exponential speed up does become evident with the optimization of the AQC evolution path utilizing existing optimisa- tion approaches. We reduce the computation time even further by optimization over heuristically-derived parametrised functions. Finally, we improve our own results by exploring two-dimensional paths, and give arguments that using more dimensions in the search space can enhance the computational power to an even greater extent.
|
6 |
Parallel quantum computing : from theory to practicePius, Einar January 2015 (has links)
The term quantum parallelism is commonly used to refer to a property of quantum computations where an algorithm can act simultaneously on a superposition of states. However, this is not the only aspect of parallelism in quantum computing. Analogously to the classical computing model, every algorithm consists of elementary quantum operations and the application of them could be parallelised itself. This kind of parallelism is explored in this thesis in the one way quantum computing (1WQC) and the quantum circuit model. In the quantum circuit model we explore arithmetic circuits and circuit complexity theory. Two new arithmetic circuits for quantum computers are introduced in this work: an adder and a multiply-adder. The latter is especially interesting because its depth (i.e. the number of parallel steps required to finish the computation) is smaller than for any known classical circuit when applied sequentially. From the complexity theoretical perspective we concentrate on the classes QAC0 and QAC0[2], the quantum counterparts of AC0 and AC0[2]. The class AC0 comprises of constant depth circuits with unbounded fan-in AND and OR gates and AC0[2] is obtained when unbounded fan-in parity gates are added to AC0 circuits. We prove that QAC0 circuits with two layers of multi-qubit gates cannot compute parity exactly. This is a step towards proving QAC0 6= QAC0[2], a relation known to hold for AC0 and AC0[2]. In 1WQC, computation is done through measurements on an entangled state called the resource state. Two well known parallelisation methods exist in this model: signal shifting and finding the maximally delayed general flow. The first one uses the measurement calculus formalism to rewrite the dependencies of an existing computation, whereas the second technique exploits the geometry of the resource state to find the optimal ordering of measurements. We prove that the aforementioned methods result in same depth computations when the input and output sizes are equal. Through showing this equivalence we reveal new properties of 1WQC computations and design a new algorithm for the above mentioned parallelisations.
|
7 |
Shadow and Light : Casting a Shadow with Light and the Design of Geometric Phase Quantum Logic Gates in Waveguide SystemsMorin, Henri 21 November 2022 (has links)
Shadows are a common and familiar optical effect. We can all easily cast, observe, and characterize them. In our daily experience, they occur with the presence of an illumination source and an opaque object that casts the shadow. However, shadows cast by an opaque object are not the only possibility. In the first project of this thesis, we demonstrate what we call the "laser shadow" effect, where beams of light serve as both illumination source and object to create a new type of shadow. We then characterize the temperature and power dependence of the effect. Finally, we demonstrate how this laser shadow is indeed a shadow by comparing it to a genuine shadow.
Creating a quantum computer is a goal at the intersection of both quantum mechanics and computer science. It is of interest because quantum computers have been shown to have an advantage over classical computers in solving certain types of problems. In the second project of this thesis, we design a possible building block for such a quantum computer. The design relies on the "geometric phase", which theory suggests can provide enhanced robustness against errors. We then test our design in simulations of waveguide systems using two material platforms in different coupling regimes. Finally, we propose a practical experiment to take the design out of the computer and test the waveguide system using a physical apparatus.
|
8 |
Multi-partite entanglement in quantum information processingLoukopoulos, Klearchos January 2011 (has links)
Quantum theories have had an unprecedented success in providing a framework for studying physical systems. A fundamental implication of these theories is the existence of so-called entangled states, that is states whose description cannot be reduced to their constituents. These states are purely quantum and there is no such analogue in classical physics, where knowing the state of every particle is sufficient to infer the state of the system they compose. Entanglement is a core element of many quantum algorithms, quantum teleportation, quantum communications and quantum cryptographic scenarios. Furthermore, entanglement is present in nearly all solid-state systems, when they are at, or close to, their state of lowest energy. Therefore, it is both a technological resource and also a property which needs to be investigated in order to achieve understanding of real world materials at a fundamental level. The most concise demonstration of entanglement is perhaps in the case of maximal entanglement between two spin-l/2 particles. These maximally entangled two- particle states are called Bell states and they have been used to demonstrate experimentally that quantum mechanics is inequivalent to classical mechanics. A gen- eralization of this setting comes from studying entanglement between two physical systems, these can be either pure or mixed (e.g. in contact with a thermal bath). Entanglement between two systems, also knows as bipartite entanglement, has been studied in depth and quantified through various measures. However bipartite entanglement, by definition, is not the only quantity of in- terest. In some cases, entanglement is global and its properties cannot be reduced to studying bi-partitions. This type of entanglement, so-called multipartite entanglement, is harder to quantify and to study in general. Its presence is profound in physical systems that are at the point of undergoing a quantum phase transition and it is also a core ingredient for quantum error correcting codes, performing classical computation with quantum resources and some cryptographic scenarios. In this thesis we study properties of systems with multi-partite entanglement in the context of renormalization and quantum phase transitions, we show that multi- partite entanglement can be used to perform cryptographic tasks and we investigate what classes of Hamiltonians generate multiartite entanglement, while at the same time, their action can be simulated efficiently by a classical computer.
|
9 |
Some remarks on consequences of Shor's Factoring AlgorithmKarl-Georg Schlesinger, kgschles@esi.ac.at 26 February 2001 (has links)
No description available.
|
10 |
A Plausibility Argument for \#P$Karl--Georg Schlesinger, kgschles@esi.ac.at 26 February 2001 (has links)
No description available.
|
Page generated in 0.0959 seconds