201 |
A Quantum Light Source for Light-matter InteractionXing, Xingxing 13 August 2013 (has links)
I present in this thesis the design, implementation and measurement results of a narrowband quantum light source based on cavity-enhanced Parametric Down-Conversion
(PDC). Spontaneous Parametric Down-Conversion (SPDC) is the workhorse in the field of optical quantum information and quantum computation, yet it is not suitable for applications where deterministic nonlinearities are required due to its low spectral brightness. By
placing the nonlinear crystal inside a cavity, the spectrum of down-conversion is actively modified, such that all the non-resonant modes of down-conversion experience destructive interference, while the resonant mode sees constructive interference, resulting in great enhancement in spectral brightness. I design and construct such a cavity-enhanced down-conversion source with record high spectral brightness, making it possible to use cold atoms as the interaction medium to achieve large nonlinearity between photons. The frequency of the photons is tunable and their coherence time is measured to be on the order of 10 nanoseconds, matching the lifetime of the excited state of typical alkali atoms. I characterize extensively the output of the source by measuring the second-order correlation function, quantifying two-photon indistinguishability, performing quantum state tomography of entangled states, and showing different statistics of the source.
The unprecedented long coherence time of the photon pairs has also made possible the encoding of quantum information in the time domain of the photons. I present a theoretical proposal of multi-dimensional quantum information with such long-coherence-time photons and analyze its performance with realistic parameter settings. I implement this proposal with the quantum light source I have built, and show for the first time that a qutrit can be encoded in the time domain of the single photons. I demonstrate the coherence is preserved for the qutrit state, thus ruling out any classical probabilistic explanation of the experimental data. Such an encoding scheme provides an easy access to multi-dimensional systems and can be used as a versatile platform for many quantum information and quantum computation tasks.
|
202 |
Characterization of Quantum States of LightAdamson, Robert B. A. 09 April 2010 (has links)
I present a series of experimental and theoretical advances in the field of quantum state estimation. Techniques for measuring the quantum state of light that were originally developed for distinguishable
photons fail when the particles are indistinguishable. I develop new methods for handling indistinguishability in quantum state estimation. The technique I present provides the first complete description of states of experimentally indistinguishable photons. It allows me to derive the number of parameters needed to describe an arbitrary state and to quantify distinguishability. I demonstrate its use by applying it to the measurement of the
quantum polarization state of two and three-photon systems.
State characterization is optimal when no redundant information is collected about the state of the system. I present the results of the first optimal characterization of the polarization state of a two-photon system. I show an improved estimation power over the previous state of the art. I also show how the optimal measurements lead to a new description of the quantum state in terms
of a discrete Wigner function.
It is often desirable to describe the quantum state of a system in terms of
properties that are not themselves quantum-mechanical observables. This usually requires a full characterization of the state followed by a calculation of the properties from the parameters characterizing the state.
I apply a technique that allows such properties to be determined directly, without a full characterization of the state.
This allows one such property, the purity, to be determined in a single measurement, regardless of the size of the system, while
the conventional method of determining purity requires a number of measurements that scales exponentially with the system size.
|
203 |
A Quantum Light Source for Light-matter InteractionXing, Xingxing 13 August 2013 (has links)
I present in this thesis the design, implementation and measurement results of a narrowband quantum light source based on cavity-enhanced Parametric Down-Conversion
(PDC). Spontaneous Parametric Down-Conversion (SPDC) is the workhorse in the field of optical quantum information and quantum computation, yet it is not suitable for applications where deterministic nonlinearities are required due to its low spectral brightness. By
placing the nonlinear crystal inside a cavity, the spectrum of down-conversion is actively modified, such that all the non-resonant modes of down-conversion experience destructive interference, while the resonant mode sees constructive interference, resulting in great enhancement in spectral brightness. I design and construct such a cavity-enhanced down-conversion source with record high spectral brightness, making it possible to use cold atoms as the interaction medium to achieve large nonlinearity between photons. The frequency of the photons is tunable and their coherence time is measured to be on the order of 10 nanoseconds, matching the lifetime of the excited state of typical alkali atoms. I characterize extensively the output of the source by measuring the second-order correlation function, quantifying two-photon indistinguishability, performing quantum state tomography of entangled states, and showing different statistics of the source.
The unprecedented long coherence time of the photon pairs has also made possible the encoding of quantum information in the time domain of the photons. I present a theoretical proposal of multi-dimensional quantum information with such long-coherence-time photons and analyze its performance with realistic parameter settings. I implement this proposal with the quantum light source I have built, and show for the first time that a qutrit can be encoded in the time domain of the single photons. I demonstrate the coherence is preserved for the qutrit state, thus ruling out any classical probabilistic explanation of the experimental data. Such an encoding scheme provides an easy access to multi-dimensional systems and can be used as a versatile platform for many quantum information and quantum computation tasks.
|
204 |
Studies in Applied and Fundamental Quantum Mechanics: Duality, Tomography, Cryptography and HolographyBolduc, Eliot 05 November 2013 (has links)
This thesis encompasses a collection of four pieces of work on wave-particle duality, weak-value-assisted tomography, high-dimensional quantum key distribution, and phase-only holograms. In the work on duality, we derive a novel duality relation, and we sketch a thought experiment that leads to an apparent violation of the duality principle. In the project on tomography, we perform a state determination procedure with weak values, and we study the accuracy of the method. In the quantum cryptography project, we optimize an experimental implementation of a quantum cryptography system where two parties share information with the orbital angular momentum degree of freedom of entangled photon pairs. Finally, in the work on holography, we establish the exact solution to the encryption of a phase-only hologram, and experimentally demonstrate its application to spatial light modulators. The four projects provide improvements on measurement procedures in applied and fundamental quantum mechanics.
|
205 |
The measurement, creation and manipulation of quantum optical states via photodetectionWebb, James, Engineering & Information Technology, Australian Defence Force Academy, UNSW January 2009 (has links)
In this thesis, we demonstrate an array of photodetection theory and techniques bridging the traditional discrete and continuous variable experimental domains. In quantum optics, the creation and measurement of states of light are intertwined and we present experimental architectures considering both aspects. We describe the measurement of mean photon numbers at optical sideband frequencies using homodyne detection. We use our technique to provide a direct comparison to photon-counting measurements and observe that our technique exhibits superior speed, dynamic range and mode selectivity compared to photon counters. Our analysis also rejects a semiclassical description of the vacuum state, with our observations supporting the quantum mechanical model. We create a new means of describing the detection ???signatures??? of multi-port networks of non-photon-number discriminating detectors. Our model includes the practical effects of loss and dark counts. We use this model to analyse the performance of the loopand balanced- time-division-multiplexed detector architectures in a projective measurement role. Our analysis leads us to describe a prescriptive recipe for the optimisation of each architecture. In light of contemporary technology, we conclude the balanced TDM detector is the better architecture. Our analysis is then extended to the tomographic reconstruction of an unknown optical state using multi-port photon-counting networks. Our new approach is successfully applied to the reconstruction of the photon statistics of weak coherent states and demonstrates reduced error and sensitivity to experimental parameter variations than established techniques. We report the development of a source of quadrature squeezed vacuum at 1550 nm, and characterise the squeezing observed at the first 3 free spectral ranges of the downconversion cavity. This is then used as a source of frequency-entangled photons for a projective photon subtraction operation described by our earlier theory. We propose a new hybrid time/frequency domain approach to homodyne detection and illustrate its application in characterising the prepared state. Our output state has a statistically significant single photon contribution and permits future experimentation in frequency basis quantum information.
|
206 |
Stepping stones towards linear optical quantum computingTill Weinhold Unknown Date (has links)
The experiments described in this thesis form an investigation into the path towards establishing the requirements of quantum computing in a linear optical system. Our qubits are polarisation encoded photons for which the basic operations of quantum computing, single qubit rotations, are a well understood problem. The difficulty lies in the interaction of photons. To achieve these we use measurement induced non-linearities. The first experiment in this thesis describes the thorough characterisation of a controlled-sign gate based on such non-linearities. The photons are provided as pairs generated through parametric down-conversion, and as such share correlations unlikely to carry over into large scale implementations of the future. En route to such larger circuits, a characterisation of the actions of the controlled-sign gate is conducted, when the input qubits have been generated independently from each other, revealing a large drop in process fidelity. To explore the cause of this degradation of the gate performance a thorough and highly accurate model of the gate is derived including the realistic description of faulty circuitry, photon loss and multi-photon emission by the source. By simulating the effects of the various noise sources individually, the heretofore largely ignored multi-photon emission is identified as the prime cause of the degraded gate performance, causing a drop in fidelity nearly three times as large as any other error source. I further draw the first comparison between the performance of an experimental gate to the error probabilities per gate derived as thresholds for fault-tolerant quantum computing. In the absence of a single vigourous threshold value, I compare the gate performance to the models that yielded the highest threshold to date as an upper bound and to the threshold of the Gremlin-model, which allows for the most general errors. Unsurprisingly this comparison reveals that the implemented gate is clearly insufficient, however just remedying the multi-photon emission error will allow this architecture to move to within striking distance of the boundary for fault-tolerant quantum computing. The utilised methodology can be applied to any gate in any architecture and can, combined with a suitable model of the noise sources, become an important guide for developments required to achieve fault tolerant quantum computing. The final experiment on the path towards linear optical quantum computing is the demonstration of a pair of basic versions of Shor's algorithm which display the essential entanglement for the algorithm. The results again highlight the need for extensive measurements to reveal the fundamental quality of the implemented algorithm, which is not accessible with limited indicative measurements. In the second part of the thesis, I describe two experiments on other forms of entanglement by extending the actions of a Fock-State filter, a filter that is capable of attenuating single photon states stronger than multi-photon states, to produce entangled states. Furthermore this device can be used in conjunction with standard wave-plates to extend the range of operations possible on the bi-photonic qutrit space, showing that this setup suffices to produce any desired qutrit state, thereby giving access to new measurement capabilities and in the process creating and proving the first entanglement between a qubit and a qutrit.
|
207 |
Stepping stones towards linear optical quantum computingTill Weinhold Unknown Date (has links)
The experiments described in this thesis form an investigation into the path towards establishing the requirements of quantum computing in a linear optical system. Our qubits are polarisation encoded photons for which the basic operations of quantum computing, single qubit rotations, are a well understood problem. The difficulty lies in the interaction of photons. To achieve these we use measurement induced non-linearities. The first experiment in this thesis describes the thorough characterisation of a controlled-sign gate based on such non-linearities. The photons are provided as pairs generated through parametric down-conversion, and as such share correlations unlikely to carry over into large scale implementations of the future. En route to such larger circuits, a characterisation of the actions of the controlled-sign gate is conducted, when the input qubits have been generated independently from each other, revealing a large drop in process fidelity. To explore the cause of this degradation of the gate performance a thorough and highly accurate model of the gate is derived including the realistic description of faulty circuitry, photon loss and multi-photon emission by the source. By simulating the effects of the various noise sources individually, the heretofore largely ignored multi-photon emission is identified as the prime cause of the degraded gate performance, causing a drop in fidelity nearly three times as large as any other error source. I further draw the first comparison between the performance of an experimental gate to the error probabilities per gate derived as thresholds for fault-tolerant quantum computing. In the absence of a single vigourous threshold value, I compare the gate performance to the models that yielded the highest threshold to date as an upper bound and to the threshold of the Gremlin-model, which allows for the most general errors. Unsurprisingly this comparison reveals that the implemented gate is clearly insufficient, however just remedying the multi-photon emission error will allow this architecture to move to within striking distance of the boundary for fault-tolerant quantum computing. The utilised methodology can be applied to any gate in any architecture and can, combined with a suitable model of the noise sources, become an important guide for developments required to achieve fault tolerant quantum computing. The final experiment on the path towards linear optical quantum computing is the demonstration of a pair of basic versions of Shor's algorithm which display the essential entanglement for the algorithm. The results again highlight the need for extensive measurements to reveal the fundamental quality of the implemented algorithm, which is not accessible with limited indicative measurements. In the second part of the thesis, I describe two experiments on other forms of entanglement by extending the actions of a Fock-State filter, a filter that is capable of attenuating single photon states stronger than multi-photon states, to produce entangled states. Furthermore this device can be used in conjunction with standard wave-plates to extend the range of operations possible on the bi-photonic qutrit space, showing that this setup suffices to produce any desired qutrit state, thereby giving access to new measurement capabilities and in the process creating and proving the first entanglement between a qubit and a qutrit.
|
208 |
Stepping stones towards linear optical quantum computingTill Weinhold Unknown Date (has links)
The experiments described in this thesis form an investigation into the path towards establishing the requirements of quantum computing in a linear optical system. Our qubits are polarisation encoded photons for which the basic operations of quantum computing, single qubit rotations, are a well understood problem. The difficulty lies in the interaction of photons. To achieve these we use measurement induced non-linearities. The first experiment in this thesis describes the thorough characterisation of a controlled-sign gate based on such non-linearities. The photons are provided as pairs generated through parametric down-conversion, and as such share correlations unlikely to carry over into large scale implementations of the future. En route to such larger circuits, a characterisation of the actions of the controlled-sign gate is conducted, when the input qubits have been generated independently from each other, revealing a large drop in process fidelity. To explore the cause of this degradation of the gate performance a thorough and highly accurate model of the gate is derived including the realistic description of faulty circuitry, photon loss and multi-photon emission by the source. By simulating the effects of the various noise sources individually, the heretofore largely ignored multi-photon emission is identified as the prime cause of the degraded gate performance, causing a drop in fidelity nearly three times as large as any other error source. I further draw the first comparison between the performance of an experimental gate to the error probabilities per gate derived as thresholds for fault-tolerant quantum computing. In the absence of a single vigourous threshold value, I compare the gate performance to the models that yielded the highest threshold to date as an upper bound and to the threshold of the Gremlin-model, which allows for the most general errors. Unsurprisingly this comparison reveals that the implemented gate is clearly insufficient, however just remedying the multi-photon emission error will allow this architecture to move to within striking distance of the boundary for fault-tolerant quantum computing. The utilised methodology can be applied to any gate in any architecture and can, combined with a suitable model of the noise sources, become an important guide for developments required to achieve fault tolerant quantum computing. The final experiment on the path towards linear optical quantum computing is the demonstration of a pair of basic versions of Shor's algorithm which display the essential entanglement for the algorithm. The results again highlight the need for extensive measurements to reveal the fundamental quality of the implemented algorithm, which is not accessible with limited indicative measurements. In the second part of the thesis, I describe two experiments on other forms of entanglement by extending the actions of a Fock-State filter, a filter that is capable of attenuating single photon states stronger than multi-photon states, to produce entangled states. Furthermore this device can be used in conjunction with standard wave-plates to extend the range of operations possible on the bi-photonic qutrit space, showing that this setup suffices to produce any desired qutrit state, thereby giving access to new measurement capabilities and in the process creating and proving the first entanglement between a qubit and a qutrit.
|
209 |
Micromanipulation Of Biological Particles With Optical TweezersBayoudh, Sonia Unknown Date (has links)
Following the first demonstration in 1987 by Arthur Ashkin of trapping of biological objects with infrared laser light, optical tweezers have become increasingly useful and versatile tool in a variety of non-contact micromanipulation experiments in biological applications. In this thesis we demonstrated various applications of optical tweezers in botanical sciences, chemical engineering and anatomical sciences. The investigation of the three-dimensional shape of spinach chloroplasts has been accomplished. This was done using a steerable and a stationary trap system. A trapped rotating calcite crystal positioned close to a chloroplast provided means for inducing the rotation and orientation of chloroplast. The utility of rotating birefringent particles is demonstrated for the first time in biological applications. The stirrer method is a versatile method in orienting any biological object to study its shape and/or structure. Also, we demonstrated the ability of optical tweezers to fix and displace chloroplasts inside a living spinach plant cell. In the second part of the work described in this thesis, the steerable trap was used to study the viscoelastic properties of a polymeric filament that connects a single bacterium to an activated sludge floc. Also we estimated the minimum bonding force that can cause a weak interaction between the bacterium surface and the filament using optical tweezers as a transducer. This force was estimated to be at least 10 pN. These measurements are of value in improving activated sludge flocculation and ultimately the wastewater treatment process. In addition, the steerable trap was used to move small organelles inside large bacteria cells. The repositioning of organelles resulted in creating new internal cell structure. In the final part of the thesis, experiments are described where the laser tweezers system was combined with a cw argon-ion laser microbeam to investigate the fusion of smooth muscle cells and macrophages. In order to minimize the optical damage to the cells, a special arrangement was established to create short pulses for cutting the contact of the cell membrane of the two-fusion cell partners. The effectiveness of the cutting function of the pulsed system when used at 488 nm wavelength varied from cell to cell. The laser parameters such as laser power, pulse duration and repetition rate were varied in order to obtain the best working function of the setup. But overall the results indicate that the relatively long (ms) pulses possible may not be well suited to such applications.
|
210 |
Open Quantum Dynamics of Mesoscopic Bose-Einstein CondensatesCorney, Joel Frederick Unknown Date (has links)
The properties of an atomic Bose-Einstein condensate in a double-well potential are investigated through a two-mode analysis. An analytic solution for the semiclassical tunnelling and self-trapping dynamics is compared with numerical simulations of the quantum dynamics, which exhibit collapses and revivals for a closed system. A continuous non-destructive measurement technique to monitor the Josephson tunnelling oscillations is presented, in which the condensate in one well dispersively shifts the phase of a coherent probe beam in proportion to atom-number. The evolution of the resulting homodyne photocurrent and Bloch Q distributions shows that oscillations develop even when the initial state possesses phase symmetry. The conditional dynamics of the condensate which result from measurement back-action also appear in certain semiclassical formulations. The homodyne measurement technique is incorporated into a proposed weak-force detector. A maximally entangled initial state, which is the ground state for a double condensate with strong attractive atomic interactions, enables a high-precision measurement. The dynamics of quantum many-body multimode systems of interacting bosons are simulated using phase-space methods. The use of the Wigner technique predicts novel noise effects in fibre solitons. The positive-P representation is used to simulate the formation of mesoscopic Bose-Einstein condensates via evaporative cooling in three dimensional atom traps. The results indicate highly non-classical behaviour near the critical point, and provide evidence for the spontaneous formation of vortices. Comparisons with corresponding mean-field calculations reveal large differences between the semiclassical and fully quantum results. Finally, the possibility of future progress with alternative phase-space methods is considered.
|
Page generated in 0.0255 seconds