• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 2
  • Tagged with
  • 5
  • 5
  • 4
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

On Experimental Quantum Communication and Cryptography

Erven, Christopher January 2012 (has links)
One of the most fascinating recent developments in research has been how different disciplines have become more and more interconnected. So much so that fields as disparate as information theory and fundamental physics have combined to produce ideas for the next generation of computing and secure information technologies, both of which have far reaching consequences. For more than fifty years Moore's law, which describes the trend of the transistor's size shrinking by half every two years, has proven to be uncannily accurate. However, the computing industry is now approaching a fundamental barrier as the size of a transistor approaches that of an individual atom and the laws of physics and quantum mechanics take over. Rather then look at this as the end, quantum information science has emerged to ask the question of what additional power and functionality might be realized by harnessing some of these quantum effects. This thesis presents work on the sub-field of quantum cryptography which seeks to use quantum means in order to assure the security of ones communications. The beauty of quantum cryptographic methods are that they can be proven secure, now and indefinitely into the future, relying solely on the validity of the laws of physics for their proofs of security. This is something which is impossible for nearly all current classical cryptographic methods to claim. The thesis begins by examining the first implementation of an entangled quantum key distribution system over two free-space optical links. This system represents the first test-bed of its kind in the world and while its practical importance in terrestrial applications is limited to a smaller university or corporate campus, the system mimics the setup for an entangled satellite system aiding in the study of distributing entangled photons from an orbiting satellite to two earthbound receivers. Having completed the construction of a second free-space link and the automation of the alignment system, I securely distribute keys to Alice and Bob in two distant locations separated by 1,575 m with no direct line-of-sight between them. I examine all of the assumptions necessary for my claims of security, something which is particularly important for moving these systems out of the lab and into commercial industry. I then go on to describe the free-space channel over which the photons are sent and the implementation of each of the major system components. I close with a discussion of the experiment which saw raw detected entangled photon rates of 565 s^{-1} and a quantum bit error rate (QBER) of 4.92% resulting in a final secure key rate of 85 bits/s. Over the six hour night time experiment I was able to generate 1,612,239 bits of secure key. With a successful QKD experiment completed, this thesis then turns to the problem of improving the technology to make it more practical by increasing the key rate of the system and thus the speed at which it can securely encrypt information. It does so in three different ways, involving each of the major disciplines comprising the system: measurement hardware, source technology, and software post-processing. First, I experimentally investigate a theoretical proposal for biasing the measurement bases in the QKD system showing a 79% improvement in the secret key generated from the same raw key rates. Next, I construct a second generation entangled photon source with rates two orders of magnitude higher than the previous source using the idea of a Sagnac interferometer. More importantly, the new source has a QBER as low as 0.93% which is not only important for the security of the QKD system but will be required for the implementation of a new cryptographic primitive later. Lastly, I study the free-space link transmission statistics and the use of a signal-to-noise ratio (SNR) filter to improve the key rate by 25.2% from the same amount of raw key. The link statistics have particular relevance for a current project with the Canadian Space Agency to exchange a quantum key with an orbiting satellite - a project which I have participated in two feasibility studies for. Wanting to study the usefulness of more recent ideas in quantum cryptography this thesis then looks at the first experimental implementation of a new cryptographic primitive called oblivious transfer (OT) in the noisy storage model. This primitive has obvious important applications as it can be used to implement a secure identification scheme provably secure in a quantum scenario. Such a scheme could one day be used, for example, to authenticate a user over short distances, such as at ATM machines, which have proven to be particularly vulnerable to hacking and fraud. Over a four hour experiment, Alice and Bob measure 405,642,088 entangled photon pairs with an average QBER of 0.93% allowing them to create a secure OT key of 8,939,150 bits. As a first implementer, I examine many of the pressing issues currently preventing the scheme from being more widely adopted such as the need to relax the dependance of the OT rate on the loss of the system and the need to extend the security proof to cover a wider range of quantum communication channels and memories. It is important to note that OT is fundamentally different than QKD for security as the information is never physically exchanged over the communication line but rather the joint equality function f(x) = f(y) is evaluated. Thus, security in QKD does not imply security for OT. Finally, this thesis concludes with the construction and initial alignment of a second generation free-space quantum receiver, useful for increasing the QKD key rates, but designed for a fundamental test of quantum theory namely a Svetlichny inequality violation. Svetlichny's inequality is a generalization of Bell's inequality to three particles where any two of the three particles maybe be non-locally correlated. Even so, a violation of Svetlichny's inequality shows that certain quantum mechanical states are incompatible with this restricted class of non-local yet realistic theories. Svetlichny's inequality is particularly important because while there has been an overwhelming number of Bell experiments performed testing two-body correlations, experiments on many-body systems have been few and far between. Experiments of this type are particularly valuable to explore since we live in a many-body world. The new receiver incorporates an active polarization analyzer capable of switching between measurement bases on a microsecond time-scale through the use of a Pockels cell while maintaining measurements of a high fidelity. Some of the initial alignment and analysis results are detailed including the final measured contrasts of 1:25.2 and 1:22.6 in the rectilinear and diagonal bases respectively.
2

On Experimental Quantum Communication and Cryptography

Erven, Christopher January 2012 (has links)
One of the most fascinating recent developments in research has been how different disciplines have become more and more interconnected. So much so that fields as disparate as information theory and fundamental physics have combined to produce ideas for the next generation of computing and secure information technologies, both of which have far reaching consequences. For more than fifty years Moore's law, which describes the trend of the transistor's size shrinking by half every two years, has proven to be uncannily accurate. However, the computing industry is now approaching a fundamental barrier as the size of a transistor approaches that of an individual atom and the laws of physics and quantum mechanics take over. Rather then look at this as the end, quantum information science has emerged to ask the question of what additional power and functionality might be realized by harnessing some of these quantum effects. This thesis presents work on the sub-field of quantum cryptography which seeks to use quantum means in order to assure the security of ones communications. The beauty of quantum cryptographic methods are that they can be proven secure, now and indefinitely into the future, relying solely on the validity of the laws of physics for their proofs of security. This is something which is impossible for nearly all current classical cryptographic methods to claim. The thesis begins by examining the first implementation of an entangled quantum key distribution system over two free-space optical links. This system represents the first test-bed of its kind in the world and while its practical importance in terrestrial applications is limited to a smaller university or corporate campus, the system mimics the setup for an entangled satellite system aiding in the study of distributing entangled photons from an orbiting satellite to two earthbound receivers. Having completed the construction of a second free-space link and the automation of the alignment system, I securely distribute keys to Alice and Bob in two distant locations separated by 1,575 m with no direct line-of-sight between them. I examine all of the assumptions necessary for my claims of security, something which is particularly important for moving these systems out of the lab and into commercial industry. I then go on to describe the free-space channel over which the photons are sent and the implementation of each of the major system components. I close with a discussion of the experiment which saw raw detected entangled photon rates of 565 s^{-1} and a quantum bit error rate (QBER) of 4.92% resulting in a final secure key rate of 85 bits/s. Over the six hour night time experiment I was able to generate 1,612,239 bits of secure key. With a successful QKD experiment completed, this thesis then turns to the problem of improving the technology to make it more practical by increasing the key rate of the system and thus the speed at which it can securely encrypt information. It does so in three different ways, involving each of the major disciplines comprising the system: measurement hardware, source technology, and software post-processing. First, I experimentally investigate a theoretical proposal for biasing the measurement bases in the QKD system showing a 79% improvement in the secret key generated from the same raw key rates. Next, I construct a second generation entangled photon source with rates two orders of magnitude higher than the previous source using the idea of a Sagnac interferometer. More importantly, the new source has a QBER as low as 0.93% which is not only important for the security of the QKD system but will be required for the implementation of a new cryptographic primitive later. Lastly, I study the free-space link transmission statistics and the use of a signal-to-noise ratio (SNR) filter to improve the key rate by 25.2% from the same amount of raw key. The link statistics have particular relevance for a current project with the Canadian Space Agency to exchange a quantum key with an orbiting satellite - a project which I have participated in two feasibility studies for. Wanting to study the usefulness of more recent ideas in quantum cryptography this thesis then looks at the first experimental implementation of a new cryptographic primitive called oblivious transfer (OT) in the noisy storage model. This primitive has obvious important applications as it can be used to implement a secure identification scheme provably secure in a quantum scenario. Such a scheme could one day be used, for example, to authenticate a user over short distances, such as at ATM machines, which have proven to be particularly vulnerable to hacking and fraud. Over a four hour experiment, Alice and Bob measure 405,642,088 entangled photon pairs with an average QBER of 0.93% allowing them to create a secure OT key of 8,939,150 bits. As a first implementer, I examine many of the pressing issues currently preventing the scheme from being more widely adopted such as the need to relax the dependance of the OT rate on the loss of the system and the need to extend the security proof to cover a wider range of quantum communication channels and memories. It is important to note that OT is fundamentally different than QKD for security as the information is never physically exchanged over the communication line but rather the joint equality function f(x) = f(y) is evaluated. Thus, security in QKD does not imply security for OT. Finally, this thesis concludes with the construction and initial alignment of a second generation free-space quantum receiver, useful for increasing the QKD key rates, but designed for a fundamental test of quantum theory namely a Svetlichny inequality violation. Svetlichny's inequality is a generalization of Bell's inequality to three particles where any two of the three particles maybe be non-locally correlated. Even so, a violation of Svetlichny's inequality shows that certain quantum mechanical states are incompatible with this restricted class of non-local yet realistic theories. Svetlichny's inequality is particularly important because while there has been an overwhelming number of Bell experiments performed testing two-body correlations, experiments on many-body systems have been few and far between. Experiments of this type are particularly valuable to explore since we live in a many-body world. The new receiver incorporates an active polarization analyzer capable of switching between measurement bases on a microsecond time-scale through the use of a Pockels cell while maintaining measurements of a high fidelity. Some of the initial alignment and analysis results are detailed including the final measured contrasts of 1:25.2 and 1:22.6 in the rectilinear and diagonal bases respectively.
3

Ultrasensitive Magnetometry and Imaging with NV Diamond

Kim, Changdong 2010 May 1900 (has links)
NV centers in a diamond are proving themselves to be good building blocks for quantum information, electron spin resonance (ESR) imaging, and sensor applications. The key feature of the NV is that it has an electron spin that can be polarized and read out at room temperature. The readout is optical, thus the magnetic field imaging can also be done easily. Magnetic field variation with feature sizes below 0.3 microns cannot be directly resolved, and so in this region magnetic resonance imaging must be employed. To realize the full sensitivity of NV diamond, the spin transition linewidth must be as narrow as possible. Additionally, in the case of NV ensembles for micron-sized magnetometers, there must be a high concentration of NV. To this end three techniques are explored: (1) Electron paramagnetic resonance (EPR) imaging with microwave field gradients, (2) Magic angle rotation of magnetic field, and (3) TEM irradiation to optimize the yield of NV in a diamond. For the EPR imaging demonstration a resonant microwave field gradient is used in place of the usual DC magnetic gradient to obtain enough spatial resolution to resolve two very close "double NV" centers in a type Ib bulk diamond. Microfabrication technology enabled the micron-size wire structure to sit directly on the surface of millimeter-scale diamond plate. In contrast to conventional magnetic resonance imaging pulsed ESR was used to measure the Rabi oscillations. From the beating of Rabi oscillations from a "double NV," the pair was resolved using the one-dimension EPR imaging (EPRI) and the spatial distance was obtained. To achieve high sensitivity in nitrogen-doped diamond, the dipole-dipole coupling between the electron spin of the NV center and the substitutional nitrogen (14N) electron must be suppressed because it causes linewidth broadening. Magic angle spinning is an accepted technique to push T2 and T2 * down toward the T1 limit. An experiment was performed using the HPHT diamond with a high concentration of nitrogen, and a rotating field was applied with a microfabricated wire structure to reduce line broadening. In this experiment, ~50% suppression of the linewidth was observed and the effective time constant T2* improved from 114 ns to 227 ns. To achieve the highest possible sensitivity for micro-scale magnetic sensors the concentration of NV should be large. Since the unconverted N are magnetic impurities they shorten T2 and T2*, giving a tradeoff between NV (and therefore N) concentration and sensitivity. To construct a damage monitor, a type Ib HPHT sample was irradiated with electrons from a transmission electron microscope (TEM) and the effects on the ESR transition were seen well before physical damage appeared on the diamond and thus this proved to be a sensitive metric for irradiation damage.
4

Practical issues in theoretical descriptions of experimental quantum state and entanglement estimation

Yin, Jun 06 1900 (has links)
xii, 133 p. : ill. (some col.) / We study entanglement estimation and verification in realistic situations, taking into account experimental imperfections and statistical fluctuations due to finite data. We consider both photonic and spin-1/2 systems. We study how entanglement created with mixed photon wave packets is degraded. We apply statistical analysis to and propose criteria for reliable entanglement verification and estimation. Finally we devote some effort to making quantum state estimation efficient by applying information criteria. This dissertation includes previously published co-authored material. / Committee in charge: Michael G. Raymer, Chair; Steven J. van Enk, Advisor; Stephen Hs,u Member; Jens U. Noeckel, Member; Je rey A. Cina, Outside Member;
5

Negative frequency at the horizon : scattering of light at a refractive index front

Jacquet, Maxime J. January 2017 (has links)
This thesis considers the problem of calculating and observing the mixing of modes of positive and negative frequency in inhomogeneous, dispersive media. Scattering of vacuum modes of the electromagnetic field at a moving interface in the refractive index of a dielectric medium is discussed. Kinematics arguments are used to demonstrate that this interface may, in a regime of linear dispersion, act as the analogue of the event horizon of a black hole to modes of the field. Furthermore, a study of the dispersion of the dielectric shows that five distinct configurations of modes of the inhomogeneous medium at the interface exist as a function of frequency. Thus it is shown that the interface is simultaneously a black- and white-hole horizon-like and horizonless emitter. The role, and importance, of negative-frequency modes of the field in mode conversion at the horizon is established and yields a calculation of the spontaneous photonic flux at the interface. An algorithm to calculate the scattering of vacuum modes at the interface is introduced. Spectra of the photonic flux in the moving and laboratory frame, for all modes and all realisable increase in the refractive index at the interface are computed. As a result of the various mode configurations, the spectra are highly structured in intervals with black-hole, white-hole and no horizon. The spectra are dominated by a negative-frequency mode, which is the partner in any Hawking-type emission. An experiment in which an incoming positive-frequency wave is populated with photons is assembled to observe the transfer of energy to outgoing waves of positive and negative frequency at the horizon. The effect of mode conversion at the interface is clearly shown to be a feature of horizon physics. This is a classical version of the quantum experiment that aims at validating the mechanism of Hawking radiation.

Page generated in 0.0667 seconds