Return to search

Investigations into phase effects from diffracted Gaussian beams for high-precision interferometry

Gravitational wave detectors are a new class of observatories aiming to detect gravitational waves from cosmic sources. All-reflective interferometer configurations have been proposed for future detectors, replacing transmissive optics with diffractive elements, thereby reducing thermal issues associated with power absorption. However, diffraction gratings introduce additional phase noise, creating more stringent conditions for alignment stability, and further investigations are required into all-reflective interferometers. A suitable mathematical framework using Gaussian modes is required for analysing the alignment stability using diffraction gratings. Such a framework was created, whereby small beam displacements are modelled using a modal technique. It was confirmed that the original modal-based model does not contain the phase changes associated with grating displacements. Experimental tests verified that the phase of a diffracted Gaussian beam is independent of the beam shape. Phase effects were further examined using a rigorous time-domain simulation tool. These findings show that the perceived phase difference is based on an intrinsic change of coordinate system within the modal-based model, and that the extra phase can be added manually to the modal expansion. This thesis provides a well-tested and detailed mathematical framework that can be used to develop simulation codes to model more complex layouts of all-reflective interferometers.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:573554
Date January 2013
CreatorsLodhia, Deepali
PublisherUniversity of Birmingham
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://etheses.bham.ac.uk//id/eprint/4331/

Page generated in 0.0017 seconds