• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Interferometric Methods for Seismic Monitoring in Industrial Environments

Dales, Philippe 19 October 2018 (has links)
As the global demand for energy and natural resources continues to increase so does our interaction with Earth's near surface through resource extraction and waste injection. In monitoring these interaction, seismology plays a central role. The focus of this work is on improving the detection and localization of seismic sources, a fundamental problem in seismology. After discussing the strengths and limitations of existing methods for source detection and localization, I develop a solution based on a beamforming approach that uses cross-correlation functions in a maximum likelihood search for sources of seismic energy. I call this method InterLoc, short for `interferometric locator', and apply it to data recorded at two active underground mines to demonstrate its effectiveness in monitoring both impulsive sources and persistent sources. Next, I demonstrate how persistent seismic sources, typically seen as contaminants, can be used directly to measure small changes in the medium between a source and either source-station pairs. This method relies on the ability to locate and monitor source activity and then use this information to identify and select cross-correlation functions to isolate each source of interest. From the resulting cross-correlations, it is possible to measure small temporal changes in the waveforms. To demonstrate this method, I show how ore-crushers can be used to track the growth of a block cave by measuring changes in traveltimes due to ray paths having to circumvent the growing cave. In the final chapter I focus on the development of a processing framework for the detection and location of microseismic events recorded on dense (or large-N) surface arrays. The proposed framework involves: (1) data reduction; (2) dividing the array into smaller sub-arrays; (3) waveform processing within fixed time windows; (4) stacking of time windows selected based on each potential origin time and source location; and (6) combining the output from all sub-arrays to infer detections and locations of sources. This methodology is validated with synthetic data built to emulate a real dataset from a 10,050 node survey to evaluate the suitability of land for carbon sequestration. Based on the presence of very strong coherent contaminating sources and low rock quality, I am only able to detect sources with moment magnitude greater than -0.5. In the five hours of data processed there is no positive detections suggesting this could be a good site for carbon storage. More work is needed to improve the detection threshold and quantify risk based on event location and magnitude. In summary, my work demonstrates how the interference (via cross-correlation) and stacking of seismic waveforms can be combined in different ways to create effective solutions for problems faced by today's industries.
2

Optimisation of galaxy identification methods on large interferometric surveys

Gqaza, Themba 14 May 2019 (has links)
The astronomical size of spectral data cubes that will result from the SKA pathfinders planned large HI surveys such as LADUMA; Fornax HI survey; DINGO; WALLABY; etc. necessitate fully automated three-dimensional (3D) source finding and parametrization tools. A fraction of the percentage difference in the performance of these automated tools corresponds to a significant number of galaxies being detected or undetected. Failure or success to resolve satellites around big spirals will affect both the low and the high mass end of the HI mass function. As a result, the performance and efficiency of these automated tools are of great importance, especially in the epoch of big data. Here I present the comprehensive comparison of performance between the fully automated source identification and parametrization software: SOFIA, the visual galaxy identification method and the semi-automated galaxy identification method. Each galaxy identification method has been applied to the same ∼ 35 gigabytes 3D HI data cube. The data cube results from the blind HI imaging survey conducted using the Westerbork Synthesis Radio Telescope (WSRT). The survey mapped the overdensity corresponding to the Perseus-Pisces Supercluster filament crossing the Zone-of-Avoidance (ZoA), at (`, b) ≈ (160◦ , 0.5◦ ). A total of 211 galaxies detected using the semi-automated method by Ramatsoku et al. [2016]. In this work, I detected 194 galaxies (using the visual identification method) of which 89.7% (174) have cross-matches/counterparts on the galaxy catalogue produced through semi-automated identification method. A total of 130 detections were made using SOFIA of which 89 were also identified by the two other methods. I used the sample of 174 visual detections with semi-automated counterparts as a Testbed to calculate the reliability and completeness achieved by SOFIA. The achieved reliability is ∼ 0.68 whereas completeness is ∼ 0.51. Further parameter fine-tuning is necessary to have a better handle on all SOFIA parameters and achieve higher reliability and completeness values.

Page generated in 0.0871 seconds