• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

RFI Monitoring for the MeerKAT Radio Telescope

Schollar, Christopher 01 January 2015 (has links)
South Africa is currently building MeerKAT, a 64 dish radio telescope array, as a pre-cursor for the proposed Square Kilometre Array (SKA). Both telescopes will be located at a remote site in the Karoo with a low level of Radio Frequency Interference (RFI). It is important to maintain a low level of RFI to ensure that MeerKAT has an unobstructed view of the universe across its bandwidth. The only way to effectively manage the environment is with a record of RFI around the telescope. The RFI management team on the MeerKAT site has multiple tools for monitoring RFI. There is a 7 dish radio telescope array called KAT7 which is used for bi-weekly RFI scans on the horizon. The team has two RFI trailers which provide a mobile spectrum and transient measurement system. They also have commercial handheld spectrum analysers. Most of these tools are only used sporadically during RFI measurement campaigns. None of the tools provided a continuous record of the environment and none of them perform automatic RFI detection. Here we design and implement an automatic, continuous RFI monitoring solution for MeerKAT. The monitor consists of an auxiliary antenna on site which continuously captures and stores radio spectra. The statistics of the spectra describe the radio frequency environment and identify potential RFI sources. All of the stored RFI data is accessible over the web. Users can view the data using interactive visualisations or download the raw data. The monitor thus provides a continuous record of the RF environment, automatically detects RFI and makes this information easily accessible. This RFI monitor functioned successfully for over a year with minimal human intervention. The monitor assisted RFI management on site during RFI campaigns. The data has proved to be accurate, the RFI detection algorithm shown to be effective and the web visualisations have been tested by MeerKAT engineers and astronomers and proven to be useful. The monitor represents a clear improvement over previous monitoring solutions used by MeerKAT and is an effective site management tool.
2

VRBridge: a Constructivist Approach to Supporting Interaction Design and End-User Authoring in Virtual Reality

Winterbottom, Cara 01 June 2010 (has links)
For any technology to become widely-used and accepted, it must support end-user authoring and customisation. This means making the technology accessible by enabling understanding of its design issues and reducing its technical barriers. Our interest is in enabling end-users to author dynamic virtual environments (VEs), specifically their interactions: player interactions with objects and the environment; and object interactions with each other and the environment. This thesis describes a method to create tools and design aids which enable end-users to design and implement interactions in a VE and assist them in building the requisite domain knowledge, while reducing the costs of learning a new set of skills. Our design method is based in constructivism, which is a theory that examines the acquisition and use of knowledge. It provides principles for managing complexity in knowledge acquisition: multiplicity of representations and perspectives; simplicity of basic components; encouragement of exploration; support for deep reflection; and providing users with control of their process as much as possible. We derived two main design aids from these principles: multiple, interactive and synchronised domain-specific representations of the design; and multiple forms of non-invasive and user-adaptable scaffolding. The method began with extensive research into representations and scaffolding, followed by investigation of the design strategies of experts, the needs of novices and how best to support them with software, and the requirements of the VR domain. We also conducted a classroom observation of the practices of non-programmers in VR design, to discover their specific problems with effectively conceptualising and communicating interactions in VR. Based on our findings in this research and our constructivist guidelines, we developed VRBridge, an interaction authoring tool. This contained a simple event-action interface for creating interactions using trigger-condition-action triads or Triggersets. We conducted two experimental evaluations during the design of VRBridge, to test the effectiveness of our design aids and the basic tool. The first tested the effectiveness of the Triggersets and additional representations: a Floorplan, a Sequence Diagram and Timelines. We used observation, interviews and task success to evaluate how effectively end-users could analyse and debug interactions created with VRBridge. We found that the Triggersets were effective and usable by novices to analyse an interaction design, and that the representations significantly improved end-user work and experience. The second experiment was large-scale (124 participants) and conducted over two weeks. Participants worked on authoring tasks which embodied typical interactions and complexities in the domain. We used a task exploration metric, questionnaires and computer logging to evaluate aspects of task performance: how effectively end-users could create interactions with VRBridge; how effectively they worked in the domain of VR authoring; how much enjoyment or satisfaction they experienced during the process; and how well they learned over time. This experiment tested the entire system and the effects of the scaffolding and representations. We found that all users were able to complete authoring tasks using VRBridge after very little experience with the system and domain; all users improved and felt more satisfaction over time; users with representations or scaffolding as a design aid completed the task more expertly, explored more effectively, felt more satisfaction and learned better than those without design aids; users with representations explored more effectively and felt more satisfaction than those with scaffolding; and users with both design aids learned better but did not improve in any other way over users with a single design aid. We also gained evidence about how the scaffolding, representations and basic tool were used during the evaluation. The contributions of this thesis are: an effective and efficient theory-based design method; a case study in the use of constructivism to structure a design process and deliver effective tools; a proof-of-concept prototype with which novices can create interactions in VR without traditional programming; evidence about the problems that novices face when designing interactions and dealing with unfamiliar programming concepts; empirical evidence about the relative effectiveness of additional representations and scaffolding as support for designing interactions; guidelines for supporting end-user authoring in general; and guidelines for the design of effective interaction authoring systems in general.
3

Acceleration of the noise suppression component of the DUCHAMP source-finder.

Badenhorst, Scott 01 January 2015 (has links)
The next-generation of radio interferometer arrays - the proposed Square Kilometre Array (SKA) and its precursor instruments, The Karoo Array Telescope (MeerKAT) and Australian Square Kilometre Pathfinder (ASKAP) - will produce radio observation survey data orders of magnitude larger than current sizes. The sheer size of the imaged data produced necessitates fully automated solutions to accurately locate and produce useful scientific data for radio sources which are (for the most part) partially hidden within inherently noisy radio observations (source extraction). Automated extraction solutions exist but are computationally expensive and do not yet scale to the performance required to process large data in practical time-frames. The DUCHAMP software package is one of the most accurate source extraction packages for general (source shape unknown) source finding. DUCHAMP's accuracy is primarily facilitated by the à trous wavelet reconstruction algorithm, a multi-scale smoothing algorithm which suppresses erratic observation noise. This algorithm is the most computationally expensive and memory intensive within DUCHAMP and consequently improvements to it greatly improve overall DUCHAMP performance. We present a high performance, multithreaded implementation of the à trous algorithm with a focus on `desktop' computing hardware to enable standard researchers to do their own accelerated searches. Our solution consists of three main areas of improvement: single-core optimisation, multi-core parallelism and the efficient out-of-core computation of large data sets with memory management libraries. Efficient out-of-core computation (data partially stored on disk when primary memory resources are exceeded) of the à trous algorithm accounts for `desktop' computing's limited fast memory resources by mitigating the performance bottleneck associated with frequent secondary storage access. Although this work focuses on `desktop' hardware, the majority of the improvements developed are general enough to be used within other high performance computing models. Single-core optimisations improved algorithm accuracy by reducing rounding error and achieved a 4 serial performance increase which scales with the filter size used during reconstruction. Multithreading on a quad-core CPU further increased performance of the filtering operations within reconstruction to 22 (performance scaling approximately linear with increased CPU cores) and achieved 13 performance increase overall. All evaluated out-of-core memory management libraries performed poorly with parallelism. Single-threaded memory management partially mitigated the slow disk access bottleneck and achieved a 3.6 increase (uniform for all tested large data sets) for filtering operations and a 1.5 increase overall. Faster secondary storage solutions such as Solid State Drives or RAID arrays are required to process large survey data on `desktop' hardware in practical time-frames.

Page generated in 0.0197 seconds