• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2879
  • 958
  • 497
  • 79
  • 70
  • 64
  • 56
  • 44
  • 39
  • 37
  • 36
  • 19
  • 16
  • 15
  • 13
  • Tagged with
  • 5364
  • 5364
  • 1268
  • 858
  • 846
  • 807
  • 757
  • 611
  • 553
  • 520
  • 481
  • 481
  • 413
  • 398
  • 384
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Harnessing open sound control for networked music in VST systems

Hector, Jason January 2009 (has links)
Includes abstract. / Includes bibliographical references (leaves 144-147). / Professional audio equipment is migrating towards general purpose computers running professional audio software systems such as Virtual Studio Technology [VST) and VST plugins that have been adopted as an informal standard for audio and MIDI plugins. The proliferation of computer networks has facilitated sharing and distributing musical content over networked technology and has benefits of allowing multiple, simultaneous, remote access to audio resources, it allows processing and computer hardware resources to be distributed amongst many computers and therefore has potential for VST processing farms, and facilities location independent musical collaborations. Open Sound Control is an open high-level application protocol developed to facilitate modern networked communications amongst audio processing units like VST plugins. OSC has been suggested as a successor to MIDI addressing its shortcomings in its design. This dissertation presents a prototype VST plugin called OscVstBridge that bridges the network isolated VST plugin and VST host to the OSC network domain. The prototype facilitates modern, high speed, networked, musical control communications amongst VST audio processes. OSC is an open protocol allowing OscVstBridge to communicate between VST systems from different vendors furthering inter-operability of audio and musical system and promotes standardisation.
72

Force-extension of the Amylose Polysaccharide

van den Berg, Rudolf January 2009 (has links)
Myasthenia gravis (MG) is an autoimmune disorder in which auto-antibodies directed at the acetylcholine receptors (AChR) of the neuromuscular junction (NMJ) block, alter or destroy their targets. The anti-AChR antibodies cause activation of the classical complement pathway leading to inflammatory injury at the NMJ. Decay Accelerating Factor (DAF), a member of complement regulatory proteins, prevents activation of autologous components of complement pathways. The absence of DAF, in knock-out mouse models, has been shown to significantly increase the susceptibility to experimental autoimmune MG. A previous study showed that a high proportion of South African MG patients of African genetic ancestry develop immunosuppressive therapy-resistant extraocular muscle (EOM) dysfunction. We hypothesized that these patients have deficient DAF expression in their EOMs resulting in less protection from complement injury. Sequence analysis of relevant regions of the DAF gene revealed a single nucleotide polymorphism (SNP), c.-198C>G, in the promoter region in MG patients of African genetic ancestry with severe EOM MG involvement (MG n=101; Control n= 132; Odds ratio= 6.6; p=0.009). DAF-luciferase reporter assays, using 3 different cell lines (COS-7, HT1080 and C2C12) revealed that the c.-198C>G SNP (Mut-DAF) led to an increase in DAF promoter activity (
73

A lossy, dictionary -based method for short message service (SMS) text compression

Martin, Wickus January 2009 (has links)
Short message service (SMS) message compression allows either more content to be fitted into a single message or fewer individual messages to be sent as part of a concatenated (or long) message. While essentially only dealing with plain text, many of the more popular compression methods do not bring about a massive reduction in size for short messages. The Global System for Mobile communications (GSM) specification suggests that untrained Huffman encoding is the only required compression scheme for SMS messaging, yet support for SMS compression is still not widely available on current handsets. This research shows that Huffman encoding might actually increase the size of very short messages and only modestly reduce the size of longer messages. While Huffman encoding yields better results for larger text sizes, handset users do not usually write very large messages consisting of thousands of characters. Instead, an alternative compression method called lossy dictionary-based (LD-based) compression is proposed here. In terms of this method, the coder uses a dictionary tuned to the most frequently used English words and economically encodes white space. The encoding is lossy in that the original case is not preserved; instead, the resulting output is all lower case, a loss that might be acceptable to most users. The LD-based method has been shown to outperform Huffman encoding for the text sizes typically used when writing SMS messages, reducing the size of even very short messages and even, for instance, cutting a long message down from five to two parts. Keywords: SMS, text compression, lossy compression, dictionary compression
74

An application of brain-based education principles with ICT as a cognitive tool: a case study of grade 6 decimal instruction at Sunlands Primary School

Le Roux, Zelda Joy January 2015 (has links)
The larger population of South African learners do not learn effectively and struggle with low academic achievements currently. This can be attributed to various factors such as frequent changes in the curriculum, underqualified educators, ineffective teaching methods and barriers to learning existing in classrooms today. Learners need extra support, including cognitive support, but in reality the heavy workload of educators may prevent them from giving learners the needed support. If support is given, it is minimal or not effective enough. Computer technologies may afford both educators and learners such opportunities when used as a cognitive tool in activities that provide the needed support. This research is concerned with the use of computer technology as a cognitive tool to activate learners' cognitive processes, thus enhancing learning, based on Brain Based Education principles. The objective is to lay the foundation in using computer technologies as cognitive tools in educators' teaching practice and instructional design to make teaching and learning more effective, interactive, real world based, giving meaning to what is learnt and to enhance understanding.
75

Individual document management techniques : an explorative study

Sello, Mpho January 2007 (has links)
Includes bibliographical references (leaves 65-68). / Individuals are generating, storing and accessing more information than ever before. The information comes from a variety of sources such as the World Wide Web, email and books. Storage media is becoming larger and cheaper. This makes accumulation of information easy. When information is kept in large volumes, retrieving it becomes a problem unless there is a system in place for managing this. This study examined the techniques that users have devised to make retrieval of their documents easy and timely. A survey of user document management techniques was done through interviews. The uncovered techniques were then used to build an expert system that provides assistance with document management decision-making. The system provides recommendations on file naming and organization, document backup and archiving as well as suitable storage media. The system poses a series of questions to the user and offers recommendations on the basis of the responses given. The system was evaluated by two categories of users: those who had been interviewed during data collection and those who had not been interviewed. Both categories of users found the recommendations made by the system to be reasonable and indicated that the system was easy to use. Some users thought the system could be of great benefit to people new to computers.
76

A parallel multidimensional weighted histogram analysis method

Potgieter, Andrew January 2014 (has links)
Includes bibliographical references. / The Weighted Histogram Analysis Method (WHAM) is a technique used to calculate free energy from molecular simulation data. WHAM recombines biased distributions of samples from multiple Umbrella Sampling simulations to yield an estimate of the global unbiased distribution. The WHAM algorithm iterates two coupled, non-linear, equations, until convergence at an acceptable level of accuracy. The equations have quadratic time complexity for a single reaction coordinate. However, this increases exponentially with the number of reaction coordinates under investigation, which makes multidimensional WHAM a computationally expensive procedure. There is potential to use general purpose graphics processing units (GPGPU) to accelerate the execution of the algorithm. Here we develop and evaluate a multidimensional GPGPU WHAM implementation to investigate the potential speed-up attained over its CPU counterpart. In addition, to avoid the cost of multiple Molecular Dynamics simulations and for validation of the implementations we develop a test system to generate samples analogous to Umbrella Sampling simulations. We observe a maximum problem size dependent speed-up of approximately 19 x for the GPGPU optimized WHAM implementation over our single threaded CPU optimized version. We find that the WHAM algorithm is amenable to GPU acceleration, which provides the means to study ever more complex molecular systems in reduced time periods.
77

Implementing the reliable data protocol on a Java enabled mobile device

Walsh, Justin January 2003 (has links)
Includes bibliographical references. / The author examines the architecture and discusses the implementation and problems encountered during implementation of the Reliable Data Protocol on a Java enabled mobile. A preliminary evaluation of the protocol implementation is presented.
78

Incompatibility of lognormal forward-Libor and Swap market models

Goschen, Wayne S January 2005 (has links)
Includes bibliographical references (p. 156-160). / The lognormal forward-Libor and Swap market models were formulated to price caps and swaptions. However, the prices computed by these two models, under equivalent measures, are reported to be unequal. This study investigates this incompatibility by computing the prices of caps and swaptions under both the forward Libor measure and the forward Swap measure, in both the Libor and Swap market models. This was done by building a computer program that implements the Monte Carlo versions of the models, using data from caps and swaptions traded in the South African market. It was found that the actual price of caps, using the same implied volatility, were simulated accurately in both the Libor and Swap market models under both the forward Libor measure and the forward Swap measure. On the other hand, although the actual swaption prices were also simulated accurately in both the Libor and Swap market models under both the forward Libor measure and the forward Swap measure, a different implied volatility was used for each model. Therefore, the swaption price computed by the Libor market model was inconsistent with the price generated by the Swap market model; the two models are indeed incompatible. In order to price interest rate derivatives consistently, either the Libor market model or the Swap market model must be chosen. Since the Libor market model priced consistently under the two forward measures, and the time taken to simulate a price in the Libor market model was much less than in the Swap market model, the practice in the market to use the Libor market model in favour of the Swap market model is justified.
79

Enhancing colour-coded poll sheets using computer vision as a viable Audience Response System (ARS) in Africa

Muchaneta, Irikidzai Zorodzai January 2018 (has links)
Audience Response Systems (ARS) give a facilitator accurate feedback on a question posed to the listeners. The most common form of ARS are clickers; Clickers are handheld response gadgets that act as a medium of communication between the students and facilitator. Clickers are prohibitively expensive creating a need to innovate low-cost alternatives with high accuracy. This study builds on earlier research by Gain (2013) which aims to show that computer vision and coloured poll sheets can be an alternative to clicker based ARS. This thesis examines a proposal to create an alternative to clickers applicable to the African context, where the main deterrent is cost. This thesis studies the computer vision structures of feature detection, extraction and recognition. In this research project, an experimental study was conducted using various lecture theatres with students ranging from 50 - 150. Python and OpenCV tools were used to analyze the photographs and document the performance as well as observing the different conditions in which to acquire results. The research had an average detection rate of 75% this points to a promising alternative audience response system as measured by time, cost and error rate. Further work on the capture of the poll sheet would significantly increase this result. With regards to cost, the computer vision coloured poll sheet alternative is significantly cheaper than clickers.
80

MIRMaid : an interface for a content based Music Information Retrieval test-bed

Cloete, Candice Lynn January 2006 (has links)
Word processed copy. / Includes bibliographical references (leaves 74-77). / MIRMaid is an acronym for Music Information Retrieval Modular aid and is an interface that allows different content based retrieval tasks to be compared against each other to find optimal combinations of retrieval parameters for specialised problem domains. The dissertation describes the process of how the MIRMaid interface was developed, modified and refined.

Page generated in 0.1538 seconds