• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • No language data
  • Tagged with
  • 487
  • 487
  • 487
  • 171
  • 157
  • 155
  • 155
  • 68
  • 57
  • 48
  • 33
  • 29
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Perceptions and practice of Gov2.0 in English local government

Barrance, Thomas Alexander January 2016 (has links)
Gov2.0 is an emerging and contested subject that offers a radical alternative to the construction of relationships between residents and their local authorities. This research investigates the practice of Gov2.0 and practitioners’ perceptions of this in English local authorities. The research combines analysis of practices through a content analysis of 50 principal local authority web sites and use of Q-methodology to identify the shared subjective frames of reference of 52 local government actors. The literature surrounding Gov2.0 is found to be lacking a clear theoretical model. A model is presented as a basis for an exploration of the practice and common understanding of the subject. Levels of inconsistency in adoption of Gov2.0 that are not defined by political party control, geography or authority governance structure are identified. The results of the Q-methodology examination of individual perspectives are discussed, and four frames of reference which provide a foundation for variations of practice observed are proposed. This research offers a theoretical model for understanding Gov2.0; it identifies four distinct frames of reference held by practitioners regarding Gov2.0 and presents an analysis of the range of adoption practices within English local authorities.
162

Modelling metrical flux : an adaptive frequency neural network for expressive rhythmic perception and prediction

Elmsley, Andrew J. January 2017 (has links)
Beat induction is the perceptual and cognitive process by which humans listen to music and perceive a steady pulse. Computationally modelling beat induction is important for many Music Information Retrieval (MIR) methods and is in general an open problem, especially when processing expressive timing, e.g. tempo changes or rubato. A neuro-cognitive model has been proposed, the Gradient Frequency Neural Network (GFNN), which can model the perception of pulse and metre. GFNNs have been applied successfully to a range of ‘difficult’ music perception problems such as polyrhythms and syncopation. This thesis explores the use of GFNNs for expressive rhythm perception and modelling, addressing the current gap in knowledge for how to deal with varying tempo and expressive timing in automated and interactive music systems. The cannonical oscillators contained in a GFNN have entrainment properties, allowing phase shifts and resulting in changes to the observed frequencies. This makes them good candidates for solving the expressive timing problem. It is found that modelling a metrical perception with GFNNs can improve a machine learning music model. However, it is also discovered that GFNNs perform poorly when dealing with tempo changes in the stimulus. Therefore, a novel Adaptive Frequency Neural Network (AFNN) is introduced; extending the GFNN with a Hebbian learning rule on oscillator frequencies. Two new adaptive behaviours (attraction and elasticity) increase entrainment in the oscillators, and increase the computational efficiency of the model by allowing for a great reduction in the size of the network. The AFNN is evaluated over a series of experiments on sets of symbolic and audio rhythms both from the literature and created specifically for this research. Where previous work with GFNNs has focused on frequency and amplitude responses, this thesis considers phase information as critical for pulse perception. Evaluating the time-based output, it was found that AFNNs behave differently to GFNNs: responses to symbolic stimuli with both steady and varying pulses are significantly improved, and on audio data the AFNNs performance matches the GFNN, despite its lower density. The thesis argues that AFNNs could replace the linear filtering methods commonly used in beat tracking and tempo estimation systems, and lead to more accurate methods.
163

Portfolio of compositions

Luque Ancona, Sergio January 2012 (has links)
A portfolio of compositions for acoustic instruments, electronic resources alone, and for acoustic instruments and live electronics. The accompanying commentary describes the aesthetic and the context of these works, their approach to form, and traces the development of the techniques used in their composition. In particular, it discusses a variety of approaches to computer-aided algorithmic composition and stochastic processes for the generation of musical elements (e.g. chord sequences, rhythmic patterns, sound structures). Included in the commentary is a description of research into stochastic synthesis, and of the development of a personal implementation of Dynamic Stochastic Synthesis and Stochastic Concatenation of Dynamic Stochastic Synthesis in SuperCollider. LIST OF WORKS Surveillance (2011) for computer 15:00 Daisy (2011) for computer 9:40 Absorbed (2010) for 2 violas 9:00 My idea of fun (2010) for clarinet, percussion and viola 7:00 Brazil (2009) for computer 8:10 Spine (2008) for English horn, percussion, violin and double bass 8:30 "Sex, Drugs and Rock 'n Roll" was never meant to be like this (2007) for computer 9:40 Don't have any evidence (2007) for bass flute, English horn, bass clarinet, bassoon, percussion, piano, violin, viola, cello and double bass 9:30 Happy Birthday (2006/2007) for computer 8:00 Résistance (2006) for accordion 4:00 My life has been filled with terrible misfortune; most of which never happened (2004) for bass clarinet, violin, viola, cello, double bass and live electronics 9:00.
164

Penalized regression methods with application to generalized linear models, generalized additive models, and smoothing

Utami Zuliana, Sri January 2017 (has links)
Recently, penalized regression has been used for dealing problems which found in maximum likelihood estimation such as correlated parameters and a large number of predictors. The main issues in this regression is how to select the optimal model. In this thesis, Schall’s algorithm is proposed as an automatic selection of weight of penalty. The algorithm has two steps. First, the coefficient estimates are obtained with an arbitrary penalty weight. Second, an estimate of penalty weight λ can be calculated by the ratio of the variance of error and the variance of coefficient. The iteration is continued from step one until an estimate of penalty weight converge. The computational cost is minimized because the optimal weight of penalty could be obtained within a small number of iterations. In this thesis, Schall’s algorithm is investigated for ridge regression, lasso regression and two-dimensional histogram smoothing. The proposed algorithm are applied to real data sets and simulation data sets. In addition, a new algorithm for lasso regression is proposed. The performance of results of the algorithm was almost comparable in all applications. Schall’s algorithm can be an efficient algorithm for selection of weight of penalty.
165

A rule-based approach for recognition of chemical structure diagrams

Sadawi, Noureddin January 2013 (has links)
In chemical literature much information is given in the form of diagrams depicting chemical structures. In order to access this information electronically, diagrams have to be recognized and translated into a processable format. Although a number of approaches have been proposed for the recognition of molecule diagrams in the literature, they traditionally employ procedural methods with limited flexibility and extensibility. This thesis presents a novel approach that models the principal recognition steps for molecule diagrams in a strictly rule based system. We develop a framework that enables the definition of a set of rules for the recognition of different bond types and arrangements as well as for resolving possible ambiguities. This allows us to view the diagram recognition problem as a process of rewriting an initial set of geometric artefacts into a graph representation of a chemical diagram without the need to adhere to a rigid procedure. We demonstrate the flexibility of the approach by extending it to capture new bond types and compositions. In experimental evaluation we can show that an implementation of our approach outperforms the currently available leading open source system. Finally, we discuss how our framework could be applied to other automatic diagram recognition tasks.
166

Speech recognition by computer : algorithms and architectures

Tyler, J. E. M. January 1988 (has links)
This work is concerned with the investigation of algorithms and architectures for computer recognition of human speech. Three speech recognition algorithms have been implemented, using (a) Walsh Analysis, (b) Fourier Analysis and (c) Linear Predictive Coding. The Fourier Analysis algorithm made use of the Prime-number Fourier Transform technique. The Linear Predictive Coding algorithm made use of LeRoux and Gueguen's method for calculating the coefficients. The system was organised so that the speech samples could be input to a PC/XT microcomputer in a typical office environment. The PC/XT was linked via Ethernet to a Sun 2/180s computer system which allowed the data to be stored on a Winchester disk so that the data used for testing each algorithm was identical. The recognition algorithms were implemented entirely in Pascal, to allow evaluation to take place on several different machines. The effectiveness of the algorithms was tested with a group of five naive speakers, results being in the form of recognition scores. The results showed the superiority of the Linear Predictive Coding algorithm, which achieved a mean recognition score of 93.3%. The software was implemented on three different computer systems. These were an 8-bit microprocessor, a sixteen-bit microcomputer based on the IBM PC/XT, and a Motorola 68020 based Sun Workstation. The effectiveness of the implementations was measured in terms of speed of execution of the recognition software. By limiting the vocabulary to ten words, it has been shown that it would be possible to achieve recognition of isolated utterances in real time using a single 68020 microprocessor. The definition of real time in this context is understood to mean that the recognition task will on average, be completed within the duration of the utterance, for all the utterances in the recogniser's vocabulary. A speech recogniser architecture is proposed which would achieve real time speech recognition without any limitation being placed upon (a) the order of the transform, and (b) the size of the recogniser's vocabulary. This is achieved by utilising a pipeline of four processors, with the pattern matching process performed in parallel on groups of words in the vocabulary.
167

An investigation into automation of fire field modelling techniques

Taylor, Stephen John January 1997 (has links)
The research described in this thesis has produced a prototype system based on fire field modelling techniques for use by members of the Fire Safety Engineering community who are not expert in modelling techniques. The system captures the qualitative reasoning of an experienced modeller in the assessment of room geometries in order to setup important initial parameters of the problem. The prototype system is based on artificial intelligence techniques, specifically expert system technology. It is implemented as a case based reasoning (CBR) system, primarily because it was discovered that the expert uses case based reasoning when manually dealing with such problems. The thesis answers three basic research questions. These are organised into a primary question and two subsidiary questions. The primary question is: how can CFD setup for fire modelling problems be automated? From this, the two subsidiary questions are concerned with how to represent the qualitative and quantitative knowledge associated with fire modelling; and selection of the most appropriate method of knowledge storage and retrieval. The thesis describes how knowledge has been acquired and represented for the system, pattern recognition issues, the methods of knowledge storage and retrieval chosen, the implementation of the prototype system and validation. Validation has shown that the system models the expert’s knowledge in a satisfactory way and that the system performs competently when faced with new problems. The thesis concludes with a section regarding new research questions arising from the research, and the further work these questions entail.
168

The use of some non-minimal representations to improve the effectiveness of genetic algorithms

Robbins, Phil January 1995 (has links)
In the unitation representation used in genetic algorithms, the number of genotypes that map onto each phenotype varies greatly. This leads to an attractor in phenotype space which impairs the performance of the genetic algorithm. The attractor is illustrated theoretically and empirically. A new representation, called the length varying representation (LVR), allows unitation chromosomes of varying length (and hence with a variety of attractors) to coexist. Chromosomes whose lengths yield attractors close to optima come to dominate the population. The LVR is shown to be more effective than the unitation representation against a variety of fitness functions. However, the LVR preferentially converges towards the low end of phenotype space. The phenotype shift representation (PSR), which retains the ability of the LVR to select for attractors that are close to optima, whilst using a fixed length chromosome and thus avoiding the asymmetries inherent in the LVR, is defined. The PSR is more effective than the LVR and the results compare favourably with previously published results from eight other algorithms. The internal operation of the PSR is investigated. The PSR is extended to cover multi-dimensional problems. The premise that improvements in performance may be attained by the insertion of introns, non-coding sequences affecting linkage, into traditional bit string chromosomes is investigated. In this investigation, using a population size of 50, there was no evidence of improvement in performance. However, the position of the optima relative to the hamming cliffs is shown to have a major effect on the performance of the genetic algorithm using the binary representation, and the inadequacy of the traditional crossover and mutation operators in this context is demonstrated. Also, the disallowance of duplicate population members was found to improve performance over the standard generational replacement strategy in all trials.
169

Applying experientialism to HCI methods

Imaz, Manuel January 2001 (has links)
The aim of this thesis is to incorporate the results of Experientialism in the domain of Human-Computer Interaction. The purpose is twofold: on the one hand it shows how some concepts of Experientialism like metaphor, image-schema, stories or conceptual integration may be used to explain where some concepts of HCI come from. On the other hand it uses the same conceptual background to support the design activity: the same concepts of Experientialism may be employed to build new conceptual artifacts in order to design User Interfaces and application software in general. One of the most fruitful ideas Experientislim may offer is conceptual integration as the basis upon which to construct new design solutions. Notwithstanding the pervasive use of metaphor in everyday language and even in HCI texts, there is a considerable amount of criticism regarding the use of metaphor in designing user interfaces based on the assumption that this practice may be the origin of troubles when using such software products. That is why one of the chapters is aimed at showing that not only the use of metaphor is pervasive in HCI but even the use of figurative language as well. Not only figurative language is usually employed but it is even one of the main tools for conceptualising new ideas and concepts required in the activity of software development. The Thesis proposes a framework aimed at designing User Interfaces based on the concepts of Experientialism. The proposal integrates two phases (analysis and design) the same way as most of software development methods do, trying to profit on the broad scope of the cognitive processes such as image-schema, metaphor and conceptual integration. These general concepts may be well suited to build conceptual models upon which to elaborate the user interfaces and the optirnalizy principles proposed to study the suitability of conceptual integration may be also used as validity criteria to evaluate such design artifacts. In order to validate such a proposal, the Thesis shows how to use the framework in two different situations: i) to explain why a problem such as the Mac trashcan -used to eject diskettes- is not a problem of using metaphors but an unfortunate design decision, and ii) to be applied in the design of a new User Interface. Other concepts of Experientialism are proposed in capturing user requirements. The concept of story is the ground on which to build scenarios or use cases, as stories are a more general cognitive process and a form of telling things at a more general level. That is why the user stories may be mapped to use cases, as both are essentially different type of stories and the capture of requirements is a way of specifying one type of stories (use cases) based on the original stories (user stories).
170

A configuration approach for selecting a data warehouse architecture

Weir, Robert January 2008 (has links)
Living in the Information Age, organisations must be able to exploit their data alongside the traditional economic resources of man, machine and money. Accordingly, organisations implement data warehouses to organise and consolidate their data, which creates a decision support system that is “subject oriented”, “time variant”, “integrated” and “non-volatile”. However, the organisation's ability to successfully exploit their data is determined by the degree of strategic alignment. As such, this study poses the question: how can a data warehouse be successfully and demonstrably aligned to an organisation's strategic objectives? This thesis demonstrates that strategic alignment can be achieved by following a new "top down" data warehouse implementation framework, the Configuration Approach, which is based upon determining an organisation's target configuration. This was achieved by employing Miles and Snow's Ideal Types to formulate a questionnaire that reveals an organisation's target configuration in terms of its approach to the Entrepreneurial, Administration and Information Systems challenges. Crucially, this thesis also provides the means to choose a data warehouse architecture that is wholly based on the organisation's target configuration. The Configuration Approach was evaluated using a single case study undergoing a period of strategic transformation where the implementation of a data warehouse was key to its strategic ambitions. The case study illustrated how it is possible to articulate an organisation's strategic configuration, which becomes the key driver for building a warehouse that demonstrably supports the resolution of its Entrepreneurial and Administration challenges. Significantly, the case study also provides a unique opportunity to demonstrate how the target configuration helps organisations to make the right choice of data warehouse architecture to satisfy the Information Systems challenge. In this case, the Configuration Approach provides a basis for challenging the architectural choices made by a consultancy on behalf of the participating organisation. Accordingly, it can be asserted that data warehouses are strategic investments, if implemented using the Configuration Approach.

Page generated in 0.0906 seconds