• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 327
  • 58
  • 46
  • 35
  • 21
  • 9
  • 9
  • 8
  • 7
  • 6
  • 4
  • 4
  • 4
  • 3
  • 3
  • Tagged with
  • 634
  • 66
  • 65
  • 54
  • 54
  • 49
  • 47
  • 45
  • 41
  • 35
  • 35
  • 34
  • 33
  • 33
  • 32
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

EFFICIENT CONSTRUCTION OF ACCURATE MULTIPLE ALIGNMENTS AND LARGE-SCALE PHYLOGENIES

Wheeler, Travis John January 2009 (has links)
A central focus of computational biology is to organize and make use of vast stores of molecular sequence data. Two of the most studied and fundamental problems in the field are sequence alignment and phylogeny inference. The problem of multiple sequence alignment is to take a set of DNA, RNA, or protein sequences and identify related segments of these sequences. Perhaps the most common use of alignments of multiple sequences is as input for methods designed to infer a phylogeny, or tree describing the evolutionary history of the sequences. The two problems are circularly related: standard phylogeny inference methods take a multiple sequence alignment as input, while computation of a rudimentary phylogeny is a step in the standard multiple sequence alignment method.Efficient computation of high-quality alignments, and of high-quality phylogenies based on those alignments, are both open problems in the field of computational biology. The first part of the dissertation gives details of my efforts to identify a best-of-breed method for each stage of the standard form-and-polish heuristic for aligning multiple sequences; the result of these efforts is a tool, called Opal, that achieves state-of-the-art 84.7% accuracy on the BAliBASE alignment benchmark. The second part of the dissertation describes a new algorithm that dramatically increases the speed and scalability of a common method for phylogeny inference called neighbor-joining; this algorithm is implemented in a new tool, called NINJA, which is more than an order of magnitude faster than a very fast implementation of the canonical algorithm, for example building a tree on 218,000 sequences in under 6 days using a single processor computer.
42

Consistency Maintenance for Multiplayer Video Games

Fletcher, Robert D. S. 16 January 2008 (has links)
Multiplayer games have to support activities which have differing usability requirements. The usability of the system is directly influenced by the choice of consistency maintenance algorithm. These algorithms must accommodate usability requirements while ensuring shared data is accurately replicated. We demonstrate that consistency maintenance in games can be organized around the AMP properties which state that separate nodes can maintain their instances of shared data using different algorithms (asymmetry), multiple consistency maintenance algorithms can be used within an application (multiplicity), and that consistency maintenance algorithms should be created as modular components (plug-replaceability). The motivation for AMP is outlined with a review of examples from commercial 3D games and related research. Consistency maintenance algorithms are shown to exist in a usability trade-off space. A set of usability metrics is introduced and used to experimentally explore this space. Our results imply that no single algorithm is suitable for every in-game situation. The thesis concludes with an informal evaluation of the AMP based on our experience using the Fiaa.NET as an AMP framework. We found that AMP had several weaknesses, but that these were outweighed by the benefits for the developer. / Thesis (Master, Computing) -- Queen's University, 2008-01-14 23:11:30.657
43

The ability to generate or inhibit responses after frontal lobectomy /

Miller, Laurie Ann January 1987 (has links)
The ability to generate different responses, and the ability to inhibit inappropriate behaviour, were explored in patients with unilateral cerebral excisions. Site-of-lesion effects were found to interact with the sex of the subject, the time of test-administration, and the nature of the response criteria. In Part I, the Thurstone Word Fluency Test revealed impairments two weeks postoperatively in patients with frontal, temporal, or central-area lesions. In men, removals from the left cerebral hemisphere caused greater deficits than removals from the right, but only left central-area excisions resulted in long-lasting impairments. Patients with left frontal-lobe removals produced few words on a sentence-completion fluency task, but on visual-image fluency, no patient-group was impaired. In Part II, an inability to inhibit impulsive actions on risk-taking tasks was seen after frontal lobectomy, as was a tendency to disregard the instructions on a word-fluency task. These results are consistent with the fact that patients with frontal-lobe lesions described themselves on a behavioural-trait questionnaire as less flexible and more impulsive than did control subjects.
44

Learning matrix and functional models in high-dimensions

Balasubramanian, Krishnakumar 27 August 2014 (has links)
Statistical machine learning methods provide us with a principled framework for extracting meaningful information from noisy high-dimensional data sets. A significant feature of such procedures is that the inferences made are statistically significant, computationally efficient and scientifically meaningful. In this thesis we make several contributions to such statistical procedures. Our contributions are two-fold. We first address prediction and estimation problems in non-standard situations. We show that even when given no access to labeled samples, one can still consistently estimate error rate of predictors and train predictors with respect to a given (convex) loss function. We next propose an efficient procedure for predicting with large output spaces, that scales logarithmically in the dimensionality of the output space. We further propose an asymptotically optimal procedure for sparse multi-task learning when the tasks share a joint support. We show consistency of the proposed method and derive rates of convergence. We next address the problem of learning meaningful representations of data. We propose a method for learning sparse representations that takes into account the structure of the data space and demonstrate how it enables one to obtain meaningful features. We establish sample complexity results for the proposed approach. We then propose a model-free feature selection procedure and establish its sure-screening property in the high dimensional regime. Furthermore we show that with a slight modification, the approach previously proposed for sparse multi-task learning enables one to obtain sparse representations for multiple related tasks simultaneously.
45

Conditional discrimination acquisition in young children : are the facilitative of naming due to stimulus discrimination? /

Stull, Anne K. January 2007 (has links) (PDF)
Thesis (M.A.)--University of North Carolina Wilmington, 2007. / Includes bibliographical references (leaves: 129-131)
46

An investigation into the experiences of managers who work flexibly

Anderson, Deirdre January 2008 (has links)
This thesis explores the experiences of managers who work flexibly. Flexible working policies are prevalent in all organizations in the UK because of the legislation giving specific groups of parents and carers the right to request flexible working. Many organizations extend the policies to all employees, yet the take-up is not as high as expected, particularly among staff at managerial levels. This thesis explores how managers construe and experience flexible working arrangements while successfully fulfilling their roles as managers of people. The exploratory study consisted of interviews with eight managers with unique flexible working patterns. Analysis of the interview transcripts identified concepts of consistency and adaptability. Consistency refers to meeting fixed needs from the work and non-work domains, and adaptability refers to the adjustment of schedules to meet the changing demands from those domains. The concepts of consistency and adaptability were further explored in the main study which is based on interviews with 24 women and 10 men who held managerial positions and had a flexible working arrangement which reduced their face time in the workplace. The research offers three main contributions to the literature. At a theoretical level, I propose a model which demonstrates how individuals use consistency and adaptability to meet the fixed and changing demands from the work and non-work domains. This model extends understanding of the complexity of the segmentation/integration continuum of boundary theory, explaining how and why managers use flexible working arrangements as a means of managing boundaries and achieving desired goals in both domains. Four distinct clusters emerged among the managerial participants in terms of the type and direction of adaptability, indicating the range of strategies used by managers to ensure the success of their flexible working arrangements. A detailed description of managers’ flexible working experiences is provided, adding to what is known about the role of manager through the exploration of the enactment of that role when working flexibly.
47

Gentzenův důkaz bezespornosti aritmetiky / Gentzen's Consistency Proof

Horská, Anna January 2011 (has links)
This paper contains detailed description of two consistency proofs, which state that in the system called Peano arithmetic no contradiction can be obtained. The proofs were first published in 1936 and 1938 by the German mathematician Gerhard Gentzen. For the purpose of this paper, the proofs were read and studied from the original articles called "Die Widerspruchsfreiheit der reinen Zahlentheorie" and "Neue Fassung des Widerspruchsfreiheitsbeweises für die reine Zahlentheorie". The first mentioned proof is interesting from the historical point of view. Gentzen used a natural deduction sequent calculus and ordinal numbers in an unusual form he invented. The second proof is similar to the consistency proof, which is commonly known as a consistency proof for Peano arithmetic nowadays.
48

Ensuring performance and correctness for legacy parallel programs

McPherson, Andrew John January 2015 (has links)
Modern computers are based on manycore architectures, with multiple processors on a single silicon chip. In this environment programmers are required to make use of parallelism to fully exploit the available cores. This can either be within a single chip, normally using shared-memory programming or at a larger scale on a cluster of chips, normally using message-passing. Legacy programs written using either paradigm face issues when run on modern manycore architectures. In message-passing the problem is performance related, with clusters based on manycores introducing necessarily tiered topologies that unaware programs may not fully exploit. In shared-memory it is a correctness problem, with modern systems employing more relaxed memory consistency models, on which legacy programs were not designed to operate. Solutions to this correctness problem exist, but introduce a performance problem as they are necessarily conservative. This thesis focuses on addressing these problems, largely through compile-time analysis and transformation. The first technique proposed is a method for statically determining the communication graph of an MPI program. This is then used to optimise process placement in a cluster of CMPs. Using the 64-process versions of the NAS parallel benchmarks, we see an average of 28% (7%) improvement in communication localisation over by-rank scheduling for 8-core (12-core) CMP-based clusters, representing the maximum possible improvement. Secondly, we move into the shared-memory paradigm, identifying and proving necessary conditions for a read to be an acquire. This can be used to improve solutions in several application areas, two of which we then explore. We apply our acquire signatures to the problem of fence placement for legacy well-synchronised programs. We find that applying our signatures, we can reduce the number of fences placed by an average of 62%, leading to a speedup of up to 2.64x over an existing practical technique. Finally, we develop a dynamic synchronisation detection tool known as SyncDetect. This proof of concept tool leverages our acquire signatures to more accurately detect ad hoc synchronisations in running programs and provides the programmer with a report of their locations in the source code. The tool aims to assist programmers with the notoriously difficult problem of parallel debugging and in manually porting legacy programs to more modern (relaxed) memory consistency models.
49

Consistency of Cognitions in Remarriage: A Test of the Consistency Tenet of the Multidimensional Cognitive-Developmental Model

Campbell, JaNae Elise 01 May 2009 (has links)
Remarriages have been increasing over the last several decades, yet little has been done in establishing theories and interventions specific to remarried couples and stepfamilies. Fine and Kurdek proposed a model specific to individuals in a remarriage situation. In an effort to validate their model, this study tested a key tenet, the tenet of consistency in cognitions, across spouses. Data were analyzed from the "Relationship Quality and Stability in Utah Newlywed Remarriages" study. With a sample of 449 couples, a series of correlations and backward regressions were completed. The results indicate that individual perceptions are more predictive of remarital quality than is consistency of cognitions. A critique of the Multidimensional Cognitive-Developmental Model is discussed. Limitations are addressed and recommendations for future research are given.
50

G+: A Constraint-Based System for Geometric Modeling

Lawrence, Joseph Britto 03 August 2002 (has links)
Most commercial CAD systems do not offer sufficient support for the design activity. The reason is that they cannot understand the functional requirements of the design product. The user is responsible for maintaining the functional requirements in different design phases. By incorporating constraint programming concepts, these CAD systems would evolve into systems which would maintain the functional requirements in the design process, and perform analysis and simulation of geometric models. The CAD systems incorporated with constraint programming concepts would reduce design time, avoid human fatigue and error, and also maintain consistency of the geometric constraints imposed on the model. The G+ system addresses these issues by introducing a constraint-based system for geometric modeling by object-oriented methods. The G+ is designed such that available specialized algorithms can be utilized to enable handling of non-linear problems by both iterative and non-iterative schemes.

Page generated in 0.0672 seconds