• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 270
  • 52
  • 27
  • 25
  • 19
  • 10
  • 6
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 482
  • 482
  • 355
  • 335
  • 189
  • 99
  • 65
  • 64
  • 58
  • 53
  • 53
  • 53
  • 49
  • 49
  • 47
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
131

A study of data

Tiao, Hsao-Ying Jennifer January 2010 (has links)
Typescript (photocopy). / Digitized by Kansas Correctional Industries
132

Avoiding data inconsistency problems in the conceptual design of data bases: a semantic approach

Leasure, David Elden. January 1984 (has links)
Call number: LD2668 .T4 1984 L42 / Master of Science
133

Georeferenced Point Clouds: A Survey of Features and Point Cloud Management

Otepka, Johannes, Ghuffar, Sajid, Waldhauser, Christoph, Hochreiter, Ronald, Pfeifer, Norbert 25 October 2013 (has links) (PDF)
This paper presents a survey of georeferenced point clouds. Concentration is, on the one hand, put on features, which originate in the measurement process themselves, and features derived by processing the point cloud. On the other hand, approaches for the processing of georeferenced point clouds are reviewed. This includes the data structures, but also spatial processing concepts. We suggest a categorization of features into levels that reflect the amount of processing. Point clouds are found across many disciplines, which is reflected in the versatility of the literature suggesting specific features. (authors' abstract)
134

Tracing the compositional process : sound art that rewrites its own past : formation, praxis and a computer framework

Rutz, Hanns Holger January 2014 (has links)
The domain of this thesis is electroacoustic computer-based music and sound art. It investigates a facet of composition which is often neglected or ill-defined: the process of composing itself and its embedding in time. Previous research mostly focused on instrumental composition or, when electronic music was included, the computer was treated as a tool which would eventually be subtracted from the equation. The aim was either to explain a resultant piece of music by reconstructing the intention of the composer, or to explain human creativity by building a model of the mind. Our aim instead is to understand composition as an irreducible unfolding of material traces which takes place in its own temporality. This understanding is formalised as a software framework that traces creation time as a version graph of transactions. The instantiation and manipulation of any musical structure implemented within this framework is thereby automatically stored in a database. Not only can it be queried ex post by an external researcher—providing a new quality for the empirical analysis of the activity of composing—but it is an integral part of the composition environment. Therefore it can recursively become a source for the ongoing composition and introduce new ways of aesthetic expression. The framework aims to unify creation and performance time, fixed and generative composition, human and algorithmic “writing”, a writing that includes indeterminate elements which condense as concurrent vertices in the version graph. The second major contribution is a critical epistemological discourse on the question of ob- servability and the function of observation. Our goal is to explore a new direction of artistic research which is characterised by a mixed methodology of theoretical writing, technological development and artistic practice. The form of the thesis is an exercise in becoming process-like itself, wherein the epistemic thing is generated by translating the gaps between these three levels. This is my idea of the new aesthetics: That through the operation of a re-entry one may establish a sort of process “form”, yielding works which go beyond a categorical either “sound-in-itself” or “conceptualism”. Exemplary processes are revealed by deconstructing a series of existing pieces, as well as through the successful application of the new framework in the creation of new pieces.
135

Enhanced font services for X Window system

Tsang, Pong-fan, Dex, 曾邦勳 January 2000 (has links)
published_or_final_version / Computer Science and Information Systems / Master / Master of Philosophy
136

Modèles et algorithmes pour la planification de production à moyen terme en environnement incertain

Lenoir, Arnaud 14 November 2008 (has links) (PDF)
Nous nous intéressons dans cette thèse aux problèmes d'optimisation de systèmes de grande taille en environnement incertain et plus particulièrement à la résolution de leurs équivalents déterministes par des méthodes de décomposition de type proximal. L'application sous-jacente que nous avons à l'esprit est celle de la gestion optimale de la production électrique d'EDF soumise aux aléas climatique, de marche et de consommation. Nous mettons 'a plat les couplages naturels espace-temps- aléas liés à cette application et proposons deux nouveaux schémas de discrétisation pour le couplage des aléas, bas'es sur l'estimation non-paramétrique de espérance conditionnelle, qui constituent des alternatives à la construction d'arbres de scénarios. Nous nous intéressons ensuite aux méthodes de décomposition en travaillant sur un modèle général, celui de la minimisation d'une somme de deux fonctions convexes, la première séparable et l'autre couplante. D'une part, ce modèle simplifie nous exonéré de la technicité due à un choix particulier de cou- plage et de sous-système. D'autre part hypothèse de convexité permet de tirer parti de la théorie des opérateurs monotones et de l'identification des méthodes proximales comme des algorithmes de points fixes. Nous mettons l'accent sur les propriétés différentielles des opérateurs de réflexion généralisée dont on cherche un point fixe, qui permettent de borner la vitesse de convergence. Nous étudions ensuite deux familles d'algorithmes de décomposition-coordination issues des méthodes dites d'éclatement d'opérateurs, à savoir les méthodes Forward-Backward et de type Rachford. Nous suggérons quelques techniques d'accélération de la convergence des méthodes de type Rachford. Pour cela, nous analysons dans un premier temps la méthode d'un point de vue théorique, fournissant ainsi des explications à certaines observations numériques, avant de proposer des améliorations en réponse. Parmi elles, une mise a' jour automatique du facteur d'échelle permet de corriger son éventuel mauvais choix initial. La preuve de convergence de cette technique se voit facilitée grâce aux résultats de stabilité de certaines lois internes vis a' vis de la convergence graphique établis en amont. Nous soumettons aussi l'idée d'introduire des "sauts" dans la méthode lorsqu'elle est appliquée à des problèmes polyédraux, en fondant nos argument sur la géométrie formée par la suite des itérés. En dernier lieu, nous montrons qu'il est possible, en ajoutant un mécanisme de contrôle, de s'affranchir de la résolution de tous les sous-problèmes à chaque itération en préservant la convergence globale. L'intérêt pratique de ces suggestions est confirmé par des tests numériques sur le problème de gestion de production électrique.
137

Automates, énumération et algorithmes

Bassino, Frédérique 06 December 2005 (has links) (PDF)
Ces travaux s'inscrivent dans le cadre général de la théorie des automates, de la combinatoire des mots, de la combinatoire énumérative et de l'algorithmique. Ils ont en commun de traiter des automates et des langages réguliers, de problèmes d'énumération et de présenter des résultats constructifs, souvent explicitement sous forme d'algorithmes. Les domaines dont sont issus les problèmes abordés sont assez variés. Ce texte est compose de trois parties consacrées aux codes préfixes, à certaines séquences lexicographiques et à l'énumération d'automates.
138

Navigation dans les grands graphes

Hanusse, Nicolas 26 November 2009 (has links) (PDF)
L'idée directrice de ce travail est de montrer que bon nombre de requêtes peuvent être exprimées comme une navigation dans des graphes.
139

DEVELOPMENT AND TESTING OF DATA STRUCTURES FOR THE CPM/MRP METHODOLOGY.

Ardalan, Alireza January 1983 (has links)
A major purpose of this dissertation is to design and develop data structures for the Critical Path Method-Material Requirements Planning (CPM/MRP) methodology. The data structures developed consider the trade-off between processing time required to perform the required operations on data structures and the computer capacity utilization to store data. The CPM/MRP technique was designed to combine the capabilities of the critical path method and material requirements planning system. The critical path method is a project planning and control technique which schedules projects subject to technological sequence constraints and activity durations. When combined with material requirements planning, the methodology explicitly considers both the resources required by the activities comprising the project and the lead time to acquire the required resources. CPM/MRP contains algorithms for project scheduling subject to technological sequence and resource constraints. The early start and late start algorithms find feasible early start and late start schedules for both activity start times and resource order release times. The major drawback of the FORTRAN IV computer program which incorporated the CPM/MRP algorithms was the tremendous computer memory capacity requirements. This prohibited application of the CPM/MRP to large projects. The data structures developed in this dissertation are efficient with respect to both computer memory utilization and processing time. To design the data structures, the characteristics of storable and non-storable resources and the necessary operations within each resource category is studied. Another purpose of this dissertation is to develop an algorithm to schedule operating rooms for surgery procedures in hospitals subject to resource constraints to increase operating suite utilization. Since the major reason for low operating suite utilization is lack of required resources when they are needed and where they are needed, the CPM/MRP concept is applied to schedule surgeries. The late start algorithm outlined in this dissertation schedules surgeries and resources required for each surgery. The data structures and the surgery scheduling algorithm are incorporated into a FORTRAN IV computer program. The program has been tested with actual data gathered from a hospital. The results met the objectives of both computer memory utilization and low computation time.
140

Applied logic : its use and implementation as a programming tool

Warren, David H. D. January 1978 (has links)
The first Part of the thesis explains from first principles the concept of "logic programming" and its practical application in the programming language Prolog. Prolog is a simple but powerful language which encourages rapid, error-free programming and clear, readable, concise programs. The basic computational mechanism is a pattern matching process ("unification") operating on general record structures ("terms" of logic). IThe ideas are illustrated by describing in detail one sizable Prolog program which implements a simple compiler. The advantages and practicability of using Prolog for "real" compiler implementation are discussed. The second Part of the thesis describes techniques for implementing Prolog efficiently. In particular it is shown how to compile the patterns involved in the matching process into instructions of a low-level language. This idea has actually been implemented in a compiler (written in Prolog) from Prolog to DECsystem-10 assembly language. However the principles involved are explained more abstractly in terms of a "Prolog Machine". The code generated is comparable in speed with that produced by existing DEC10 Lisp compilers. Comparison is possible since pure Lisp can be viewed as a (rather restricted) subset of Prolog. It is argued that structured data objects, such as lists and trees, can be manipulated by pattern matching using a "structure 'sharing" representation as efficiently as by conventional selector and constructor functions operating on linked records in "heap" storage. Moreover the pattern matching formulation actually helps the implementor to produce a better implementation.

Page generated in 0.0868 seconds