• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 659
  • 91
  • 45
  • 38
  • 15
  • 11
  • 11
  • 11
  • 11
  • 11
  • 11
  • 7
  • 4
  • 3
  • 3
  • Tagged with
  • 963
  • 963
  • 950
  • 904
  • 257
  • 238
  • 228
  • 182
  • 139
  • 86
  • 83
  • 69
  • 67
  • 60
  • 53
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

On-demand metadata extraction network (OMEN)

McEnnis, Daniel. January 2006 (has links)
OMEN (On-demand Metadata Extraction Network) addresses a fundamental problem in Music Information Retrieval: the lack of universal access to a large dataset containing significant amounts of copyrighted music. This thesis proposes a solution to this problem that is accomplished by utilizing the large collections of digitized music available at many libraries. Using OMEN, libraries will be able to perform on-demand feature extraction on site, returning feature values to researchers instead of providing direct access to the recordings themselves. This avoids copyright difficulties, since the underlying music never leaves the library that owns it. The analysis is performed using grid-style computation on library machines that are otherwise under-used (e.g., devoted to patron web and catalogue use).
202

Using statistical and knowledge-based approaches for literature-based discovery /

Yildiz, Meliha Yetisgen. January 2007 (has links)
Thesis (Ph. D.)--University of Washington, 2007. / Vita. Includes bibliographical references (leaves 97-103).
203

Citation analysis and emerging technologies foundational technical developments in optical disk technology /

Wrenn, Thomas T. January 1989 (has links)
Thesis (Ed. D.)--West Virginia University, 1989. / Vita. eContent provider-neutral record in process. Description based on print version record. Includes bibliographical references (leaves 101-106).
204

Evaluation of the development and impact of clinical information systems /

Ho, Lai-ming. January 1998 (has links)
Thesis (Ph. D.)--University of Hong Kong, 1998. / Includes bibliographical references (leaves 347-365).
205

On-demand metadata extraction network (OMEN)

McEnnis, Daniel. January 2006 (has links)
No description available.
206

Order adaptive file organizations to be used in connection with serial scan

Elrod, Thomas Howard. January 1960 (has links)
Call number: LD2668 .T4 1960 E47
207

Design and development of a microcomputer controlled automatic storage and retrieval model

Sethia, Pradeep. January 1984 (has links)
Call number: LD2668 .T4 1984 S47 / Master of Science
208

Compile-time optimisation of store usage in lazy functional programs

Hamilton, Geoffrey William January 1993 (has links)
Functional languages offer a number of advantages over their imperative counterparts. However, a substantial amount of the time spent on processing functional programs is due to the large amount of storage management which must be performed. Two apparent reasons for this are that the programmer is prevented from including explicit storage management operations in programs which have a purely functional semantics, and that more readable programs are often far from optimal in their use of storage. Correspondingly, two alternative approaches to the optimisation of store usage at compile-time are presented in this thesis. The first approach is called compile-time garbage collection. This approach involves determining at compile-time which cells are no longer required for the evaluation of a program, and making these cells available for further use. This overcomes the problem of a programmer not being able to indicate explicitly that a store cell can be made available for further use. Three different methods for performing compile-time garbage collection are presented in this thesis; compile-time garbage marking, explicit deallocation and destructive allocation. Of these three methods, it is found that destructive allocation is the only method which is of practical use. The second approach to the optimisation of store usage is called compile-time garbage avoidance. This approach involves transforming programs into semantically equivalent programs which produce less garbage at compile-time. This attempts to overcome the problem of more readable programs being far from optimal in their use of storage. In this thesis, it is shown how to guarantee that the process of compile-time garbage avoidance will terminate. Both of the described approaches to the optimisation of store usage make use of the information obtained by usage counting analysis. This involves counting the number of times each value in a program is used. In this thesis, a reference semantics is defined against which the correctness of usage counting analyses can be proved. A usage counting analysis is then defined and proved to be correct with respect to this reference semantics. The information obtained by this analysis is used to annotate programs for compile-time garbage collection, and to guide the transformation when compile-time garbage avoidance is performed. It is found that compile-time garbage avoidance produces greater increases in efficiency than compile-time garbage collection, but much of the garbage which can be collected by compile-time garbage collection cannot be avoided at compile-time. The two approaches are therefore complementary, and the expressions resulting from compile-time garbage avoidance transformations can be annotated for compile-time garbage collection to further optimise the use of storage.
209

PLEXPLAN: AN INTEGRATED INTELLIGENT ENVIRONMENT FOR INFORMATION SYSTEMS PLANNING (ARTIFICIAL INTELLIGENCE).

MCINTYRE, SCOTT CAMPBELL. January 1986 (has links)
PLEXPLAN, an automated software system and methodology for Information Systems (IS) planning, is described. The PLEXPLAN methodology and tools are used to integrate information from multiple and dynamic perspectives on the organization. A conceptual and technical software architecture is described which permits storage, integration and manipulation of such information. The PLEXPLAN methodology is consistent with the PLEXPLAN software architecture inasmuch as specific techniques in the methodology are represented in the automated environment as tools. A case is presented where PLEXPLAN was used to address an Information Systems planning task. Use of the PLEXPLAN methodology and tools is specifically described as they relate to the case.
210

Metamodels for describing the structure interaction of layered software systems.

Vincent, Stephen George. January 1988 (has links)
This research identifies a current and future need in the realm of information systems development which has surfaced as a result of layered architectures and software reuse. An analysis methodology based upon two three-dimensional metamodels which correspond to the principal aspects of system architecture, structure and communication, is developed. Each metamodel can be viewed as having three planes which represent increasing abstractions away from actual source code. For example, with regard to the structure metamodel, the lowest plane corresponds to actual source code structures written in a specific computer language, the middle plane represents the general form of the structure available in that language, and the top plane represents the general form of structures available in any language. An object-oriented viewpoint was adopted in order to allow the expression of the relationships between entities found on a single plane of a metamodel, as well as the expression of the relationships between entities found on different planes. The metamodels provide a framework and methodology for discerning the structure and communication mechanisms employed in software source code as well as a framework from within which behavioral models can be developed.

Page generated in 0.0822 seconds