• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1606
  • 457
  • 422
  • 170
  • 114
  • 102
  • 61
  • 49
  • 40
  • 36
  • 29
  • 23
  • 21
  • 17
  • 16
  • Tagged with
  • 3645
  • 856
  • 804
  • 754
  • 608
  • 544
  • 420
  • 400
  • 392
  • 363
  • 310
  • 304
  • 296
  • 277
  • 264
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
261

A Web-based Database For Experimental Structural Engineering Research

Turer, Oncel Umut 01 May 2010 (has links) (PDF)
This study presents a web-based database application for storing experimental data and related documents at a single location and sharing them among researchers and engineers from all over the world. The database application, accessible from a website was developed for experimental structural engineering researches, and initially tested at Structural and Earthquake Engineering Laboratory of Civil Engineering Department, METU. The application is composed of two parts. The first part is the database that stores information about projects, specimens, experiments, experimental data, documentation, site members, and member groups at the server side. The second part is the website that provides a functional user interface for easy use of application and providing accessibility from everywhere via internet. After the development of the database and the website, these two parts were attached to each other and application functionalities that enabled users to create, modify, search, and delete projects, specimens and experiments / allowed users to upload/download documentation and experimental data / provided abilities to users to plot test data and share their opinions were ensured. ASP.NET framework and C# programming language was utilized for the web application development. Functionality and usability of the database application was then tested by uploading and sharing various experimental results.
262

Data management on distributed databases

Wah, Benjamin W. January 1900 (has links)
"Revision of the author's thesis, University of California, Berkeley, 1979." / Includes bibliographical references (p. [273]-281) and index.
263

Evaluation of potential DSS tool for BDF-HQ manpower and operational equipment rsource planning /

Alhamdan, Ali M. January 2003 (has links) (PDF)
Thesis (M.S. in Information Technology Management)--Naval Postgraduate School, June 2003. / Thesis advisor(s): Daniel R. Dolk, Glenn Cook. Includes bibliographical references (p. 113-114). Also available online.
264

Die ontwerp en ontwikkeling van 'n rekenaargesteunde opleidingsbestuurstelsel vir die chemiese industrie (Afrikaans)

Botha, Johannes Lodewikus. January 2001 (has links)
Thesis (Ph. D. (Educational Management and Policy Studies))--University of Pretoria, 2001. / Includes bibliographical references.
265

Processing and management of uncertain information in vague databases /

Lu, An. January 2009 (has links)
Includes bibliographical references (p. 147-159).
266

Skyline queries in database systems /

Fu, Gregory Chung Yin. January 2003 (has links)
Thesis (M. Phil.)--Hong Kong University of Science and Technology, 2003. / Includes bibliographical references (leaves 51-52). Also available in electronic version. Access restricted to campus users.
267

GANNET: A machine learning approach to document retrieval

Chen, Hsinchun, Kim, Jinwoo 12 1900 (has links)
Artificial Intelligence Lab, Department of MIS, University of Arizona / Information science researchers have recently turned to new artificial intelligence-based inductive learning techniques including neural networks, symbolic learning and genetic algorithms. An overview of the new techniques and their usage in information science research is provided. The algorithms adopted for a hybrid genetic algorithms and neural nets based system, called GANNET, are presented. GANNET performed concept (keyword) optimization for user-selected documents during information retrieval using the genetic algorithms. It then used the optimized concepts to perform concept exploration in a large network of related concepts through the Hopfield net parallel relaxation procedure. Based on a test collection of about 3,000 articles from DIALOG and an automatically created thesaurus, and using Jaccard's score as a performance measure, the experiment showed that GANNET improved the Jaccard's scores by about 50% and helped identify the underlying concepts that best describe the user-selected documents.
268

COPLINK Knowledge Management for Law Enforcement: Text Analysis, Visualization and Collaboration

Atabakhsh, Homa, Schroeder, Jennifer, Chen, Hsinchun, Chau, Michael, Xu, Jennifer J., Zhang, Jing, Bi, Haidong January 2001 (has links)
Artificial Intelligence Lab, Department of MIS, University of Arizona / Crime and police report information is rapidly migrating from paper records to automated records management databases. Most mid and large sized police agencies have such systems that provide access to information by their own personnel, but lack any efficient manner by which to provide that information to other agencies. Criminals show no regard for jurisdictional boundaries and in fact take advantage of the lack of communication across jurisdictions. Federal standards initiatives such as the National Incident Based Reporting System (NIBRS, US Department of Justice 1998), are attempting to provide reporting standards to police agencies to facilitate future reporting and information sharing among agencies as these electronic reporting systems become more widespread. We integrated platform-independence, stability, scalability, and an intuitive graphical user interface to develop the COPLINK system, which is currently being deployed at Tucson Police Department (TPD). User evaluations of the application allowed us to study the impact of COPLINK on law enforcement personnel as well as to identify requirements for improving the system and extending the project. We are currently in the process of extending the functionality of COPLINK in several areas. These include textual analysis, collaboration, visualization and geo-mapping.
269

Incorporating semantic integrity constraints in a database schema

Yang, Heng-li 11 1900 (has links)
A database schema should consist of structures and semantic integrity constraints. Se mantic integrity constraints (SICs) are invariant restrictions on the static states of the stored data and the state transitions caused by the primitive operations: insertion, dele tion, or update. Traditionally, database design has been carried out on an ad hoc basis and focuses on structure and efficiency. Although the E-R model is the popular concep tual modelling tool, it contains few inherent SICs. Also, although the relational database model is the popular logical data model, a relational database in fourth or fifth normal form may still represent little of the data semantics. Most integrity checking is distributed to the application programs or transactions. This approach to enforcing integrity via the application software causes a number of problems. Recently, a number of systems have been developed for assisting the database design process. However, only a few of those systems try to help a database designer incorporate SICs in a database schema. Furthermore, current SIC representation languages in the literature cannot be used to represent precisely the necessary features for specifying declarative and operational semantics of a SIC, and no modelling tool is available to incorporate SICs. This research solves the above problems by presenting two models and one subsystem. The E-R-SIC model is a comprehensive modelling tool for helping a database designer in corporate SICs in a database schema. It is application domain-independent and suitable for implementation as part of an automated database design system. The SIC Repre sentation model is used to represent precisely these SICs. The SIC elicitation subsystem would verify these general SICs to a certain extent, decompose them into sub-SICs if necessary, and transform them into corresponding ones in the relational model. A database designer using these two modelling tools can describe more data semantics than with the widely used relational model. The proposed SIC elicitation subsystem can provide more modelling assistance for him (her) than current automated database design systems.
270

Geometric searching with spacefilling curves

Nulty, William Glenn 08 1900 (has links)
No description available.

Page generated in 0.0312 seconds