• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 19640
  • 3370
  • 2417
  • 2007
  • 1551
  • 1432
  • 877
  • 406
  • 390
  • 359
  • 297
  • 234
  • 208
  • 208
  • 208
  • Tagged with
  • 38124
  • 12457
  • 9252
  • 7109
  • 6698
  • 5896
  • 5289
  • 5197
  • 4726
  • 3455
  • 3303
  • 2815
  • 2726
  • 2538
  • 2116
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
221

Automatic processing of Chinese language bank cheques

余銘龍, Yu, Ming-lung. January 2002 (has links)
published_or_final_version / Electrical and Electronic Engineering / Master / Master of Philosophy
222

Design and performance analysis of data broadcasting systems

蕭潤明, Siu, Yun-ming. January 1995 (has links)
published_or_final_version / Electrical and Electronic Engineering / Doctoral / Doctor of Philosophy
223

Data aggregation for capacity management

Lee, Yong Woo 30 September 2004 (has links)
This thesis presents a methodology for data aggregation for capacity management. It is assumed that there are a very large number of products manufactured in a company and that every product is stored in the database with its standard unit per hour and attributes that uniquely specify each product. The methodology aggregates products into families based on the standard units-per-hour and finds a subset of attributes that unambiguously identifies each family. Data reduction and classification are achieved using well-known multivariate statistical techniques such as cluster analysis, variable selection and discriminant analysis. The experimental results suggest that the efficacy of the proposed methodology is good in terms of data reduction.
224

Computerized system of architectural specifications : with a database link to drawings

Suddarth, Jane January 1983 (has links)
The architecture profession, like many other design professions, is being revolutionized through the usage and development of computers. Both computer hardware and software are reaching levels of high development in the area of graphics. Even though graphic systems are becoming more sophisticated, there is no current linkage of textual information with graphics. Architectural projects consist of both text (specifications) and graphics (working drawings). Consequently, this creative project develops a computer software system (series of programs) for linking and unifying text information (specifications) with working drawings. / Department of Architecture
225

Spatial data quality management

He, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
226

Spatial data quality management

He, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
227

Data discretization simplified randomized binary search trees for data preprocessing /

Boland, Donald Joseph. January 1900 (has links)
Thesis (M.S.)--West Virginia University, 2007. / Title from document title page. Document formatted into pages; contains xiv, 174 p. : ill. (some col.). Includes abstract. Includes bibliographical references (p. 172-174).
228

Entwicklung von Data-Warehouse-Systemen : Anforderungsmanagement, Modellierung, Implementierung /

Goeken, Matthias. January 2006 (has links)
Zugl.: Marburg, University, Diss., 2006.
229

Transparent Asynchronous Transmitter Receiver Interface (TAXI) communications for fiber optic data links /

Sankaran, Mahadevan. January 1994 (has links)
Thesis (M.S.)--Virginia Polytechnic Institute and State University, 1994. / Vita. Abstract. Includes bibliographical references (leaf 51). Also available via the Internet.
230

Progressive image and video transmission with error concealment on burst error channels and lossy packet networks /

Srinivas, Bindignavile S. January 1997 (has links)
Thesis (Ph. D.)--University of Washington, 1997. / Vita. Includes bibliographical references (leaves [84]-91).

Page generated in 0.0631 seconds