61 |
Data Quality By Design: A Goal-oriented ApproachJiang, Lei 13 August 2010 (has links)
A successful information system is the one that meets its design goals. Expressing these goals and subsequently translating them into a working solution is a major challenge for information systems engineering. This thesis adopts the concepts and techniques from goal-oriented (software)
requirements engineering research for conceptual database design, with a focus on data quality issues. Based on a real-world case study, a goal-oriented process is proposed for database requirements analysis and modeling. It spans from analysis of high-level stakeholder goals to detailed design of a conceptual databases schema. This process is then extended specifically for dealing with data quality issues: data of low quality may be detected and corrected by performing various quality assurance activities; to support these activities, the schema needs to be revised by accommodating additional data requirements. The extended process therefore focuses on analyzing and modeling quality assurance data requirements.
A quality assurance activity supported by a revised schema may involve manual work,
and/or rely on some automatic techniques, which often depend on the specification and enforcement of data quality rules. To address the constraint aspect in conceptual database design, data quality rules are classified according to a number of domain and application independent properties. This classification can be used to guide rule designers and to facilitate building of a
rule repository. A quantitative framework is then proposed for measuring and comparing DQ
rules according to one of these properties: effectiveness; this framework relies on derivation of formulas that represent the effectiveness of DQ rules under different probabilistic assumptions.
A semi-automatic approach is also presented to derive these effectiveness formulas.
|
62 |
Data Quality By Design: A Goal-oriented ApproachJiang, Lei 13 August 2010 (has links)
A successful information system is the one that meets its design goals. Expressing these goals and subsequently translating them into a working solution is a major challenge for information systems engineering. This thesis adopts the concepts and techniques from goal-oriented (software)
requirements engineering research for conceptual database design, with a focus on data quality issues. Based on a real-world case study, a goal-oriented process is proposed for database requirements analysis and modeling. It spans from analysis of high-level stakeholder goals to detailed design of a conceptual databases schema. This process is then extended specifically for dealing with data quality issues: data of low quality may be detected and corrected by performing various quality assurance activities; to support these activities, the schema needs to be revised by accommodating additional data requirements. The extended process therefore focuses on analyzing and modeling quality assurance data requirements.
A quality assurance activity supported by a revised schema may involve manual work,
and/or rely on some automatic techniques, which often depend on the specification and enforcement of data quality rules. To address the constraint aspect in conceptual database design, data quality rules are classified according to a number of domain and application independent properties. This classification can be used to guide rule designers and to facilitate building of a
rule repository. A quantitative framework is then proposed for measuring and comparing DQ
rules according to one of these properties: effectiveness; this framework relies on derivation of formulas that represent the effectiveness of DQ rules under different probabilistic assumptions.
A semi-automatic approach is also presented to derive these effectiveness formulas.
|
63 |
Effective use of Java Data objects in developing database applications. Advantages and disadvantages /Zilidis, Paschalis. January 2004 (has links) (PDF)
Thesis (M.S. in Computer Science)--Naval Postgraduate School, June 2004. / Thesis advisor(s): Thomas Otani. Includes bibliographical references (p. 267). Also available online.
|
64 |
A dynamic data/currency protocol for mobile database design and reconfigurationXia, Yanli. January 2002 (has links)
Thesis (M.S.)--University of Florida, 2002. / Title from title page of source document. Includes vita. Includes bibliographical references.
|
65 |
Semantic integrity recommendations on good design methodology /Kogan, Irina. January 2001 (has links)
Thesis (M. Sc.)--York University, 2001. Graduate Programme in Computer Science. / Typescript. Includes bibliographical references (leaves 155-159). Also available on the Internet. MODE OF ACCESS via web browser by entering the following URL: http://wwwlib.umi.com/cr/yorku/fullcit?pMQ71641.
|
66 |
On conceptual design of active databasesTanaka, Asterio Kiyoshi January 1992 (has links)
No description available.
|
67 |
Redesign of distributed relational databasesKarlapalem, Kamalakar 12 1900 (has links)
No description available.
|
68 |
Impact of data modeling and database implementation methods on the optimization of conceptual aircraft designHall, Neil Scott 08 1900 (has links)
No description available.
|
69 |
The next generation of database : object-oriented databaseHon, Wing-Keung January 1994 (has links)
As some new computer applications, such as computer-aided design, multimedia systems, and knowledge-based systems, require more complex data structures, the traditional database system seems to be unable to support these new requirements. A recently developed database technology, object-oriented database, provides a solution to these problems.The purpose of this thesis is to investigate what is object-oriented database, especially in its internal organization such as object persistency. Two object-oriented database systems, EXODUS and ODE, are discussed in detail. In addition, a comparison between the relational databases and object-oriented databases is made. / Department of Computer Science
|
70 |
Schema Integration : How to Integrate Static and Dynamic Database SchemataBellström, Peter January 2010 (has links)
Schema integration is the task of integrating several local schemata into one global database schema. It is a complex, error-prone and time consuming task. Problems arise in recognizing and resolving problems, such as differences and similarities, between two schemata. Problems also arise in integrating static and dynamic schemata. In this thesis, three research topics are addressed: Maintaining Vocabulary in Schema Integration, Integration of Static Schemata and Integration of Static and Dynamic Schemata, while applying the notation in the Enterprise Modeling approach. In Maintaining Vocabulary in Schema Integration an analysis of what semantic loss is and why it occurs in schema integration is conducted. Semantic loss is a problem that should be avoided because both concepts and dependencies might be lost. In the thesis, it is argued that concepts and dependencies should be retained as long as possible in the schemata. This should facilitate user involvement since the users’ vocabulary is retained even after resolving similarities and differences between two schemata. In Integration of Static Schemata two methods are developed. These methods facilitate recognition and resolution of similarities and differences between two conceptual database schemata. By applying the first method, problems between two schemata can be recognized that otherwise could pass unnoticed; by applying the second method, problems can be resolved without causing semantic loss by retaining concepts and dependencies in the schemata. In Integration of Static and Dynamic Schemata a method on how to integrate static and dynamic schemata is developed. In the method, focus is put on pre- and post-conditions and how to map these to states and state changes in the database. By applying the method, states that are important for the database can be designed and integrated into the conceptual database schema. Also, by applying the method, active database rules can be designed and integrated into the conceptual database schema.
|
Page generated in 0.0425 seconds