Spelling suggestions: "subject:"data coequality managemement"" "subject:"data coequality managementment""
1 |
Spatial data quality managementHe, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
|
2 |
Spatial data quality managementHe, Ying, Surveying & Spatial Information Systems, Faculty of Engineering, UNSW January 2008 (has links)
The applications of geographic information systems (GIS) in various areas have highlighted the importance of data quality. Data quality research has been given a priority by GIS academics for three decades. However, the outcomes of data quality research have not been sufficiently translated into practical applications. Users still need a GIS capable of storing, managing and manipulating data quality information. To fill this gap, this research aims to investigate how we can develop a tool that effectively and efficiently manages data quality information to aid data users to better understand and assess the quality of their GIS outputs. Specifically, this thesis aims: 1. To develop a framework for establishing a systematic linkage between data quality indicators and appropriate uncertainty models; 2. To propose an object-oriented data quality model for organising and documenting data quality information; 3. To create data quality schemas for defining and storing the contents of metadata databases; 4. To develop a new conceptual model of data quality management; 5. To develop and implement a prototype system for enhancing the capability of data quality management in commercial GIS. Based on reviews of error and uncertainty modelling in the literature, a conceptual framework has been developed to establish the systematic linkage between data quality elements and appropriate error and uncertainty models. To overcome the limitations identified in the review and satisfy a series of requirements for representing data quality, a new object-oriented data quality model has been proposed. It enables data quality information to be documented and stored in a multi-level structure and to be integrally linked with spatial data to allow access, processing and graphic visualisation. The conceptual model for data quality management is proposed where a data quality storage model, uncertainty models and visualisation methods are three basic components. This model establishes the processes involved when managing data quality, emphasising on the integration of uncertainty modelling and visualisation techniques. The above studies lay the theoretical foundations for the development of a prototype system with the ability to manage data quality. Object-oriented approach, database technology and programming technology have been integrated to design and implement the prototype system within the ESRI ArcGIS software. The object-oriented approach allows the prototype to be developed in a more flexible and easily maintained manner. The prototype allows users to browse and access data quality information at different levels. Moreover, a set of error and uncertainty models are embedded within the system. With the prototype, data quality elements can be extracted from the database and automatically linked with the appropriate error and uncertainty models, as well as with their implications in the form of simple maps. This function results in proposing a set of different uncertainty models for users to choose for assessing how uncertainty inherent in the data can affect their specific application. It will significantly increase the users' confidence in using data for a particular situation. To demonstrate the enhanced capability of the prototype, the system has been tested against the real data. The implementation has shown that the prototype can efficiently assist data users, especially non-expert users, to better understand data quality and utilise it in a more practical way. The methodologies and approaches for managing quality information presented in this thesis should serve as an impetus for supporting further research.
|
3 |
Customer Data ManagementSehat, Mahdis, PAVEZ FLORES, RENÉ January 2012 (has links)
Abstract As the business complexity, number of customers continues to grow and customers evolve into multinational organisations that operate across borders, many companies are faced with great challenges in the way they manage their customer data. In today’s business, a single customer may have a relationship with several entities of an organisation, which means that the customer data is collected through different channels. One customer may be described in different ways by each entity, which makes it difficult to obtain a unified view of the customer. In companies where there are several sources of data and the data is distributed to several systems, data environments become heterogenic. In this state, customer data is often incomplete, inaccurate and inconsistent throughout the company. This thesis aims to study how organisations with heterogeneous customer data sources implement the Master Data Management (MDM) concept to achieve and maintain high customer data quality. The purpose is to provide recommendations for how to achieve successful customer data management using MDM based on existing literature related to the topic and an interview-based empirical study. Successful customer data management is more of an organisational issue than a technological one and requires a top-down approach in order to develop a common strategy for an organisation’s customer data management. Proper central assessment and maintenance processes that can be adjusted according to the entities’ needs must be in place. Responsibilities for the maintenance of customer data should be delegated to several levels of an organisation in order to better manage customer data.
|
4 |
Assessing Data Quality of ERP and CRM SystemsSarwar, Muhammad Azeem January 2014 (has links)
Data Quality confirms the correct and meaningful representation of real world information. Researchers have proposed frameworks to measure and analyze the Data Quality. Still modern organizations find it very challenging to state the level of enterprise Data Quality maturity. This study aims at defining the Data Quality of a system also examine the Data Quality Assessment practices. A definition for Data Quality is suggested with the help of systematic literature review. Literature review also provided a list of dimensions and initiatives for Data Quality Assessment. A survey is conducted to examine these aggregated aspects of Data Quality in an organization actively using ERP and CRM systems. The survey was aimed at collecting organizational awareness of Data Quality and to study the practices followed to ensure the Data Quality in ERP and CRM systems. The survey results identified data validity, accuracy and security as the main areas of interest for Data Quality. The results also indicate that, due to audit requirements of ERP systems, ERP systems have higher demand of Data Quality as compared to CRM systems.
|
5 |
Řízení kvality dat v malých a středních firmách / Data quality management in small and medium enterprisesZelený, Pavel January 2010 (has links)
This diploma thesis deals with the data quality management. There are many tools and methodologies to support the data quality management even in Czech market but they are all only for large companies. Small and middle companies can't afford them because of high cost. The first goal of this thesis is to summarize principles of the methodologies and then on the base of the methodologies to suggest more simple methodology for small and middle companies. In the second part of thesis is created and adapted the methodology for a specific company. The first step is to choose the data area of interest in the company. Because of impossibility to buy a software tool to clean data, there are defined relatively simple rules which are base source to create cleaning scripts in SQL language. The scripts are used for automatic data cleaning. On the base of next analyze is decided what data should be cleaned manually. In the next step are described recommendations how to remove duplicities from the database. There is used a functionality of the company's production system. The last step of the methodology is to create a control mechanism which have to keep the required data quality in future. At the end of thesis is made a data research in four data sources. All these sources are from companies using the same production system. The reason of research is to present the overview of data quality and to help with decision about cleaning data in the companies also.
|
6 |
Datová kvalita a nástroje pro její řízení / Data Quality And Tools For Its ManagementTezzelová, Jana January 2009 (has links)
This diploma thesis deals with data quality, with emphasis on issues of management and on tools which were developed for solving data quality issues. The goal of this work is to summarize knowledge about data quality problems which includes its evaluation, management, description of key problems in data and possibilities of their solutions. The aims of this thesis are among others also analysis of market of software tools for support and management of data quality and mainly comparison of functionalities and possibilities of several of those tools. This work is split into two consequential parts. The first theoretical part is focusing on opening to problems of data quality and mainly data quality management, including identification of main steps for successful management. The second practical part is focusing on the market with data quality tools, especially its characteristics, segmentation, evolution, current state and expectable trends. The important section of this part is also practical comparison of features and evaluation of the work with several data quality tools. This work aims to be beneficial for all the audience interested in data quality problems, especially its management and supporting technology. Thanks to focusing on data quality tools market and tools comparison this work could be also useful guide for companies which are currently choosing the proper tool for introducing the data quality. Regarding this work focus the readers are expected to have at least basic orientation in Business Intelligence.
|
7 |
Strategies to Improve Data Quality for Forecasting Repairable Spare PartsEguasa, Uyi Harrison 01 January 2016 (has links)
Poor input data quality used in repairable spare parts forecasting by aerospace small and midsize enterprises (SME) suppliers results in poor inventory practices that manifest into higher costs and critical supply shortage risks. Guided by the data quality management (DQM) theory as the conceptual framework, the purpose of this exploratory multiple case study was to identify the key strategies that the aerospace SME repairable spares suppliers use to maximize their input data quality used in forecasting repairable spare parts. The multiple case study comprised of a census sample of 6 forecasting business leaders from aerospace SME repairable spares suppliers located in the states of Florida and Kansas. The sample was collected via semistructured interviews and supporting documentation from the consenting participants and organizational websites. Eight core themes emanated from the application of the content data analysis process coupled with methodological triangulation. These themes were labeled as establish data governance, identify quality forecast input data sources, develop a sustainable relationship and collaboration with customers and vendors, utilize a strategic data quality system, conduct continuous input data quality analysis, identify input data quality measures, incorporate continuous improvement initiatives, and engage in data quality training and education. Of the 8 core themes, 6 aligned to the DQM theory's conceptual constructs while 2 surfaced as outliers. The key implication of the research toward positive social change may include the increased situational awareness for SME forecasting business leaders to focus on enhancing business practices for input data quality to forecast repairable spare parts to attain sustainable profits.
|
8 |
Datová kvalita, integrita a konsolidace dat v BI / Data quality, data integrity and consolidation of data in BIDražil, Michal January 2008 (has links)
This thesis deals with the areas of enterprise data quality, data integrity and data consolidation from the perspective of Business Intelligence (BI), which is currently experiencing significant growth. The aim of this thesis is to provide a comprehensive view of the data quality in terms of BI, to analyze problems in the area of data quality control and to propose options to address them. Moreover, the thesis aims to analyze and assess the features of specialized software tools for data quality. Last but not least aim of this thesis is to identify the critical success factors in the field of data quality in CRM and BI projects. The thesis is divided into two parts. The first (theoretical) part deals with data quality, data integrity and consolidation of data in relation to BI trying to identify key issues, which are related to these areas. The second (practical) part of the thesis deals at first with the features of software tools for data quality and offers their fundamental summary as well as the tools breakdown. This part also provides basic comparison of the few selected software products specialized at the corporate data quality assurance. The practical part hereafter describes addressing the data quality within the specific BI/CRM project conducted by Clever Decision Ltd. This thesis is intended primarily for BI and data quality experts, as well as the others who are interested in these disciplines. The main contribution of this thesis is that it provides comprehensive view not only of data quality itself, but also deals with the issues that are directly related to the corporate data quality assurance. This thesis can serve as a sort of guidance for one of the first implementation phases in the BI projects, which deals with the data integration, data consolidation and solving problems in the area of data quality.
|
Page generated in 0.1067 seconds