Spelling suggestions: "subject:"aemantic coequality"" "subject:"aemantic c.equality""
1 |
Business Process Models Quality : evaluation and improvement / La qualité des processus métier : évaluation et améliorationAyad, Sarah 18 December 2013 (has links)
La problématique scientifique abordée correspond à la modélisation et à l'amélioration des processus métiers. Ce problème est d'un intérêt croissant pour les entreprises qui prennent conscience de l'impact indéniable que peuvent avoir une meilleure compréhension et une meilleure gestion des processus métiers (PM) sur l'efficacité, la cohérence et la transparence de leurs activités. Le travail envisagé dans le cadre de la thèse vise à proposer une méthode et un outil pour mesurer et améliorer la qualité des modèles de processus métier. L’originalité de l’approche est qu’elle vise non seulement la qualité syntaxique mais aussi la qualité sémantique et pragmatique en s’appuyant notamment sur les connaissances du domaine. / In recent years the problems related to modeling and improving business processes have been of growing interest. Indeed, companies are realizing the undeniable impact of a better understanding and management of business processes (BP) on the effectiveness, consistency, and transparency of their business operations. BP modeling aims at a better understanding of processes, allowing deciders to achieve strategic goals of the company. However, inexperienced systems analysts often lack domain knowledge leading and this affects the quality of models they produce.Our approach targets the problem related to business process modeling quality by proposing an approach encompassing methods and tools for business process (BP) models quality measurement and improvement. We propose to support this modeling effort with an approach that uses domain knowledge to improve the semantic quality of BP models.The main contribution of this thesis is fourfold:1. Exploiting the IS domain knowledge: A business process metamodel is identified.Semantics are added to the metamodel by the mean of OCL constraints.2. Exploiting the application domain knowledge. It relies on domain ontologies. Alignment between the concepts of both metamodels is defined and illustrated.3. Designing of the guided quality process encompassing methods and techniques to evaluate and improve the business process models. Our process propose many quality constraints and metrics in order to evaluat the quality of the models and finally the process propose relevant recommendations for improvement.4. Development of a software prototype “BPM-Quality”. Our prototype implements all theabove mentioned artifacts and proposes a workflow enabling its users to evaluate andimprove CMs efficiently and effectively.We conducted a survey to validate the selection of the quality constraints through a first experience and also conducted a second experiment to evaluate the efficacy and efficiency of our overall approach and proposed improvements.
|
2 |
Evaluating Data Quality in a Data Warehouse Environment / Utvärdering av datakvalitet i ett datalagerRedgert, Rebecca January 2017 (has links)
The amount of data accumulated by organizations have grown significantly during the last couple of years, increasing the importance of data quality. Ensuring data quality for large amounts of data is a complicated task, but crucial to subsequent analysis. This study investigates how to maintain and improve data quality in a data warehouse. A case study of the errors in a data warehouse was conducted at the Swedish company Kaplan, and resulted in guiding principles on how to improve the data quality. The investigation was done by manually comparing data from the source systems to the data integrated in the data warehouse and applying a quality framework based on semiotic theory to identify errors. The three main guiding principles given are (1) to implement a standardized format for the source data, (2) to implement a check prior to integration where the source data are reviewed and corrected if necessary, and (3) to create and implement specific database integrity rules. Further work is encouraged on establishing a guide for the framework on how to best perform a manual approach for comparing data, and quality assurance of source data. / Mängden data som ackumulerats av organisationer har ökat betydligt under de senaste åren, vilket har ökat betydelsen för datakvalitet. Att säkerställa datakvalitet för stora mängder data är en komplicerad uppgift, men avgörande för efterföljande analys. Denna studie undersöker hur man underhåller och förbättrar datakvaliteten i ett datalager. En fallstudie av fel i ett datalager på det svenska företaget Kaplan genomfördes och resulterade i riktlinjer för hur datakvaliteten kan förbättras. Undersökningen gjordes genom att manuellt jämföra data från källsystemen med datat integrerat i datalagret och genom att tillämpa ett kvalitetsramverk grundat på semiotisk teori för att kunna identifiera fel. De tre huvudsakliga riktlinjerna som gavs är att (1) implementera ett standardiserat format för källdatat, (2) genomföra en kontroll före integration där källdatat granskas och korrigeras vid behov, och (3) att skapa och implementera specifika databasintegritetsregler. Vidare forskning uppmuntras för att skapa en guide till ramverket om hur man bäst jämför data genom en manuell undersökning, och kvalitetssäkring av källdata.
|
Page generated in 0.0732 seconds