• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 41
  • 12
  • 8
  • 8
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 114
  • 114
  • 38
  • 29
  • 19
  • 19
  • 17
  • 17
  • 16
  • 14
  • 13
  • 11
  • 11
  • 10
  • 10
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Modellering av affärsprocesser : FALLSTUDIE PÅ CENTSOFT AB

Chen, Louis January 2014 (has links)
Många företag har börjat inse att processmodellering kan vara ett bra verktyg för att bättre förstå sitt företag. Att man kan använda processkartor för att analysera och förbättra sina processer.Centsoft AB är ett litet IT-företag med sex anställda som omsätter ca 5,4 miljoner kronor per år. Centsofts huvudsakliga produkt är en webbaserad applikation för elektronisk fakturahantering. I dagsläget har Centsoft ingen dokumentation alls som visar hur dess interna processer ser ut. Därför har studenten fått i uppdrag av Centsoft ABatt kartlägga dess affärsprocesser, analysera dessa och komma med eventuella förbättringsförslag. Arbetet är indelat i tre delar, först genomfördes en förstudie av Centsoft för att förstå dess verksamhet, ta reda på vilka processer det har och produceraen domänbeskrivning. Sedan följde en modelleringsdel under vilken processkartor över verksamheten konstruerades, denna del skedde iterativt för att säkerställa att modellerna blev klara i tid. Arbetet avslutades med en analysdel för att identifiera potentiella förbättringsmöjligheter.Teknik som användes för förbättringsförslag var processförbättring, eftersom process-redesign bedömdes som alltför tidskrävande.Resultatet är elva stycken processkartor, varav två är förbättringsförslag. En av slutsatserna som kunde dras utifrån detta arbete är att Centsoft har välfungerande affärsprocesser som bidrar till att företaget expanderar. Det här arbetet blir en bra grund för Centsoft att arbeta vidare på, t.ex.vid framtida expansioner, då det kanske blir aktuellt att konstruera nya processer, eller ändra befintliga. / Many companies have begun to realize process modelling can be a useful tool to better understand their business, and that process maps can beused to analyze and improve the company processes. Centsoft AB is a small IT company with six employees and has revenue of approximately 5.4 million SEK per year. Centsofts main product is aweb-based application for handling electronic invoices.As of today, Centsoft does not have any documentation at all of its internal processes. Therefore Centsoft has given the student an assignment to identify and analyze its business processes, and to come up with any suggestions for improvement.This project is divided into three parts; first part is a case study of Centsoft which is done in order to understand its operations, identify its business processes and to produce a domain description. Second part is to create process maps for the business; this part was done iteratively to ensure that the process maps were delivered on time. The third part is to analyze the process maps for potential improvement opportunities. Technique used to improve the processes was process improvement, because process-redesign was considered too time consuming.The result is eleven process maps, two of which are suggestions for improvement. A conclusion that could be drawn from this project based on the successful expanding and profit-making by Centsoft is that they have well-functioning processes. This project will be a good basis for Centsoft during future expanding, when the need for redesigning existing or designingnew processes arises.
92

Weakness Identification of Excess Inventory Based on Business Process Models : A Case Study with Business Process Modelling and Weakness Identification

He, Hongyu January 2020 (has links)
With development and impact of ICT, the method of work in many organizations has been becoming more collaborative and communicative, under which a number of organizations start to take corresponding strategies to achieve business goals and create more values. Managing Business Process is an effective and efficient way to improve productivity and performance of business activities from an organizational level. Business Process model, as a representation of Business Process, provides a big picture of Business Process, allowing organizations to acquire understanding on logical relationships among different business activities and to improve Business Process by various approaches. This study discusses the application of Business Process models on weakness identification which is related to the problem of excess inventory in supply chain with a qualitative method. It adopts three Business Process Modelling techniques to build Business Process models for a planning team involved with demand and supply planning, where four experts from the team participate in interview. The models are analyzed according to selected weakness patterns in order to identify the process weaknesses and link them with the causes of excess inventory. The result of this study gives a positive answer, which means the Business Process Models are capable to identify causes for the concrete problem of excess inventory by identifying process weaknesses.
93

A Case Study of a Business Process Modeling in Mobile ERP System

Broomé Clason, Agnes, Holmberg, Maria January 2019 (has links)
Utvecklingen inom informationssystem har lett till att allt fler verksamhet villdigitalisera deras arbetsprocesser. Det växande mobila användandet av EnterpriseResource Planning (ERP) på enheter såsom smartphones, tablets och handdatorer harlett till att verksamheter ställer större krav på sina system och leverantörer av dessa.Framtidens ERPsystemmåste fungera mobilt och underlätta arbetet för de anställdaistället för att vara ännu ett ITsystemsom ska lösa organisationens alla utmaningar.Denna fallstudie är gjord hos en grossistverksamhet i Skåne som var mitt iimplementationen av sitt ERPsystem,Microsoft Dynamics. Denna studie harundersökt hur arbetet med verksamhetens affärsprocesser såg ut före och underimplementationen av det nya systemet. Bakgrunden till fallstudien var att undersökaom det fanns korrelation mellan hur väl affärsprocesserna var evaluerade, anpassadeoch hur vällyckad verksamhetens ERPimplementationblev. Fallstudien villundersöka hur processmodellering och Business Process Management (BPM) kanstödja denna slags implementation av mobila ERPsystem.Verksamheten i studien är ett mellanstort säljochdistributionsföretag med kontor på4 svenska orter. Verksamheten påbörjade 2015 en digitalisering av helaorganisationen. ITsystemeti bestod då av flera olika system som anpassats underlängre tid för att integrera med varandra och dela information precis efterorganisationens önskan. Som en del av implementationen valde de att använda ettBusiness Process Modeling verktyg för att introducera och träna användarna avprocesserna i det nya systemet. Implementationsfasen i projektet började i februari2017 och i februari 2019 har verksamheten fortfarande inte fastställt ett Golivedatum.. Medeltiden för en ERPimplementationär 17,4 månader [5]. Då verksamhetenGolivedatum har flyttats räknas det nu som att verksamhetens implementation ärförsenade och över medeltiden.Arbetet med processerna före själva projektstart uppfattas som bristfälligt utifrån BPMdras slutsatsen att förseningen av “Go livedatumet”delvis beror det på att företagetoch organisationen inte har analyserat sina egna processer och har förlitat sig förmycket på att konsultbolaget och RapidValue BPM skulle sköta inlärning ochimplementationen. / The development in information systems has led to an increasing number ofbusinesses wishing to digitize their work processes. The growing mobile use ofEnterprise Resource Planning (ERP) systems on devices such as smartphones andtablets has led businesses to place greater demands on their information systems andsuppliers of these. ERP system in the future must be able to function mobile andfacilitate the work for the employees instead of being another IT system that will solvethe organisation's all challenges.This case study was made at a wholesale company in Skåne, which was in the midstof an ERP implementation with Microsoft Dynamics. This study has investigated howthe work with business processes looked before and during the implementation of thenew system. The background to the case study was to investigate whether there wereany correlation between how well the business processes were evaluated, adapted andhow successful the business's ERP implementation was. The case study aims toinvestigate how process modeling and Business Process Management (BPM) cansupport this kind of implementation of mobile ERP systems.The activity in the study is a mediumsizedsales and distribution company withoffices in 4 Swedish locations. In 2015, the business began digitizing the entireorganization. The IT system consisted of several different systems that had beenadapted over time to fit the needs of the organisation. As part of the implementation,they chose to use a Business Process Modeling tool to introduce the system and trainthe users of the processes in the new system. The implementation phase of the projectbegan in February 2017 and in February 2019 the business still has not set a Golivedate. The average time for an ERP implementation is 17.4 months [5]. Since theGolivedate business has been moved, it is now considered that the operation'simplementation is delayed and over the average time.The work on the processes before the actual start of the project is perceived asinadequate from BPM, the conclusion is drawn that the delay of the "Go live date" ispartly due to the fact that the company and the organization have not analyzed theirown processes and have relied too much on the consulting company and RapidValueBPM to handle learning and implementation.
94

Analysis of Negative Emission Ammonia Fertilizer (urea) Process / Analys av negativa utsläpp från ammoniak gödsel (urea) processen

Alejo Vargas, Lucio Rodrigo January 2020 (has links)
As the world population keeps increasing, ammonia-based fertilizers like urea are essential to provide food security. However, the current fertilizer industry is based on fossil fuel feedstock (mainly natural gas), making the production process CO2 emission-intensive. More specifically, besides the CO2 emitted during the process, the CO2 captured in urea is also released into the atmosphere after the fertilizer is applied to agricultural soils. Thus, positioning the fertilizer industry among the top four industrial emitters globally. Hence, in order to meet the target of limiting global warming to 1.5 ºC and achieve net-zero emissions by 2050, it is necessary to strengthen the carbon mitigation efforts in the current fertilizer industry. This can be achieved in different ways, such as using renewable biofuels and implementing technologies that can lead to zero/negative CO2 emissions. For that reason, the present study presents pathways to achieve a more environmentally friendly fertilizer production process. An overall analysis is performed if negative emissions can be achieved by replacing different fractions of natural gas (used as both feedstock and fuel) with biogas and biomethane and by capturing and storing the CO2 emitted from the process using chemical solvents as activated MDEA and MEA. The results obtained from the study revealed that negative emissions in fertilizer plant can be achieved by retrofitting an existing ammonia plant with a MEA based CO2 capture system (with a carbon capture rate of 90%) for the SMR burner flue gas, and by introducing 50% of biogas in the feedstock (alongside Natural gas), and 75% of biogas in the SMR burner fuel (alongside Natural gas). This initial approach would result in net negative emissions from urea's production and application and require approximately 0.5 kg of biogas per kg of urea produced in this case. Furthermore, the equivalent energy intensity for the negative emission urea plant would be 0.32% and 3.37% lower compared to the fossil fuel-based case without/with CCS, respectively. Ultimately, it is even possible to produce approximately 6% more urea product by replacing a particular fraction of natural gas with biogas. The reason for this increased production is due to the surplus of carbon dioxide by the introduction of biogas. It can be used along with the ammonia product going to storage in the fossil fuel-based case, where there was not enough CO2 to keep the feedstock molar ratio at the urea plant's inlet.
95

Dynamic Modelling and Optimization of Polymerization Processes in Batch and Semi-batch Reactors. Dynamic Modelling and Optimization of Bulk Polymerization of Styrene, Solution Polymerization of MMA and Emulsion Copolymerization of Styrene and MMA in Batch and Semi-batch Reactors using Control Vector Parameterization Techniques.

Ibrahim, W.H.B.W. January 2011 (has links)
Dynamic modelling and optimization of three different processes namely (a) bulk polymerization of styrene, (b) solution polymerization of methyl methacrylate (MMA) and (c) emulsion copolymerization of Styrene and MMA in batch and semi-batch reactors are the focus of this work. In this work, models are presented as sets of differential-algebraic equations describing the process. Different optimization problems such as (a) maximum conversion (Xn), (b) maximum number average molecular weight (Mn) and (c) minimum time to achieve the desired polymer molecular properties (defined as pre-specified values of monomer conversion and number average molecular weight) are formulated. Reactor temperature, jacket temperature, initial initiator concentration, monomer feed rate, initiator feed rate and surfactant feed rate are used as optimization variables in the optimization formulations. The dynamic optimization problems were converted into nonlinear programming problem using the CVP techniques which were solved using efficient SQP (Successive Quadratic Programming) method available within the gPROMS (general PROcess Modelling System) software. The process model used for bulk polystyrene polymerization in batch reactors, using 2, 2 azobisisobutyronitrile catalyst (AIBN) as initiator was improved by including the gel and glass effects. The results obtained from this work when compared with the previous study by other researcher which disregarded the gel and glass effect in their study which show that the batch time operation are significantly reduced while the amount of the initial initiator concentration required increases. Also, the termination rate constant decreases as the concentration of the mixture increases, resulting rapid monomer conversion. The process model used for solution polymerization of methyl methacrylate (MMA) in batch reactors, using AIBN as the initiator and Toluene as the solvent was improved by including the free volume theory to calculate the initiator efficiency, f. The effects of different f was examined and compared with previous work which used a constant value of f 0.53. The results of these studies show that initiator efficiency, f is not constant but decreases with the increase of monomer conversion along the process. The determination of optimal control trajectories for emulsion copolymerization of Styrene and MMA with the objective of maximizing the number average molecular weight (Mn) and overall conversion (Xn) were carried out in batch and semi-batch reactors. The initiator used in this work is Persulfate K2S2O8 and the surfactant is Sodium Dodecyl Sulfate (SDS). Reduction of the pre-batch time increases the Mn but decreases the conversion (Xn). The sooner the addition of monomer into the reactor, the earlier the growth of the polymer chain leading to higher Mn. Besides that, Mn also can be increased by decreasing the initial initiator concentration (Ci0). Less oligomeric radicals will be produced with low Ci0, leading to reduced polymerization loci thus lowering the overall conversion. On the other hand, increases of reaction temperature (Tr) will decrease the Mn since transfer coefficient is increased at higher Tr leading to increase of the monomeric radicals resulting in an increase in termination reaction.
96

Life cycle assessment of feedstock recycling processes

Keller, Florian 06 February 2024 (has links)
This study examines the ecological impact of exemplary processes for the feedstock recycling of waste fractions. It is shown that the material process efficiency of gasification and pyrolysis has a low impact on the greenhouse gas balance in the short term, but that high product yields are necessary in the long term to avoid an increasing climate impact. In a systemic context, different process routes of syngas and pyrolysis oil utilization are compared, and their efficiency and quantitative potential for greenhouse gas reduction compared to electricity-based alternatives of process direct heating of conventional processes and electrolysis-based process chains are classified. It is shown that direct utilization options with few process steps are ecologically more efficient. Feedstock recycling shows a similar reduction potential to direct heating, while the use of electrolysis-based process chains is inefficient but necessary to achieve systemic climate neutrality.:1. Introduction and outline 1 2. Life cycle assessment methodology 5 2.1. Previous LCA investigation on feedstock recycling 7 2.2. Assessment scope 9 2.3. Attributional vs. consequential LCI modelling 11 2.4. Inventory modelling consistency 12 2.5. Prospective technology assessment 13 2.6. Conclusions for the applied methodology 14 3. Process description and modelling 16 3.1. Feedstock recycling technologies 18 3.1.1. Gasification 18 3.1.2. Syngas conditioning and purification 23 3.1.3. Pyrolysis 29 3.1.4. Pyrolysis oil hydroprocessing 32 3.2. Chemical production technologies 34 3.2.1. Steam cracking 35 3.2.2. Catalytic reforming 37 3.2.3. Olefin and BTX recovery 38 3.2.4. Conventional syngas production 41 3.2.5. Methanol and methanol-based synthesis 43 3.2.6. Ammonia synthesis 48 3.3. Electric power integration options 49 3.4. Conventional waste treatment processes 53 3.4.1. Mechanical biological treatment and material recovery 54 3.4.2. Waste incineration 57 3.5. Utility processes and process chain balancing 59 3.6. Electricity and heat supply modelling 65 4. Individual assessment of feedstock recycling processes 68 4.1. Goal and scope definition 68 4.2. Life cycle inventory 68 4.3. Impact assessment 72 4.4. Interpretation 80 5. System-based assessment of feedstock recycling processes 82 5.1. Goal and scope definition 82 5.2. Life cycle inventory 86 5.2.1. Utility, background system inventory and system integration 88 5.2.2. Assessment scenario definition and parameter variation 90 5.3. Impact assessment 93 5.3.1. Framework Status Quo (FSQ) 93 5.3.2. Framework Energy Integration (FEI) 99 5.4. Interpretation 106 6. Summary and conclusion 109 6.1. Results 110 6.2. Recommendations and outlook 111 References 113 Supplementary Material 136
97

Hydrogen liquefaction chain: co-product hydrogen and upstream study / Väteförvätskningskedja: samproduktväte och uppströmsstudie

Lusson, Salomé January 2021 (has links)
The European Green Deal declared that Europe must decarbonize to become carbon-neutral within 2050. To do so, the European Parliament emphasized hydrogen as a major tool for energy transition. In regard of current environmental challenges, liquid hydrogen has raised interest as energy carrier for energy storage and transport. Due to growing use of renewable energy sources such as solar and wind energy, intermittent sources will increase. Hydrogen production methods will become mostly intermittent with renewable energies. However, due to historical hydrogen production by steam methane reforming, liquefaction was developed at steady nominal charge. In order to feed current liquefaction processes with renewable hydrogen, a buffer system will become required. This thesis studies the effect of buffer and liquefaction combination on performances and cost. In order to carry out this liquefaction from intermittent source, the study is performed based on industrial data from a variable co-product hydrogen profile. This profile acts as a simplified case. The scope of the study is drawn by considering compressed hydrogen as temporary storage for the buffer while liquefaction unit is modelled around Linde Leuna cycle. The technical-economical study covers sensitivity analysis on both buffer and liquefaction unit. For the buffer unit, storage capacity, storage pressure, liquefaction flexibility and recuperation rate impacts are examined. Liquefaction sensitivity analysis includes pressure drop, electricity cost and capacity study.  It is highlighted that 100% gaseous hydrogen recovery is not profitable due to high costs increase for recuperation higher than 95%. Storage pressure and capacity as well as liquefaction flexibility drive buffer cost and recuperation rate of the co-product hydrogen. Considering liquefaction study, results highlight that pressure drops cause first order deviations in energy consumption as well as on cost. Results show that the specific buffer cost is evaluated between 71% and 59% of liquefaction cost. Hence the thesis raises attention on future work on heat exchangers design, pressure drop optimization and liquefaction unit flexibility to allow an optimized renewable liquid hydrogen production.
98

Förändringsarbete av informationsflöden i en interorganisatorisk samverkan / Change Management of Information Flows in an Interorganizational Alliance

Brandt, Juliana January 2019 (has links)
In recent years, the train's punctuality, Sweden's ranking in the European Railway Performance Index and the train industry's customer satisfaction have decreased. The largest and most influential players in the train industry are Trafikverket and SJ AB, which are two state-owned organizations. The low customer satisfaction in the train industry is based on SJ AB’s failure to deliver the latest traffic information in case of delay. Since Trafikverket manages the majority of the railway network and SJ AB is Sweden's largest train operator, there is a high degree of mutual dependence between them in order for each organization to be able to conduct its business. At present, there is a clear division of responsibility between these actors regarding how and where the traffic information is distributed. Where the Trafikverket is responsible for conveying all information on its website and on the signs while SJ AB is responsible for conveying information on its website and in its application, as well as via textmessage and mail communication. The processes for communication between the actors regarding delays, track changes and other changes in the journey are done manually with digital tools as support. Today, several uncertainties are experienced in these processes, based on the fact that traffic control at SJ AB currently does not receive any confirmation from Trafikverket if any changes have been made. Due to these factors, this study will investigate the flow of information between SJ traffic control and the Trafikverket’s train services. Therefore, the purpose of this thesis was to identify existing communications, more specifically information flows of traffic data, between the organizations and their passengers, along with identifying possible improvements in the information flows through digitization. The thesis also explores the change opportunities for a department with many manual processes. The study was conducted with a multi-method structure consisting of a literature study, observation study, questionnaire study, interview study and benchmarking. During the observation, questionnaire and interview study, the focus was on employees at SJ traffic control and other relevant departments at SJ AB and the benchmarking was conducted through external interviews. This led to the identification of the main reasons of mismanagement of traffic information and best practices. This was later analyzed with the help of collected theory, which then led to discussion, conclusion and recommendations to SJ AB. It’s obvious that traffic information has low status within SJ AB as well as the interorganizational alliance. This has hampered the development of internal processes related to managing traffic information and therefore the processes lack standardized procedures and structured routines. The main reasons of mismanagement of traffic information depend on the human factor and specific individuals. This is based on the fact that it was clear that both traffic controls perceived the received traffic information difficult to interpret and unstructured, this depending on who sent the information. This due to the fact that the individuals use varying expressions and internal technical language. To be able to improve and in the future automate the information flows between the actors a standardized working procedure with associated technical language is required in the industry. To establish a technical language, the status of the traffic information must be increased within the organizations and a dictionary for which expressions should be used, where all used expressions are listed and defined. Additionally, a clearer goal of the alliance is required.
99

Development of an integrated tool for Process Modelling and Life Cycle Assessment : Ecodesign of process plants and application to drinking water treatment / Développement d’un outil intégré pour la Modélisation de Procédés et l’Analyse de Cycle de Vie : Ecoconception d’usines de procédés et application à la production d’eau potable

Mery, Yoann 14 December 2012 (has links)
Des outils adaptés pour s’attaquer aux problématiques environnementales sont nécessaires mais malheureusement absents de l’industrie. En effet, l’introduction de nouvelles pratiques d’écoconception dans l’industrie des procédés est entravée par le manque de réalisme et de flexibilité des outils associés. Les objectifs principaux de ce travail de recherche étaient le développement d’un outil intégré pour la modélisation de procédés et l’analyse de cycle de vie (PM-LCA), ainsi que la formulation d’une approche méthodologique affiliée pour l’écoconception de procédés. L’outil logiciel et l’approche méthodologique sont appliqués à la production d’eau potable.La revue de la littérature scientifique a permis d’appréhender les efforts de recherche nécessaires. Les principales lignes directrices sont établies en conséquence.L’outil développé, nommé EVALEAU, consiste en une bibliothèque logicielle de modèles de procédés unitaires permettant le calcul d’inventaire de données en fonction de paramètres de procédés. L’outil est embarqué dans le logiciel ACV Umberto® en complément de la base de données Ecoinvent. Uneboîte à outils pour l’analyse de sensibilité, basée sur la méthode de Morris, est implémentée pour l’identification des paramètres de procédés ayant une influence majeure sur les résultats d’impacts environnementaux.L’outil EVALEAU est testé sur deux études de cas - deux usines de production d’eau potable existantes. La fiabilité de l’approche est démontrée à travers la comparaison des calculs de qualité de l’eau, de consommations d’énergie et de matériaux avec les données réelles recueillies sur site. Une procédure d’écoconception est expérimentée sur une chaîne de traitement complexe démontrant ainsi la pertinence des résultats de simulations et l’utilité de l’analyse de sensibilité pour un choix optimal des paramètres opératoires. En conséquence, ce premier outil PM-LCA est censé promouvoir l’introduction de pratiques d’écoconception dans l’industrie de l’eau / Adapted tools for tackling environmental issues are necessary but they are still missing in industry. Indeed, the introduction of ecodesign practices in the process industry is hindered by the lack of realism and flexibility of related tools.The main objectives of this research work were the development of a fully integrated tool for Process Modelling & Life Cycle Assessment (PM-LCA), and the formulation of an affiliated methodological approach for process ecodesign. The software tool and the methodological approach are meant to be applied to water treatment technologies.The literature review leads to a better comprehension of the required research efforts. The main guidelines for the development of the software tool are stated accordingly.The developed tool, named EVALEAU, consists in a library of unit process models allowing life cycleinventory calculation in function of process parameters. The tool is embedded in Umberto® LCA software and is complementary to Ecoinvent database. A sensitivity analysis toolbox, based on theMorris method, was included for the identification of the process parameters mainly affecting the lifecycle impact assessment results.EVALEAU tool was tested through two case studies - two existing drinking water plants. There liability of the modelling approach was demonstrated through water quality simulation, energy and materials inventory simulation, compared with site real data. An ecodesign procedure was experienced on a complex water treatment chain, demonstrating the relevance of simulation results and the usefulness of sensitivity analysis for an optimal choice of operation parameters.This first developed PM-LCA tool is dedicated to foster the introduction of ecodesign practices in the water industry
100

Data Perspectives of Workflow Schema Evolution : Cases of Task Deletion and Insertion

Arunagiri, Aravindhan January 2013 (has links) (PDF)
Dynamic changes in the business environment requires their business process to be up-to-date. The Workflow Management Systems supporting these business processes need to adapt to these changes rapidly. The Work Flow Management Systems however lacks the ability to dynamically propagate the process changes to their process model schemas (Workflow templates). The literature on workflow schema evolution emphasizes the impact of changes in control flow with very ittle attention to other aspects of a workflow schema. This thesis studies the data aspect (data flow and data model) of workflow schema during its evolution. Workflow schema changes can lead to inconsistencies between the underlying database model and the workflow. A rather straight forward approach to the problem would be to abandon the existing database model and start afresh. However this introduces data persistence issues. Also there could be significant system downtimes involved in the process of migrating data from the old database model to the current one. In this research we develop an approach to address this problem. The business changes demand various types of control flow changes to its business process model (workflow schema). The control flow changes include task insertion, deletion, swapping, movement, replacement, extraction, in-lining, Parallelizing etc. Many of the control flow changes to the workflow can be made by using the combination of a simple task insertion and deletion, while some like embedding task in loop/ conditional branch and Parallelizing tasks also requires the addition/removal of control dependency between the tasks. Since many of the control flow change patterns involves task insertion and deletion at its core, in this thesis we study its impact on the underlying data model. We propose algorithms to dynamically handle the changes in the underlying relational database schema. First we identify the basic change patterns that can be implemented using atomic task insertion and deletions. Then we characterize these basic pattern in terms of their data flow anomalies (Missing, Redundant, Conflicting data) that they can generate. The Data schema compliance criteria are developed to identify the data changes: (i) that makes the underlying database schema inconsistent with the modified workflow and (ii) generating the aforementioned data anomalies. The Data schema compliance criteria characterizes the change patterns in terms of its ability to work with the current relational data model. The Data schema compliance criteria show various properties required of the modified workflow to be consistent with the underlying database model. The data of any workflow instance conforming to Data schema compliance criteria can be directly accommodated in the database model. The data anomalies (of task insertion and deletion) identified using DSC are handled dynamically using respective Data adaptation algorithms. The algorithm uses the functional dependency constraints in the relational database model to adapt/handle these data anomalies. Such handled data (changes) that conform to DSC can be directly accommodated in the underlying database schema. Hence with this approach the workflow can be modified (using task insertion and deletion) and their data changes can be implemented on-the-fly using the Data adaptation algorithms. In this research the same old data model is evolved without abandoning it even after the modification of the workflow schema. This maintains the old data persistence in the existing database schema. Detailed implementation procedures to deploy the Data adaptation algorithms are presented with illustrative examples.

Page generated in 0.1519 seconds