• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 1
  • 1
  • Tagged with
  • 8
  • 8
  • 7
  • 6
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Simulação de logs de eventos com foco na análise de processos de construção na indústria naval brasileira / Event log simulation with focus on analysing processes from the brazilian shipbuilding industry

Maciel, Thales Vaz January 2016 (has links)
Submitted by Jessica Andrade (jessicastefanysa@gmail.com) on 2018-06-25T18:21:40Z No. of bitstreams: 1 THALES.pdf: 4975032 bytes, checksum: cba8edebb4cead2492a1b616ef85e60f (MD5) / Rejected by Margareth Ferreira Pinto (margarethfpinto@hotmail.com), reason: Falta título em inglês.Palavra errada na citação. Nº de folhas erradas, (ver ficha catalográfica). on 2018-06-26T14:22:31Z (GMT) / Submitted by Jessica Andrade (jessicastefanysa@gmail.com) on 2018-06-28T17:17:31Z No. of bitstreams: 1 THALES.pdf: 4975032 bytes, checksum: cba8edebb4cead2492a1b616ef85e60f (MD5) / Approved for entry into archive by Margareth Ferreira Pinto (margarethfpinto@hotmail.com) on 2018-07-16T21:13:28Z (GMT) No. of bitstreams: 1 THALES.pdf: 4975032 bytes, checksum: cba8edebb4cead2492a1b616ef85e60f (MD5) / Made available in DSpace on 2018-07-16T21:13:28Z (GMT). No. of bitstreams: 1 THALES.pdf: 4975032 bytes, checksum: cba8edebb4cead2492a1b616ef85e60f (MD5) Previous issue date: 2016 / Ha muito se trata da necessidade de melhoramento na competitividade dos estaleiros da industria brasileira de construção naval em relação a concorrentes no ambito da industria internacional. Em grande parte, a baixa eficiência verificada neste setor da industria se da pela baixa priorização do emprego tecnologico para metodologias automatizadas para controle e diagnostico de processos de construção, por exemplo. Neste contexto, a mineração de processos vem sendo consolidada como soluções para descoberta de modelos, analise de conformidade e melhoramento de processos. Contudo, estas atividades nao são triviais, tendo como principal problematica a qualidade dos dados contidos nos logs de eventos. Este trabalho propoe uma metodologia para melhoramento de qualidade em logs de eventos originalmente caracterizados pela baixa granularidade das atividades nos aspectos quantitativos e temporal atraves do emprego de distribuições de probabilidades com a implementação de um novo software capaz de sintetizar um novo log de eventos, entao livre de tais problematicas de qualidade. Foi realizado um estudo de caso em estaleiro da industria brasileira, onde foram possibilitados experimentos de descoberta de modelos de processos com algoritmos livres e proprietarios, bem como a utilização de uma ferramenta de animação para detecção de gargalos no processo. Estes testes foram realizados com base no log de eventos original, provido pelo estaleiro e tambem sobre o log de eventos sintetico, gerado pelo software de simulação, para fins de validação da abordagem. Os resultados mostraram sucesso ao revelar a fragmentação oculta das atividades, possibilitando a descoberta de modelos fidedignos e abrindo precedente a trabalhos futuros. / The need for improvement in the competitiveness on the Brazilian shipbuilding in- dustry’s shipyards, in relation to its competitors from abroad, is not a novice issue. The low efficiency that can be verified in this section of the national industry is greatly caused by the low prioritization of technological usage in automated methodologies for control- ling and diagnosing the aseembly process, for example. In that context, process mining has been consolidated as the solution for discovering models, conformance analysis and enhancement of business processes. However, such activities are far from trivial oftenly facing log event data quality issues. This work proposes a novice methodology for the improvement of data quality in event logs that are originaly described as ungranular in the quantitative and temporal aspects, by using probability distributions with a new soft- ware implementation that is capable of synthesising a new event log, which is then free of such quality problematics. A case study has been performed in a shipyard from whithin the Brazilian industry, where various process discovery experiments have been executed with both free and proprietary algorithms. Also, a process model animation tool has been applied for bottleneck detection purposes. Such experiments were conducted based on he original event log that was provided by the shipyard’s administration office and also on the event log that has been generated by the simulation software, for validating the pre- sented approach. The results showed success in revealing the hidden fragmentation in the activities, enabling the discovery of trustworthy process models and opening precedents for future work.
2

Using Event logs and Rapid Ethnographic Data to Mine Clinical Pathways

January 2020 (has links)
abstract: Background: Process mining (PM) using event log files is gaining popularity in healthcare to investigate clinical pathways. But it has many unique challenges. Clinical Pathways (CPs) are often complex and unstructured which results in spaghetti-like models. Moreover, the log files collected from the electronic health record (EHR) often contain noisy and incomplete data. Objective: Based on the traditional process mining technique of using event logs generated by an EHR, observational video data from rapid ethnography (RE) were combined to model, interpret, simplify and validate the perioperative (PeriOp) CPs. Method: The data collection and analysis pipeline consisted of the following steps: (1) Obtain RE data, (2) Obtain EHR event logs, (3) Generate CP from RE data, (4) Identify EHR interfaces and functionalities, (5) Analyze EHR functionalities to identify missing events, (6) Clean and preprocess event logs to remove noise, (7) Use PM to compute CP time metrics, (8) Further remove noise by removing outliers, (9) Mine CP from event logs and (10) Compare CPs resulting from RE and PM. Results: Four provider interviews and 1,917,059 event logs and 877 minutes of video ethnography recording EHRs interaction were collected. When mapping event logs to EHR functionalities, the intraoperative (IntraOp) event logs were more complete (45%) when compared with preoperative (35%) and postoperative (21.5%) event logs. After removing the noise (496 outliers) and calculating the duration of the PeriOp CP, the median was 189 minutes and the standard deviation was 291 minutes. Finally, RE data were analyzed to help identify most clinically relevant event logs and simplify spaghetti-like CPs resulting from PM. Conclusion: The study demonstrated the use of RE to help overcome challenges of automatic discovery of CPs. It also demonstrated that RE data could be used to identify relevant clinical tasks and incomplete data, remove noise (outliers), simplify CPs and validate mined CPs. / Dissertation/Thesis / Masters Thesis Computer Science 2020
3

Goal-oriented Process Mining

Ghasemi, Mahdi 05 January 2022 (has links)
Context: Process mining is an approach that exploits event logs to discover real processes executed in organizations, enabling them to (re)design and improve process models. Goal modelling, on the other hand, is a requirements engineering (RE) approach mainly used to analyze what-if situations and support decision making. Problem: Common problems with process mining include the complexity of discovered “spaghetti” processes and a lack of goal-process alignment. Current process mining practices mainly focus on activities and do not benefit from considering stakeholder goals and requirements to manage complexity and alignment. The critical artifact that process mining practices rely on is the event log. However, using a raw version of real-life event logs will typically result in process models being too complex, unstructured, difficult to understand and, above all, not aligned with stakeholders’ goals. Method: Involving goal-related factors can augment the precision and interpretability of mined models and help discover better opportunities to satisfy stakeholders. This thesis proposes three algorithms for goal-oriented process enhancement and discovery (GoPED) that show synergetic effects achievable by combining process mining and goal-oriented modelling. With GoPED, good historical experiences will be found within the event log to be used as a basis for inferring good process models, and bad experiences will be found to discover models to avoid. The goodness is defined in terms of alignment with regards to three categories of goal-related criteria: • Case perspective: satisfaction of individual cases (e.g., patient, costumer) in terms of some goals; • Goal perspective: overall satisfaction of some goals (e.g., to decrease waiting time) rather than individual cases; and • Organization perspective: a comprehensive satisfaction level for all goals over all cases. GoPED first adds goal-related attributes to conventional event characteristics (case identifier, activities, and timestamps), selects a subset of cases concerning goal-related criteria, and finally discovers a process model from that subset. For each criterion, an algorithm is developed to enable selecting the best subset of cases where the criterion holds. The resulting process models are expected to reproduce the desired level of satisfaction. The three GoPED algorithms were implemented in a Python tool. In addition, three other tools were implemented to complete a line of actions whose input is a raw event log and output is a subset of the event log selected with respect to the goal-related criteria. GoPED was used on real healthcare event logs (an illustrative example and a case study) to discover processes, and the performance of the tools was also assessed. Results: The performance of the GoPED toolset for various sizes and configurations of event logs was assessed through extensive experiments. The results show that the three GoPED algorithms are practical and scalable for application to event logs with realistic sizes and types of configurations. The GoPED method was also applied to the discovery of processes from the raw event log of the trajectories of patients with sepsis in a Dutch hospital, from their registration in the emergency room until their discharge. Although the raw data does not explicitly include goal-related information, some reasonable goals were derived from the data and a related research paper in consultation with a healthcare expert. The method was applied, and the resulting models were i) substantially simpler than the model dis-covered from the whole event log, ii) free from the drawbacks that using the whole event log causes, and iii) aligned with the predefined goals. Conclusion: GoPED demonstrates the benefits of exploiting goal modelling capabilities to enhance event logs and select a subset of events to discover goal-aligned and simplified process models. The resulting process model can also be compared to a model discovered from the original event log to reveal new insights about the ability of different forms of process models to satisfy the stakeholders’ goals. Learning from good behaviours that satisfy goals and detecting bad behaviours that hurt them is an opportunity to redesign models, so they are simpler, better aligned with goals, and free from drawbacks that using the whole event log may cause.
4

Comparison of adversary emulation tools for reproducing behavior in cyber attacks / : Jämförelse av verktyg för motståndaremulering vid återskapande av beteenden i cyberattacker

Elgh, Joakim January 2022 (has links)
As cyber criminals can find many different ways of gaining unauthorized access to systems without being detected, it is of high importance for organizations to monitor what is happening inside their systems. Adversary emulation is a way to mimic behavior of advanced adversaries within cyber security, which can be used to test detection capabilities of malicious behavior within a system of an organization. The emulated behavior can be based on what have been observed in real cyber attacks - open source knowledge bases such as MITRE ATT&CK collect this kind of intelligence. Many organizations have in recent years developed tools to simplify emulating the behavior of known adversaries. These tools are referred to as adversary emulation tools in this thesis. The purpose of this thesis was to evaluate how noisy different adversary emulation tools are. This was done through measurements on the amount of event logs generated by Sysmon when performing emulations against a Windows system. The goal was to find out which tool was the least noisy. The different adversary emulation tools included in this thesis were Invoke-AtomicRedTeam, CALDERA, ATTPwn and Red Team Automation. To make sure the correlation between the adversary emulation tools and the generated event logs could be identified, a controlled experiment was selected as the method for the study. Five experiments were designed including one emulation scenario each, executed by the different adversary emulation tools included in each experiment. After each emulation, event logs were collected, filtered, and measured for use in the comparison. Three experiments were conducted which compared Invoke-AtomicRedTeam, CALDERA, and a manual emulation. The results of the first three experiments indicated that Invoke-AtomicRedTeam team was the noisiest, followed by CALDERA, and the manual emulation was the least noisy. On average, the manual emulation generated 83,9% fewer logs than Invoke-AtomicRedTeam and 78,4% fewer logs than CALDERA in experiments 1-3. A fourth experiment compared Red Team Automation and Invoke-AtomicRedTeam, where Red Team Automation was the least noisy tool. The final fifth experiment compared ATTPwn and CALDERA, and the results indicated that these were similarly noisy but in different ways. It was also concluded that a main difference between the adversary emulation tools was that the number of techniques available differed between the tools which could limit the ability to emulate the behavior of real adversaries. However, as the emulation tools were implemented in different ways, this thesis could be one starting point for future development of silent adversary emulation tools or to assist in selecting an existing adversary emulation tool.
5

Scalable Performance Assessment of Industrial Assets: A Data Mining Approach

Dagnely, Pierre 21 June 2019 (has links) (PDF)
Nowadays, more and more industrial assets are continuously monitored and generate vast amount of event logs and sensor data. Data Mining is the field concerned with the exploration and exploitation of these data. Despite the fact that data mining has been researched for decades, the event log data are still underexploited in most data mining workflows although they could provide valuable insights on the asset behavior as they represent the internal processes of an asset. However, exploitation of event log data is challenging, mainly as: 1) event labels are not consistent across manufacturers, 2) assets report vast amount of data from which only a small part may be relevant, 3) textual event logs and numerical sensor data are usually processed by methods dedicated respectively to textual data or sensor data, methods combining both types of data are still missing, 4) industrial data are rarely labelled, i.e. there is no indication on the actual performance of the asset and it has to be derived from other sources, 5) the meaning of an event may vary depending on the events send after or before.Concretely, this thesis is concerned with the conception and validation of an integrated data processing framework for scalable performance assessment of industrial asset portfolios. This framework is composed of several advanced methodologies facilitating exploitation of both event logs and time series sensor data: 1) an ontology model describing photovoltaic (the validation domain) event system allowing the integration of heterogeneous event generated by various manufacturers; 2) a novel and computationally scalable methodology enabling automatic calculation of event relevancy score without any prior knowledge; 3) a semantically enriched multi-level pattern mining methodology enabling data exploration and hypothesis building across heterogeneous assets; 4) an advanced workflow extracting performance profiles by combining textual event logs and numerical sensor values; 5) a scalable methodology allowing rapid annotation of new asset runs with a known performance label only based on the event logs data.The framework has been exhaustively validated on real-world data from PV plants, provided by our industrial partner 3E. However, the framework has been designed to be domain agnostic and can be adapted to other industrial assets reporting event logs and sensor data. / Doctorat en Sciences de l'ingénieur et technologie / info:eu-repo/semantics/nonPublished
6

A Hybrid Methodology In Process Modeling:

Esgin, Eren 01 February 2009 (has links) (PDF)
The managing of complex business processes, which are changed due to globalization, calls for the development of powerful information systems that offer generic process modeling and process execution capabilities. Even though contemporary information systems are more and more utilized in enterprises, their actual impact in automatizing complex business process is still limited by the difficulties encountered in design phase. Actually this design phase is time consuming, often subjective and incomplete. In the scope of this study, a reverse approach is followed. Instead of starting with process design, the method of discovering interesting patterns from the navigation traces is taken as basis and a new data analysis methodology named &ldquo / From-to Chart Based Process Discovery&rdquo / is proposed. In this hybrid methodology &ldquo / from-to chart&rdquo / , which is fundamentally dedicated to material handling issues on production floor, is used as the front-end to monitor the transitions among activities of a realistic event log and convert these raw relations into optimum activity sequence. Then a revised version of process mining, which is the back-end of this methodology, upgrades optimum activity sequence into process model.
7

A Comparative Study of Process Mining Software for Supporting Bottleneck Analysis of Production Systems

Vaithianathan, Suntharaseelan January 2021 (has links)
Process mining (PM) has already found its place in analyzing and visualizing business processes across different sectors such as banking, healthcare, insurance, and logistics. However, its application is very limited in production systems. Production systems are very complex and dynamic, hindering the accurate capture of different process chain events. Generating the event log data out of the recorded events requires a proper understanding of the processes. The results of the bottleneck analysis can help the production experts to optimize the process. This study analyzes three to four PM tools to select an appropriate tool that can perform bottleneck analysis. The initial research will focus on various features of the PM tools such as modeling support, user friendliness, flexibility, stability, animation support, and BPMN modeling support. The feature analysis uses the event-log data generated from simulating the process model created in a Simulation-based Optimization tool FACTS Analyzer. Some PM tools can generate the BPMN model from the process model; animation support and modeling support features are vital for the bottleneck analysis. The PM tool's ability to create the process model automatically from the event log data can help save time involved in data collection through a physical visit to the site, conducting interviews, observations, etc. In contrast, the bottleneck analysis used simulated event log data and an event log generated from a real-world production system. The study focuses on finding whether PM can support the bottleneck analysis of the real-world production systems and find the appropriate PM tool available in the market for the same application. The use of process mining tools in the bottleneck analysis requires human intervention in identifying the bottlenecks in the process as the PM tools are incapable of recognizing them on their own. But, appropriate use of the PM tools in the bottleneck analysis can help save a considerable amount of time in manually building the simulation model. Performing iterative optimization could lead to socio, economic, and environmental sustainability as it helps avoid wastages in time, material, and energy.
8

Modelo de evaluación de métricas de control para procesos de negocio utilizando Process Mining / Control Metrics Evaluation Model for Business Processes using Process Mining

García Oliva, Rodrigo Alfonso, Santos Barrenechea, Jesús Javier 24 October 2020 (has links)
Este proyecto tiene como objetivo analizar la complejidad de los procesos de negocio en las empresas retail de una forma profunda que en otras técnicas resulta muy difícil o incluso imposible de realizar. Con Process Mining es posible superar esta brecha y eso es lo que queremos demostrar a través de la implementación de un modelo. El proyecto propone un modelo de Process Mining que contemple la presencia de diversas fuentes de información de un proceso logístico en una empresa minorista, así como la aplicación de las tres fases de Process Mining (Descubrimiento, Conformidad y Mejora) y adicionalmente se propone una fase de diagnóstico la cual detalla un conjunto de métricas de control para evaluar el proceso de logística y así poder generar una plan de mejora que dé las pautas para optimizar el proceso en base a lo analizado mediante esta técnica. El modelo desarrollado se implementó en una empresa peruana del sector retail (TopiTop S.A) para el análisis del proceso de logística, específicamente el de gestión de órdenes de compra. Este se analizó dando como resultado de la aplicación del modelo y de la evaluación de las métricas propuestas, la identificación de anomalías en el proceso a través de la aplicación de cada una de las fases del modelo propuesto, asegurando la calidad del análisis en la fase de preprocesamiento, generando el modelo de procesos y extrayendo información que se derivó en métricas de control a través de la herramienta de código abierto ProM Tools. / This project aims to analyze the complexity of business processes in retail companies in a deep way that in other techniques is very difficult or even impossible to do. With Process Mining it is possible to overcome this gap and that is what we want to demonstrate through the implementation of a Process Mining model. The project proposes a Process Mining model that contemplates the presence of various sources of information of a logistic process in a retail company, as well as the application of the three phases of Process Mining (Discovery, Compliance and Improvement). Additionally, a diagnostic phase is proposed, which details a set of control metrics to evaluate the logistic process and thus be able to generate an improvement plan that gives the guidelines to optimize the process based on what has been analyzed through this technique. The model developed was implemented in a peruvian company in the retail sector (TopiTop S.A.) for the analysis of the logistics process, specifically the management of purchase orders. This was analyzed giving as a result of the application of the model and the evaluation of the proposed metrics, the identification of anomalies in the process through the application of each of the phases of the proposed model, ensuring the quality of the analysis in the pre-processing phase, generating the process model and extracting information that was derived in control metrics through the open source tool ProM Tools. / Tesis

Page generated in 0.0662 seconds