• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 10
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 34
  • 34
  • 11
  • 10
  • 10
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Uma abordagem incremental para mineração de processos de negócio / Incremental approach to business process mining

Kalsing, André Cristiano January 2012 (has links)
Até os dias de hoje, diversos algoritmos de mineração de modelos de processos já foram propostos para extrair conhecimento a partir de logs de eventos. O conhecimento que tais algoritmos são capazes de obter incluem modelos de processos de negócio, assim como aspectos da estrutura organizacional, como atores e papéis. A mineração de processos pode se beneficiar de uma estratégia incremental, especialmente quando as informações sobre um ou mais processos de negócio presentes no código fonte de um sistema de informação são logicamente complexas (diversas ramificações e atividades paralelas e/ou alternativas). Neste cenário, são necessárias muitas execuções da aplicação para a coleta de um grande conjunto de dados no arquivo de log, a fim de que o algoritmo de mineração possa descobrir e apresentar o processo de negócio completo. Outra situação que torna necessária a mineração incremental é a constante evolução dos processos de negócio, ocasionada geralmente por alterações nas regras de negócio de uma ou mais aplicações. Neste caso, o log pode apresentar novos fluxos de atividades, ou fluxos alterados ou simplesmente fluxos que não são mais executados. Estas mudanças devem ser refletidas no modelo do processo a fim de garantir a sincronização entre a aplicação (processo executado) e o modelo. A mineração incremental de processos pode ainda ser útil quando se faz necessária a extração gradual de um modelo de processo completo, extraindo modelos parciais (fragmentos de processo com início e fim) em um primeiro passo e integrando conhecimento adicional ao modelo em etapas até a obtenção do modelo completo. Contudo, os algoritmos atuais de mineração incremental de processos não apresentam total efetividade quanto aos aspectos acima citados, apresentando algumas limitações. Dentre elas podemos citar a não remoção de elementos obsoletos do modelo de processo descoberto, gerados após a atualização do processo executado, e também a descoberta de informações da estrutura organizacional associada ao processo como, por exemplo, os atores que executam as atividades. Este trabalho propõe um algoritmo incremental para a mineração de processos de negócio a partir de logs de execução. Ele permite a atualização completa de um modelo existente, bem como o incremento de um modelo de processo na medida em que novas instâncias são adicionadas ao log. Desta forma, podemos manter ambos, modelo de processo e o processo executado sincronizados, além de diminuirmos o tempo total de processamento uma vez que apenas novas instâncias de processo devem ser consideradas. Por fim, com este algoritmo é possível extrair modelos com acurácia igual ou superior aqueles que podem ser extraídos pelos algoritmos incrementais atuais. / Even today, several process mining algorithms have been proposed to extract knowledge from event logs of applications. The knowledge that such algorithms are able to discovery includes business process models, business rules, as well as aspects of organizational structure, such actors and roles of processes. These process mining algorithms can be divided into two: non-incremental and incremental. The mining process can benefit from an incremental strategy, especially when information about the process structure available in the system source code is logically complex (several branches and parallel activities). In this scenario, its necessary several executions of the application, to collect a large set of log data, so that the mining algorithm can discover and present the complete business process. Another use case where incremental mining is usefull is during the changing structure of the process, caused by the change in the business logic of an application. In this case, the log may provide new traces of activities, modified traces or simply traces that are no longer running. These changes must be reflected in the process model being generated to ensure synchronization between the application and model. The incremental process mining can also be useful when it is necessary to extract a complete process model in a gradual way, extracting partial models (process fragments with begin and end) in a first step and integrating additional knowledge to the model in stages to obtain the complete model. However, existing incremental process mining algorithms are not effective to all aspects mentioned above. All of them have limitations with respect to certain aspects of incremental mining, such as deletion of elements in the process model (process model update). Additionally, most of them do not extract all the information present in the structure of the process, such as the actors who perform the activities. This paper proposes an incremental process mining algorithm from execution logs of information systems. The new algorithm allows the full update (adding and removing elements) of an existing model, as well as the increment of a process model as new records are added to the log. Thus, we can keep process models and process execution syncronized, while reducting the total processing time, since only new process instances must be processed. Finally, are expected the extraction of process models with similar or higher accuracy compared to current incremental mining algorithms.
12

Um método para descoberta semi-automática de processos de negócio codificados em sistemas legados / A semi-automatic method to discovery business processes encoded in legacy systems

Nascimento, Gleison Samuel do January 2014 (has links)
Há mais de uma década, BPM vem sendo introduzida nas organizações devido suas vantagens tais como documentação, gerenciamento, monitoração e melhoria contínua de seus processos de negócio. Na abordagem BPM, normalmente, os processos de negócio da organização são executados sob o controle de um Sistema Gerenciador de Processos de Negócio. Estes sistemas executam os processos de negócio, coletando informações úteis para organização. Por exemplo, através destes sistemas é possível identificar as atividades que mais demoram ou consomem mais recursos humanos. Desta forma, é possível redesenhar os processos de maneira ágil, garantido a evolução contínua do negócio. Entretanto, para se beneficiar da tecnologia BPM, a organização deve mapear seus processos de negócio e modelá-los no Sistema Gerenciador de Processos de Negócio. Normalmente, esse trabalho é realizado por especialistas humanos, que observam e identificam o funcionamento da organização, definindo, em detalhes, os fluxos de trabalho realizados para cumprir determinadas metas de negócio. Contudo, na maior parte das organizações os processos de negócio encontram-se implementados em sistemas de informações legados. Tais sistemas possuem pouca documentação, foram desenvolvidos com uso de tecnologias obsoletas e os processos de negócio, neles contidos, foram programados implicitamente no seu código fonte. Deste modo, além das entrevistas com usuários-chave da organização, os analistas precisam também entender o funcionamento dos sistemas legados a fim de identificar os processos de negócio da organização. Geralmente, os analistas de negócio fazem este trabalho manualmente, interpretando os algoritmos escritos no código fonte legado e identificando os fluxos de trabalho nele escritos. Esse trabalho é complexo, demorado e suscetível a erros, pois depende do nível de conhecimento que os analistas de negócio têm sobre o código fonte legado. Pensando neste problema, essa tese apresenta um método que automatiza a descoberta de processos de negócio implementados implicitamente no código fonte de sistemas legados. O método propõe uma técnica híbrida, que usa análise estática do código fonte e análise dinâmica (mineração de processos) para descobrir os processos de negócio codificados em sistemas legados. A tese apresenta os passos para aplicação do método, definindo para cada passo, um conjunto de ferramentas capazes de automatizar a descoberta de informações no código fonte legado. Este trabalho também mostra três estudos de caso, onde o método foi aplicado com sucesso e comparado a outras técnicas existentes na literatura. / For over a decade, BPM is being introduced in organizations due to its advantages such as documentation, management, monitoring and continuous improvement of its business processes. In BPM approach, business processes of the organization are executed under the control of a Business Processes Management System. These systems monitor the execution of the processes and measuring the operational efficiency of the organization through, for example, of the identification of activities those are slower or consume more resources. Thus, the organization can redesign their business processes in an agile and fast mode, thereby ensuring the continued evolution of your business. However, to take advantage of BPM technology, the organization must map their business processes and model them in the Business Processes Management System. Typically, organizations execute the business process mapping through manual techniques, such as interviews, meetings with users, questionnaires, document analysis and observations of the organizational environment. However, in most organizations business processes are executed in legacy systems information. Such systems have not documentation, have been developed with obsolete technologies and the business processes are programmed implicitly in its source code. Thus, in addition to interviews with expert users of the organization, analysts must also understand the working of legacy systems in order to identify the business processes of the organization. Generally, business analysts do this work manually, interpreting algorithms written in legacy source code and identifying workflows written in the source code. This work is complex, time consuming and error prone, since it depends on the knowledge level that business analysts have about the legacy source code. Thinking about this problem, this thesis presents a method that automates the discovery of business processes implemented implicitly in the source code of legacy systems. The method proposes a hybrid technique that uses static analysis of the source code and dynamic analysis (mining process) to discover business processes encoded in legacy systems. The thesis presents the steps for applying the method, defining for each step, a set of tools that automate the discovery of information in the legacy source code. This work also shows three case studies where the method was successfully applied and compared to other existing techniques in the literature.
13

Service Oriented Architecture & Web Services : Guidelines for Migrating from Legacy Systems and Financial Consideration

Adeyinka, Oluwaseyi January 2008 (has links)
The purpose of this study is to present guidelines that can be followed when introducing Service-oriented architecture through the use of Web services. This guideline will be especially useful for organizations migrating from their existing legacy systems where the need also arises to consider the financial implications of such an investment whether it is worthwhile or not. The proposed implementation guide aims at increasing the chances of IT departments in organizations to ensure a successful integration of SOA into their system and secure strong financial commitment from the executive management. Service oriented architecture technology is a new concept, a new way of looking at a system which has emerged in the IT world and can be implemented by several methods of which Web services is one platform. Since it is a developing technology, organizations need to be cautious on how to implement this technology to obtain maximum benefits. Though a well-designed, service-oriented environment can simplify and streamline many aspects of information technology and business, achieving this state is not an easy task. Traditionally, management finds it very difficult to justify the considerable cost of modernization, let alone shouldering the risk without achieving some benefits in terms of business value. The study identifies some common best practices of implementing SOA and the use of Web services, steps to successfully migrate from legacy systems to componentized or service enabled systems. The study also identified how to present financial return on investment and business benefits to the management in order to secure the necessary funds. This master thesis is based on academic literature study, professional research journals and publications, interview with business organizations currently working on service oriented architecture. I present guidelines that can be of assistance to migrate from legacy systems to service-oriented architecture based on the analysis from comparing information sources mentioned above.
14

Automated Culling of Data in a Relational Database for Archiving

Nilsson, Simon January 2019 (has links)
Background. Archiving of legacy information systems is challenging. When no options exist for extracting the information in a structured way, the last resort is to save the database. Optimally only the information that is relevant should be saved and the rest of the information could be removed. Objectives. The goal is to develop a method for assisting the archivist in the process of culling a database before archiving. The method should be described as rules defining how the tables can be identified.Methods. To get an overview of how the process works today and what archivists think can be improved, a number of interviews with experts in database archiving is done. The results from the interviews are then analysed, together with test databases to define rules that can be used in a general case. The rules are then implemented in a prototype that is tested and evaluated to verify if the method works. Results. The results point to the algorithm being both faster and able to exclude more irrelevant tables than a person could do with the manual method. An algorithm for finding candidate keys has also been improved to decrease the number of tests and execution time in the worst case. Conclusions. The evaluation shows results that point to the method working as intended while resulting in less work for the archivist. More work should be done on this method to improve it further.
15

Outsmarting Passwords : An Analysis on the use of Smart Cards for Single Sign-On in Legacy Applications / Singelinloggning i Legacysystem : En Studie i Användandet av Smarta Kort för Singelinloggning i Legacysystem

Tingström, Alexander January 2017 (has links)
By leveraging smart-cards as a bearer of user credentials for legacy applications the security of these systems can be greatly increased. In this thesis a solution to the problem of legacy applications only allowing username-password authentication is proposed. Storing user-data encrypted on the card and automatically serving it to the required application allows for automatically generated passwords to be used. The backbone of this system is developed. This solution is then analyzed and found to result in a significantly increased level of security. / Genom att använda smartakort som bärare av användarnamn och lösenord för gamla "legacyapplikationer" så kan man drastiskt öka säkerheten i dessa system. I detta examensarbete så läggs ett förslag på lösning till problemet att äldre applikationer enbart tillåter autentisering genom användarnamn.lösenord. Genom att lagra användardata på ett enkrypterat utrymme i ett smartkort för att sedan automatiskt mata in detta i den berörda applikationen så tillåts starkare, ej användarvänliga, lösenord att användas. Grunden till detta system utvecklas och en säkerhetsanalys utförs. Detta visar på en kraftig förbättring av säkerheten gentemot dessa system.
16

A Behavioral Test Strategy For Board Level Systems

Hameed, Qaisar 13 March 1999 (has links)
A digital board typically contains a heterogeneous mixture of chips: microprocessors, memory, control and I/O logic. Different testing techniques are needed for each of these components. To test the whole board, these techniques must be integrated into an overall testing strategy for the board. In this thesis, we have applied a behavioral testing scheme to test the board. Each component chip is tested by observing the behavior of the system in response to the test code, i.e. the component under test is not isolated from the rest of the circuit during test. This obviates the need for the extra hardware used for isolating the chips that is required for structural testing. But this is done at the cost of reduced fault location, although fault detection is still adequate. We have applied the start small approach to behavioral testing. We start by testing a small core of functions. Then, only those functions already tested are used to test the remaining behavior. The grand goal is testing the whole board. This is divided into goals for testing each of the individual chips, which is further subdivided into sub-goals for each of the sub-functions of the board or sub-goals for testing for the most common faults in a component. Each component is tested one by one. Once a component passes, it is put in a passed items set and then can be used in testing the remaining components. Using the start small approach helps isolate the faults to the chip level and thus results in better fault location than the simple behavioral testing scheme in which there is no concept of passed items set and its usage. As an example, this testing approach is applied to a microcontroller based temperature sensor board. This code is run on the VHDL model of the system, and then also on the actual system. For modeling the system in VHDL, Synopsys Smart model library components are used. Faults are injected in the system and then the performance of the strategy is evaluated. This strategy is found to be very effective in detecting internal faults of the chip and locating the faults to the chip level. The interconnection faults are difficult to locate although they are detected in most of the cases. Different scenarios for incorporating this scheme in legacy systems are also discussed. / Master of Science
17

The Impact of Refactoring LegacySystems on Code Quality Metrics

Jonsson, Alan January 2017 (has links)
Maintaining legacy code can be difficult, it can also be hard to know whether what you are doing tomaintain the code is actually beneficial or not. This paper examines the impact of refactoring on codequality by systematically applying proven refactoring techniques and measuring its effect on codequality through code metrics.As an application at Xylem was being modified the application was simultaneously refactored. Beforeany refactoring was done the code quality was measured by maintainability index, cyclomaticcomplexity, depth of inheritance, class coupling and lines of code. Some parts of the application werere-written, others were refactored. By measuring the same metrics after the application wasrefactored it was found that the metrics had improved, indicating that the code quality hadincreased.
18

Integrace legacy databází do soudobých informačních systémů / Integration of legacy databases to current information systems

Navrátil, Jan January 2016 (has links)
The goal of this thesis is to design and implement a framework to support Legacy system access. Legacy systems are databases that use incompatible and obsolete technologies and can not be easily abandoned. The framework allows the abstraction of application logic from database platform and will enable full or incremental migration to a new, modern platform in the future. The framework also considers the option of encapsulation of an existing legacy application to be included in the new system as a black box. A system based on proposed framework has been succesfully deployed in a company. The system facilitated the migration of the company to a new information system with an entirely different database platform. The practice shows the viability of the framework design. 1
19

Fatores críticos de sucesso para integração com sistemas legados: um estudo de caso no SERPRO / Critical success factors for integration with legacy systems: A SERPRO case study

Anastácio, Wellington Montefusco 18 March 2014 (has links)
Para o cidadão solicitar serviços de governo através de um portal que centralize o atendimento e não requeira conhecimento prévio da estrutura administrativa que organiza esses serviços, é necessário que o governo se atente a uma questão desafiadora: Quais são as poucas áreas na integração com sistemas legados cujo tratamento inadequado implicará necessariamente no fracasso na implementação do portal? Uma vez que sistemas de informação estão fortemente ligados aos processos de negócio da organização que atendem, a resposta dessa pergunta não tem natureza unicamente tecnológica. Buscou-se, nesse trabalho, a identificação das poucas áreas críticas para o sucesso na integração com sistemas legados no governo e explicações que ajudassem a entender por que elas assim são. Conhecer e entender os resultados obtidos contribui para a superação de barreiras que residem no desafio de implantar o portal centralizador que facilitará o autoatendimento dos cidadãos e contribuirá para o avanço do governo eletrônico. A estratégia utilizada para atingir os objetivos foi a de estudo de caso da organização pública Serviço Federal de Processamento de Dados (SERPRO). Este estudo teve uma natureza exploratória e descritiva e a organização é relevante porque atende demandas de sistemas de informação para o governo nas esferas municipal, federal e estadual há quase cinco décadas. O estudo de caso se desenvolveu em duas fases: (1) survey para identificar os fatores críticos de sucesso, incluindo análise fatorial para identificar as dimensões críticas que resumem os fatores encontrados e; (2) entrevistas semiestruturadas aplicadas a dez profissionais escolhidos pelo critério de representarem percepções extremas e opostas em relação à percepção média sobre cada dimensões crítica encontrada. Todas as entrevistas foram transcritas e categorizadas por análise temática. Foram obtidos 106 respondentes do survey e mais de 12 horas de conteúdo transcrito para as análises. Os resultados obtidos foram que o sucesso da integração de sistemas de informação com sistemas legados no governo está fortemente ligado a quatro dimensões: (1) efetividade dos recursos tecnológicos e humanos, porque a complexidade do sistema legado foi identificada como a causa de fracasso mais relevante em projetos dessa natureza; (2) processo minimizador de incertezas, porque as distorções na comunicação e os imprevistos que surgem ao longo do projeto requisitam altíssima qualidade de comunicação; (3) poder de prioridade, porque conflitos de interesse são frequentes e é crítico que se tenha poder e recursos para resolver questões como falta de prioridade de uma equipe externa; e (4) clareza da necessidade do negócio, porque essa clareza dá à equipe de desenvolvimento a segurança necessária sobre a consistência da solução de integração de sistemas. / For the citizen requesting government services through a centralized portal service which does not require prior knowledge of the government administrative structure organizing these services, it is necessary that the government pay attention to a challenging question: What are the few areas in integration with legacy systems in which inadequate treatment results necessarily in failure to the portal implementation? Since information systems have a strong link to the business processes, the question has not only a technological nature answer. In this work, we sought to identify the few critical areas for successful integration with legacy systems in government environment. We also pursued to find possible explanations that help to understand the reason why they are critical. To know and to understand the results obtained in this work contributes to overcoming barriers that reside on the challenge of promoting centralized interface that will facilitate self-service for citizens and will promote the electronic government. The strategy used for the achievement of the objectives was a case study of the public organization Serviço Federal de Processamento de Dados (SERPRO). This study was of exploratory and descriptive nature and this organization is important because it meets the demands of information systems for the government at the municipal, state and federal levels to nearly five decades. This case study had two phases. First, we identified critical success factors with a survey and found dimensions that summarize the identified factors with a factor analyses. Second, we conducted a sequence of ten semi-structured interviews applied to experienced professionals. The criterion used to select these professionals were their perceived perception about the importance of the critical dimensions that we found. The selected perception should be extreme and opposite in relation to the average perception of the dimension. We transcribed and categorized all the interviews with a thematic analysis. As results, we got 106 survey respondents and 12 hours of transcribed interviews. We found four critical dimensions to the successful integration of information systems with legacy systems in government. The first is the effectiveness of technological and human resources because we identified that the complexity of legacy system in projects of system integration as the most important cause of failures. The second is to minimize process uncertainty because of the risk of distortions in communication and the need of contingencies that may arise during the project. The third is critical dimension is the priority force because conflicts of interest are common and it is critical to have the power and the necessary resources to solve them. The last dimension is the clarity of the business need, because this clearly gives the development team the necessary security about the consistency of the solution of systems integration.
20

Fatores críticos de sucesso para integração com sistemas legados: um estudo de caso no SERPRO / Critical success factors for integration with legacy systems: A SERPRO case study

Wellington Montefusco Anastácio 18 March 2014 (has links)
Para o cidadão solicitar serviços de governo através de um portal que centralize o atendimento e não requeira conhecimento prévio da estrutura administrativa que organiza esses serviços, é necessário que o governo se atente a uma questão desafiadora: Quais são as poucas áreas na integração com sistemas legados cujo tratamento inadequado implicará necessariamente no fracasso na implementação do portal? Uma vez que sistemas de informação estão fortemente ligados aos processos de negócio da organização que atendem, a resposta dessa pergunta não tem natureza unicamente tecnológica. Buscou-se, nesse trabalho, a identificação das poucas áreas críticas para o sucesso na integração com sistemas legados no governo e explicações que ajudassem a entender por que elas assim são. Conhecer e entender os resultados obtidos contribui para a superação de barreiras que residem no desafio de implantar o portal centralizador que facilitará o autoatendimento dos cidadãos e contribuirá para o avanço do governo eletrônico. A estratégia utilizada para atingir os objetivos foi a de estudo de caso da organização pública Serviço Federal de Processamento de Dados (SERPRO). Este estudo teve uma natureza exploratória e descritiva e a organização é relevante porque atende demandas de sistemas de informação para o governo nas esferas municipal, federal e estadual há quase cinco décadas. O estudo de caso se desenvolveu em duas fases: (1) survey para identificar os fatores críticos de sucesso, incluindo análise fatorial para identificar as dimensões críticas que resumem os fatores encontrados e; (2) entrevistas semiestruturadas aplicadas a dez profissionais escolhidos pelo critério de representarem percepções extremas e opostas em relação à percepção média sobre cada dimensões crítica encontrada. Todas as entrevistas foram transcritas e categorizadas por análise temática. Foram obtidos 106 respondentes do survey e mais de 12 horas de conteúdo transcrito para as análises. Os resultados obtidos foram que o sucesso da integração de sistemas de informação com sistemas legados no governo está fortemente ligado a quatro dimensões: (1) efetividade dos recursos tecnológicos e humanos, porque a complexidade do sistema legado foi identificada como a causa de fracasso mais relevante em projetos dessa natureza; (2) processo minimizador de incertezas, porque as distorções na comunicação e os imprevistos que surgem ao longo do projeto requisitam altíssima qualidade de comunicação; (3) poder de prioridade, porque conflitos de interesse são frequentes e é crítico que se tenha poder e recursos para resolver questões como falta de prioridade de uma equipe externa; e (4) clareza da necessidade do negócio, porque essa clareza dá à equipe de desenvolvimento a segurança necessária sobre a consistência da solução de integração de sistemas. / For the citizen requesting government services through a centralized portal service which does not require prior knowledge of the government administrative structure organizing these services, it is necessary that the government pay attention to a challenging question: What are the few areas in integration with legacy systems in which inadequate treatment results necessarily in failure to the portal implementation? Since information systems have a strong link to the business processes, the question has not only a technological nature answer. In this work, we sought to identify the few critical areas for successful integration with legacy systems in government environment. We also pursued to find possible explanations that help to understand the reason why they are critical. To know and to understand the results obtained in this work contributes to overcoming barriers that reside on the challenge of promoting centralized interface that will facilitate self-service for citizens and will promote the electronic government. The strategy used for the achievement of the objectives was a case study of the public organization Serviço Federal de Processamento de Dados (SERPRO). This study was of exploratory and descriptive nature and this organization is important because it meets the demands of information systems for the government at the municipal, state and federal levels to nearly five decades. This case study had two phases. First, we identified critical success factors with a survey and found dimensions that summarize the identified factors with a factor analyses. Second, we conducted a sequence of ten semi-structured interviews applied to experienced professionals. The criterion used to select these professionals were their perceived perception about the importance of the critical dimensions that we found. The selected perception should be extreme and opposite in relation to the average perception of the dimension. We transcribed and categorized all the interviews with a thematic analysis. As results, we got 106 survey respondents and 12 hours of transcribed interviews. We found four critical dimensions to the successful integration of information systems with legacy systems in government. The first is the effectiveness of technological and human resources because we identified that the complexity of legacy system in projects of system integration as the most important cause of failures. The second is to minimize process uncertainty because of the risk of distortions in communication and the need of contingencies that may arise during the project. The third is critical dimension is the priority force because conflicts of interest are common and it is critical to have the power and the necessary resources to solve them. The last dimension is the clarity of the business need, because this clearly gives the development team the necessary security about the consistency of the solution of systems integration.

Page generated in 0.4506 seconds