Spelling suggestions: "subject:"[een] ROBUSTNESS"" "subject:"[enn] ROBUSTNESS""
391 |
MARDI: Marca d'água Digital Robusta via Decomposição de Imagens : uma proposta para aumentar a robustez de técnicas de marca d'água digital / MARDI: Robust Digital Watermarking by Image Decomposition: a proposed to increse the robustness of digital watermark techniquesLopes, Ivan Oliveira 27 July 2018 (has links)
Submitted by Ivan Oliveira Lopes (io.lopes@ifsp.edu.br) on 2018-09-22T02:34:15Z
No. of bitstreams: 1
MARDI_Marca dagua Robusta via Decomposicao de Imagens.pdf: 69551246 bytes, checksum: 895274cb196d7f5075a7bf92bba60e9f (MD5) / Approved for entry into archive by Cristina Alexandra de Godoy null (cristina@adm.feis.unesp.br) on 2018-09-25T19:59:12Z (GMT) No. of bitstreams: 1
lopes_io_dr_ilha.pdf: 69617454 bytes, checksum: 7a93538756770bc0935aa4b85b95f4b7 (MD5) / Made available in DSpace on 2018-09-25T19:59:12Z (GMT). No. of bitstreams: 1
lopes_io_dr_ilha.pdf: 69617454 bytes, checksum: 7a93538756770bc0935aa4b85b95f4b7 (MD5)
Previous issue date: 2018-07-27 / Com a crescente evolução dos equipamentos eletrônicos, muitos dados digitais têm sido produzidos, copiados e distribuídos com facilidade, gerando uma grande preocupação com a sua segurança. Dentre as várias técnicas usadas para proteger os dados digitais, tem-se as técnicas de inserção e extração de marca d'água em imagens digitais. Uma marca d'água pode ser qualquer informação como, por exemplo, um código, um logotipo, ou uma sequência aleatória de letras e números, que vise garantir a autenticidade e a proteção dos direitos autoriais dos dados. Neste trabalho, estudou-se sobre as técnicas existentes de inserção e extração de marca d'água digital, abordando desde seu conceito até o desenvolvimento de algoritmos de inserção e extração de marca d'água em imagens digitais. Desenvolveu-se um método para aumentar a robustez de técnicas de marca d' água digital pela decomposição da imagem em duas partes: estrutural (áreas homogêneas) e de detalhes (áreas com ruídos, texturas e bordas). Contudo, a marca d'água é inserida na parte de detalhes por se tratar de áreas menos afetadas por operações de processamento digital de imagens. Os resultados mostraram que o método proposto aumentou a robustez das técnicas da marca d'água testadas. Baseado nos resultados obtidos, desenvolveu-se uma nova técnica de marca d'água digital, utilizando a transformada discreta de wavelets, a decomposição de imagens e a transformada discreta do cosseno. / With the increasing evolution of technological equipment, many digital data have been easily produced, copied and distributed by generating a great concern for their security. Among the various techniques used to protect the digital data, there are techniques for inserting and extracting a watermark into digital images. A watermark can be any information, such as a code, a logo, or a random sequence of letters and numbers, aiming to ensure the authenticity and copyright protection. In this work are studied, existing insertion and extraction techniques in digital watermarking, by covering from its concept to the development of watermark insertion and extraction algorithms, in digital images. A method was developed to increase the robustness of digital watermarking techniques by decomposing the image into two parts: structural (homogeneous areas) and details (areas with noises, textures and edges). However, the watermark is inserted in the detail area due to be less affected areas by digital image processing. The results showed that the proposed method increased the robustness of the tested watermarking techniques. Based on the results obtained, we developed a new digital watermark technique using discrete wavelet transform, image decomposition and discrete cosine transform.
|
392 |
From Policy Instruments to Action Arenas: Toward Robust Fisheries and Adaptive Fishing Households in Southwest Nova ScotiaJanuary 2014 (has links)
abstract: The coastal fishing community of Barrington, Southwest Nova Scotia (SWNS), has depended on the resilience of ocean ecosystems and resource-based economic activities for centuries. But while many coastal fisheries have developed unique ways to govern their resources, global environmental and economic change presents new challenges. In this study, I examine the multi-species fishery of Barrington. My objective was to understand what makes the fishery and its governance system robust to economic and ecological change, what makes fishing households vulnerable, and how household vulnerability and system level robustness interact. I addressed these these questions by focusing on action arenas, their contexts, interactions and outcomes. I used a combination of case comparisons, ethnography, surveys, quantitative and qualitative analysis to understand what influences action arenas in Barrington, Southwest Nova Scotia (SWNS). I found that robustness of the fishery at the system level depended on the strength of feedback between the operational level, where resource users interact with the resource, and the collective-choice level, where agents develop rules to influence fishing behavior. Weak feedback in Barrington has precipitated governance mismatches. At the household level, accounts from harvesters, buyers and experts suggested that decision-making arenas lacked procedural justice. Households preferred individual strategies to acquire access to and exploit fisheries resources. But the transferability of quota and licenses has created divisions between haves and have-nots. Those who have lost their traditional access to other species, such as cod, halibut, and haddock, have become highly dependent on lobster. Based on regressions and multi-criteria decision analysis, I found that new entrants in the lobster fishery needed to maintain high effort and catches to service their debts. But harvesters who did not enter the race for higher catches were most sensitive to low demand and low prices for lobster. This study demonstrates the importance of combining multiple methods and theoretical approaches to avoid tunnel vision in fisheries policy. / Dissertation/Thesis / Ph.D. Environmental Social Science 2014
|
393 |
Metodo para a determinação do numero de gaussianas em modelos ocultos de Markov para sistemas de reconhecimento de fala continua / A new method for determining the number of gaussians in hidden Markov models for continuos speech recognition systemsYared, Glauco Ferreira Gazel 20 April 2006 (has links)
Orientador: Fabio Violaro / Tese (doutorado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação / Made available in DSpace on 2018-08-06T10:44:21Z (GMT). No. of bitstreams: 1
Yared_GlaucoFerreiraGazel_D.pdf: 5774867 bytes, checksum: 49a79d9495ce25c8a69ca34858a956ee (MD5)
Previous issue date: 2006 / Resumo: Atualmente os sistemas de reconhecimento de fala baseados em HMMs são utilizados em diversas aplicações em tempo real, desde telefones celulares até automóveis. Nesse contexto, um aspecto importante que deve ser considerado é a complexidade dos HMMs, a qual está diretamente relacionada com o custo computacional. Assim, no intuito de permitir a aplicação prática do sistema, é interessante otimizar a complexidade dos HMMs, impondo-se restrições em relação ao desempenho no reconhecimento. Além disso, a otimização da topologia é importante para uma estimação confiável dos parâmetros dos HMMs. Os trabalhos anteriores nesta área utilizam medidas de verossimilhança para a obtenção de sistemas que apresentem um melhor compromisso entre resolução acústica e robustez. Este trabalho apresenta o novo Algoritmo para Eliminação de Gaussianas (GEA), o qual é baseado em uma análise discriminativa e em uma análise interna, para a determinação da complexidade mais apropriada para os HMMs. O novo método é comparado com o Critério de Informação Bayesiano (BIC), com um método baseado em medidas de entropia, com um método discriminativo para o aumento da resolução acústica dos modelos e com os sistemas contendo um número fixo de Gaussianas por estado / Abstract: Nowadays, HMM-based speech recognition systems are used in many real time processing applications, from cell phones to auto mobile automation. In this context, one important aspect to be considered is the HMM complexity, which directly determines the system computational load. So, in order to make the system feasible for practical purposes, it is interesting to optimize the HMM size constrained to a minimum acceptable recognition performance. Furthermore, topology optimization is also important for reliable parameter estimation. Previous works in this area have used likelihood measures in order to obtain models with a better compromise between acoustic resolution and robustness. This work presents the new Gaussian Elimination Algorithm (GEA), which is based on a discriminative analysis and on an internal analysis, for determining the more suitable HMM complexity. The new approach is compared to the classical Bayesian Information Criterion (BIC), to an entropy based method, to a discriminative-based method for increasing the acoustic resolution of the HMMs and also to systems containing a fixed number of Gaussians per state / Doutorado / Telecomunicações e Telemática / Doutor em Engenharia Elétrica
|
394 |
Conception robuste de structure spatiale en présence de méconnaissances de modèle / Robust design of a spacecraft structure under lack of knowledgeMaugan, Fabien 19 January 2017 (has links)
Les travaux présentés dans cette thèse visent à apporter des outils d’aide à la décision à partir de modèles prenant en compte une représentation Info-Gap des différentes sources de méconnaissance du système. Il est en effet possible en utilisant des simulations numériques de développer des indicateurs de support à la décision sous un certain niveau d’incertitude aléatoire ou épistémique. Le principe de conception est ici utilisé au sens large, et peut entrer sans le cadre du dimensionnement structural de composants, de la définition de l’amplitude d’excitation maximum d’un essai, ou encore de la mise en place d’une distribution de capteurs. Ces différentes méthodologies sont ici développées puis appliquées sur des systèmes académiques et industriels / The work presented in this PhD thesis aims at propose new decision making tools in order to take into account a Info-Gap modelling of the different sources of lack of knowledge. Indeed, numerical simulation allow to develop useful indicators for decision making under a given level of random or epistemic uncertainty. The design principle is hereby used in its very large sense, and can stand for structural component design, maximum load definition fir vibrating test or sensor placement design. In this manuscript, these different methodologies are developed and applied to academic and industrial structures.
|
395 |
Modelling genetic selection for gastrointestinal parasites resistance in small ruminants / La modélisation de la résistance aux parasites gastro-intestinaux chez les petits ruminantsAssenza, Fabrizio 06 October 2014 (has links)
Les nématodes gastro-intestinaux sont des parasites de la caillette des petits ruminants qui posent des contraintes majeures pour l’élevage de ces animaux dans le monde. Récemment leur impact économique a augmenté notamment à cause de l’apparition de nématodes résistants aux anthelminthiques. La sélection génétique pourrait être une stratégie complémentaire des traitements chimiques. Dans cette thèse, nous avons exploré la variabilité génétique disponible qui permettrait une sélection sur la résistance aux nématodes. Les résultats obtenus en termes d’héritabilités, corrélations génétiques et QTLs, suggèrent que la variation génétique des populations étudiées pourrait satisfaire les requis d’un objectif de sélection permettant à la fois d’améliorer la résistance aux nématodes et la croissance des animaux. En outre, l’identification de loci SNP associés à la variation observée sur les caractères de résistance aux nématodes pourrait nous permettre d’améliorer la réponse à la sélection. / Abomasal nematodes are a major constraint to small ruminants industry worldwide. Recently their economic impact has increased due to the recrudescence of anthelmintic resistance among many parasite populations. Genetic selection might be a valid strategy for enhancing the efficacy of anthelmintics. We explored the genetic variability, in both sheep and goat, possibly available for a breeding plan featuring parasite resistance as its breeding goal. The results obtained in terms of heritabilities, genetic correlations and QTLs, suggest that the variation in the genetic pool of the population under study might comply with the requirements of a breeding goal including both parasite resistance and production traits. Furthermore, marker assisted selection could be a feasible option to enhance the selection response.
|
396 |
An Efficient Randomized Approximation Algorithm for Volume Estimation and Design CenteringAsmus, Josefine 03 July 2017 (has links) (PDF)
Die Konstruktion von Systemen oder Modellen, welche unter Unsicherheit und Umweltschwankungen robust arbeiten, ist eine zentrale Herausforderung sowohl im Ingenieurwesen als auch in den Naturwissenschaften. Dies ist im Design-Zentrierungsproblem formalisiert als das Finden eines Designs, welches vorgegebene Spezifikationen erfüllt und dies mit einer hohen Wahrscheinlichkeit auch noch tut, wenn die Systemparameter oder die Spezifikationen zufällig schwanken. Das Finden des Zentrums wird oft durch das Problem der Quantifizierung der Robustheit eines Systems begleitet. Hier stellen wir eine neue adaptive statistische Methode vor, um beide Probleme gleichzeitig zu lösen. Unsere Methode, Lp-Adaptation, ist durch Robustheit in biologischen Systemen und durch randomisierte Lösungen für konvexe Volumenberechnung inspiriert. Lp-Adaptation ist in der Lage, beide Probleme im allgemeinen, nicht-konvexen Fall und bei niedrigen Rechenkosten zu lösen.
In dieser Arbeit beschreiben wir die Konzepte des Algorithmus und seine einzelnen Schritte. Wir testen ihn dann anhand bekannter Vergleichsfälle und zeigen seine Anwendbarkeit in elektronischen und biologischen Systemen. In allen Fällen übertrifft das vorliegende Verfahren den bisherigen Stand der Technik. Dies ermöglicht die Umformulierung von Optimierungsproblemen im Ingenieurwesen und in der Biologie als Design-Zentrierungsprobleme unter Berücksichtigung der globalen Robustheit des Systems. / The design of systems or models that work robustly under uncertainty and environmental fluctuations is a key challenge in both engineering and science. This is formalized in the design centering problem, defined as finding a design that fulfills given specifications and has a high probability of still doing so if the system parameters or the specifications randomly fluctuate. Design centering is often accompanied by the problem of quantifying the robustness of a system. Here we present a novel adaptive statistical method to simultaneously address both problems. Our method, Lp-Adaptation, is inspired by how robustness evolves in biological systems and by randomized schemes for convex volume computation. It is able to address both problems in the general, non-convex case and at low computational cost.
In this thesis, we describe the concepts of the algorithm and detail its steps. We then test it on known benchmarks, and demonstrate its real-world applicability in electronic and biological systems. In all cases, the present method outperforms the previous state of the art. This enables re-formulating optimization problems in engineering and biology as design centering problems, taking global system robustness into account.
|
397 |
The evolutionary dynamics of biochemical networks in fluctuating environmentsPlatt, Robert John January 2011 (has links)
Typically, systems biology focuses on the form and function of networks of biochemical interactions. Questions inevitably arise as to the evolutionary origin of those networks' properties. Such questions are of interest to a growing number of systems biologists, and several groups have published studies shown how varying environments can affect network topology and lead to increased evolvability. For decades, evolutionary biologists have also investigated the evolution of evolvability and its relationship to the interactions between genotype and phenotype. While the perspectives of systems and evolutionary biologists sometimes differ, their interests in patterns of interactions and evolvability have much in common. This thesis attempts to bring together the perspectives of systems and evolutionary theory to investigate the evolutionary effects of fluctuating environments. Chapter 1 introduces the necessary themes, terminology and literature from these fields. Chapter 2 explores how rapid environmental fluctuations, or "noise", affects network size and robustness. In Chapter 3, we use the Avida platform to investigate the relationship between genetic architecture, fluctuating environments and population biology. Chapter 4 examines contingency loci as a physical basis for evolvability, while chapter 5 presents a 500-generation laboratory evolution experiment which exposes E. coli to varying environments. The final discussion, concludes that the evolution of generalism can lead to genetic architectures which confer evolvability, which may arise in rapidly fluctuating environments as a by-product of generalism rather than as a selected trait.
|
398 |
Testabilité des services Web / Web services testabilityRabhi, Issam 09 January 2012 (has links)
Cette thèse s’est attaquée sous diverses formes au test automatique des services Web : une première partie est consacrée au test fonctionnel à travers le test de robustesse. La seconde partie étend les travaux précédents pour le test de propriétés non fonctionnelles, telles que les propriétés de testabilité et de sécurité. Nous avons abordé ces problématiques à la fois d’un point de vue théorique et pratique. Nous avons pour cela proposé une nouvelle méthode de test automatique de robustesse des services Web non composés, à savoir les services Web persistants (stateful) et ceux non persistants. Cette méthode consiste à évaluer la robustesse d’un service Web par rapport aux opérations déclarées dans sa description WSDL, en examinant les réponses reçues lorsque ces opérations sont invoquées avec des aléas et en prenant en compte l’environnement SOAP. Les services Web persistants sont modélisés grâce aux systèmes symboliques. Notre méthode de test de robustesse dédiée aux services Web persistants consiste à compléter la spécification du service Web afin de décrire l’ensemble des comportements corrects et incorrects. Puis, en utilisant cette spécification complétée, les services Web sont testés en y intégrant des aléas. Un verdict est ensuite rendu. Nous avons aussi réalisé une étude sur la testabilité des services Web composés avec le langage BPEL. Nous avons décrit précisément les problèmes liés à l’observabilité qui réduisent la faisabilité du test de services Web. Par conséquent, nous avons évalué des facteurs de la testabilité et proposé des solutions afin d’améliorer cette dernière. Pour cela, nous avons proposé une approche permettant, en premier lieu, de transformer la spécification ABPEL en STS. Cette transformation consiste à convertir successivement et de façon récursive chaque activité structurée en un graphe de sous-activités. Ensuite, nous avons proposé des algorithmes d’améliorations permettant de réduire ces problèmes de testabilité. Finalement, nous avons présenté une méthode de test de sécurité des services Web persistants. Cette dernière consiste à évaluer quelques propriétés de sécurité, tel que l’authentification, l’autorisation et la disponibilité, grâce à un ensemble de règles. Ces règles ont été crée, avec le langage formel Nomad. Cette méthodologie de test consiste d’abord à transformer ces règles en objectifs de test en se basant sur la description WSDL, ensuite à compléter, en parallèle, la spécification du service Web persistant et enfin à effectuer le produit synchronisé afin de générer les cas de test. / This PhD thesis focuses on diverse forms of automated Web services testing : on the one hand, is dedicated to functional testing through robustness testing. On the other hand, is extends previous works on the non-functional properties testing, such as the testability and security properties. We have been exploring these issues both from a theoretical and practical perspective. We proposed a robustness testing method which generates and executes test cases automatically from WSDL descriptions. We analyze the Web service over hazards to find those which may be used for testing. We show that few hazards can be really handled and then we improve the robustness issue detection by separating the SOAP processor behavior from the Web service one. Stateful Web services are modeled with Symbolic Systems. A second method dedicated to stateful Web services consists in completing the Web service specification to describe correct and incorrect behaviors. By using this completed specification, the Web services are tested with relevant hazards and a verdict is returned. We study the BPEL testability on a well-known testability criterion called observability. To evaluate, we have chosen to transform ABPEL specifications into STS to apply existing methods. Then, from STS testability issues, we deduce some patterns of ABPEL testability degradation. These latter help to finally propose testability enhancement methods of ABPEL specifications. Finally, we proposed a security testing method for stateful Web Services. We define some specific security rules with the Nomad language. Afterwards, we construct test cases from a symbolic specification and test purposes derived from the previous rules. Moreover, to validate our proposal, we have applied our testing approach on real size case studies.
|
399 |
Interactive 3D segmentation repair with image-foresting transform, supervoxels and seed robustness / Reparação interativa de segmentações 3D com transformada imagem-floresta, supervoxels, robustez de sementesAnderson Carlos Moreira Tavares 02 June 2017 (has links)
Image segmentation consists on its partition into relevant regions, such as to isolate the pixels belonging to desired objects in the image domain, which is an important step for computer vision, medical image processing, and other applications. Many times automatic segmentation generates results with imperfections. The user can correct them by editing manually, interactively or can simply discard the segmentation and try to automatically generate another result by a different method. Interactive methods combine benefits from manual and automatic ones, reducing user effort and using its high-level knowledge. In seed-based methods, to continue or repair a prior segmentation (presegmentation), avoiding the user to start from scratch, it is necessary to solve the Reverse Interactive Segmentation Problem (RISP), that is, how to automatically estimate the seeds that would generate it. In order to achieve this goal, we first divide the segmented object into its composing cores. Inside a core, two seeds separately always produce the same result, making one redundant. With this, only one seed per core is required. Cores leading to segmentations which are contained in the result of other cores are redundant and can also be discarded, further reducing the seed set, a process called Redundancy Analysis. A minimal set of seeds for presegmentation is generated and the problem of interactive repair can be solved by adding new seeds or removing seeds. Within the framework of the Image-Foresting Transform (IFT), new methods such as Oriented Image-Foresting Transform (OIFT) and Oriented Relative Fuzzy Connectedness (ORFC) were developed. However, there were no known algorithms for computing the core of these methods. This work develops such algorithms, with proof of correctness. The cores also give an indication of the degree of robustness of the methods on the positioning of the seeds. Therefore, a hybrid method that combines GraphCut and the ORFC cores, as well as the Robustness Coefficient (RC), have been developed. In this work, we present another developed solution to repair segmentations, which is based on IFT-SLIC, originally used to generate supervoxels. Experimental results analyze, compare and demonstrate the potential of these solutions. / Segmentação de imagem consiste no seu particionamento em regiões, tal como para isolar os pixels pertencentes a objetos de interesse em uma imagem, sendo uma etapa importante para visão computacional, processamento de imagens médicas e outras aplicações. Muitas vezes a segmentação automática gera resultados com imperfeições. O usuário pode corrigi-las editando-a manualmente, interativamente ou simplesmente descartar o resultado e gerar outro automaticamente. Métodos interativos combinam os benefícios dos métodos manuais e automáticos, reduzindo o esforço do usuário e utilizando seu conhecimento de alto nível. Nos métodos baseados em sementes, para continuar ou reparar uma segmentação prévia (presegmentação), evitando o usuário começar do zero, é necessário resolver o Problema da Segmentação Interativa Reversa (RISP), ou seja, estimar automaticamente as sementes que o gerariam. Para isso, este trabalho particiona o objeto da segmentação em núcleos. Em um núcleo, duas sementes separadamente produzem o mesmo resultado, tornando uma delas redundante. Com isso, apenas uma semente por núcleo é necessária. Núcleos contidos nos resultados de outros núcleos são redundantes e também podem ser descartados, reduzindo ainda mais o conjunto de sementes, um processo denominado Análise de Redundância. Um conjunto mínimo de sementes para a presegmentação é gerado e o problema da reparação interativa pode então ser resolvido através da adição de novas sementes ou remoção. Dentro do arcabouço da Transformada Imagem-Floresta (IFT), novos métodos como Oriented Image-Foresting Transform (OIFT) e Oriented Relative Fuzzy Connectedness (ORFC) foram desenvolvidos. Todavia, não há algoritmos para calcular o núcleo destes métodos. Este trabalho desenvolve tais algoritmos, com prova de corretude. Os núcleos também nos fornecem uma indicação do grau de robustez dos métodos sobre o posicionamento das sementes. Por isso, um método híbrido do GraphCut com o núcleo do ORFC, bem como um Coeficiente de Robustez (RC), foram desenvolvidos. Neste trabalho também foi desenvolvida outra solução para reparar segmentações, a qual é baseada em IFT-SLIC, originalmente utilizada para gerar supervoxels. Resultados experimentais analisam, comparam e demonstram o potencial destas soluções.
|
400 |
Sistemas de informação cientes de processos, robustos e confiáveis / Robust and reliable process-aware information systemsAndré Luis Schwerz 08 December 2016 (has links)
Atualmente, diversas empresas e organizações estão cada vez mais empreendendo esforços para transformar rapidamente as suas potenciais ideias em produtos e serviços. Esses esforços também têm estimulado a evolução dos sistemas de informação que passaram a ser apoiados por modelos de alto nível de abstração para descrever a lógica do processo. Neste contexto, destaca-se o sucesso dos Sistemas de Informação cientes de Processos (PAIS, do inglês Process-Aware Information Systems) para o gerenciamento de processos de negócios e automação de processos científicos de larga escala (e-Science). Grande parte do sucesso dos PAIS é devido à capacidade de prover funcionalidades genéricas para modelagem, execução e monitoramento dos processos. Essas características são bem-sucedidas quando os modelos de processos têm um caminho bem-comportado no sentido de atingir os seus objetivos. No entanto, situações anômalas que desviam a execução desse caminho bem-comportado ainda representam um significativo desafio para os PAIS. Por causa dos vários tipos de falhas que desviam a execução do comportamento esperado, prover uma execução robusta e confiável é uma tarefa complexa para os atuais PAIS, uma vez que nem todas as situações de falha podem ser eficientemente descritas dentro da estrutura do fluxo tradicional. Como consequência, o tratamento de tais situações geralmente envolve intervenções manuais nos sistemas por operadores humanos, o que resulta em custos adicionais e significativos para as empresas. Neste trabalho é introduzido um método de composição para recuperação ciente de custos e benefícios que é capaz de encontrar e seguir caminhos alternativos que reduzam os prejuízos financeiros do tratamento de exceções. Do ponto de vista prático, esse método provê o tratamento de exceção automatizado e otimizado ao calcular os custos e benefícios de cada caminho de recuperação e escolher o caminho com a melhor relação custo-benefício disponível. Mais especificamente, o método de recuperação proposto estende a abordagem WED-flow (Workflow, Event processing and Data-flow) para permitir a composição ciente de custos e benefícios de passos de recuperação transacionais backward e forward. Por fim, os experimentos mostram que esse método de recuperação pode ser adequadamente incorporado para manipular exceções em uma ampla variedade de processos. / Nowadays, many corporations and organizations are increasingly making efforts to transform quickly and effectively their potential ideas into products and services. These efforts have also stimulated the evolution of information systems that are now supported by higher-level abstract models to describe the process logic. In this context, several sophisticated Process-Aware Information Systems (PAIS) have successfully been proposed for managing business processes and automating large-scale scientific (e-Science) processes. Much of this success is due to their ability to provide generic functionality for modeling, execution and monitoring processes. These functionalities work well when process models have a well-behaved path towards achieving their objectives. However, anomalous situations that fall outside of the well-behaved execution path still pose a significant challenge to PAIS. Because of the many types of failures that may deviate execution away from expected behaviors, provision of robust and reliable execution is a complex task for current PAIS, since not all failure situations can be efficiently modeled within the traditional flow structure. As a consequence, the treatment for such situations usually involves interventions in systems by human operators, which result in significant additional cost for businesses. In this work, we introduce a cost/benefit-aware recovery composition method that is able to find and follow alternative paths to reduce the financial side effects of exception handling. From a practical point of view, this method provides the automated and optimized exception handling, by calculating the cost and benefits of each recovery path, and choosing the recovery path with the best cost/benefits available. More specifically, our recovery method extends the WED-flow (Workflow, Event processing and Data-flow) approach for enabling cost/benefit-aware composition of forward and/or backward transactional recovery steps. Finally, the experiments point out that this recovery method can be suitably incorporated into exception handling within a wide variety of processes.
|
Page generated in 0.0478 seconds