• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 34
  • 20
  • 13
  • 7
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 95
  • 95
  • 29
  • 26
  • 25
  • 23
  • 16
  • 16
  • 15
  • 15
  • 15
  • 15
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Herausforderungen bei der Prozessunterstützung im Operationssaal: Aktivitätserfassung und Datenspeicherung als Grundlage zur Erkennung des chirurgischen Prozesses

Rockstroh, Max 19 February 2021 (has links)
Die aktuelle Gesundheitsversorgung ist geprägt durch eine steigende Komplexität, die durch die Verzahnung verschiedener medizinischer Bereiche und die Nutzung immer komplexerer, technisch unterstützter Behandlungsmöglichkeiten bedingt ist. Gleichzeitig stehen Kliniken und das medizinische Fachpersonal unter einem hohen Kosten- und Zeitdruck. Für das Erreichen einer optimalen Behandlung des Patienten durch minimalinvasive und mikrochirurgische Eingriffe sind zunehmend Ansätze notwendig, die auf eine Interoperabilität verschiedener Systeme setzen und die Verwendung von zusätzlichen (semi )automatischen Unterstützungssystemen ermöglichen. Diese Arbeit beschäftigt sich mit Herausforderungen bei der Prozessunterstützung im Operationssaal. Dabei liegt der Fokus auf der Aktivitätserfassung und Datenspeicherung als Grundlage zur Erkennung des chirurgischen Prozesses. In einem ersten Schritt wurde ein theoretisches Vorgehensmodell für die intraoperative Prozessunterstützung auf der Basis eines geschlossenen Regelkreises entwickelt. Dabei steht der chirurgische Prozess im Zentrum. In weiteren Schritten wurden ein System zur Erfassung des aktuellen Arbeitsschrittes auf Basis der vorhandenen Videodaten (z.B. Mikroskopie, Ultraschall) sowie eine zentrale Speicherlösung für den Operationssaal entwickelt. Diese zentralen Komponenten sollen es Systemen ermöglichen, während des Eingriffs Daten an die verschiedenen Nutzer bereitzustellen und diese gleichzeitig für eine spätere Dokumentation vorzuhalten. Im weiteren Verlauf der Arbeit werden Ansätze zum Erreichen einer Interoperabilität von Medizingeräten und IT-Systemen im Gesundheitswesen vorgestellt, da technische Systeme neben zusätzlicher Sensorik eine wertvolle Informationsquelle für die Erfassung des aktuellen Prozesses im Operationssaal darstellen. Im Rahmen der Arbeiten des Projektes OR.NET (BMBF, 2012-2016) wurde mit der IEEE 11073-SDC-Standardfamilie eine Möglichkeit zur offenen Vernetzung geschaffen. Konzeptionell integriert werden die Systeme durch eine Beschreibung von Mehrwertdiensten von der einfachen Anzeige von Geräteparametern bis hin zur Teilautomatisierung von technischen Arbeitsschritten auf Basis des Kommunikationsstandards IEEE11073-SDC. Diese wurden basierend auf dieser Basistechnologie gemeinsam mit verschiedenen Projektpartnern entwickelt und mit Klinikern und Klinikbetreibern evaluiert. Die Ergebnisse der Evaluation zeigen, dass durch eine syntaktische und semantische Interoperabilität neue, nutzbringende Funktionen umgesetzt und die Arbeit der verschiedenen Nutzergruppen im Gesundheitssystem effektiv unterstützt werden können.:Inhalt Abstract Abkürzungsverzeichnis Abbildungsverzeichnis Tabellenverzeichnis 1 Einleitung 1.1 Motivation für diese Arbeit 1.2 Zielsetzung der Arbeit 2 Grundlagen der Arbeit 2.1 Aktuelle Operationssäle 2.2 Prozessunterstützung 2.2.1 Workflow und Prozessmodellierung 2.2.2 Workflow in der Medizin 2.2.3 Kontextsensitive Systeme 2.3 Klinische Anwendungsfälle im Rahmen der Arbeit 2.3.1 Intrakranielle Eingriffe am Gehirn 2.3.2 Transsphenoidale Hypophysenadenomentfernung 2.3.3 Sanierende Ohr-OP 3 Vorgehensmodell zur Prozessunterstützung 3.1 Interpretation und Action 3.2 Data analysis und Monitoring 3.3 Mögliche Herangehensweisen bei der Umsetzung einer Prozessunterstützung 4 Erfassung prozessrelevanter Daten im Operationssaal 4.1 Stand der Forschung 4.1.1 Team Assessment und Performanzüberwachung 4.1.2 OP-Dokumentation, Qualitätssicherung und Elektronische Patientenakte (EPA) 4.1.3 Workflow Recognition 4.2 Erkennung der Interaktion zwischen Medizingerät und medizinischem Personal basierend auf der Analyse von Videodaten 4.2.1 Methode 4.2.2 Evaluation 4.2.3 Diskussion 4.3 Erfassung von Informationen durch OP-Integration 4.3.1 Stand der Forschung 4.3.2 Grundlagen des Projektes OR.NET 4.3.3 Zusammenfassung 5 Datenspeicherung im Operationssaal (Surgical Data Recorder und die Erweiterungen auf Basis von OR.NET) 5.1 Surgical Data Recorder 5.1.1 Anforderungsanalyse 5.1.2 Systemkonzept 5.1.3 Evaluationsstudie 5.2 Anpassungen des Datenaufzeichnungskonzeptes im Rahmen des OR.NET-Projekts 5.3 Diskussion und Vergleich der Ansätze 6 Mehrwertdienste auf Basis einer offenen Vernetzung 6.1 Setzen von eingriffsspezifischen Informationen auf den angeschlossenen Geräten 6.2 Anzeige von Informationen im Sichtfeld des Chirurgen 6.3 Mehrwerte durch Datenintegration 6.4 Funktionen mit Nutzung von Prozessinformationen 7 Entwicklung der OP-Demonstratoren und Evaluation der implementierten Mehrwertdienste mit verschiedenen Anwendergruppen 7.1 Anforderungsanalyse für den Leipziger Demonstrator 7.2 Infrastruktur des Demonstrators 7.3 Integrationsszenarien in den Demonstratoren 7.4 Umgesetzte Mehrwertdienste im Leipziger Demonstrator 7.5 Vorgehen bei der Evaluation des Leipziger Demonstrators 7.5.1 Technische Evaluation 7.5.2 Klinische Evaluation 7.6 Ergebnisse der Evaluation 7.6.1 Technische Evaluation 7.6.2 Klinische Evaluation 7.7 Diskussion der Ergebnisse 8 Ausblick auf Weiterentwicklungen der offenen Vernetzung im Operationssaal 9 Zusammenfassung 10 Literatur 11 Anhang 11.1 Modellierung der sanierenden Ohr-OP als EPK inkl. möglicher Vernetzungsszenarien 11.2 Übersicht der umgesetzten Use-Cases im Leipziger Demonstrator 136 11.3 Fragenkatalog OP-Personal 11.4 Fragebogen Betreiber 12 Eigenständigkeitserklärung 13 Eigene Literatur 14 Danksagung
72

AdaptFlow: Protocol-based Medical Treatment Using Adaptive Workflows

Greiner, U., Müller, R., Rahm, E., Ramsch, J., Heller, B., Löffler, M. 25 January 2019 (has links)
Objectives: In many medical domains investigator-initiated clinical trials are used to introduce new treatments and hence act as implementations of guideline-based therapies. Trial protocols contain detailed instructions to conduct the therapy and additionally specify reactions to exceptional situations (for instance an infection or a toxicity). To increase quality in health care and raise the number of patients treated according to trial protocols, a consultation system is needed that supports the handling of the complex trial therapy processes efficiently. Our objective was to design and evaluate a consultation system that should 1) observe the status of the therapies currently being applied, 2) offer automatic recognition of exceptional situations and appropriate decision support and 3) provide an automatic adaptation of affected therapy processes to handle exceptional situations. Methods: We applied a hybrid approach that combines process support for the timely and efficient execution of the therapy processes as offered by workflow management systems with a knowledge and rule base and a mechanism for dynamic workflow adaptation to change running therapy processes if induced by changed patient condition. Results and Conclusions: This approach has been implemented in the AdaptFlow prototype. We performed several evaluation studies on the practicability of the approach and the usefulness of the system. These studies show that the AdaptFlow prototype offers adequate support for the execution of real-world investigator-initiated trial protocols and is able to handle a large number of exceptions.
73

Event-Oriented Dynamic Adaptation of Workflows: Model, Architecture and Implementation

Müller, Robert 28 November 2004 (has links)
Workflow management is widely accepted as a core technology to support long-term business processes in heterogeneous and distributed environments. However, conventional workflow management systems do not provide sufficient flexibility support to cope with the broad range of failure situations that may occur during workflow execution. In particular, most systems do not allow to dynamically adapt a workflow due to a failure situation, e.g., to dynamically drop or insert execution steps. As a contribution to overcome these limitations, this dissertation introduces the agent-based workflow management system AgentWork. AgentWork supports the definition, the execution and, as its main contribution, the event-oriented and semi-automated dynamic adaptation of workflows. Two strategies for automatic workflow adaptation are provided. Predictive adaptation adapts workflow parts affected by a failure in advance (predictively), typically as soon as the failure is detected. This is advantageous in many situations and gives enough time to meet organizational constraints for adapted workflow parts. Reactive adaptation is typically performed when predictive adaptation is not possible. In this case, adaptation is performed when the affected workflow part is to be executed, e.g., before an activity is executed it is checked whether it is subject to a workflow adaptation such as dropping, postponement or replacement. In particular, the following contributions are provided by AgentWork: A Formal Model for Workflow Definition, Execution, and Estimation: In this context, AgentWork first provides an object-oriented workflow definition language. This language allows for the definition of a workflow’s control and data flow. Furthermore, a workflow’s cooperation with other workflows or workflow systems can be specified. Second, AgentWork provides a precise workflow execution model. This is necessary, as a running workflow usually is a complex collection of concurrent activities and data flow processes, and as failure situations and dynamic adaptations affect running workflows. Furthermore, mechanisms for the estimation of a workflow’s future execution behavior are provided. These mechanisms are of particular importance for predictive adaptation. Mechanisms for Determining and Processing Failure Events and Failure Actions: AgentWork provides mechanisms to decide whether an event constitutes a failure situation and what has to be done to cope with this failure. This is formally achieved by evaluating event-condition-action rules where the event-condition part describes under which condition an event has to be viewed as a failure event. The action part represents the necessary actions needed to cope with the failure. To support the temporal dimension of events and actions, this dissertation provides a novel event-condition-action model based on a temporal object-oriented logic. Mechanisms for the Adaptation of Affected Workflows: In case of failure situations it has to be decided how an affected workflow has to be dynamically adapted on the node and edge level. AgentWork provides a novel approach that combines the two principal strategies reactive adaptation and predictive adaptation. Depending on the context of the failure, the appropriate strategy is selected. Furthermore, control flow adaptation operators are provided which translate failure actions into structural control flow adaptations. Data flow operators adapt the data flow after a control flow adaptation, if necessary. Mechanisms for the Handling of Inter-Workflow Implications of Failure Situations: AgentWork provides novel mechanisms to decide whether a failure situation occurring to a workflow affects other workflows that communicate and cooperate with this workflow. In particular, AgentWork derives the temporal implications of a dynamic adaptation by estimating the duration that will be needed to process the changed workflow definition (in comparison with the original definition). Furthermore, qualitative implications of the dynamic change are determined. For this purpose, so-called quality measuring objects are introduced. All mechanisms provided by AgentWork include that users may interact during the failure handling process. In particular, the user has the possibility to reject or modify suggested workflow adaptations. A Prototypical Implementation: Finally, a prototypical Corba-based implementation of AgentWork is described. This implementation supports the integration of AgentWork into the distributed and heterogeneous environments of real-world organizations such as hospitals or insurance business enterprises.
74

Upravljanje tokovima aktivnosti u distributivnom menadžment sistemu / Workflow management system for DMS

Nedić Nemanja 24 February 2016 (has links)
<p>U radu je predstavljeno istraživanje vezano za poboljšanje performansi rada velikih nadzorno-upravljačkih sistema poput DMS-a. Ovaj cilj je postignut koordinacijom izvršavanja tokova aktivnosti, što podrazumeva efikasnu raspodelu zadataka na računarske resurse. U te svrhe razvijeni su i testirani različiti algoritmi. Ovakav pristup je obezbedio veći stepen iskorišćenja računarskih resursa, što je rezultiralo boljim performansama.</p> / <p>Thе paper presents an approach how to improve performance of larger scale distributed utility management system such as DMS. This goal is accomplished by using an intelligent workflow management. Workflows are divided into the atomic tasks which are scheduled to computing resources for execution. For these purposes various scheduling algorithms are developed and thoroughly tested. This approach has provided greater utilization of computing resources which further have resulted in better performance.</p>
75

Die Einführung von Vorgangsbearbeitungssystemen in der öffentlichen Verwaltung als IT-organisatorischer Gestaltungsprozeß

Knaack, Ildiko 08 December 1999 (has links)
Die Arbeit entwickelt einen konzeptionellen Rahmen für den IT-organisatorischen Gestaltungsprozeß bei der Einführung von Vorgangsbearbeitungssystemen und berücksichtigt dabei die Besonderheiten der öffentlichen planenden Verwaltung (Ministerialverwaltung). Die Einführung von Vorgangsbearbeitungssystemen greift wie kein anderes IT-System zuvor in die Ablauf- und Aufbauorganisation der öffentlichen Verwaltung ein. Vorgangsbearbeitungssysteme eröffnen grundlegend neue Gestaltungsmöglichkeiten der Bearbeitung von Vorgängen und erfordern zugleich eine an die IT-Unterstützung angepaßte, optimierte Organisation der Vorgangsbearbeitung. Der Einführungsprozeß ist durch eine Vielzahl von informationstechnischen und organisatorischen Abhängigkeiten und Wechselwirkungen gekennzeichnet. Informa-tionstechnik und Organisation werden als zwei Determinanten der Gestaltung betrachtet, die bei der Durchführung informationstechnisch-organisatorischer Gestaltungsmaßnahmen zu berücksichtigen sind. Der Gestaltungsbedarf ergibt sich aus der Art und Weise der Nutzung des Vorgangsbearbeitungssystems. Ziel der Gestaltung ist eine schrittweise optimierte und an die Spezifika der Behörde angepaßte Nutzung des Vorgangsbearbeitungssystems. Ausgehend von einer Systematisierung der konventionellen und IT-gestützten Vorgangsbearbeitung und einer Untersuchung, inwieweit Methoden und Konzepte des Softwareengineerings, der Verwaltungsmodernisierung und der Organisationswissenschaft informationstechnisch-organisatorische Wechselwirkungen berücksichtigen, wird das Drei-Ebenen-Modell der IT-organisatorischen Gestaltung als konzeptioneller Rahmen entwickelt. Da die konkrete Durchführung IT-organisatorischer Gestaltungsmaßnahmen von den Spezifika der jeweiligen Behörde und dem einzuführenden Vorgangsbearbeitungssystem abhängig ist, wird der IT-organisatorische Gestaltungsprozeß anhand unterschiedlicher Nutzungsstufen von Vorgangsbearbeitungssystemen und an Beispielen IT-organisatorischer Gestaltung des Projekts DOMEA® verdeutlicht. / The work provides concept standards for the IT-related process of implementation a document- and workflowmanagement-system and at the same time takes into account the distinctive features of the public sector. Like no IT-system before, the implementation of a document- and workflowmanagement-system influences the process and structure of the public sector. Document- and workflowmanagement-systems ensure completely new ways of working and at the same time require an optimal organization of document- and workflowmanagement that fits with the needs of IT-support. The process of implementation is characterized by a large number of IT-related and organizational dependences and interaction. IT and organization are considered as two determinants are crucial of a successful organizational and IT-related framing. The need of framing depends on the way document- and workflowmanagement-system is used. The aim is to gradually optimize the use of document- and workflowmanagement-system and to adapt it to the specific needs of the individual organization. Starting from a systematization of conventional and IT-based document- and workflowmanagement and a study on the dependences of IT and organization with methods and concepts of software-engineering, the modernization of the public sector and with theory of organization, the three-tier-model of IT-based shaping is developed. The concrete organizational an IT-related implementation depends on the specific needs of the individual organization and of the document- and workflowmanagement-system to be implemented. Therefore, the organizational an IT-related implementation will be characterized by different levels of using a document- and workflowmanagement-system and by examples of the DOMEA®-project.
76

Semantics, verification, and implementation of workflows with cancellation regions and OR-joins

Wynn, Moe Thandar January 2006 (has links)
Workflow systems aim to provide automated support for the conduct of certain business processes. Workflow systems are driven by workflow specifications which among others, capture the execution interdependencies between various activities. These interdependencies are modelled by means of different control flow constructors, e.g., sequence, choice, parallelism and synchronisation. It has been shown in the research on workflow patterns that the support for and the interpretation of various control flow constructs varies substantially across workflow systems. Two of the most problematic patterns relate to the OR-join and to cancellation. An OR-join is used in situations when we need to model " wait and see" behaviour for synchronisation. Different approaches assign a different (often only intuitive) semantics to this type of join, though they do share the common theme that synchronisation is only to be performed for active paths. Depending on context assumptions this behaviour may be relatively easy to deal with, though in general its semantics is complicated, both from a definition point of view (in terms of formally capturing a desired intuitive semantics) and from a computational point of view (how does one determine whether an OR-join is enabled?). Many systems and languages struggle with the semantics and implementation of the OR-join because its non-local semantics require a synchronisation depending on an analysis of future execution paths. This may require some non-trivial reasoning. The presence of cancellation features and other OR-joins in a workflow further complicates the formal semantics of the OR-join. The cancellation feature is commonly used to model external events that can change the behaviour of a running workflow. It can be used to either disable activities in certain parts of a workflow or to stop currently running activities. Even though it is possible to cancel activities in workflow systems using some sort of abort function, many workflow systems do not provide direct support for this feature in the workflow language. Sometimes, cancellation affects only a selected part of a workflow and other activities can continue after performing a cancellation action. As cancellation occurs naturally in business scenarios, comprehensive support in a workflow language is desirable. We take on the challenge of providing formal semantics, verification techniques as well as an implementation for workflows with those features. This thesis addresses three interrelated issues for workflows with cancellation regions and OR-joins. The concept of the OR-join is examined in detail in the context of the workflow language YAWL, a powerful workflow language designed to support a collection of workflow patterns and inspired by Petri nets. The OR-join semantics has been redesigned to represent a general, formal, and decidable approach for workflows in the presence of cancellation regions and other OR-joins. This approach exploits a link that is proposed between YAWL and reset nets, a variant of Petri nets with a special type of arc that can remove all tokens from a place. Next, we explore verification techniques for workflows with cancellation regions and OR-joins. Four structural properties have been identified and a verification approach that exploits coverability and reachability notions from reset nets has been proposed. The work on verification techniques has highlighted potential problems with calculating state spaces for large workflows. Applying reduction rules before carrying out verification can decrease the size of the problem by cutting down the size of the workflow that needs to be examined while preserving some essential properties. Therefore, we have extended the work on verification by proposing reduction rules for reset nets and for YAWL nets with and without OR-joins. The proposed OR-join semantics as well as the proposed verification approach have been implemented in the YAWL environment.
77

Foundations of process-aware information systems

Russell, Nicholas Charles January 2007 (has links)
Over the past decade, the ubiquity of business processes and their need for ongoing management in the same manner as other corporate assets has been recognized through the establishment of a dedicated research area: Business Process Management (or BPM). There are a wide range of potential software technologies on which a BPM o®ering can be founded. Although there is signi¯cant variation between these alternatives, they all share one common factor { their execution occurs on the basis of a business process model { and consequently, this ¯eld of technologies can be termed Process-Aware Information Systems (or PAIS). This thesis develops a conceptual foundation for PAIS based on the results of a detailed examination of contemporary o®erings including work°ow and case han- dling systems, business process modelling languages and web service composition languages. This foundation is based on 126 patterns that identify recurrent core constructs in the control-°ow, data and resource perspectives of PAIS. These patterns have been used to evaluate some of the leading systems and business process modelling languages. It also proposes a generic graphical language for de¯ning exception handling strategies that span these perspectives. On the basis of these insights, a comprehensive reference language { newYAWL { is developed for business process modelling and enactment. This language is formally de¯ned and an abstract syntax and operational semantics are provided for it. An assessment of its capabilities is provided through a comprehensive patterns-based analysis which allows direct comparison of its functionality with other PAIS. newYAWL serves as a reference language and many of the ideas embodied within it are also applicable to existing languages and systems. The ultimate goal of both the patterns and newYAWL is to improve the support and applicability of PAIS.
78

Entwicklung einer Methode für die Integration von Standardsoftware am Beispiel der Integration von Prüfsystemen in die Leistungsabrechnung von Krankenversicherungen

Hutter, Michael January 2009 (has links)
Zugl.: Sankt Gallen, Univ., Diss., 2009
79

Um estudo sobre a aplicação do padrão BPMN (Business process modeland notation) para a modelagem do processo de desenvolvimento de produtos numa empresa de pequeno porte do segmento metal-mecânico

Mocrosky, Jeferson Ferreira 03 October 2012 (has links)
A modelagem do processo de negócio é uma abordagem da década de 1990 para melhoria do desempenho das organizações, que volta atualmente como forte contribuinte para a melhoria de desempenho das organizações. É com essa abordagem que esta pesquisa realiza a modelagem de um Processo Desenvolvimento de Produtos (PDP) de uma empresa de fabricação mecânica, que manufatura máquinas e equipamentos para apoiar a produção em frigoríficos, na região oeste Catarinense. A modelagem do PDP utiliza o padrão Business Process Model and Notation (BPMN) apoiada pelo aplicativo Intalio BPMS. O objetivo da pesquisa é avaliar a modelagem com BPMN para a formalização do Processo de Desenvolvimento de Produtos e como tratar as complexidades e interações intrínsecas deste processo, em pequenas empresas de fabricação mecânica. A modelagem com BPMN é estruturada na avalição do PDP de uma empresa selecionada e de observações in loco da execução do processo. A metodologia adotada para desenvolvimento da modelagem do PDP da empresa considerou os seguintes aspectos: i) estudo de uma empresa; ii) modelagem informacional; iii) automação do modelo e execução; iv) implementação do modelo do PDP na empresa. Também são apresentadas as características do Modelo Unificado de Rozenfeld et al. (2006), usado como referência para sistematizar a modelagem do Processo de Desenvolvimento de Produtos da empresa, através de avaliação do processo da empresa. Uma breve descrição é feita para apresentar as características dos principais padrões usados na Modelagem de Processos de Negócios, incluindo os principais aplicativos computacionais usados para apoiar os padrões de modelagem. Os resultados foram divididos em duas partes, em modelos abstratos estáticos e dinâmicos. O modelo abstrato estático tem caráter informacional, apresentando riqueza de detalhes, na forma de um mapa detalhado do processo. Para automação, esse modelo estático foi desdobrado em outros dois modelos abstratos, que são configurados para se tornarem dinâmicos, visando a implementação e execução de forma a atender satisfatoriamente a realidade da execução do processo na empresa. O primeiro modelo abstrato dinâmico implementado e executado define o produto e finaliza com a decisão do cliente sobre o orçamento solicitado ao setor de vendas da empresa. O segundo modelo abstrato dinâmico inicia com a aprovação do orçamento pelo cliente, dado início a atividades de projeto informacional e finaliza com a liberação para produção. Essa abordagem visa minimizar as complexidades de modelagem do processo e das particularidades especificas da empresa. A modelagem do PDP com o modelo de referência e a aplicação do padrão BPMN apoiado pelo Intalio BPMS permitiu relatar boas práticas, lições aprendidas, dificuldades e facilidades encontradas. Além disso, o PDP formalizado pela modelagem com BPMN e Intalio BPMS proporcionou mudanças significativas na execução atual do processo, contribuindo para maior integração entre os participantes. / Modeling the business process and an approach of the 1990s to improve the performance of organizations, this currently returns as a strong contributor to the improvement of performance of organizations Packing Company, in the region west of Santa Catarina State. Modeling the PDP uses the standard Business Process Model and Notation (BPMN) supported by the application Intalio BPMS. The objective of the research and evaluate the modeling with BPMN for the formalization of the development process of products and how to deal with the complexities and intrinsic interactions of this process, in small companies of mechanical manufacturing. The modeling with BPMN and structured in evaluation of PDP a company selected and comments on the site of the work, the execution of the process. The methodology adopted for the development of modeling PDP the company considered the following aspects: (i) study of a company; (ii) informational modeling; (iii) the automation model and implementation; (iv) implementation of the model of the PDP in the company. Also presented are the characteristics of Unified Model of Rozenfeld et al. (2006), used as a reference to systematize the modeling of the Products Development Process of the company, through evaluation of the process of the company. A brief description and made to have the characteristics of the major standards used in the modeling of business processes, including the main computational applications used to support the standards of modeling. The results were divided into two parts, in abstract models static and dynamic. The abstract model has static informational character, presenting richness of detail, in the form of a detailed map of the process. For automation, this static model was unfolded in two other abstract models, which are configured to become dynamic, aiming at the implementation and execution in order to meet satisfactorily the reality of the implementation of the process in the company. The first abstract model dynamic implemented and executed defines the product and finishes with the customer's decision on the budget requested the sales of the company. The second abstract model dynamic starts with the approval of the budget by the customer, initiated the activities of project informational and ends with the release to production. This approach aims to minimize the complexities of modeling the process and the specific peculiarities of the company. Modeling the PDP with the reference model and the application of standard BPMN supported by Intalio BPMS allowed report best practices, lessons learned, difficulties and facilities found. In addition, the PDP formalized by modeling with BPMN and Intalio BPMS provided significant changes in the implementation of the current process, contributing to greater integration between the participants.
80

Conception d’une architecture de services d’intelligence ambiante pour l’optimisation de la qualité de service de transmission de messages en e-santé / Design of an ambient intelligence services architecture for optimizing quality of service of message transmission in eHealth

Guizani, Nachoua 30 September 2016 (has links)
La gestion de l'acheminement de messages d'e-santé en environnement ubiquitaire soulève plusieurs défis majeurs liés à la diversité et à la spécificité des cas d'usage et des acteurs, à l'évolutivité des contextes médical, social, logistique, environnemental...Nous proposons une méthode originale d'orchestration autonome et auto-adaptative de services visant à optimiser le flux des messages et à personnaliser la qualité de transmission, en les adressant aux destinataires les plus appropriés dans les délais requis. Notre solution est une architecture générique dirigée par des modèles du domaine d'information considéré et des données contextuelles, basés sur l'identification des besoins et des contraintes soulevées par notre problématique.Notre approche consiste en la composition de services de fusion et de gestion dynamique en temps réel d'informations hétérogènes provenant des écosystèmes source, cible et message, pilotés par des méthodes d'intelligence artificielle pour l'aide à la prise de décision de routage. Le but est de garantir une communication fiable, personnalisable et sensible à l'évolution du contexte, quel que soit le scénario et le type de message (alarme, technique, etc.). Notre architecture, applicable à divers domaines, a été consolidée par une modélisation des processus métiers (BPM) explicitant le fonctionnement des services qui la composent.Le cadriciel proposé est basé sur des ontologies et est compatible avec le standard HL7 V3. L'auto-adaptation du processus décisionnel d'acheminement est assurée par un réseau bayésien dynamique et la supervision du statut des messages par une modélisation mathématique utilisant des réseaux de Petri temporels / Routing policy management of eHealth messages in ubiquitous environment leads to address several key issues, such as taking into account the diversity and specificity of the different use cases and actors, as well as the dynamicity of the medical, social, logistic and environmental contexts.We propose an original, autonomous and adaptive service orchestration methodology aiming at optimizing message flow and personalizing transmission quality by timely sending the messages to the appropriate recipients. Our solution consists in a generic, model-driven architecture where domain information and context models were designed according to user needs and requirements. Our approach consists in composing, in real time, services for dynamic fusion and management of heterogeneous information from source, target and message ecosystems, driven by artificial intelligence methods for routing decision support. The aim is to ensure reliable, personalized and dynamic context-aware communication, whatever the scenario and the message type (alarm, technical, etc.). Our architecture is applicable to various domains, and has been strengthened by business process modeling (BPM) to make explicit the services operation.The proposed framework is based on ontologies and is compatible with the HL7 V3 standard. Self-adaptation of the routing decision process is performed by means of a dynamic Bayesian network and the messages status supervision is based on timed Petri nets

Page generated in 0.1023 seconds