• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 34
  • 20
  • 13
  • 7
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 95
  • 95
  • 29
  • 26
  • 25
  • 23
  • 16
  • 16
  • 15
  • 15
  • 15
  • 15
  • 13
  • 12
  • 12
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

From machine learning to learning with machines:remodeling the knowledge discovery process

Tuovinen, L. (Lauri) 19 August 2014 (has links)
Abstract Knowledge discovery (KD) technology is used to extract knowledge from large quantities of digital data in an automated fashion. The established process model represents the KD process in a linear and technology-centered manner, as a sequence of transformations that refine raw data into more and more abstract and distilled representations. Any actual KD process, however, has aspects that are not adequately covered by this model. In particular, some of the most important actors in the process are not technological but human, and the operations associated with these actors are interactive rather than sequential in nature. This thesis proposes an augmentation of the established model that addresses this neglected dimension of the KD process. The proposed process model is composed of three sub-models: a data model, a workflow model, and an architectural model. Each sub-model views the KD process from a different angle: the data model examines the process from the perspective of different states of data and transformations that convert data from one state to another, the workflow model describes the actors of the process and the interactions between them, and the architectural model guides the design of software for the execution of the process. For each of the sub-models, the thesis first defines a set of requirements, then presents the solution designed to satisfy the requirements, and finally, re-examines the requirements to show how they are accounted for by the solution. The principal contribution of the thesis is a broader perspective on the KD process than what is currently the mainstream view. The augmented KD process model proposed by the thesis makes use of the established model, but expands it by gathering data management and knowledge representation, KD workflow and software architecture under a single unified model. Furthermore, the proposed model considers issues that are usually either overlooked or treated as separate from the KD process, such as the philosophical aspect of KD. The thesis also discusses a number of technical solutions to individual sub-problems of the KD process, including two software frameworks and four case-study applications that serve as concrete implementations and illustrations of several key features of the proposed process model. / Tiivistelmä Tiedonlouhintateknologialla etsitään automoidusti tietoa suurista määristä digitaalista dataa. Vakiintunut prosessimalli kuvaa tiedonlouhintaprosessia lineaarisesti ja teknologiakeskeisesti sarjana muunnoksia, jotka jalostavat raakadataa yhä abstraktimpiin ja tiivistetympiin esitysmuotoihin. Todellisissa tiedonlouhintaprosesseissa on kuitenkin aina osa-alueita, joita tällainen malli ei kata riittävän hyvin. Erityisesti on huomattava, että eräät prosessin tärkeimmistä toimijoista ovat ihmisiä, eivät teknologiaa, ja että heidän toimintansa prosessissa on luonteeltaan vuorovaikutteista eikä sarjallista. Tässä väitöskirjassa ehdotetaan vakiintuneen mallin täydentämistä siten, että tämä tiedonlouhintaprosessin laiminlyöty ulottuvuus otetaan huomioon. Ehdotettu prosessimalli koostuu kolmesta osamallista, jotka ovat tietomalli, työnkulkumalli ja arkkitehtuurimalli. Kukin osamalli tarkastelee tiedonlouhintaprosessia eri näkökulmasta: tietomallin näkökulma käsittää tiedon eri olomuodot sekä muunnokset olomuotojen välillä, työnkulkumalli kuvaa prosessin toimijat sekä niiden väliset vuorovaikutukset, ja arkkitehtuurimalli ohjaa prosessin suorittamista tukevien ohjelmistojen suunnittelua. Väitöskirjassa määritellään aluksi kullekin osamallille joukko vaatimuksia, minkä jälkeen esitetään vaatimusten täyttämiseksi suunniteltu ratkaisu. Lopuksi palataan tarkastelemaan vaatimuksia ja osoitetaan, kuinka ne on otettu ratkaisussa huomioon. Väitöskirjan pääasiallinen kontribuutio on se, että se avaa tiedonlouhintaprosessiin valtavirran käsityksiä laajemman tarkastelukulman. Väitöskirjan sisältämä täydennetty prosessimalli hyödyntää vakiintunutta mallia, mutta laajentaa sitä kokoamalla tiedonhallinnan ja tietämyksen esittämisen, tiedon louhinnan työnkulun sekä ohjelmistoarkkitehtuurin osatekijöiksi yhdistettyyn malliin. Lisäksi malli kattaa aiheita, joita tavallisesti ei oteta huomioon tai joiden ei katsota kuuluvan osaksi tiedonlouhintaprosessia; tällaisia ovat esimerkiksi tiedon louhintaan liittyvät filosofiset kysymykset. Väitöskirjassa käsitellään myös kahta ohjelmistokehystä ja neljää tapaustutkimuksena esiteltävää sovellusta, jotka edustavat teknisiä ratkaisuja eräisiin yksittäisiin tiedonlouhintaprosessin osaongelmiin. Kehykset ja sovellukset toteuttavat ja havainnollistavat useita ehdotetun prosessimallin merkittävimpiä ominaisuuksia.
82

[en] TEAM: AN ARCHITECTURE FOR E-WORKFLOW MANAGEMENT / [pt] TEAM: UMA ARQUITETURA PARA GERÊNCIA DE E-WORKFLOWS

LUIZ ANTONIO DE MORAES PEREIRA 30 August 2004 (has links)
[pt] Em aplicações colaborativas distribuídas, o uso de repositórios centralizados para armazenamento dos dados e programas compartilhados compromete algumas características importantes desse tipo de aplicações, tais como tolerância a falhas, escalabilidade e autonomia local. Aplicações como Kazaa, Gnutella e Edutella exemplificam o emprego de computação ponto-a-ponto (P2P), que tem se mostrado uma alternativa interessante para solução dos problemas apontados acima, sem impor as restrições típicas de sistemas centralizados ou mesmo distribuídos do tipo mediadores e SGBDHs. Nesse trabalho apresentamos a arquitetura TEAM (Teamwork-support Environment Architectural Model) para gerência de workflows na web. Além de descrevermos os componentes e conectores da arquitetura, que se baseia em computação P2P, tratamos dos aspectos de modelagem de processos, gerência dos dados, metadados e das informações de controle de execução dos processos. Exploramos, também, a estratégia adotada para disseminação das consultas e mensagens encaminhadas aos pares da rede em ambientes baseados na arquitetura. Ilustramos o emprego da arquitetura TEAM em um estudo de caso em e-learning. / [en] In distributed collaborative applications, the use of centralized repositories for storing shared data and programs compromises some important characteristics of this type of applications, such as fault tolerance, scalability and local autonomy. Applications like Kazaa, Gnutella and Edutella exemplify the use of peer-to-peer computing, which is being considered an interesting alternative for the solution of the problems mentioned above, without imposing typical restrictions of centralized or even distributed systems such as mediators and HDBMSs. In this work we present the TEAM (Teamwork-support Environment Architectural Model) architecture for managing workflows in the Web. Besides describing the components and connectors of the architecture, which is based on P2P computing, we address the modelling of processes and management of data, metadata and execution control information.We also discuss the strategy adopted for queries dissemination and messages sent to peers in environments based on the architecture. We illustrate the application of TEAM in a case study in e-learning.
83

RAfEG: Referenz-Systemarchitektur und prototypische Umsetzung -- Ausschnitt aus dem Abschlussbericht zum Projekt "Referenzarchitektur für E-Government" (RAfEG) --

Kunis, Raphael, Rünger, Gudula 07 December 2007 (has links)
Das Ziel des RAfEG-Projektes bestand in der Entwicklung einer Referenzarchitektur "E-Government", die die notwendigen Komponenten zur Realisierung informations- und kommunikationstechnischer Systeme (IuK-Systeme) für typische Prozesse in nachgeordneten Behörden der Innenministerien der Bundesländer bereitstellte. Die Architektur RAfEG stellt einen ganzheitlichen Ansatz dar, der viele wesentliche Aspekte, beginnend mit der formalen Beschreibung der fachlichen Zusammenhänge bis hin zur Entwicklung von verteilt agierenden Softwarekomponenten behördlicher Geschäftsprozesse umfasst. Die Architektur liefert unter Berücksichtigung hardwareseitiger Voraussetzungen die Struktur von Softwarekomponenten zur Verwaltungsautomatisierung. Die Architektur RAfEG wurde als räumlich verteiltes komponentenbasiertes Softwaresystem entworfen. Dabei war es notwendig, Konzepte zur effizienten Nutzung von heterogenen Systemen für interaktive Anwendungen im Bereich E-Government zu entwickeln. Die prototypische Umsetzung der Architektur erfolgte für Planfeststellungsverfahren/Plangenehmigungsprozesse am Beispiel des Regierungspräsidiums Leipzig. Das Vorhaben war geprägt von der Entwicklung eines durchgängigen Konzeptes zur optimalen IuK-technischen Unterstützung von Verwaltungsprozessen. Dies führte von der Modellierung der fachlichen Zusammenhänge (Fachkonzept) über die entwicklungsorientierte, methodische Abbildung der zu implementierenden Sachverhalte (Datenverarbeitungskonzept) bis zur komponentenbasierten Softwareentwicklung (Implementierungskonzept). Dieses Konzept mündete in einer Referenzarchitektur für typische E-Government-Prozesse. Dazu wurden neben den rein fachlichen, aufgabenbezogenen Aspekten insbesondere Sicherheitsaspekte sowie technische und organisatorische Schnittstellen ausführlich betrachtet. Der durchgängige Einsatz von Open Source Software führt hierbei zu einer kosteneffizienten, flexiblen Referenzlösung, die durch ihre komponentenbasierte Struktur als weiteren Aspekt sehr gut an spezielle Anforderungen anpassbar ist.
84

IT-Unterstützung zur energiesensitiven Produktentwicklung

Reichel, Thomas, Rünger, Gudula, Steger, Daniel, Xu, Haibin 15 July 2010 (has links)
Die Entwicklung kostengünstiger, energiesparender und resourcenschonender Produkte gewinnt zunehmend an Bedeutung. Dabei bildet die Bewertung von Kosten und Energie über den gesamten Lebenszyklus des Produkts, von der Entwicklung und Fertigung über den Betrieb bis hin zum Recycling, auf Basis virtueller Prototypen eine wesentliche Grundlage. Da Entwurfsentscheidungen in frühen Phasen der Entwicklung, in denen noch kein realer Prototyp existiert, einen hohen Einfluss auf spätere Kosten haben können, besteht die Notwendigkeit empirische, entscheidungsrelevante Daten aus IT-Systemen der Produktentwicklung (z.B. Produktdatenmanagementsysteme) und des Betriebs (z.B. Enterprise-Resource-Planning-Systeme) zu extrahieren und dem Konstrukteur geeignete Methoden zur Aggregation der Daten bereitzustellen. Insbesondere bei der Optimierung hinsichtlich der Energieeffzienz von Produkten muss auf Daten des gesamten Lebenszyklus zurückgegriffen werden, um schon in der Entwicklungsphase Abschätzungen über den Energieverbrauch im Produktleben treffen zu können. Eine Optimierung der Energieeffizienz kann dabei sowohl durch die Steigerung der Produktivität bei gleichbleibendem Energieverbrauch als auch durch die Verringerung des Energieverbrauchs bei gleichbleibender Produktivität erfolgen. In diesem Bericht soll der Produktentwicklungsprozess aus IT-Sicht betrachtet werden, indem zunächst aktuelle Methodiken der Produktentwicklung mit ihrer IT-Unterstützung und der beteiligten IT-Systeme untersucht werden. Es werden Anforderungen an ein IT-System formuliert, die Energieeffizienzbewertungen und -optimierungen in allen Phasen der Produktentwicklung unter Nutzung der beteiligten IT-Systeme ermöglichen. Ein solches IT-System zur energiesensitiven Produktentwicklung (energiesensitives Produktentwicklungssystem) soll den Konstrukteur bei der Entwicklung energieeffizenter Produkte unterstützen. Dafür müssen die Funktionalitäten bestehender PDM-Systeme um Methoden zur Analyse, Synthese und Bewertung der Energieeffizienz des Produkts erweitert werden. Es wird abschließend vorgeschlagen, wie die Methoden zur Bewertung energierelevanter Daten durch Workflows umgesetzt werden können.
85

An economic analysis of digitalized and standardized workflows within the operating room

Von Schudnat, Christian 02 February 2025 (has links)
Tesis por compendio / [ES] La digitalización en la sanidad sigue poco desarrollada en comparación con muchos otros sectores, pero ofrece la posibilidad de hacer frente a retos cada vez mayores. A partir del caso de un sistema digital de gestión del flujo de trabajo implantado en el quirófano, esta tesis examina cuál podría ser el impacto de la digitalización en la calidad, la eficiencia y la economía de la gestión hospitalaria. La metodología de investigación se basa en un enfoque cuantitativo. En primer lugar, se ha desarrollado una visión general y un análisis del mercado sanitario europeo, así como una amplia y sistemática revisión bibliográfica. Esta se ha enfocado en la eficiencia y el impacto económico de la estandarización y digitalización de los procesos intraoperatorios en el quirófano. Los resultados proporcionaron la base para derivar las preguntas de investigación en torno a los sistemas digitales de gestión del flujo de trabajo. Para cubrir las lagunas de investigación identificadas, se han obtenido y analizado de forma retrospectiva datos de un hospital, como estudio de caso, sobre un sistema digital de gestión del flujo de trabajo, Surgical Process Manager (SPM), en ortopedia y cirugía general. Los datos adicionales proporcionados por los pacientes, se han utilizado de forma retrospectiva para realizar cálculos económicos con el fin de responder a la pregunta principal de la investigación en cirugía general, centrándose en la cirugía de la obesidad. Las principales conclusiones de la tesis muestran que la implantación de sistemas digitales de gestión del flujo de trabajo, como el SPM, mejora la eficiencia y la economía en la cirugía de la obesidad. El análisis de odds ratio (razón de oportunidades/probabilidades) para evaluar el impacto en la calidad no permitió extraer conclusiones ni en ortopedia ni en cirugía de la obesidad. El cálculo coste-beneficio en cirugía de la obesidad mostró un ahorro de costes de 318 euros por paciente, lo que supuso un total de 10.073 euros. Este beneficio económico se consiguió gracias a la disminución de la duración de la estancia (-1,2 días). Por primera vez, esta investigación aporta pruebas sobre el valor económico de los procesos digitalizados y estandarizados en el quirófano en cirugía general y ambulatoria. Los resultados facilitarán las decisiones de inversión en digitalización de la gestión hospitalaria y ofrecerán opciones para superar la carga económica en el mercado sanitario actual. Además, los resultados proporcionan detalles en profundidad sobre estructuras de costes específicas, cálculos, así como reembolsos y cómo medir eficazmente el impacto financiero en los sistemas digitales en el quirófano. / [CA] La digitalització de l'assistència sanitària encara està endarrerida en comparació amb moltes altres indústries, però ofereix el potencial per fer front als reptes creixents. Aquesta tesi examina sobre l'exemple d'un sistema de gestió de flux de treball digital, implementat al quiròfan quirúrgic (OR), quin podria ser l'impacte de la digitalització en la qualitat, l'eficiència i l'economia per a la gestió hospitalària. La metodologia d'aquesta investigació es basa en un enfocament quantitatiu. En primer lloc, es va fer una visió general i anàlisi del mercat sanitari i una extensa revisió sistemàtica de la literatura. Es va centrar en l'eficiència i l'impacte econòmic de l'estandardització i la digitalització dels processos intraoperatoris al quiròfan. Les troballes van proporcionar la base per derivar les preguntes de recerca al voltant dels sistemes de gestió de flux de treball digitals. Per cobrir els buits de recerca identificats, s'han extret i analitzat retrospectivament dades d'hospitals d'un sol centre en un sistema de gestió de flux de treball digital, Surgical Process Manager (SPM), en ortopèdia i cirurgia general i visceral (G&V). S'han utilitzat retrospectivament dades addicionals dels pacients per a càlculs econòmics per concloure la pregunta principal de recerca en G&V amb un enfocament en la cirurgia de l'obesitat. Les principals conclusions de la tesi mostren que la implementació de sistemes digitals de gestió de flux de treball, a partir de l'exemple de l'SPM, millora l'eficiència i l'economia en la cirurgia de l'obesitat. L'anàlisi d'odds ratio per avaluar l'impacte en la qualitat no va permetre treure conclusions ni en ortopèdia ni en cirurgia de l'obesitat. El càlcul cost-benefici en cirurgia de l'obesitat va mostrar un estalvi de costos de 318 € per pacient, que va ascendir a 10.073 €. Aquest benefici econòmic es va aconseguir mitjançant la disminució de la durada de l'estada (-1,2 dies). Per primera vegada la investigació proporciona evidència sobre el valor econòmic dels processos digitalitzats i estandarditzats a la sala d'operacions en G&V. Els resultats facilitaran les decisions d'inversió en la digitalització de la gestió hospitalària i oferiran opcions per superar la càrrega econòmica del mercat sanitari actual. A més, les troballes proporcionen detalls detallats sobre estructures de costos específiques, càlculs, així com el reemborsament i com mesurar eficaçment l'impacte financer en els sistemes digitals de l'OR. / [EN] Digitalization in healthcare still lags compared to many other industries but offers the potential to cope with increasing challenges. This thesis examines the example of digital workflow management implemented in the surgical Operating Room (OR) and what the impact of digitalization on quality, efficiency and economics for hospital management could be. The methodology for this research is based on a quantitative approach. First, an overview and analysis of the healthcare market and an extensive systematic literature review were carried out. It was focused on the efficiency and economic impact of standardization and digitalization of intraoperative processes in the OR. The findings provided the basis for the research questions around digital workflow-management-systems. Single-center hospital data on a digital workflow-management-system, Surgical Process Manager (SPM), has been retrospectively extracted and analyzed in orthopedics and general & visceral surgery (G&V) to fill identified research gaps. Additionally, provided patient data has been retrospectively used for economic calculations to conclude the main research question in G&V, focusing on obesity surgery. The main findings of the thesis show that implementing digital workflow management systems, in the example of the SPM, improves efficiency and economics in obesity surgery. The odds ratio analysis to assess the impact on quality did not allow conclusions in orthopedics or obesity surgery. The cost-benefit calculation in obesity surgery showed cost savings of 318 € per patient, totaling 10,073€. This economic benefit was achieved by decreasing the length of stay (-1.2 days). For the first time, research provides evidence of the economic value of digitized and standardized processes in the OR in G&V. The results will facilitate investment decisions in digitization of hospital management and offer options to overcome the financial burden in the current healthcare market. Also, the findings provide in-depth details on specific cost structures, calculations, reimbursement, and how to effectively measure the financial impact on digital systems in the OR. / Von Schudnat, C. (2024). An economic analysis of digitalized and standardized workflows within the operating room [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/203008 / Compendio
86

Scientific Workflows for Hadoop

Bux, Marc Nicolas 07 August 2018 (has links)
Scientific Workflows bieten flexible Möglichkeiten für die Modellierung und den Austausch komplexer Arbeitsabläufe zur Analyse wissenschaftlicher Daten. In den letzten Jahrzehnten sind verschiedene Systeme entstanden, die den Entwurf, die Ausführung und die Verwaltung solcher Scientific Workflows unterstützen und erleichtern. In mehreren wissenschaftlichen Disziplinen wachsen die Mengen zu verarbeitender Daten inzwischen jedoch schneller als die Rechenleistung und der Speicherplatz verfügbarer Rechner. Parallelisierung und verteilte Ausführung werden häufig angewendet, um mit wachsenden Datenmengen Schritt zu halten. Allerdings sind die durch verteilte Infrastrukturen bereitgestellten Ressourcen häufig heterogen, instabil und unzuverlässig. Um die Skalierbarkeit solcher Infrastrukturen nutzen zu können, müssen daher mehrere Anforderungen erfüllt sein: Scientific Workflows müssen parallelisiert werden. Simulations-Frameworks zur Evaluation von Planungsalgorithmen müssen die Instabilität verteilter Infrastrukturen berücksichtigen. Adaptive Planungsalgorithmen müssen eingesetzt werden, um die Nutzung instabiler Ressourcen zu optimieren. Hadoop oder ähnliche Systeme zur skalierbaren Verwaltung verteilter Ressourcen müssen verwendet werden. Diese Dissertation präsentiert neue Lösungen für diese Anforderungen. Zunächst stellen wir DynamicCloudSim vor, ein Simulations-Framework für Cloud-Infrastrukturen, welches verschiedene Aspekte der Variabilität adäquat modelliert. Im Anschluss beschreiben wir ERA, einen adaptiven Planungsalgorithmus, der die Ausführungszeit eines Scientific Workflows optimiert, indem er Heterogenität ausnutzt, kritische Teile des Workflows repliziert und sich an Veränderungen in der Infrastruktur anpasst. Schließlich präsentieren wir Hi-WAY, eine Ausführungsumgebung die ERA integriert und die hochgradig skalierbare Ausführungen in verschiedenen Sprachen beschriebener Scientific Workflows auf Hadoop ermöglicht. / Scientific workflows provide a means to model, execute, and exchange the increasingly complex analysis pipelines necessary for today's data-driven science. Over the last decades, scientific workflow management systems have emerged to facilitate the design, execution, and monitoring of such workflows. At the same time, the amounts of data generated in various areas of science outpaced hardware advancements. Parallelization and distributed execution are generally proposed to deal with increasing amounts of data. However, the resources provided by distributed infrastructures are subject to heterogeneity, dynamic performance changes at runtime, and occasional failures. To leverage the scalability provided by these infrastructures despite the observed aspects of performance variability, workflow management systems have to progress: Parallelization potentials in scientific workflows have to be detected and exploited. Simulation frameworks, which are commonly employed for the evaluation of scheduling mechanisms, have to consider the instability encountered on the infrastructures they emulate. Adaptive scheduling mechanisms have to be employed to optimize resource utilization in the face of instability. State-of-the-art systems for scalable distributed resource management and storage, such as Apache Hadoop, have to be supported. This dissertation presents novel solutions for these aspirations. First, we introduce DynamicCloudSim, a cloud computing simulation framework that is able to adequately model the various aspects of variability encountered in computational clouds. Secondly, we outline ERA, an adaptive scheduling policy that optimizes workflow makespan by exploiting heterogeneity, replicating bottlenecks in workflow execution, and adapting to changes in the underlying infrastructure. Finally, we present Hi-WAY, an execution engine that integrates ERA and enables the highly scalable execution of scientific workflows written in a number of languages on Hadoop.
87

Uso de sistema de gerência de workflow para apoiar o desenvolvimento de software baseado no processo unificado da Rational estendido para alcançar níveis 2 e 3 do modelo de maturidade / Using a workflow management system to support software development based on extended rational unified process to reach maturity model levels 2 and 3

Manzoni, Lisandra Vielmo January 2001 (has links)
Este trabalho descreve a avaliação do Processo Unificado Rational (RUP) realizada com base no Modelo de Maturidade da Capacitação (CMM ou SW-CMM), e a utilização de um sistema de gerência de workflow comercial, Exchange 2000 Server, na implementação de um protótipo de um ambiente de apoio a este processo, chamado de Ambiente de Gerenciamento de Projetos (AGP). O Processo Unificado Rational (RUP) foi avaliado com relação às práticas-chave descritas pelo Modelo de Maturidade da Capacitação (CMM) do Software Engineering Institute (SEI), da Carnegie Mellon University. A avaliação identificou o suporte fornecido por este modelo de processo às organizações que desejam alcançar níveis 2 e 3 do CMM. A avaliação resultou na elaboração de propostas para complementar as macro-atividades (Core Workflows) do RUP, visando satisfazer as práticas-chave do CMM. O CMM apresenta um modelo de avaliação de processo que busca atingir a maturidade dos processos da organização, é específico para o desenvolvimento de software, os aspectos de melhoria contínua são fortemente evidenciados e várias organizações já estão utilizando-o com sucesso. O RUP surgiu como uma proposta de unificar as melhores práticas de desenvolvimento de software. Foi experimentada a utilização de um sistema de gerência de workflow, de fato um servidor de colaboração, para apoiar o processo de desenvolvimento de software. A ferramenta desenvolvida foi avaliada com base em requisitos considerados, por alguns autores da área, desejáveis em um ambiente de apoio ao processo de desenvolvimento. O protótipo do ambiente de gerenciamento de projetos é uma ferramenta de suporte baseada na Web, que visa auxiliar os gerentes de projeto de software nas atividades de gerenciamento e controle, e ajudar na interação e troca de informações entre os membros da equipe de desenvolvimento. O Processo Unificado apresenta uma abordagem bem-definida dos processos de engenharia de software e de gerenciamento de projetos de software, mas não se concentra em atividades de gerenciamento de sistemas. Ele apresenta lacunas em atividades envolvendo gerenciamento de recursos humanos, gerenciamento de custos e gerenciamento de aquisição. AGP é uma ferramenta flexível que pode ser acessada pela Internet, suporta a colaboração entre os membros de uma equipe, e oferece os benefícios da Web, como navegação intuitiva através de links e páginas. Esta ferramenta ajuda no suporte ao gerenciamento, fornecendo opções para planejar e monitorar o projeto, e suporta eventos, como mudança de estados, e comunicação aos usuários de suas novas tarefas. / This master dissertation describes the assessment of the Rational Unified Process (RUP) based on the Capability Maturity Model for Software (SW-CMM or CMM), and the implementation of a prototype tool to support this process based on of-the-shelf Workflow Management System, Exchange 2000 Server. The prototype developed is called Project Management Environment (PME). Rational Unified Process (RUP) was assessed based on the key practices described for the Capability Maturity Model (CMM) at the Carnegie Mellon Software Engineering Institute. The assessment identified the facilities that RUP offers to support an organization aiming at CMM levels 2 and 3. The assessment resulted in the elaboration of propositions to complement the Rational Unified Process in order to satisfy the key process areas of CMM. CMM shows a process model that is far fetched to reach the process maturity of an organization, is specific for the software development, and strongly emphasizes the aspects of continuous improvement and several organizations already used it with success. RUP describes how to apply best practices of software engineering. It was experimented the use of a Workflow Management System, in fact a collaboration server, to support the software development process. The experimental environment was assessed considering the requirements identified by various researchers for an environment to effectively support a software development process. The prototype software development environment is a web-based process support system, which provides means to assist the management of software development projects and help the interaction and exchange of information between disperse members of a development. The Rational Unified Process presents a well defined approach on software project management and software engineering processes, but it is not an approach centered on systems management concerns. Therefore it lacks activities involving issues as cost management, human resource management, communications management, and procurement management. PME is a flexible tool that can be accessed through the Internet, supporting the collaboration between team members, and offering the benefits of the Web, with intuitive navigation through of links and pages. It helps to support management control, providing options to plan and monitor the project, and supports events of the process, as changing states, and communicates users of their attributed tasks.
88

Uso de sistema de gerência de workflow para apoiar o desenvolvimento de software baseado no processo unificado da Rational estendido para alcançar níveis 2 e 3 do modelo de maturidade / Using a workflow management system to support software development based on extended rational unified process to reach maturity model levels 2 and 3

Manzoni, Lisandra Vielmo January 2001 (has links)
Este trabalho descreve a avaliação do Processo Unificado Rational (RUP) realizada com base no Modelo de Maturidade da Capacitação (CMM ou SW-CMM), e a utilização de um sistema de gerência de workflow comercial, Exchange 2000 Server, na implementação de um protótipo de um ambiente de apoio a este processo, chamado de Ambiente de Gerenciamento de Projetos (AGP). O Processo Unificado Rational (RUP) foi avaliado com relação às práticas-chave descritas pelo Modelo de Maturidade da Capacitação (CMM) do Software Engineering Institute (SEI), da Carnegie Mellon University. A avaliação identificou o suporte fornecido por este modelo de processo às organizações que desejam alcançar níveis 2 e 3 do CMM. A avaliação resultou na elaboração de propostas para complementar as macro-atividades (Core Workflows) do RUP, visando satisfazer as práticas-chave do CMM. O CMM apresenta um modelo de avaliação de processo que busca atingir a maturidade dos processos da organização, é específico para o desenvolvimento de software, os aspectos de melhoria contínua são fortemente evidenciados e várias organizações já estão utilizando-o com sucesso. O RUP surgiu como uma proposta de unificar as melhores práticas de desenvolvimento de software. Foi experimentada a utilização de um sistema de gerência de workflow, de fato um servidor de colaboração, para apoiar o processo de desenvolvimento de software. A ferramenta desenvolvida foi avaliada com base em requisitos considerados, por alguns autores da área, desejáveis em um ambiente de apoio ao processo de desenvolvimento. O protótipo do ambiente de gerenciamento de projetos é uma ferramenta de suporte baseada na Web, que visa auxiliar os gerentes de projeto de software nas atividades de gerenciamento e controle, e ajudar na interação e troca de informações entre os membros da equipe de desenvolvimento. O Processo Unificado apresenta uma abordagem bem-definida dos processos de engenharia de software e de gerenciamento de projetos de software, mas não se concentra em atividades de gerenciamento de sistemas. Ele apresenta lacunas em atividades envolvendo gerenciamento de recursos humanos, gerenciamento de custos e gerenciamento de aquisição. AGP é uma ferramenta flexível que pode ser acessada pela Internet, suporta a colaboração entre os membros de uma equipe, e oferece os benefícios da Web, como navegação intuitiva através de links e páginas. Esta ferramenta ajuda no suporte ao gerenciamento, fornecendo opções para planejar e monitorar o projeto, e suporta eventos, como mudança de estados, e comunicação aos usuários de suas novas tarefas. / This master dissertation describes the assessment of the Rational Unified Process (RUP) based on the Capability Maturity Model for Software (SW-CMM or CMM), and the implementation of a prototype tool to support this process based on of-the-shelf Workflow Management System, Exchange 2000 Server. The prototype developed is called Project Management Environment (PME). Rational Unified Process (RUP) was assessed based on the key practices described for the Capability Maturity Model (CMM) at the Carnegie Mellon Software Engineering Institute. The assessment identified the facilities that RUP offers to support an organization aiming at CMM levels 2 and 3. The assessment resulted in the elaboration of propositions to complement the Rational Unified Process in order to satisfy the key process areas of CMM. CMM shows a process model that is far fetched to reach the process maturity of an organization, is specific for the software development, and strongly emphasizes the aspects of continuous improvement and several organizations already used it with success. RUP describes how to apply best practices of software engineering. It was experimented the use of a Workflow Management System, in fact a collaboration server, to support the software development process. The experimental environment was assessed considering the requirements identified by various researchers for an environment to effectively support a software development process. The prototype software development environment is a web-based process support system, which provides means to assist the management of software development projects and help the interaction and exchange of information between disperse members of a development. The Rational Unified Process presents a well defined approach on software project management and software engineering processes, but it is not an approach centered on systems management concerns. Therefore it lacks activities involving issues as cost management, human resource management, communications management, and procurement management. PME is a flexible tool that can be accessed through the Internet, supporting the collaboration between team members, and offering the benefits of the Web, with intuitive navigation through of links and pages. It helps to support management control, providing options to plan and monitor the project, and supports events of the process, as changing states, and communicates users of their attributed tasks.
89

Uso de sistema de gerência de workflow para apoiar o desenvolvimento de software baseado no processo unificado da Rational estendido para alcançar níveis 2 e 3 do modelo de maturidade / Using a workflow management system to support software development based on extended rational unified process to reach maturity model levels 2 and 3

Manzoni, Lisandra Vielmo January 2001 (has links)
Este trabalho descreve a avaliação do Processo Unificado Rational (RUP) realizada com base no Modelo de Maturidade da Capacitação (CMM ou SW-CMM), e a utilização de um sistema de gerência de workflow comercial, Exchange 2000 Server, na implementação de um protótipo de um ambiente de apoio a este processo, chamado de Ambiente de Gerenciamento de Projetos (AGP). O Processo Unificado Rational (RUP) foi avaliado com relação às práticas-chave descritas pelo Modelo de Maturidade da Capacitação (CMM) do Software Engineering Institute (SEI), da Carnegie Mellon University. A avaliação identificou o suporte fornecido por este modelo de processo às organizações que desejam alcançar níveis 2 e 3 do CMM. A avaliação resultou na elaboração de propostas para complementar as macro-atividades (Core Workflows) do RUP, visando satisfazer as práticas-chave do CMM. O CMM apresenta um modelo de avaliação de processo que busca atingir a maturidade dos processos da organização, é específico para o desenvolvimento de software, os aspectos de melhoria contínua são fortemente evidenciados e várias organizações já estão utilizando-o com sucesso. O RUP surgiu como uma proposta de unificar as melhores práticas de desenvolvimento de software. Foi experimentada a utilização de um sistema de gerência de workflow, de fato um servidor de colaboração, para apoiar o processo de desenvolvimento de software. A ferramenta desenvolvida foi avaliada com base em requisitos considerados, por alguns autores da área, desejáveis em um ambiente de apoio ao processo de desenvolvimento. O protótipo do ambiente de gerenciamento de projetos é uma ferramenta de suporte baseada na Web, que visa auxiliar os gerentes de projeto de software nas atividades de gerenciamento e controle, e ajudar na interação e troca de informações entre os membros da equipe de desenvolvimento. O Processo Unificado apresenta uma abordagem bem-definida dos processos de engenharia de software e de gerenciamento de projetos de software, mas não se concentra em atividades de gerenciamento de sistemas. Ele apresenta lacunas em atividades envolvendo gerenciamento de recursos humanos, gerenciamento de custos e gerenciamento de aquisição. AGP é uma ferramenta flexível que pode ser acessada pela Internet, suporta a colaboração entre os membros de uma equipe, e oferece os benefícios da Web, como navegação intuitiva através de links e páginas. Esta ferramenta ajuda no suporte ao gerenciamento, fornecendo opções para planejar e monitorar o projeto, e suporta eventos, como mudança de estados, e comunicação aos usuários de suas novas tarefas. / This master dissertation describes the assessment of the Rational Unified Process (RUP) based on the Capability Maturity Model for Software (SW-CMM or CMM), and the implementation of a prototype tool to support this process based on of-the-shelf Workflow Management System, Exchange 2000 Server. The prototype developed is called Project Management Environment (PME). Rational Unified Process (RUP) was assessed based on the key practices described for the Capability Maturity Model (CMM) at the Carnegie Mellon Software Engineering Institute. The assessment identified the facilities that RUP offers to support an organization aiming at CMM levels 2 and 3. The assessment resulted in the elaboration of propositions to complement the Rational Unified Process in order to satisfy the key process areas of CMM. CMM shows a process model that is far fetched to reach the process maturity of an organization, is specific for the software development, and strongly emphasizes the aspects of continuous improvement and several organizations already used it with success. RUP describes how to apply best practices of software engineering. It was experimented the use of a Workflow Management System, in fact a collaboration server, to support the software development process. The experimental environment was assessed considering the requirements identified by various researchers for an environment to effectively support a software development process. The prototype software development environment is a web-based process support system, which provides means to assist the management of software development projects and help the interaction and exchange of information between disperse members of a development. The Rational Unified Process presents a well defined approach on software project management and software engineering processes, but it is not an approach centered on systems management concerns. Therefore it lacks activities involving issues as cost management, human resource management, communications management, and procurement management. PME is a flexible tool that can be accessed through the Internet, supporting the collaboration between team members, and offering the benefits of the Web, with intuitive navigation through of links and pages. It helps to support management control, providing options to plan and monitor the project, and supports events of the process, as changing states, and communicates users of their attributed tasks.
90

On the construction of decentralised service-oriented orchestration systems

Jaradat, Ward January 2016 (has links)
Modern science relies on workflow technology to capture, process, and analyse data obtained from scientific instruments. Scientific workflows are precise descriptions of experiments in which multiple computational tasks are coordinated based on the dataflows between them. Orchestrating scientific workflows presents a significant research challenge: they are typically executed in a manner such that all data pass through a centralised computer server known as the engine, which causes unnecessary network traffic that leads to a performance bottleneck. These workflows are commonly composed of services that perform computation over geographically distributed resources, and involve the management of dataflows between them. Centralised orchestration is clearly not a scalable approach for coordinating services dispersed across distant geographical locations. This thesis presents a scalable decentralised service-oriented orchestration system that relies on a high-level data coordination language for the specification and execution of workflows. This system's architecture consists of distributed engines, each of which is responsible for executing part of the overall workflow. It exploits parallelism in the workflow by decomposing it into smaller sub-workflows, and determines the most appropriate engines to execute them using computation placement analysis. This permits the workflow logic to be distributed closer to the services providing the data for execution, which reduces the overall data transfer in the workflow and improves its execution time. This thesis provides an evaluation of the presented system which concludes that decentralised orchestration provides scalability benefits over centralised orchestration, and improves the overall performance of executing a service-oriented workflow.

Page generated in 0.0874 seconds