Spelling suggestions: "subject:"workflow"" "subject:"iworkflow""
371 |
Koala: sistema para integração de métodos de predição e análise de estruturas de proteína / Koala: system for integration of methods for protein structures prediction and analysisDefelicibus, Alexandre 25 January 2016 (has links)
A Biologia Computacional tem desenvolvido algoritmos aplicados a problemas relevantes da Biologia. Um desses problemas é a Protein Structure Prediction (PSP). Vários métodos têm sido desenvolvidos na literatura para lidar com esse problema. Porém a reprodução de resultados e a comparação dos mesmos não têm sido uma tarefa fácil. Nesse sentido, o Critical Assessment of protein Structure Prediction (CASP), busca entre seus objetivos, realizar tais comparações. Além disso, os sistemas desenvolvidos para esse problema em geral não possuem interface amigável, não favorecendo o uso por não especialistas da computação. Buscando reduzir essas dificuldades, este trabalho propões o Koala, um sistema baseado em uma plataforma web, que integra vários métodos de predição e análises de estruturas de proteínas, possibilitando a execução de experimentos complexos com o uso de fluxos de trabalhos. Os métodos de predição disponíveis podem ser integrados para a realização de análises dos resultados, usando as métricas RMSD, GDT-TS ou TM-Score. Além disso, o método Sort by front dominance (baseado no critério de optimalidade de Pareto), proposto nesse trabalho, consegue avaliar predições sem uma estrutura de referência. Os resultados obtidos, usando proteínas alvo de artigos recentes e do CASP11, indicam que o Koala tem capacidade de realizar um conjunto relativamente grande de experimentos estruturados, beneficiando a determinação de melhores estruturas de proteínas, bem como o desenvolvimento de novas abordagens para predição e análise por meio de fluxos de trabalho. / Computational Biology has developed algorithms applied to relevant problems from Biology. One of these probems is Protein Structure Prediction (PSP). Several methods have been developed on the liteture to deal with this problem. However, the reproduction of results and the comparison of the methods have not been an easy task. Accordingly, the Critical Assessment of protein Structure Prediction (CASP), has among his objectives, perform these comparisons. Besides, the developed systems for this problem have low usability, not benefiting the investigation of various methods by non experts. In order to minimize those difficulties, this project proposes Koala, a web-based system that integrates several algorithms applied to PSP and analysis, allowing the execution of complex experiments by using workflows. The prediction methods can be integrated to perform some analysis of the results, by using the RMSD, GDT-TS and TM-Score metrics. Moreover, the Sort by front dominance method (based on the criterion of Pareto optimalidad), proposed on this work, can evaluate predictions with no reference structure. The results obtained, using target proteins from recent articles and CASP11, indicate that Koala has the capability to execute a relatively large set of organized experiments, benefiting determining of better protein structures, as well as the development of new approaches for prediction and analysis through workflows.
|
372 |
Scientific Workflows for HadoopBux, Marc Nicolas 07 August 2018 (has links)
Scientific Workflows bieten flexible Möglichkeiten für die Modellierung und den Austausch komplexer Arbeitsabläufe zur Analyse wissenschaftlicher Daten. In den letzten Jahrzehnten sind verschiedene Systeme entstanden, die den Entwurf, die Ausführung und die Verwaltung solcher Scientific Workflows unterstützen und erleichtern. In mehreren wissenschaftlichen Disziplinen wachsen die Mengen zu verarbeitender Daten inzwischen jedoch schneller als die Rechenleistung und der Speicherplatz verfügbarer Rechner.
Parallelisierung und verteilte Ausführung werden häufig angewendet, um mit wachsenden Datenmengen Schritt zu halten. Allerdings sind die durch verteilte Infrastrukturen bereitgestellten Ressourcen häufig heterogen, instabil und unzuverlässig. Um die Skalierbarkeit solcher Infrastrukturen nutzen zu können, müssen daher mehrere Anforderungen erfüllt sein: Scientific Workflows müssen parallelisiert werden. Simulations-Frameworks zur Evaluation von Planungsalgorithmen müssen die Instabilität verteilter Infrastrukturen berücksichtigen. Adaptive Planungsalgorithmen müssen eingesetzt werden, um die Nutzung instabiler Ressourcen zu optimieren. Hadoop oder ähnliche Systeme zur skalierbaren Verwaltung verteilter Ressourcen müssen verwendet werden.
Diese Dissertation präsentiert neue Lösungen für diese Anforderungen. Zunächst stellen wir DynamicCloudSim vor, ein Simulations-Framework für Cloud-Infrastrukturen, welches verschiedene Aspekte der Variabilität adäquat modelliert. Im Anschluss beschreiben wir ERA, einen adaptiven Planungsalgorithmus, der die Ausführungszeit eines Scientific Workflows optimiert, indem er Heterogenität ausnutzt, kritische Teile des Workflows repliziert und sich an Veränderungen in der Infrastruktur anpasst. Schließlich präsentieren wir Hi-WAY, eine Ausführungsumgebung die ERA integriert und die hochgradig skalierbare Ausführungen in verschiedenen Sprachen beschriebener Scientific Workflows auf Hadoop ermöglicht. / Scientific workflows provide a means to model, execute, and exchange the increasingly complex analysis pipelines necessary for today's data-driven science. Over the last decades, scientific workflow management systems have emerged to facilitate the design, execution, and monitoring of such workflows. At the same time, the amounts of data generated in various areas of science outpaced hardware advancements.
Parallelization and distributed execution are generally proposed to deal with increasing amounts of data. However, the resources provided by distributed infrastructures are subject to heterogeneity, dynamic performance changes at runtime, and occasional failures. To leverage the scalability provided by these infrastructures despite the observed aspects of performance variability, workflow management systems have to progress: Parallelization potentials in scientific workflows have to be detected and exploited. Simulation frameworks, which are commonly employed for the evaluation of scheduling mechanisms, have to consider the instability encountered on the infrastructures they emulate. Adaptive scheduling mechanisms have to be employed to optimize resource utilization in the face of instability. State-of-the-art systems for scalable distributed resource management and storage, such as Apache Hadoop, have to be supported.
This dissertation presents novel solutions for these aspirations. First, we introduce DynamicCloudSim, a cloud computing simulation framework that is able to adequately model the various aspects of variability encountered in computational clouds. Secondly, we outline ERA, an adaptive scheduling policy that optimizes workflow makespan by exploiting heterogeneity, replicating bottlenecks in workflow execution, and adapting to changes in the underlying infrastructure. Finally, we present Hi-WAY, an execution engine that integrates ERA and enables the highly scalable execution of scientific workflows written in a number of languages on Hadoop.
|
373 |
Resource-oriented architecture based scientific workflow modellingDuan, Kewei January 2016 (has links)
This thesis studies the feasibility and methodology of applying state-of-the-art computer technology in scientific workflow modelling, within a collaborative environment. The collaborative environment also indicates that the people involved include non-computer scientists or engineers from other disciplines. The objective of this research is to provide a systematic methodology based on a web environment for the purpose of lowering the barriers brought by the heterogeneous features of multi-institutions, multi-platforms and geographically distributed resources which are implied in the collaborative environment of scientific workflow.
|
374 |
P-MIA : padr?o m?ltiplas inst?ncias autoadapt?veis : um padr?o de dados para wokflows cient?ficosH?bler, Patr?cia Nogueira 10 December 2010 (has links)
Made available in DSpace on 2015-04-14T14:49:29Z (GMT). No. of bitstreams: 1
428502.pdf: 4574146 bytes, checksum: 0b828e3a78fb87024bcab2a9d14b3a60 (MD5)
Previous issue date: 2010-12-10 / A busca de solu??es informatizadas, com o objetivo de se obter agilidade e confiabilidade nas informa??es, faz com que profissionais de diferentes ?reas utilizem tecnologias com prop?sitos semelhantes. A utiliza??o de sistemas de gerenciamento de workflow ? um exemplo desse tipo de solu??o, a qual empresas e cientistas utilizam para documentar as etapas executadas e otimizar o tempo de execu??o. Esta Tese apresenta um padr?o capaz de manipular grandes volumes de dados e otimizar seu processamento, identificando grupos de dados promissores, como um componente de workflows cient?ficos. A ?rea de aplica??o ? a Bioinform?tica, uma ?rea multidisciplinar, que se utiliza de v?rias ferramentas computacionais para a realiza??o de seus experimentos, os quais podem demorar anos para serem finalizados. A solu??o proposta beneficia, dentro da Bioinform?tica, o desenho racional de f?rmacos. Assim, a contextualiza??o da ?rea de estudo ? realizada, e ? proposta uma solu??o para o problema por meio da defini??o de um padr?o de dados que permite a autoadapta??o de inst?ncias de workflow em execu??o. O P-MIA: Padr?o M?ltiplas Inst?ncias Autoadapt?veis, assim denominado por manipular um grande conjunto de dados e por, em tempo de execu??o, definir as a??es a serem executadas sobre os dados, ? formalizado com base nas defini??es de redes de Petri e sua representa??o gr?fica feita por meio de redes de Petri coloridas. Sobre o padr?o, s?o realizados testes experimentais, os quais comprovam que, com a utiliza??o do P-MIA, ? poss?vel reduzir a quantidade de experimentos, mantendo um crit?rio de qualidade aceit?vel.
|
375 |
Avalia??o de m?todos para gera??o de documentos personalizadosRiva, Aline Duarte 23 March 2011 (has links)
Made available in DSpace on 2015-04-14T14:49:34Z (GMT). No. of bitstreams: 1
432735.pdf: 17180547 bytes, checksum: 96993fa825c7582c0569665df258c8c4 (MD5)
Previous issue date: 2011-03-23 / Este trabalho descreve o desenvolvimento de tr?s m?todos para constru??o de documentos personalizados (jornais e cat?logos) para seus leitores. Dois desses m?todos fazem a utiliza??o de um algoritmo proposto em Oliveira [21] respons?vel pela distribui??o das informa??es nas p?ginas os documentos e o terceiro m?todo ? baseado na constru??o de templates usados juntamente com a ferramenta de editora??o lnDesign1. Ap?s o desenvolvimento desses tr?s m?todos, dois tipos de pesquisas com usu?rios s?o realizadas a fim de verificar a qualidade dos documentos criados. Uma pesquisa ? realizada com um grupo de usu?rios sem experi?ncia em ferramentas de editora??o e a outra pesquisa com futuros jornalistas e publicit?rios. O objetivo do trabalho ? identificar qual dos m?todos desenvolvido ? mais adequado para a gera??o de determinados tipos de documento, nesse caso jornais e cat?logos. Finalmente este trabalho apresenta conceitos relacionados ? ?rea de documentos personalizados, assim como uma avalia??o dos resultados qualitativos e quantitativos
|
376 |
Um workflow cient?fico para a modelagem do processo de desenvolvimento de f?rmacos assistido por computador utilizando receptor flex?velMachado, Karina dos Santos 30 March 2007 (has links)
Made available in DSpace on 2015-04-14T14:50:21Z (GMT). No. of bitstreams: 1
391202.pdf: 2565991 bytes, checksum: a1f131e3f03bd916c91d19ddca5c04ff (MD5)
Previous issue date: 2007-03-30 / O desenho de drogas assistido por computador (CADD) ? um processo que envolve a execu??o seq?encial de diferentes programas, no qual ? testado se um determinado ligante (pequena mol?cula) interage bem com um receptor (geralmente uma prote?na ou enzima). Esse processo geralmente ? executando com o aux?lio de shell scripts. Por?m a modifica??o dos par?metros de entrada e an?lise dos resultados nesse tipo de abordagem ? uma tarefa complexa e que consome muito tempo. Al?m disso, para considerar a flexibilidade do receptor durante experimentos de docking, ? necess?rio que se utilize milhares de snapshots do receptor. Neste trabalho, esses snapshots s?o obtidos da trajet?ria de simula??es por din?mica molecular do receptor. Devido aos desafios associados ? manipula??o desse grande n?mero de snapshots do receptor e ? necessidade de um melhor controle sobre os diferentes programas associados a esse processo, esse trabalho apresenta um workflow cient?fico para automa??o do processo de desenvolvimento de drogas assistido por computador, incluindo de forma expl?cita a flexibilidade do receptor. Os softwares JAWE e Shark foram utilizados para modelar e executar respectivamente o workflow. Mesmo com essa automa??o no processo, ainda h? problemas relacionados ao n?mero de snapshots do receptor que deve ser utilizado. O tempo necess?rio para a execu??o de experimentos de docking, considerando aproximadamente tr?s mil snapshots do receptor, ? em torno de 500 horas. Para simplificar e agilizar esse processo, foi desenvolvida uma forma de sele??o de snapshots baseado na energia livre de liga??o (FEB). Assim, durante os experimentos de docking, chamados de docking seletivo, somente uma parte dos snapshots s?o utilizados: aqueles que obtiveram os melhores resultados de intera??o em termos de FEB durante um experimento exaustivo com um determinado ligante. Para validar essa implementa??o e sele??o, foram executados experimentos de docking com a enzima InhA de M. tuberculosis como receptor e cinco ligantes diferentes. Os resultados desses experimentos ilustram a efici?ncia do workflow implementado e da forma de sele??o de snapshots do receptor que est? sendo realizada
|
377 |
Fluigi: an end-to-end software workflow for microfluidic designHuang, Haiyao 17 February 2016 (has links)
One goal of synthetic biology is to design and build genetic circuits in living cells for a range of applications with implications in health, materials, and sensing. Computational design methodologies allow for increased performance and reliability of these circuits. Major challenges that remain include increasing the scalability and robustness of engineered biological systems and streamlining and automating the synthetic biology workflow of “specify-design-build-test.”
I summarize the advances in microfluidic technology, particularly microfluidic large scale integration, that can be used to address the challenges facing each step of the synthetic biology workflow for genetic circuits. Microfluidic technologies allow precise control over the flow of biological content within microscale devices, and thus may provide more reliable and scalable construction of synthetic biological systems. However, adoption of microfluidics for synthetic biology has been slow due to the expert knowledge and equipment needed to fabricate and control devices. I present an end-to-end workflow for a
computer-aided-design (CAD) tool, Fluigi, for designing microfluidic devices and for integrating biological Boolean genetic circuits with microfluidics. The workflow starts with a ``netlist" input describing the connectivity of microfluidic device to be designed, and proceeds through placement, routing, and design rule checking in a process analogous to electronic computer aided design (CAD). The output is an image of the device for printing as a mask for photolithography or for computer numerical control (CNC) machining. I also introduced a second workflow to allocate biological circuits to microfluidic devices and to generate the valve control scheme to enable biological computation on the device.
I used the CAD workflow to generate 15 designs including gradient generators, rotary pumps, and devices for housing biological circuits. I fabricated two designs, a gradient generator with CNC machining and a device for computing a biological XOR function with multilayer soft lithography, and verified their functions with dye. My efforts here show a first end-to-end demonstration of an extensible and foundational microfluidic CAD tool from design concept to fabricated device. This work provides a platform that when completed will automatically synthesize high level functional and performance specifications into fully realized microfluidic hardware, control software, and synthetic biological wetware.
|
378 |
Cost-efficient resource management for scientific workflows on the cloudPietri, Ilia January 2016 (has links)
Scientific workflows are used in many scientific fields to abstract complex computations (tasks) and data or flow dependencies between them. High performance computing (HPC) systems have been widely used for the execution of scientific workflows. Cloud computing has gained popularity by offering users on-demand provisioning of resources and providing the ability to choose from a wide range of possible configurations. To do so, resources are made available in the form of virtual machines (VMs), described as a set of resource characteristics, e.g. amount of CPU and memory. The notion of VMs enables the use of different resource combinations which facilitates the deployment of the applications and the management of the resources. A problem that arises is determining the configuration, such as the number and type of resources, that leads to efficient resource provisioning. For example, allocating a large amount of resources may reduce application execution time however at the expense of increased costs. This thesis investigates the challenges that arise on resource provisioning and task scheduling of scientific workflows and explores ways to address them, developing approaches to improve energy efficiency for scientific workflows and meet the user's objectives, e.g. makespan and monetary cost. The motivation stems from the wide range of options that enable to select cost-efficient configurations and improve resource utilisation. The contributions of this thesis are the following. (i) A survey of the issues arising in resource management in cloud computing; The survey focuses on VM management, cost efficiency and the deployment of scientific workflows. (ii) A performance model to estimate the workflow execution time for a different number of resources based on the workflow structure; The model can be used to estimate the respective user and energy costs in order to determine configurations that lead to efficient resource provisioning and achieve a balance between various conflicting goals. (iii) Two energy-aware scheduling algorithms that maximise the number of completed workflows from an ensemble under energy and budget or deadline constraints; The algorithms address the problem of energy-aware resource provisioning and scheduling for scientific workflow ensembles. (iv) An energy-aware algorithm that selects the frequency to be used for each workflow task in order to achieve energy savings without exceeding the workflow deadline; The algorithm takes into account the different requirements and constraints that arise depending on the workflow and system characteristics. (v) Two cost-based frequency selection algorithms that choose the CPU frequency for each provisioned resource in order to achieve cost-efficient resource configurations for the user and complete the workflow within the deadline; Decision making is based on both the workflow characteristics and the pricing model of the provider.
|
379 |
Multi-agent workflow model (MAWM): a workflow model designed for Chinese business processes. / CUHK electronic theses & dissertations collection / Digital dissertation consortium / ProQuest dissertations and thesesJanuary 2001 (has links)
A new workflow model, namely multi-agent workflow model (MAWM), is proposed. The cultural dependent issues in Chinese business process modeling are considered and catered for in MAWM. Moreover, MAWM offers high flexibility in workflow modeling. / Further, the structural correctness of a process in MAWM is verified. A set of structural correctness criteria are defined and based on them a series of verification algorithms are designed. Also, we show that the verification algorithm for a single process is linear in complexity. As such, it is more efficient than contemporary algorithms, which are polynomial. / In this thesis, we have provided a starting point for organizational issues, especially cultural issues support in workflow modeling. Our research focuses on the context of Chinese business processes. Based on our study of Chinese business culture, we identified a set of desirable properties for workflows in Chinese business process modeling. / Our research has shown that there are one must cater for the cultural requirements for effective Chinese business process modeling. This aspect, however, has been undermined. MAMW is our attempt to break through this "cultural vs. technology" iceberg. / Ren Yongjie. / Source: Dissertation Abstracts International, Volume: 62-09, Section: B, page: 4086. / Supervisers: Kam-Fai Wong; Boon-Toh Low. / Thesis (Ph.D.)--Chinese University of Hong Kong, 2001. / Includes bibliographical references (p. 124-132). / Electronic reproduction. Hong Kong : Chinese University of Hong Kong, [2012] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. Ann Arbor, MI : ProQuest dissertations and theses, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Electronic reproduction. Ann Arbor, MI : ProQuest Information and Learning Company, [200-] System requirements: Adobe Acrobat Reader. Available via World Wide Web. / Abstracts in English and Chinese. / School code: 1307.
|
380 |
Budování bankovního kontaktního centra / Building a contact center in banking industrySeifrt, Ladislav January 2006 (has links)
Budování kontaktního centra je velmi rozsáhlou a komplexní problematikou. Tato diplomová práce se zaměřuje na aspekt efektivního využití dostupných technologií pro podporu procesů kontaktního centra (CC), především pak v bankovním sektoru. V práci je zpracován multidimenzionální model obslužnosti kontaktního centra zachycující procesní stránku řízení. Model obslužnosti je poté zasazen do kontextu současných technologií pro CC a výsledkem je model, který zachycuje vliv jednotlivých technologií na proces řízení interakcí. Tento model je v závěru teoretické části aplikován na podmínky bankovnictví. Součástí diplomové práce je i praktická (neveřejná) část, která aplikuje předchozí model na podmínky kontaktního centra společností GE Money v ČR a SR. Do kontextu modelu je zasazen vývoj kontaktního centra GE Money po vybrané období, začínající před zahájením a končící dokončením několika projektů, které byly během zpracování této práce ve fázi realizace. Tato konfrontace CC GE Money s modelem je pak výchozím bodem pro doporučení dalších kroků v budování a rozvoji tohoto CC.
|
Page generated in 0.0404 seconds