Spelling suggestions: "subject:"workflow"" "subject:"iworkflow""
281 |
Integrating Synchronous Collaborative Applications with Product Lifecycle Management WorkflowsJohnson, Jordan Lowell 01 June 2015 (has links) (PDF)
Product Lifecycle Management (PLM) systems are used by thousands of engineering companies world wide. Improving these systems will have a drastic and global effect. One possible improvement is to integrate synchronous collaborative applications with PLM systems. These applications allow multiple people to work on a single digital object simultaneously. They have already been shown to reduce the time a task requires. Using these applications to complete a project will reduce the project time. However, simply including synchronous collaborative applications within a PLM system ignores powerful benefits that could provide further time-saving benefits. The integration must allow improved awareness at the project level, so that users can mediate their own actions.This thesis presents a method for such an integration. It also presents a prototype which implements that method. Testing was carried out using this prototype. As hypothesized, including synchronous collaborative applications shortened the overall project time. In addition, providing awareness information and allowing users to mediate themselves further shortened project times and reduced variation in those times. Proper integration should therefore provide awareness at the project level and allow users to mediate themselves to some extent.
|
282 |
CAD-Centric Dynamic Workflow GenerationKenworthy, Travis L. 20 April 2009 (has links) (PDF)
CAD systems are important design tools that enable the designer to conceptualize, visualize, analyze, and manufacture a design (Shahin 2008). Although high-end CAD systems provide several built-in design applications, the users of CAD often select various custom or proprietary non-CAD analyses that constrain, optimize, or evaluate their designs. An efficient method is needed to perform trade studies from within the CAD environment. Methods have been developed to meet the challenges associated with this need. The methods have been implemented in a program, called the Process Integrator, which resides in a CAD system and allows the user to perform trade studies on an assembly model from within the CAD environment. The Process Integrator allows the user to create a generic process configuration to link analyses with CAD assemblies for optimization. The generic configuration can then be run at any time, on any assembly that meets the configuration requirements. Test cases are presented in which the efficiency of the automated process is demonstrated. Results indicate that significant time savings can be achieved through the application of these methods.
|
283 |
Blockchain Use for Data Provenance in Scientific WorkflowSigurjonsson, Sindri Már Kaldal January 2018 (has links)
In Scientific workflows, data provenance plays a big part. Through data provenance, the execution of the workflow is documented and information about the data pieces involved are stored. This can be used to reproduce scientific experiments or to proof how the results from the workflow came to be. It is therefore vital that the provenance data that is stored in the provenance database is always synchronized with its corresponding workflow, to verify that the provenance database has not been tampered with. The blockchain technology has been gaining a lot of attention in recent years since Satoshi Nakamoto released his Bitcoin paper in 2009. The blockchain technology consists of a peer-to-peer network where an append-only ledger is stored and replicated across a peer-to-peer network and offers high tamper-resistance through its consensus protocols. In this thesis, the option of whether the blockchain technology is a suitable solution for synchronizing workflow with its provenance data was explored. A system that generates a workflow, based on a definition written in a Domain Specific Language, was extended to utilize the blockchain technology to synchronize the workflow itself and its results. Furthermore, the InterPlanetary File System was utilized to assist with the versioning of individual executions of the workflow. The InterPlanetary File System provided the functionality of comparing individual workflows executions in more detail and to discover how they differ. The solution was analyzed with respect to the 21 CFR Part 11 regulations imposed by the FDA in order to see how it could assist with fulfilling the requirements of the regulations. Analysis on the system shows that the blockchain extension can be used to verify if the synchronization between a workflow and its results has been tampered with. Experiments revealed that the size of the workflow did not have a significant effect on the execution time of the extension. Additionally, the proposed solution offers a constant cost in digital currency regardless of the workflow. However, even though the extension shows some promise of assisting with fulfilling the requirements of the 21 CFR Part 11 regulations, analysis revealed that the extension does not fully comply with it due to the complexity of the regulations / I vetenskapliga arbetsflöden är usprung (eng. provenance) av dataviktigt. Genom att spåra ursprunget av data, i form av dokumentation,kan datas ursprung sparas. Detta kan användas för att återskapavetenskapliga experiment eller för att bevisa hur resultat från arbetsflödegenererats. Det är därför viktigt att datas ursprung, som lagrasi ursprungsdatabasen, alltid är synkroniserad med dess motsvarandearbetsflöde som ett sätt att verifiera att ursprungsdatabasen intehar manipulerats. Blockchainteknologi har fått mycket uppmärksamhetde senaste åren sen Satoshi Nakamoto släppte sin Bitcoin artikelår 2009. Blockchainteknologi består av ett peer-to-peer nätverk där endastbifogning tillåts i en liggare som är replikerad över ett peer-topeernätverk vilken tillhandahåller hög manipuleringsresistans genomkonsensusprotokoll. I denna uppsats undersöks hurvida blockchainteknologi är en passande lösning för arbetsflödessynkronisering avursprungsdata. Ett system som genererar ett arbetsflöde, baserat påen definition som skrivits i ett domänspecifikt språk, var förlängt föratt utnyttja blockchainteknologi för synkronisering av arbetsflödet ochdess resultat. InterPlanetary File System användes för att assistera medversionshanteringen av individuella exekveringar av arbetsflödet. InterPlanetaryFile System tillhandahöll funktionalitet för att jämföra individuellaarbetsflödesexekveringar mer detaljerat samt att upptäckahur de skiljer sig åt. Resultaten är analyserade med hänsyn till 21 CFRPart 11 regleringar från FDA för att se hur resultaten kan assistera medatt uppfylla kraven av förordningarna. Analys av systemen visar attblockchainförlängningen kan användas för att verifiera att synkroniseringenmellan arbetsflödet och dess resultat inte har manipulerats.Experimenten visade att storleken av arbetsflödet inte hade märkbareffekt på exekveringstiden av förlängningen. Därutöver möjliggör denpresenterade lösningen en konstant kostnad i digital valuta oavsett arbetsflödetsstorlek. Även om förlängningen visar lovande resultat förassistering av fullföljande av 21 CFR Part 11 regleringarna påvisar analysatt förlängningen inte fullständigt uppfyller kraven på grund avkomplexiteten av dessa regleringar.
|
284 |
Transitions in the Workflow of a Psychiatric Hospital Unit During the COVID-19 Pandemic and Design ImplicationsRaquel Lauren Ruiz (14070165) 01 December 2022 (has links)
<p>As in many other aspects of society, the COVID-19 pandemic imposed a stress test on the entire healthcare system. Hospitals faced complex challenges, including making adaptations to each unit's specific characteristics and needs. Inpatient psychiatric units differ substantially from other hospital units, creating unique problems and considerations to safely manage patients, families, and healthcare staff during a pandemic. As research is done to improve our infrastructure for future pandemics, facilities with psychiatric units have unique design considerations. This thesis aims to examine the effects of the COVID-19 pandemic on the workflow of a psychiatric unit and its possible implications on future design requirements. Specifically, looking at transitions, i.e., what areas remained unchanged, what changes were made but have not persisted, and those that persisted, and what needs to be improved to better function in the next pandemic. The methodology used in this study included an examination of current design requirements and specifications and the workflow in a facility of this type. To maintain the facility's and staff's privacy, the specific unit is identified as Unit-A. A construction firm provided access to drawings, and engineers were interviewed to elucidate specific design/performance requirements. The methodology also included interviews of nurses and administrative staff actively engaged within Unit-A throughout the pandemic. A workflow was established based on the information collected before and during the pandemic. An analysis of the interviews and the workflow revealed: (i) the highest demands were placed on the direct-care staff, and (ii) policies implemented by the hospital without special consideration for the psychiatric unit caused unintended strain on the workflow. These significant findings suggest that each hospital should have a unique pandemic plan to accommodate their respective units. Obtaining more input from those directly impacted will improve the resiliency of the workflow. These findings also suggest that design specifications need to more effectively incorporate the needs of the professional staff, such as including a larger number of spaces for the healthcare staff in the hospital compared to current standards </p>
|
285 |
Konstnärlighet inom analoga kontra digitala arbetssätt vid mixning : En jämförande studie mellan två olika arbetssättVendel, Simon January 2023 (has links)
No description available.
|
286 |
Efficient and Cost-effective Workflow Based on Containers for Distributed Reproducible ExperimentsPerera, Shelan January 2016 (has links)
Reproducing distributed experiments is a challenging task for many researchers. There are many factors which make this problem harder to solve. In order to reproduce distributed experiments, researchers need to perform complex deployments which involve many dependent software stacks with many configurations and manual orchestrations. Further, researchers need to allocate a larger amount of money for clusters of machines and then spend their valuable time to perform those experiments. Also, some of the researchers spend a lot of time to validate a distributed scenario in a real environment as most of the pseudo distributed systems do not provide the characteristics of a real distributed system. Karamel provides solutions for the inconvenience caused by the manual orchestration by providing a comprehensive orchestration platform to deploy and run distributed experiments. But still, this solution may incur a similar amount of expenses as of a manual distributed setup since it uses virtual machines underneath. Further, it does not provide quick validations of a distributed setup with a quick feedback loop, as it takes considerable time to terminate and provision new virtual machines. Therefore, we provide a solution by integrating Docker that can co-exists with virtual machine based deployment model seamlessly. Our solution encapsulates the container-based deployment model for users to reproduce distributed experiment in a cost-effective and efficient manner. In this project, we introduce novel deployment model with containers that is not possible with the conventional virtual machine based deployment model. Further, we evaluate our solution with a real deployment of Apache Hadoop Terasort experiment which is a benchmark for Apache Hadoop map-reduce platform in order to explain how this model can be used to save the cost and improve the efficiency.
|
287 |
Automated 3D Visualization of Brain CancerAl-Rei, Mona January 2017 (has links)
Three-dimensional (3D) visualization in cancer control has seen recent progress due to the benefits it offers to the treatment, education, and understanding of the disease. This work identifies the need for an application that directly processes two-dimensional (2D) DICOM images for the segmentation of a brain tumor and the generation of an interactive 3D model suitable for enabling multisensory learning and visualization. A new software application (M-3Ds) was developed to meet these objectives with three modes of segmentation (manual, automatic, and hybrid) for evaluation. M-3Ds software was designed to mitigate the cognitive load and empower health care professionals in their decision making for improved patient outcomes and safety. Comparison of mode accuracy was evaluated. Industrial standard software programs were employed to verify and validate the results of M-3Ds using quantitative volumetric comparison. The study determined that M-3Ds‘ hybrid mode was the highest accuracy with least user intervention for brain tumor segmentation and suitable for the clinical workflow. This paper presents a novel approach to improve medical education, diagnosis, treatment for either surgical planning or radiotherapy of brain cancer. / Thesis / Master of Science (MSc)
|
288 |
Data-Intensive Biocomputing in the CloudMeeramohideen Mohamed, Nabeel 25 September 2013 (has links)
Next-generation sequencing (NGS) technologies have made it possible to rapidly sequence the human genome, heralding a new era of health-care innovations based on personalized genetic information. However, these NGS technologies generate data at a rate that far outstrips Moore\'s Law. As a consequence, analyzing this exponentially increasing data deluge requires enormous computational and storage resources, resources that many life science institutions do not have access to. As such, cloud computing has emerged as an obvious, but still nascent, solution.
This thesis intends to investigate and design an efficient framework for running and managing large-scale data-intensive scientific applications in the cloud. Based on the learning from our parallel implementation of a genome analysis pipeline in the cloud, we aim to provide a framework for users to run such data-intensive scientific workflows using a hybrid setup of client and cloud resources. We first present SeqInCloud, our highly scalable parallel implementation of a popular genetic variant pipeline called genome analysis toolkit (GATK), on the Windows Azure HDInsight cloud platform. Together with a parallel implementation of GATK on Hadoop, we evaluate the potential of using cloud computing for large-scale DNA analysis and present a detailed study on efficiently utilizing cloud resources for running data-intensive, life-science applications. Based on our experience from running SeqInCloud on Azure, we present CloudFlow, a feature rich workflow manager for running MapReduce-based bioinformatic pipelines utilizing both client and cloud resources. CloudFlow, built on the top of an existing MapReduce-based workflow manager called Cloudgene, provides unique features that are not offered by existing MapReduce-based workflow managers, such as enabling simultaneous use of client and cloud resources, automatic data-dependency handling between client and cloud resources, and the flexibility of implementing user-defined plugins for data transformations. In-general, we believe that our work attempts to increase the adoption of cloud resources for running data-intensive scientific workloads. / Master of Science
|
289 |
Module Shaping and Exploration in Rapid FPGA Design and Assembly WorkflowsLee, Kevin 25 June 2015 (has links)
The modular design methodology has been widely adopted to harness the complexity of large FPGA-based systems. As a result, a number of commercial and academic tool flows emerged to support modular design including Hierarchical Design Flow and Partial Reconfiguration Flow, OpenPR, HMFlow, PARBIT, REPLICA, GoAhead and QFlow frameworks. As all of these projects have shown, a modular approach raises the abstraction level, provides clear boundaries for incremental design, reduces placement complexity, and improves productivity. At the physical layer, modules can be compiled into rectangular regions, suitable for placement on the FPGA fabric. Creating a design then becomes the process of placing all of the modules on the FPGA, followed by inter-module routing. FPGAs, however, are not homogenous, and the shape of individual modules could greatly impact overall device utilization. Prior work in modular assembly utilize modules with a single shape and aspect ratio in the assembly process. Due to the increasing size and heterogeneity of contemporary FPGAs, the placement flexibility of such a module is becoming increasingly limited. This thesis introduces a process that exploits offline shape generation and exploration, enabling the selection of shapes using criterias such as resource usage efficiency, placement flexibility, and device utilization. Module shapes can be generated with these criterias in mind while still taking advantage of the reduced placement complexity of modular design and assembly / Master of Science
|
290 |
An investigation into the relevance of flexibility- and interoperability requirements for implementation processes for workflow-management-applicationsKühl, Lukas W. H. January 2009 (has links)
Flexibility and Interoperability have become important characteristics for organisations and their business processes. The need to control flexible business processes within an organisation’s boundaries and between organisations imposes major requirements on a company’s process control capabilities. Workflow Management Systems (WFMS) try to fulfil these requirements by offering respective product features. Evidence suggests that the achievement of flexible business processes and an inter-organisational process control is also influenced by implementation processes for Workflow Management Applications (WFMA). [A WFMA comprises the WFMS and "all WFMS specific data with regard to one or more business processes" [VER01]]. The impact of a WFMA implementation methodology on the fulfilment of these requirements is the research scope of the project. The thesis provides knowledge in the following areas: 1. Review of the relationship between workflow management and the claim for process flexibility respectively -interoperability. 2. Definition of a research-/evaluation framework for workflow projects. This framework is composed of all relevant research variables that have been identified for the thesis. 3. Empirical survey of relevant workflow-project objectives and their priority in the context of process flexibility and –interoperability. 4. Empirical survey of the objectives’ achievement. 5. Empirical survey of methodologies / activities that have been applied within workflow projects. 6. Derivation of the project methodologies’ effectiveness in terms of the impact that applied activities had on project objectives. 7. Evaluation of existing workflow life-cycle models in accordance with the research framework. 8. Identification of basic improvements for workflow implementation processes with respect to the achievement of flexible and interoperable business processes. The first part of the thesis argues the relevance of the subject. Afterwards research variables that constitute the evaluation framework for WFMA implementation processes are stepwise identified and defined. An empirical study then proves the variables’ effectiveness for the achievement of process flexibility and –interoperability within the WFMA implementation process. After this the framework is applied to evaluate chosen WFMA implementation methodologies. Identified weaknesses and effective methodological aspects are utilised to develop generic methodological improvements. These improvements are later validated by means of a case study and interviews with workflow experts.
|
Page generated in 0.1916 seconds