• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 278
  • 189
  • 50
  • 48
  • 29
  • 24
  • 19
  • 16
  • 13
  • 11
  • 10
  • 5
  • 5
  • 4
  • 3
  • Tagged with
  • 781
  • 197
  • 131
  • 118
  • 107
  • 93
  • 91
  • 88
  • 82
  • 81
  • 79
  • 78
  • 76
  • 70
  • 63
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
611

Microinteractions: Don’t forget to raise your hand! : A qualitative study concerning microinteractions in digital meeting platforms and the impact they have on workflow efficiencies in the creative industry.

Andersson, Elsa, Strömland, Moa, Quassdorf, Catherine January 2021 (has links)
This research paper studies the extent of how microinteractions are utilized in Microsoft Teams and the impact that they have on workflow efficiencies, specifically in the creative industry. In a novel time where Covid-19 virus has increased the demand of digital platforms, the importance of research within remote meeting utilization, and the interaction possibilities within, is of utter importance. Also, since we are missing out on the spontaneous interactions that occur when meeting face-to-face, the microinteractions within digital meeting platforms now carry a heavier weight when interactions are solely occuring remotely. The data collection was achieved through a qualitative approach consisting of participant observations and semi-structured interviews. The results of this research proved that microinteractions are used frequently within digital meetings in Microsoft Teams, but in different contexts and in different constellations. The results also identified that microinteractions, which hold the right conditions, can streamline workflows in certain scenarios. This study concluded in interesting data that answered the research questions but also opened up for further research due to the topic being of a novel field and in a special time, in this case referring to the pandemic and Covid-19.
612

Refining Computerized Physician Order Entry Initiatives in an Adult Intensive Care Unit

Fuller, Chevita 01 January 2014 (has links)
Computerized physician order entry (CPOE) is used in healthcare organizations to improve workflow processes and transcription, as well as to prevent prescribing errors. Previous research has indicated challenges associated with CPOE for end-users that predispose patients to unsafe practices. Unsafe CPOE practices can be detrimental within the intensive care unit (ICU) setting due to the complexity of nursing care. Consequently, end-user satisfaction and understanding of CPOE and electronic health record (EHR) functionality are vital to avoid error omissions. CPOE initiatives should be refined post system implementation to improve clinical workflow, medication processes, and end-user satisfaction. The purpose of this quality improvement project was to refine CPOE system initiatives and develop an e-learning educational module to facilitate end-user understanding of and satisfaction with CPOE. The Iowa model of evidence-based practice, Lean methodology, and Provider Order Entry User Satisfaction and Usage Survey (POESUS) were used to guide the study. An e-learning module was implemented to increase staff understanding of the newly implemented CPOE system, and a plan was provided for ongoing data collection and investigation of end-user satisfaction and medication inadequacies with the CPOE system. A mixed-method design was recommended to key stakeholders to identify the impact of the e-learning course and refined CPOE initiatives on both end-user satisfaction and patient outcomes in the medical-surgical ICU. Findings from the study informed the impact of e-learning educational modules with CPOE system implementation. Those in organizations implementing advanced technology such as CPOE and EHR systems in critical care settings will find this paper of interest.
613

Event-Oriented Dynamic Adaptation of Workflows: Model, Architecture and Implementation

Müller, Robert 28 November 2004 (has links)
Workflow management is widely accepted as a core technology to support long-term business processes in heterogeneous and distributed environments. However, conventional workflow management systems do not provide sufficient flexibility support to cope with the broad range of failure situations that may occur during workflow execution. In particular, most systems do not allow to dynamically adapt a workflow due to a failure situation, e.g., to dynamically drop or insert execution steps. As a contribution to overcome these limitations, this dissertation introduces the agent-based workflow management system AgentWork. AgentWork supports the definition, the execution and, as its main contribution, the event-oriented and semi-automated dynamic adaptation of workflows. Two strategies for automatic workflow adaptation are provided. Predictive adaptation adapts workflow parts affected by a failure in advance (predictively), typically as soon as the failure is detected. This is advantageous in many situations and gives enough time to meet organizational constraints for adapted workflow parts. Reactive adaptation is typically performed when predictive adaptation is not possible. In this case, adaptation is performed when the affected workflow part is to be executed, e.g., before an activity is executed it is checked whether it is subject to a workflow adaptation such as dropping, postponement or replacement. In particular, the following contributions are provided by AgentWork: A Formal Model for Workflow Definition, Execution, and Estimation: In this context, AgentWork first provides an object-oriented workflow definition language. This language allows for the definition of a workflow’s control and data flow. Furthermore, a workflow’s cooperation with other workflows or workflow systems can be specified. Second, AgentWork provides a precise workflow execution model. This is necessary, as a running workflow usually is a complex collection of concurrent activities and data flow processes, and as failure situations and dynamic adaptations affect running workflows. Furthermore, mechanisms for the estimation of a workflow’s future execution behavior are provided. These mechanisms are of particular importance for predictive adaptation. Mechanisms for Determining and Processing Failure Events and Failure Actions: AgentWork provides mechanisms to decide whether an event constitutes a failure situation and what has to be done to cope with this failure. This is formally achieved by evaluating event-condition-action rules where the event-condition part describes under which condition an event has to be viewed as a failure event. The action part represents the necessary actions needed to cope with the failure. To support the temporal dimension of events and actions, this dissertation provides a novel event-condition-action model based on a temporal object-oriented logic. Mechanisms for the Adaptation of Affected Workflows: In case of failure situations it has to be decided how an affected workflow has to be dynamically adapted on the node and edge level. AgentWork provides a novel approach that combines the two principal strategies reactive adaptation and predictive adaptation. Depending on the context of the failure, the appropriate strategy is selected. Furthermore, control flow adaptation operators are provided which translate failure actions into structural control flow adaptations. Data flow operators adapt the data flow after a control flow adaptation, if necessary. Mechanisms for the Handling of Inter-Workflow Implications of Failure Situations: AgentWork provides novel mechanisms to decide whether a failure situation occurring to a workflow affects other workflows that communicate and cooperate with this workflow. In particular, AgentWork derives the temporal implications of a dynamic adaptation by estimating the duration that will be needed to process the changed workflow definition (in comparison with the original definition). Furthermore, qualitative implications of the dynamic change are determined. For this purpose, so-called quality measuring objects are introduced. All mechanisms provided by AgentWork include that users may interact during the failure handling process. In particular, the user has the possibility to reject or modify suggested workflow adaptations. A Prototypical Implementation: Finally, a prototypical Corba-based implementation of AgentWork is described. This implementation supports the integration of AgentWork into the distributed and heterogeneous environments of real-world organizations such as hospitals or insurance business enterprises.
614

Creating An Editor For The Implementation of WorkFlow+: A Framework for Developing Assurance Cases

Chiang, Thomas January 2021 (has links)
As vehicles become more complex, the work required to ensure that they are safe increases enormously. This in turn results in a much more complicated task of testing systems, subsystems, and components to ensure that they are safe individually as well as when they are integrated. As a result, managing the safety engineering process for vehicle development is of major interest to all automotive manufacturers. The goal of this research is to introduce a tool that provides support for a new framework for modeling safety processes, which can partially address some of these challenges. WorkFlow+ is a framework that was developed to combine both data flow and process flow to increase traceability, enable users to model with the desired granularity safety engineering workflow for their products, and produce assurance cases for regulators and evaluators to be able to validate that the product is safe for the users and the public. With the development of an editor, it will bring WorkFlow+ to life. / Thesis / Master of Applied Science (MASc)
615

Arbetsflöde för automatgenerering av objektinformation för järnvägsanläggningar / Workflow for automatic generation of object information for railway facilities

Yeldico, Mansour, Yeldico, Fadi January 2019 (has links)
I Sverige ställer Trafikverket krav på hur dokumentationen för projekteringen av järnvägsanläggningar ska presenteras. Dessutom har Trafikverket krav på hur informationen om järnvägsritningarnas objekt ska redovisas. Ett område som påverkar arbetsmetodiken är Byggnadsinformationsmodellering (BIM). Arbetsmetoden behandlar frågor om bland annat hur informationen om objekten på en ritning ska redovisas på ett effektivt sätt. Trots att uppdragsgivaren - konsultföretaget Sweco Rail- redan använder BIM, fylls många tabeller fortfarande i manuellt. Det manuella arbetet innebär en risk för fel och ställer krav på projektörerna. Detta då de måste hålla reda på alla ritningsobjekt och ändringar som sker under pågående projekt. Därför ökar behovet av att hitta ett effektivare och mer kvalitetssäkert arbetsflöde. I detta arbete undersöktes möjligheten att automatisera det manuella arbetet vid framtagning av två olika tabeller för två olika ritningstyper, som redan används inom konsultföretaget. Dessa är tavellistor för geografiska 2D-ritningar och materielförteckningar för stomritningar. Tavellistor anger exempelvis information om placering av hastighetstavlor, materielförteckningar kan lista information om olika komponenter i ett elskåp. Stomritningar och geografiska 2D-ritningar är två vitala delar av projekteringen av järnvägsanläggningar. Stomritningar är ritningar som återanvänds i flera projekt där små modifieringar sker beroende på tillämpningen. En geografisk 2D-ritning är en översiktlig bild som beskriver hur en järnvägssträcka är utformad. Undersökningen resulterade i två arbetsflöden. Ett för att automatgenerera tavellistor för geografiska 2D-ritningar och ett för att automatgenerera materielförteckningar för stomritningar, vilket med hjälp av inbyggda funktioner i CAD-programvaran MicroStation Connect Edition. Utvärdering av de framtagna arbetsflödena, utfördes med utgångspunkt från de intervjuade projektörernas åsikter på Sweco Rail. Dessa ansåg att de automatiserade arbetsflödena, trots att automatiseringen inte var hundraprocentig, var effektivare än det befintliga manuella arbetssättet. Arbetsflödena bidrog även till ett mer kvalitetssäkert arbetssätt. Samtidigt sänktes framställningskostnaderna med cirka 47% per tavla vid automatgenereringen av tavellistan och cirka 58% per komponent vid automatgenereringen av materielförteckningen. Utöver detta bidrog arbetsflödena till en bättre arbetsmiljö eftersom arbetet enligt de fyra projektörerna som intervjuades blev mindre ansträngande. / In Sweden, the Swedish Transport Administration sets the requirements for how the documentation for the design of railway installations should be presented. The Transport Administration also has requirements for how the information on the railway drawings objects should be presented. One area that has an impact on the methodology for work in railway projects is the implementation of Building Information Modeling (BIM). The working method manages issues such as how object information should be presented effectively. Although the client, Sweco Rail, already uses BIM, many tables are still filled manually. The manual work can in turn become a risk due to the human factor, this also becomes an imposition on the projectors, who are the ones keeping track of all the drawings, objects and changes. Therefore, the need to find a more efficient and reliable workflow increases. This study investigates the possibility of automating the manual process when producing two different tables for two different drawing types, which are already used by Sweco Rail. These are tables with signs for geographical 2D drawings and bills of material for framework drawings. Tables with signs for instance, provides the placement of speed charts whilst the bills of material can present information about components in an electrical cabinet. Framework drawings and geographical 2D drawings are two important parts of the design of railway facilities. The drawings for framework are reused in many different projects where minor modifications are made depending on the project. A geographical 2D drawing is an overview that describes how a railway line is designed. The survey resulted in two workflows. One to automatically generate tables with signs for geographical 2D drawings and the other one to automatically generate bills of material for framework drawings. The evaluation of the developed workflows was carried out based on interviews with projectors in Sweco Rail regarding their experience. They considered that the proposed automated workflows, even though the automation was not fully automatized, still were an improvement in efficiency compared to the existing manual working method. The proposed workflows also contributed to a more quality-assured working method. In addition, the production costs were reduced by approximately 47 % per board in the automatic generation of the board list and approximately 58 % per component in the automatic generation regarding the bill of material. In addition, the workflows contributed to a better work environment.  According to the four interviewed projectors the work became less exhausting.
616

Can Surface Scanning Improve the Workflow of Elekta Linac Treatments? / Kan ytskanning förbättra arbetsflödet för behandlingar med Elekta Linac?

Arousell, Anna, Engdahl, Ylva January 2019 (has links)
The aim of the project was to compare the workflow for an Elekta Linac with and without the surfacescanning system Catalyst and describe pros and cons with both workflows. The findings in the reportcan be used as decision support in development of Elekta products and workflow improvements. The method for the project was to do interviews, observations and time measurements at Södersjukhuset(not using Catalyst) and Sundsvalls sjukhus (using Catalyst). The workflows were graded in an as-sessment protocol covering time efficiency, comfort, noise, resources, reliability, cost, dosage and sideeffects. Different workflow scenarios were simulated in AnyLogic. The result of the project was that, according to our protocol, the workflow with Catalyst was ratedhigher than without it. The simulations in Anylogic showed that minimizing gaps in the treatment sched-ule generated the same number of patients treated per day, if the positioning could not be done faster.The simulations also showed that removing position verification with cone beam computer tomography(CBCT), an imaging system which is used in addition to the Catalyst system, would increase the numberof treated patients with approximately 33%. The conclusion was that there were no great differences in time efficiency between the workflows. How-ever, considering the higher reliability and comfort for the patient, optical surface scanning can improvethe positioning for Elekta Linac and is therefore worth implementing. Minimizing treatment gaps wouldnot improve the workflow. Removing the use of CBCT would increase the number of treated patientsper day.
617

Interior: A Micro-Budget Horror Feature

Beckler, Zachary 01 January 2014 (has links)
INTERIOR is a feature-length film written, directed, and produced by Zachary Beckler as part of the requirements for earning a Master of Fine Arts in Entrepreneurial Digital Cinema from the University of Central Florida. The project aims to challenge existing conventions of the horror film on multiple levels - aesthetic, narrative, technical, and industrial - while also examining growing importance of workflow throughout all aspects of production. These challenges were both facilitated and necessitated by the limited resources available to the production team and the academic context of the production. This thesis is a record of the film, from concept to completion and preparation for delivery to an audience.
618

Transcriptional states of CAR-T infusion relate to neurotoxicity: lessons from high-resolution single-cell SOM expression portraying

Loeffler-Wirth, Henry, Rade, Michael, Arakelyan, Arsen, Kreuz, Markus, Loeffler, Markus, Koehl, Ulrike, Reiche, Kristin, Binder, Hans 04 March 2024 (has links)
Anti-CD19 CAR-T cell immunotherapy is a hopeful treatment option for patients with B cell lymphomas, however it copes with partly severe adverse effects like neurotoxicity. Single-cell resolved molecular data sets in combination with clinical parametrization allow for comprehensive characterization of cellular subpopulations, their transcriptomic states, and their relation to the adverse effects. We here present a re-analysis of single-cell RNA sequencing data of 24 patients comprising more than 130,000 cells with focus on cellular states and their association to immune cell related neurotoxicity. For this, we developed a single-cell data portraying workflow to disentangle the transcriptional state space with single-cell resolution and its analysis in terms of modularly-composed cellular programs. We demonstrated capabilities of single-cell data portraying to disentangle transcriptional states using intuitive visualization, functional mining, molecular cell stratification, and variability analyses. Our analysis revealed that the T cell composition of the patient’s infusion product as well as the spectrum of their transcriptional states of cells derived from patients with low ICANS grade do not markedly differ from those of cells from high ICANS patients, while the relative abundancies, particularly that of cycling cells, of LAG3-mediated exhaustion and of CAR positive cells, vary. Our study provides molecular details of the transcriptomic landscape with possible impact to overcome neurotoxicity.
619

Design and implementation of a workflow for quality improvement of the metadata of scientific publications

Wolff, Stefan 07 November 2023 (has links)
In this paper, a detailed workflow for analyzing and improving the quality of metadata of scientific publications is presented and tested. The workflow was developed based on approaches from the literature. Frequently occurring types of errors from the literature were compiled and mapped to the data-quality dimensions most relevant for publication data – completeness, correctness, and consistency – and made measurable. Based on the identified data errors, a process for improving data quality was developed. This process includes parsing hidden data, correcting incorrectly formatted attribute values, enriching with external data, carrying out deduplication, and filtering erroneous records. The effectiveness of the workflow was confirmed in an exemplary application to publication data from Open Researcher and Contributor ID (ORCID), with 56\% of the identified data errors corrected. The workflow will be applied to publication data from other source systems in the future to further increase its performance.
620

NLP-Assisted Workflow Improving Bug Ticket Handling

Eriksson, Caroline, Kallis, Emilia January 2021 (has links)
Software companies spend a lot of resources on debugging, a process where previous solutions can help in solving current problems. The bug tickets, containing this information, are often time-consuming to read. To minimize the time spent on debugging and to make sure that the knowledge from prior solutions is kept in the company, an evaluation was made to see if summaries could make this process more efficient. Abstractive and extractive summarization models were tested for this task and fine-tuning of the bert-extractive-summarizer was performed. The model-generated summaries were compared in terms of perceived quality, speed, similarity to each other, and summarization length. The average description summary contained part of the description needed and the found solution was either well documented or did not answer the problem at all. The fine-tuned extractive model and the abstractive model BART provided good conditions for generating summaries containing all the information needed. / Vid mjukvaruutveckling går mycket resurser åt till felsökning, en process där tidigare lösningar kan hjälpa till att lösa aktuella problem. Det är ofta tidskrävande att läsa felrapporterna som innehåller denna information. För att minimera tiden som läggs på felsökning och säkerställa att kunskap från tidigare lösningar bevaras inom företaget, utvärderades om sammanfattningar skulle kunna effektivisera detta. Abstrakta och extraherande sammanfattningsmodeller testades för uppgiften och en finjustering av bert-extractive- summarizer gjordes. De genererade sammanfattningarna jämfördes i avseende på upplevd kvalitet, genereringshastighet, likhet mellan varandra och sammanfattningslängd. Den genomsnittliga sammanfattningen innehöll delar av den viktigaste informationen och den föreslagna lösningen var antingen väldokumenterad eller besvarade inte problembeskrivningen alls. Den finjusterade BERT och den abstrakta modellen BART visade goda förutsättningar för att generera sammanfattningar innehållande all den viktigaste informationen.

Page generated in 0.3109 seconds