• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 151
  • 22
  • 20
  • 13
  • 10
  • 8
  • 7
  • 6
  • 5
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 294
  • 294
  • 70
  • 45
  • 43
  • 37
  • 34
  • 32
  • 31
  • 29
  • 28
  • 26
  • 26
  • 23
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
151

En undersökning av insatsen för omkonstruktion av ett programvarusystem

Tesfay, Abyel, Berglund, Erik January 2021 (has links)
There exists a large amount of old software that are still used by organizations and companies, despite their defects and expensive maintenance costs. This is because of their value to the businesses. To solve the issues with the old software, re-engineering can be used as a way to reuse and maintain these software. Through re-engineering code is modified or created with the purpose of solving the defects of the older systems and satisfy any new needs. The problem is that there are few investigations about how the effort (cost in time) that is required for re-engineering can be distributed through the phases that are part of re-engineering. The purpose of this thesis is to help organizations and companies plan their re-engineering, with the goal of giving quantitative data that shows how the effort required for re-engineering can be distributed between its different phases. This thesis has used qualitative research. In a literature study, knowledge was gathered about software development costs, re-engineering in general, and a process model for re-engineering. Action research was used to perform re-engineering, and to measure the efforts required for it and how the effort was distributed among the phases that are part of re-engineering. A number of criteria were created to contribute to the fullflling of the purpose and goal of the thesis. The result of the thesis is a compilation of the effort required. The compilation shows that two thirds of the total effort was spent on designing and implementing the new system. One quarter of the effort was spent on the analysis and planning phase. The remainder of the time, approximately one tenth of the total effort, was spent on the phases testing, documentation, acceptance and system transition and other tasks. The result shows that reengineering can be affected the most by influencing the phases where analysis and planning as well as design and implementation are performed. / Det finns i dagsläget en stor mängd äldre mjukvara som fortfarande används av organisationer och företag, trots deras brister och dyra underhållskostnader. Detta är på grund av deras värde för verksamheterna. För att hantera bristerna med äldre mjukvara kan omkonstruktion användas, som ett sätt att återanvända och underhålla äldre system. Genom en omkonstruktion modifieras eller skapas ny kod som försöker lösa systemets brister och eventuellt uppfylla nya behov. Problemet är att det finns få mätningsundersökningar kring hur insatsen (tidsåtgången) som krävs kan fördelas bland de moment som är del av omkonstruktion. Syftet med arbetet är att hjälpa organisationer och företag planera deras omkonstruktioner, med malet att ge kvantitativa data som visar hur insatsen för en omkonstruktion kan fördelas mellan omkonstruktionens moment. Denna studie har använt sig av kvalitativ forskning. I en litteraturstudie samlades kunskap om kostnad inom mjukvaruutveckling, omkonstruktion generellt, och en processmodell som användes för omkonstruktionsarbetet. Aktionsforskning användes för att genomföra omkonstruktion, för att mäta den insats som krävdes och hur insatsen fördelats bland de faser som ingår i omkonstruktion. Ett antal kriterier togs fram för att kvantifiera insatsen och bedöma att den data som togs fram bidrar till att uppfylla studiens syfte och mål. Resultatet av studien blev en kostnadssammanställning. Sammanställningen visar att två tredjedelar av den totala insatsen spenderades med design och implementation av det nya systemet. En fjärdedel av tiden spenderades på fasen analys och planering. Den resterande tiden, motsvarande en tiondel av totala insatsen, spenderades på faserna testning, dokumentation, acceptans och systemövergång och övriga uppgifter. Resultatet visar alltså att omkonstruktioner kan påverkas störst genom att påverka de faser där analys och planering respektive design och implementation av systemet utförs.
152

Intraoperative process monitoring using generalized surgical process models

Liebmann, Philipp 01 March 2022 (has links)
Der Chirurg in einem modernen Operationssaal kann auf die Funktionen einer Vielzahl technischer, seine Arbeit unterstützender, Geräte zugreifen. Diese Geräte und damit auch die Funktionen, die diese zur Verfügung stellen, sind nur unzureichend miteinander vernetzt. Die unzureichende Interoperabilität der Geräte bezieht sich dabei nicht nur auf den Austausch von Daten untereinander, sondern auch auf das Fehlen eines zentralen Wissens über den gesamten Ablauf des chirurgischen Prozesses. Es werden daher Systeme benötigt, die Prozessmodelle verarbeiten und damit globales Wissen über den Prozess zur Verfügung stellen können. Im Gegensatz zu den meisten Prozessen, die in der Wirtschaft durch Workflow Management-Systeme (WfMS) unterstützt werden, ist der chirurgische Prozess durch eine hohe Variabilität gekennzeichnet. Mittlerweile gibt es viele Ansätze feingranulare, hochformalisierte Modelle des chirurgischen Prozesses zu erstellen. In dieser Arbeit wird zum einen die Qualität eines, auf patienten individuellen Eingriffen basierenden, generalisierten Modells hinsichtlich der Abarbeitung durch ein WfMS untersucht, zum anderen werden die Voraussetzungen die, die vorgelagerten Systeme erfüllen müssen geprüft. Es wird eine Aussage zur Abbruchrate der Pfadverfolgung im generalisierten Modell gemacht, das durch eine unterschiedliche Anzahl von patientenindividuellen Modellen erstellt wurde. Zudem wird die Erfolgsrate zum Wiederfinden des Prozesspfades im Modell ermittelt. Ausserdem werden die Anzahl der benötigten Schritte zumWiederfinden des Prozesspfades im Modell betrachtet.:List of Figures iv List of Tables vi 1 Introduction 1 1.1 Motivation 1 1.2 Problems and objectives 3 2 State of research 6 2.1 Definitions of terms 6 2.1.1 Surgical process 6 2.1.2 Surgical Process Model 7 2.1.3 gSPM and surgical workflow 7 2.1.4 Surgical workflow management system 8 2.1.5 Summary 9 2.2 Workflow Management Systems 10 2.2.1 Agfa HealthCare - ORBIS 10 2.2.2 Siemens Clinical Solutions - Soarian 10 2.2.3 Karl Storz - ORchestrion 10 2.2.4 YAWL BPM 11 2.3 Sensor systems 12 2.3.1 Sensors according to DIN1319 13 2.3.2 Video-based sensor technology 14 2.3.3 Human-based sensor technology 15 2.3.4 Summary 15 2.4 Process model 15 2.4.1 Top-Down 15 2.4.2 Bottom-Up 17 2.4.3 Summary 18 2.5 Methods for creating the ICCAS process model 18 2.5.1 Recording of the iSPMs 18 2.5.2 Creation of the gSPMs 20 2.6 Summary 21 3 Model-based design of workflow schemas 23 3.1 Abstract 24 3.2 Introduction 25 3.3 Model driven design of surgical workflow schemata 27 3.3.1 Recording of patient individual surgical process models 27 3.3.2 Generating generalized SPM from iSPMs 27 3.3.3 Transforming gSPM into workflow schemata 28 3.4 Summary and Outlook 30 4 Model-based validation of workflow schemas 31 4.1 Abstract 32 4.2 Introduction 33 4.3 Methods 36 4.3.1 Surgical Process Modeling 36 4.3.2 Workflow Schema Generation 38 4.3.3 The SurgicalWorkflow Management and Simulation System 40 4.3.4 System Validation Study Design 42 4.4 Results 44 4.5 Discussion 47 4.6 Conclusion 50 4.7 Acknowledgments 51 5 Influence of missing sensor information 52 5.1 Abstract 53 5.2 Introduction 54 5.3 Methodology 57 5.3.1 Surgical process modeling 57 5.3.2 Test system 59 5.3.3 System evaluation study design 61 5.4 Results 63 5.5 Discussion 66 5.6 Conclusion 68 5.7 Acknowledgments 68 5.8 Conflict of interest 68 6 Summary and outlook 69 6.1 Summary 69 6.2 Outlook 70 Bibliography 74
153

Modeling and Execution of Resilient Business Processes in Unreliable Communication Environments

Nordemann, Frank 01 March 2022 (has links)
Business processes define workflows by structuring sets of activities according to given objectives. Process Modeling Languages (PMLs) provide graphical elements to define process models. Apart from use cases in finance and commerce, PMLs gain popularity in application domains such as Cyber-Physical Systems, the Internet of Things, ubiquitous computing, mobile devices, and scenarios happening in rural, restricted, or disaster-affected regions. Many of the domains are exposed to delayed, intermittent, or broken connectivity. Existing PMLs show limitations in considering connectivity-related issues, leading to failures and breakdowns at process runtime. This thesis addresses connectivity-related issues regarding the modeling and execution of resilient business processes taking place in unreliable communication environments. With resilient BPMN (rBPMN), an extension for the Business Process Model and Notation (BPMN) addressing environments with delayed, intermittent, or broken connectivity is introduced. rBPMN extends the BPMN metamodel by new elements for resilient process models. Domain experts may define alternatives for possibly failing message flows based on priorities or characteristics of the alternatives. Functionality offered by remote participants may be moved to other participants for local execution. This thesis illustrates approaches for the graph-based analysis of business processes regarding their resilient operation. By translating process models into directed graphs, graph algorithms allow to dynamically find the most suitable process path. Domain experts are enabled to identify non-resilient parts of a process model, allowing them to optimize the involved segments before runtime. Multi-criteria analysis approaches optimize process operation according to a chosen set of criteria. A real-world scenario of an environmental-friendly slurry application illustrates the use of rBPMN’s concepts and approaches for modeling and executing resilient processes. Technical approaches realizing rBPMN’s resilience strategies using a BPMN runtime engine and microservices are illustrated. The proof-of-concept implementations may be extended and adapted, serving as guides for other application domains.
154

Risk Communication: An Analysis of Message Source and Function in Hurricane Mitigation/Preparedness Communication

Gallo, Andrew M 12 March 2009 (has links)
In September 2008, the National Weather Service (NWS) predicted that Hurricane Ike would make landfall on Galveston Island as a strong category three storm. This led the NWS to release a statement of 'certain death' if people did not adhere to the emergency evacuation messages. Millions of people fled the Texas coast. Using Hazleton and Long's (1993) taxonomy of public relations strategies, experimental methods were conducted with various evacuation messages to test emergency communication. Grunig's (1997) situational theory of publics was used to determine strategy influence. Problem recognition, constraint recognition, and level of involvement were tested. In addition, tests were conducted to measure source expertise, trust, and attitude depending on the message source. Results indicated that a national message source produced higher constraint recognition than a local message source. The national message source produced higher expertise, trust, and attitude then a local message source. The threat and punishment strategy produced the highest level of information-seeking behavior. Information-seeking behavior was the lowest when a persuasive strategy was used. Constraint recognition produced the weakest effect on information-seeking behavior. In conclusion, emergency management communicators must use the correct message strategy to have an effect on information-seeking behavior.
155

Relationship Between Autonomous Motivation and Ego-Depletion

Heilman, Mark A. 01 January 2016 (has links)
Previous research has shown that exerting self-control on a demanding task can impair performance on a subsequent demanding self-control task. This phenomenon is known as ego-depletion; however, its underlying mechanisms are not well understood. Notable gaps in the literature exist regarding whether participants’ motivation levels can attenuate the depletion effect, and whether trait self-control is related. Drawing from the process model of depletion and the self-determination theory, the goal of the study was to examine whether motivational incentives in the form of autonomy can impact performance on tasks in an ego-depleted state, and the potential relationship of trait self-control. Amazon Mechanical Turk was utilized to conduct this experimental quantitative study with a 2 (ego-depletion: yes or no) x 2 (autonomous reward motivation: incentivized or nonincentivized) between-subjects factorial design. The effects of an autonomous motivational incentive were compared with the effects of no incentive on a convenience sample of online participants (N = 211), half of whom performed a task designed to be depleting of self-control resources, and half of whom performed a non-depleting task instead. Multivariate ANCOVAs showed no significant differences for performance on a subsequent self-control task for any of the experimental groups, and no co-variance of trait self-control was found (as measured by the Brief Self-Control Scale). This study will contribute to social change by increasing understanding of the factors contributing to self-control. This knowledge will be useful to anyone intending to strengthen their own willpower and achieve their goals, and may enable practitioners to better assist clients struggling with addictions and other maladaptive behaviors.
156

Reframing Integrated Operations as Design Process

Moss, Tracy, M.A. 15 June 2020 (has links)
No description available.
157

Materialcharakterisierung von Kunststoffen fürs Thermoformen unter Nutzung neuer Messtechnologien

Sanjon, Cedric, Kayatz, Fabian, Schult, Andre 29 May 2018 (has links)
Für die Herstellung von Kunststoffformteilen, z.B. Verpackungen, Komponenten für Haushaltsgeräte, Automobil- oder Medizinbranche, werden aufgrund von Mikrostrukturen, neuen sowie hybriden Materialien und dem zunehmenden Kostendruck steigende Anforderungen an das Formteil, das Verfahren und den Prozess gestellt. Entsprechende Technologien zur Verbesserung des Umformprozesses stehen vor der Markteinführung oder werden derzeit entwickelt. Aufgrund des damit einhergehenden Anstiegs der Technologiekomplexität werden zunehmend Material- und Prozessmodelle eingesetzt. Die dienen der Technologieentwicklung, der Optimierung des Prozesses und bilden eine Hilfestellung bei der Inbetriebnahme. Ein Schwerpunkt und eine Herausforderung ist dabei die Materialmodellierung. Während des Umformens ins Werkzeug beim Thermoformen treten verschiedene Effekte auf: z. B. Dehnung und Verschiebung der Polymerketten, Bildung von amorphen und kristallinen Strukturen. Das sich daraus ergebende Verhalten ist durch geeignete Materialmodelle und deren Parametrisierung abzubilden. Ein gängiger Ansatz zur Bestimmung des Materialverhaltens und die damit verbundene Bestimmung der Materialparameter ist die Reverse-Engineering-Methode. Zu diesem Zweck stehen verschiedene Ersatzversuche zur Auswahl, z.B. Membrane-Inflation- Rheometer (MIR), Thermoformen-Material-Charakterisierung (TMC) und uniaxiale sowie biaxiale Zugversuche. Mit Hilfe geeigneter Modelle werden die Parameter entsprechend der experimentellen Daten gefittet. Für die Abbildung des Umformprozesses in einem numerischen Modell ist die Implementierung des Materialmodells in ein Prozessmodell notwendig. Um quantitative und qualitative Aussagen zur Übereinstimmung des numerischen Modells mit dem tatsächlichen Umformprozess zu erhalten, ist stets eine Validierung notwendig, indem experimentelle Simulationen durchgeführt und anhand ausgewählter Zielgrößen analysiert und den numerischen Ergebnissen gegenübergestellt werden. Zu diesem Zweck stehen verschiedene neue Messmethoden zur Verfügung, z.B. GEWAND, OCT, Hall-Effekt-Dickenmesser.
158

Discrimination, Group Identity, and Mental Health: A Comparative Study of African Americans, Caribbean Americans, and European Americans

Kimura, Aya 12 May 2008 (has links)
No description available.
159

Enhancing the Admissibility of Live Box Data Capture in Digital Forensics: Creation of the Live Box Computer Preservation Response (LBCPR) and Comparative Study Against Dead Box Data Acquisition

Emilia Mancilla (14202911) 05 December 2022 (has links)
<p>There are several techniques and methods on how to capture data during a Live Box response in computer forensics, but the key towards these acquisitions is to keep the collected data admissible in a judicial court process. Different approaches during a Live Box examination will lead to data changes in the computer, due to the volatile nature of data stored in memory. The inevitable changes of volatile data are what cause the controversy when admitting digital evidence to court room proceedings.</p> <p>The main goal of this dissertation was to create a process model, titled Live Box Computer Preservation Response(LBCPR), that would assist in ensuing validity, reliably and accuracy of evidence in a court of law. This approach maximizes the admissibly of digital data derived from a Live Box response. </p> <p>The LBCPR was created to meet legal and technical requirements in acquiring data from a live computer. With captured Live Box computer data, investigators can further add value to their investigation when processing and analyzing the captured data set, that would have otherwise been permanently unrecoverable upon powering down the machine. By collecting the volatile data prior to conducting Dead Box forensics, there is an increased amount of information that that can be a utilized to understand the state of the machine upon collection when combined with the stored data contents. </p> <p>This study created a comparative analysis on data collection with the LBCPR method versus traditional Dead Box forensics techniques, further proving the expected results of Live Box techniques capturing volatile data. However, due to the structure of the LBCPR, there were enhanced capabilities of obtaining value from the randomization of memory dumps, because of the assistance of the collected logs in the process model. In addition, with the legal admissibility focus, there was incorporation of techniques to keep data admissible in a court of law. </p>
160

Visualizing partitioned data in Audience Response Systems : A design-driven approach

Wiigh, Oscar January 2020 (has links)
Meetings and presentations are often monological in their nature, creating a barrier of productivity in workplaces around the world. By utilizing modern technologies such as a web-based Audience Response System (ARS), meetings and presentations can be transformed into interactive exercises where the audience’s views, opinions and answers can be expressed. Visualizing these audience responses and relating questions-specific partitioned answers between each other, through visualization structures, was the topic of this report. The thesis project was carried out in collaboration with Mentimeter, creator of a web-based ARS and online presentation tool. The Double Diamond design process model was used to investigate and ground the design and development process. To guide the implementation of the prototypes, a focus group was held with four visualization and design professionals knowledgeable about ARSs, to gather feedback on high fidelity sketches. The final prototypes were evaluated with the extended Technology Acceptance Model (TAM) for information visualization to survey end-users' attitudes and willingness to adopt the visualization structures. Eight end-users tested the final web-based prototypes. The findings of the user tests indicate that both visualizations prototypes showed promise for visualizing partitioned data in novel ways for ARSs, with an emphasis on a circle cluster visualization as it allowed for the desired exploration. The results further imply that there is value to be gained by presenting partitioned data in ways that allows for exploration, and that audiences would likely adopt a full implementation of the visualizations given some added functionalities and adjustments. Future research should focus on fully implementing and testing the visualizations in front of a live audience, as well investigating other contemporary visualization structures and their capabilities for visualizing partitioned ARS data. / Möten och presentationer är ofta sedda som ett produktivitetshinder på arbetsplatser runtom i världen på grund av deras monologiska natur. Genom att använda moderna tekniska lösningar såsom webbaserade Audience Response Systems (ARS) så kan möten och presentationer omvandlas till interaktiva moment där en publiks perspektiv, åsikter och svar kan uttryckas. Att visualisera en publiks svar och relatera frågespecifika partitionerade svar mellan varandra, genom visualiseringar, var denna rapports huvudämne. Projektet utfördes i samarbete med Mentimeter, skapare av ett webbaserat ARS och digitalt presentationsverktyg. Double Diamond-modellen användes för att undersöka och förankra design- och utvecklingsarbetet i projektet. För att guida utvecklingsarbetet, och få feedback på designförslag, genomfördes en fokusgrupp med fyra visualiserings- och designexperter som besatt kunskap om ARS. De framtagna prototyperna utvärderas genom den utökade Technology Acceptance Model (TAM) för att undersöka slutanvändares inställning och villighet att använda visualiseringarna. Totalt testade åtta slutanvändare de framtagna webbaserade prototyperna. Resultatet av användartesterna indikerade att båda visualiseringsprototyperna har potential att visualisera partitionerad data på nya sätt i ARS, men att en klustervisualisering var överlägsen från en utforskningssynpunkt. Resultaten innebär vidare att det finns ett värde i att presentera partitionerad data på sätt som möjliggör utforskning av publikens svar, och att publiken troligen kommer att anta en fullständig implementering av visualiseringarna förutsatt några extra funktioner och justeringar. Framtida forskning bör fokusera på att fullständigt implementera och testa visualiseringarna framför en faktiskt publik, samt undersöka andra samtida visualiseringsstrukturer och deras möjligheter att visualisera partitionerad ARS-data.

Page generated in 0.5775 seconds