• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 39
  • 8
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 78
  • 78
  • 26
  • 15
  • 15
  • 14
  • 12
  • 12
  • 11
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Konsolidace procesního řízení v prostředí příspěvkových organizací / Consolidation of the business process management in the area of semi-budgetary organizations

Ježek, Svatopluk January 2012 (has links)
This thesis is aimed on the business process management of semi-budgetary organizations in the Czech Republic. For description of the field of business process management, the work lists common definitions and important information that is linked to this area. Subsequently, there was described the specificity of public administration in the Czech Republic, taking into account the level of and ability to implement business process management in this area. The important part of this thesis is the analysis of assumptions, which should public organizations fulfill in order to implement business process management. Thesis focuses on business process modeling, both in terms of methodology and standards used, and from the perspective of adequate tools for creating business process models. The main focus of the work is on the consolidation of the business process models of semi-budgetary organizations simultaneous checked on a case study. Case study, which is part of the practical part of this thesis, is a modified description of the implemented project, which aimed to merge two semi-budgetary organizations, especially from the perspective of the consolidation process management and process models.
42

Návrh a budování center sdílených služeb v regionu CEE / Design and build of Shared Service Centres in the CEE

Štiller, Martin January 2012 (has links)
Diploma thesis introduces general principles of design and build of Shared Service Centres. Shared Service Centre is described as a suitable platform for streamlining of internal supporting services. The thesis also introduces general principles of internal processes transformation into a shared service as well as procedure for selection of suitable location. Final assessment presents most convenient countries in the world for implementation of Shared Service Centres.
43

Forgiveness and Problematic Substance Use

Webb, Jon R., Jeter, Bridget R. 01 January 2015 (has links)
For over 75 years the process of (un)forgiveness has been anecdotally understood to play an important role in the Twelve-Step Model of addiction and recovery. However, only very recently have scholars begun to examine the empirical evidence in support thereof. Multiple dimensions and aspects of forgivingness (trait), state forgiveness, and problematic substance use are relevant to one another in the context of addiction and recovery. Models of the association between forgiveness and problematic substance use have been developed, including the conceptual identification of the process of (un)forgiveness as inherent to and/or consistent with evidence-based treatment modalities. Relevant empirical evidence is reviewed and of the 21 empirical studies in the published scientific literature, 90 % show a salutary association. Indeed, forgivingness of self may be the most important dimension of forgiveness for recovery. A future research agenda is proposed and implications for individual, public, and societal health are discussed.
44

The Effects of Business Process Management Cognitive Resources and User Cognitive Differences on Outcomes of User Comprehension

Swan, Bret R. 03 May 2007 (has links)
There is a growing need to study factors that affect user comprehension of Business Process Management (BPM) information portrayed by graphical process models (GPMs). For example, deployment of BPM Systems, unique types of enterprise-level information systems, has dramatically increased in recent years. This increase is primarily because BPM Systems give a variety of managers across an enterprise the ability to directly design, configure, enact, monitor, diagnose, and control business processes that other types of enterprise systems do not. This is possible because BPM Systems uniquely rely on GPMs derived from formal graph theory. Besides controlling the business processes, these GPMs, such as metagraphs and Unified Modeling Language (UML) diagrams, portray business process information (BPI) and prompt BPM managers to apply their training and expertise to deal with BPM situations. As a result, GPMs are the primary information artifacts for decision-making and communication among different, often geographically dispersed stakeholders. Therefore, user comprehension of these unique GPMs is critical to the efficient and effective development, deployment, and utilization of BPM Systems. User comprehension outcomes are jointly affected by the (1) BPM cognitive resources available to each manager (including the type of GPM, BPI, and user educational training and experience), and (2) cognitive differences between individual BPM managers (such as their mental workload, cognitive styles and cognitive abilities). Although research has studied GPMs in various contexts, there is apparently no empirical research investigating GPM user comprehension in the context of BPM Systems. This research makes an important contribution by addressing this gap in the literature. Statement of the Objective: The purpose of this research is to empirically study how BPM cognitive resources and cognitive differences between individuals affect outcomes of GPM user comprehension. This research centered on the following objectives: A. Investigate whether more positive user comprehension outcomes are produced by novice users if a single GPM technique is used to portray different types of BPI (e.g., as with metagraphs) or if different GPM techniques are used to portray different types of BPI (e.g., as with UML diagrams). B. Investigate whether one type of BPI is more easily comprehended and interpreted by novice users irrespective of the type of GPM or the type of educational training of the user. C. Investigate whether users with a specific type of user educational training can more easily comprehend and interpret BPM information irrespective of the type of GPM or the type of BPI. D. Evaluate influences of individual cognitive differences (i.e., mental workload, cognitive styles, and cognitive abilities) on outcomes of user comprehension. In order to accomplish these objectives, this study: (a) defined a theoretical framework conceptualizing user comprehension outcomes in terms of the interaction between cognitive resources external to the user and individual differences affecting how users cognitively process BPI, (b) empirically tested an operational research model of GPM user comprehension that is based on the theoretical framework, and (c) interpreted the experimental results in the context of related literatures. Description of Research Methods: This study empirically tested relationships between several variables representing BPM cognitive resources and individual cognitive differences hypothesized as influencing the outcomes of user comprehension. A laboratory experiment, involving 87 upper-level undergraduate students from two universities, analyzed relationships between participant comprehension of two types of GPMs (i.e., metagraphs and UML diagrams) used to portray three types of BPI (i.e., task-centric, resource-centric, and information-centric BPI) by novice GPM users possessing different educational training (i.e., industrial engineering, business management, and computer science training). Dependent variables included assessments of task accuracy, task timeliness, subjective mental workload, and self-efficacy. Covariate effects were also analyzed for two types of participant cognitive abilities (i.e., general cognitive ability (GCA) and attentional abilities) and two types of participant cognitive styles (extroversion-introversion and sensing-intuitive). Multivariate analysis techniques were used to analyze and interpret the data. Discussion of Results: The type of GPM and participants' GCA produced significant effects on the dependent variables in this study. For example, metagraph users produced significantly more desirable results than UML users across all dependent variables, contrary to what was hypothesized. However, if only the BPM cognitive resources (i.e., GPM Type, BPM Type, and the Type of Participant Education) were studied in relation to user comprehension outcomes, spurious conclusions would have been reached. When individual cognitive differences were included in the research model and analyses, results showed participants with higher GCA produced significantly more positive user comprehension outcomes compared to participants with lower GCAs. Also, many of the impacts of differences in the types of BPI and the types of UET were moderated by the differences in participants' GCA and attentional abilities. In addition, the relationship between subjective mental workload and task performance (i.e., accuracy and timeliness) suggest a possible GPM cognitive "profile" for user comprehension tasks in a BPM Systems context. These results have important implications for future research and practice in several bodies of knowledge, including GPM user comprehension in management systems engineering, BPM modeling, BPM Systems, HCI, and cognitive ergonomics literature. / Ph. D.
45

Monitoring and Prognostics for Broaching Processes by Integrating Process Knowledge

Tian, Wenmeng 07 August 2017 (has links)
With the advancement of sensor technology and data processing capacities, various types of high volume data are available for process monitoring and prognostics in manufacturing systems. In a broaching process, a multi-toothed broaching tool removes material from the workpiece by sequential engagement and disengagement of multiple cutting edges. The quality of the final part, including the geometric integrity and surface finish, is highly dependent upon the broaching tool condition. Though there has been a considerable amount of research on tool condition monitoring and prognostics for various machining processes, the broaching process is unique in the following aspects: 1) a broaching process involves multiple cutting edges, which jointly contribute to the final part quality; 2) the resharpening and any other process adjustments to the tool can only be performed with the whole broaching tool or at least a whole segment of the tool replaced. The overarching goal of this research is to explore how engineering knowledge can be used to improve process monitoring and prognostics for a complex manufacturing process like broaching. This dissertation addresses the needs for developing new monitoring and prognostics approaches based on various types of data. Specifically, the research effort focuses on 1) the use of in-situ force profile data for real-time process monitoring and fault diagnosis, 2) degradation characterization for broaching processes on an individual component level based on image processing; and 3) system-level degradation modeling and remaining useful life prediction for broaching processes based on multiple images. / Ph. D. / Big data have been providing both opportunities and challenges for product quality assurance and improvement in modern manufacturing systems. In aerospace industry, broaching processes are one of the most important manufacturing processes as they are used to produce the turbine discs in the jet engine. Nonconforming turbine disc quality, either in terms of compromised surface finish or geometry accuracy, will lead to malfunction or even catastrophic failures in the aircraft engines. One of the major sources that lead to nonconforming product quality is excessive tool wear accumulation and other abrupt malfunctions of the broaching tools. In broaching processes, multiple cutting edges are sequentially pushed or pulled through the workpiece, and each cutting edge is responsible to shape the workpiece into a specific intermediate shaped contour. Therefore, a broaching process can be regarded as a multistage manufacturing process with variation propagating through the multiple cutting edges. The overarching goal of this dissertation is to explore how process knowledge can be used to improve process monitoring and prognostics for a complex manufacturing process like broaching. This dissertation focuses on the quality assurance and improvement for broaching processes which includes: 1) timely abrupt process fault detection; 2) tool performance degradation quantification; and 3) remaining tool life prediction, which contributes to both methodological development and practical applications in advanced sensing analytics in manufacturing systems.
46

2008 U.S. Presidential Election: Persuasive YouTube Interactions About War, Health Care, and the Economy

Zimmerman, Lindsey 01 December 2009 (has links)
Persuasive appeals posted to United States presidential candidates’ YouTube videos were coded using a grounded theory mixed-methods design. 37,562 comments about education, energy, Iraq, health care, the economy, and the presidential debates were randomly collected by date and time for three studies using coding analysis: pilot, presidential primaries, and the presidential election. Seven argument types were identified and theoretically refined according to dual process models of persuasion: reason-based, candidate-based, emotion-based, endorsements, enthusiasmheuristic, other-interest and self-interest. Theoretical comparisons and hypothesis testing of argument types were conducted by issue and election event. Consistent with impression involvement, reason-based appeals were more frequent during the primaries, whereas consistent with value and outcome involvement, emotion- and candidate-based appeals were more frequent during the election.
47

Service Oriented System Design Through Process Decomposition

Akbiyik, Eren Kocak 01 September 2008 (has links) (PDF)
Although service oriented architecture has reached a particular maturity level especially in the technological dimension, there is a lack of common and acceptable approach to design a software system through composition and integration of web services. In this thesis, a service oriented system design approach for Service Oriented Architecture based software development is introduced to fill this gap. This new methodology basically offers a procedural top-down decomposition of a given software system allowing several abstraction levels. At the higher levels of the decomposition, the system is divided into abstract nodes that correspond to process models in the decomposition tree. Any node is a process and keeps the sequence and the state information for the possible sub-processes in this decomposition tree. Nodes which are defined as process models may include some sub-nodes to present details for the intermediate levels of the model. Eventually at the leaf level, process models are decomposed into existing web services as the atomic units of system execution. All processes constructing the system decomposition tree are modeled with BPEL (Business Process Execution Language) to expose the algorithmic details of the design. This modeling technique is also supported with a graphical modeling language referred to as SOSEML (Service Oriented Software Engineering Modeling Language) that is also newly introduced in this thesis.
48

A compliance management framework for business process models

Awad, Ahmed Mahmoud Hany Aly January 2010 (has links)
Companies develop process models to explicitly describe their business operations. In the same time, business operations, business processes, must adhere to various types of compliance requirements. Regulations, e.g., Sarbanes Oxley Act of 2002, internal policies, best practices are just a few sources of compliance requirements. In some cases, non-adherence to compliance requirements makes the organization subject to legal punishment. In other cases, non-adherence to compliance leads to loss of competitive advantage and thus loss of market share. Unlike the classical domain-independent behavioral correctness of business processes, compliance requirements are domain-specific. Moreover, compliance requirements change over time. New requirements might appear due to change in laws and adoption of new policies. Compliance requirements are offered or enforced by different entities that have different objectives behind these requirements. Finally, compliance requirements might affect different aspects of business processes, e.g., control flow and data flow. As a result, it is infeasible to hard-code compliance checks in tools. Rather, a repeatable process of modeling compliance rules and checking them against business processes automatically is needed. This thesis provides a formal approach to support process design-time compliance checking. Using visual patterns, it is possible to model compliance requirements concerning control flow, data flow and conditional flow rules. Each pattern is mapped into a temporal logic formula. The thesis addresses the problem of consistency checking among various compliance requirements, as they might stem from divergent sources. Also, the thesis contributes to automatically check compliance requirements against process models using model checking. We show that extra domain knowledge, other than expressed in compliance rules, is needed to reach correct decisions. In case of violations, we are able to provide a useful feedback to the user. The feedback is in the form of parts of the process model whose execution causes the violation. In some cases, our approach is capable of providing automated remedy of the violation. / Firmen entwickeln Prozessmodelle um ihre Geschäftstätigkeit explizit zu beschreiben. Geschäftsprozesse müssen verschiedene Arten von Compliance-Anforderungen einhalten. Solche Compliance-Anforderungen entstammen einer Vielzahl von Quellen, z.B. Verordnung wie dem Sarbanes Oxley Act von 2002, interne Richtlinien und Best Practices. Die Nichteinhaltung von Compliance-Anforderungen kann zu gesetzlichen Strafen oder dem Verlust von Wettbewerbsvorteilen und somit dem Verlust von Marktanteilen führen. Im Gegensatz zum klassischen, domänen-unabhängigen Begriff der Korrektheit von Geschäftsprozessen, sind Compliance-Anforderungen domain-spezifisch und ändern sich im Laufe der Zeit. Neue Anforderungen resultieren aus neuen Gesetzen und der Einführung neuer Unternehmensrichtlinien. Aufgrund der Vielzahl der Quellen für Compliance-Anforderungen, können sie unterschiedliche Ziele verfolgen und somit widersprüchliche Aussagen treffen. Schließlich betreffen Compliance-Anforderungen verschiedene Aspekte von Geschäftsprozessen, wie Kontrollfluss- und Datenabhängigkeiten. Auf Grund dessen können Compliance-Prüfungen nicht direkt Hard-coded werden. Vielmehr ist ein Prozess der wiederholten Modellierung von Compliance-Regeln und ihrer anschließenden automatischen Prüfung gegen die Geschäftsprozesse nötig. Diese Dissertation stellt einen formalen Ansatz zur Überprüfung der Einhaltung von Compliance-Regeln während der Spezifikation von Geschäftsprozessen vor. Mit visuellen Mustern ist es möglich, Compliance-Regeln hinsichtlich Kontrollfluss- und Datenabhängigkeiten sowie bedingte Regeln zu spezifizieren. Jedes Muster wird in eine Formel der temporalen Logik abgebildet. Die Dissertation behandelt das Problem der Konsistenzprüfung zwischen verschiedenen Compliance-Anforderungen, wie sie sich aus unterschiedlichen Quellen ergeben können. Ebenfalls zeigt diese Dissertation, wie Compliance-Regeln gegen die Geschäftsprozesse automatisch mittels Model Checking geprüft werden. Es wird aufgezeigt, dass zusätzliche Domänen-Kenntnisse notwendig sind, um richtige Entscheidungen zu treffen. Der vorgestelle Ansatz ermöglicht nützliches Feedback für Modellierer im Fall eines Compliance-Verstoßes. Das Feedback wird in Form von Teilen des Prozessmodells gegeben, deren Ausführung die Verletzung verursacht. In einigen Fällen ist der vorgestellte Ansatz in der Lage, den Compliance-Verstoß automatisch zu beheben.
49

Variation modeling, analysis and control for multistage wafer manufacturing processes

Jin, Ran 10 May 2011 (has links)
Geometric quality variables of wafers, such as BOW and WARP, are critical in their applications. A large variation of these quality variables reduces the number of conforming products in the downstream production. Therefore, it is important to reduce the variation by variation modeling, analysis and control for multistage wafer manufacturing processes (MWMPs). First, an intermediate feedforward control strategy is developed to adjust and update the control actions based on the online measurements of intermediate wafer quality measurements. The control performance is evaluated in a MWMP to transform ingots into polished wafers. However, in a complex multistage manufacturing process, the quality variables may have nonlinear relationship with the parameters of the predictors. In this case, piecewise linear regression tree (PLRT) models are used to address nonlinear relationships in MWMP to improve the model prediction performance. The obtained PLRT model is further reconfigured to be complied with the physical layout of the MWMP for feedforward control purposes. The procedure and effectiveness of the proposed method is shown in a case study of a MWMP. Furthermore, as the geometric profiles and quality variables are important quality features for a wafer, fast and accurate measurements of those features are crucial for variation reduction and feedforward control. A sequential measurement strategy is proposed to reduce the number of samples measured in a wafer, yet provide adequate accuracy for the quality feature estimation. A Gaussian process model is used to estimate the true profile of a wafer with improved sensing efficiency. Finally, we study the multistage multimode process monitoring problem. We propose to use PLRTs to inter-relate the variables in a multistage multimode process. A unified charting system is developed. We further study the run length distribution, and optimize the control chart system by considering the modeling uncertainties. Finally, we compare the proposed method with the risk adjustment type of control chart systems based on global regression models, for both simulation study and a wafer manufacturing process.
50

Multiscale Modelling as an Aid to Decision Making in the Dairy Industry

Hutchinson, Craig Alan January 2006 (has links)
This work presents the first known attempt to model the dairy business from a multiscale modelling perspective. The multiscale nature of the dairy industry is examined with emphasis on those key decision making and process scales involved in production. Decision making scales identified range from the investor level to the plant operator level, and encompass business, production, plant, and operational levels. The model considers scales from the production manager to the unit operation scale. The cheese making process is used to demonstrate scale identification in the context of the important phenomena and other natural levels of scrutiny of interest to decision makers. This work was a first step in the establishment of a multiscale system model capable of delivering information for process troubleshooting, scheduling, process and business optimization, and process control decision-making for the dairy industry. Here, only material transfer throughout a process, use of raw materials, and production of manufactured product is modelled. However, an implementation pathway for adding other models (such as the precipitation of milk protein which forms curd) to the system model is proposed. The software implementation of the dairy industry multiscale model presented here tests the validity of the proposed: • object model (object and collection classes) used to model unit operations and integrate them into a process, • mechanisms for modelling material and energy streams, • method to create simulations over variable time horizons. The model was implemented using object oriented programming (OOP) methods in conjunction with technologies such as Visual Basic .NET and CAPE-OPEN. An OOP object model is presented which successfully enabled the construction of a multiscale model of the cheese making process. Material content, unit operation, and raw milk supply models were integrated into the multiscale model. The model is capable of performing simulations over variable time horizons, from 1 second, to multiple years. Mechanisms for modelling material streams, connecting unit operations, and controlling unit operation behaviour were implemented. Simple unit operations such as pumps and storage silos along with more complex unit operations, such as a cheese vat batch, were modelled. Despite some simplifications to the model of the cheese making process, the simulations successfully reproduced the major features expected from the process and its constituent unit operations. Decision making information for process operators, plant managers, production managers, and the dairy business manager can be produced from the data generated. The multiscale model can be made more sophisticated by extending the functionality of existing objects, and incorporating other scale partial models. However, increasing the number of reported variables by even a small number can quickly increase the data processing and storage demands of the model. A unit operation’s operational state of existence at any point of time was proposed as a mechanism for integrating and recalculating lower scale partial models. This mechanism was successfully tested using a unit operation’s material content model and is presented here as a new concept in multiscale modelling. The proposed modelling structure can be extended to include any number of partial models and any number of scales.

Page generated in 0.0736 seconds