101 |
Development of trust in leadership: Exploring a cognitive process modelWhitmore, Corrie Baird 30 May 2007 (has links)
This thesis explored the cognitive, character-inference process that Dirks & Skarlicki (2004) assert contributes to trust development. Self-reported transformational leadership, leader integrity, organizational justice, and leader prototypicality correlated positively with cognitive trust in this sample of 81 student employees (63% female, mean age 20.5) of a large southeastern university. Leader prototypicality, a cognitive evaluation process, partially mediated the relationship between leader integrity and trust. This study's prime contribution was the longitudinal, empirical test of a model of trust development in interdependent leader-follower dyads. Future research may explore other antecedents of trust, assess how the cognitive process of trust development occurs, or investigate the relationship-based social exchange mechanism Dirks and Skarlicki (2004) suggest contributes to the development of affective trust. / Master of Science
|
102 |
A Decision-Support Framework for Design of Non-Residential Net-Zero Energy BuildingsTiwari, Railesha 28 April 2015 (has links)
Designing Net-Zero Energy Buildings (NZEB) is a complex and collaborative team process involving knowledge sharing of experts leading to the common goal of meeting the Net-Zero Energy (NZE) project objectives. The decisions made in the early stages of design drastically affect the final outcome of design and energy goals. The Architecture, Engineering and Construction (AEC) industry is pursuing ways to improve the current building design process and project delivery methods for NZEBs. To enable the building industry to improve the building design process, it is important to identify the gaps, ways of improvement and potential opportunities to structure the decision-making process for the purpose of NZE performance outcome. It is essential to identify the iterative phases of design decisions between the integrated team of experts for the design processes conducted in these early stages to facilitate the decision-making of NZEB design. The lack of a structured approach to help the AEC industry in making informed decisions for the NZEB context establishes the need to evaluate the argumentation of the NZEB design decision process. The first step in understanding the NZEB design decision process is to map the current processes in practice that have been successful in achieving the NZE goal. Since the energy use performance goal drives the design process, this research emphasizes first the need to document, in detail, and investigate the current NZEB design process with knowledge mapping techniques to develop an improved process specific to NZEB context.
In order to meet this first objective, this research qualitatively analyzed four NZEB case studies that informed decision-making in the early design phases. The four components that were studied in the early design phases included (1) key stakeholders involved (roles played), (2) phases of assessments (design approach, (3) processes (key processes, sub-processes and design activities affecting performance) and (4) technology (knowledge type and flow). A series of semi-structured, open-ended interviews were conducted with the key decision-makers and decision facilitators to identify their roles in the early design processes, the design approach adopted, rationale for decision-making, types of evaluations performed, and tools used for analysis. The qualitative data analysis was performed through content analysis and cognitive mapping techniques. Through this process, the key phases of decision-making were identified that resulted in understanding of the path to achieving NZE design goal and performance outcome.
The second objective of this research was to identify the NZE decision nodes through a comparative investigation of the case studies. This research also explored the key issues specific to each stakeholder group. The inter-relationships between the project objectives, decision context, occupants usage patterns, strategies and integrated systems, building operation and renewable energy production was identified through a series of knowledge maps and visual process models leading to the identification of the key performance indicators. This research reviewed the similarities and differences in the processes to identify significant opportunities that can improve the early building design process for NZEBs. This research identifies the key decision phases used by the integrated teams and describes the underlying structure that can change the order of key phases.
A process mapping technique was adapted to capture the practice-based complex NZEB design approach and draw insights of the teamwork and interdisciplinary communication to enable more comprehensive understanding of linkages between processes, sub-processes and design activities, knowledge exchange, and decision rationale. Ket performance indicators identified for early design of NZEBs resulted in developing a decision-support process model that can help the AEC industry in making informed decisions. This dissertation helps improve understanding of linkages between processes, decision nodes and decision rationale to enable industry-wide NZEB design process assessment and improvement. This dissertation discusses the benefits the proposed NZEB design process model brings to the AEC industry and explores future development efforts. / Ph. D.
|
103 |
The Therapist and Family Therapy: Satir's Human Validation Process ModelSatir, V., Bitter, James 01 January 1991 (has links)
Book Summary: A model for successful integration of multiple points of view, James R. Bitter's THEORY AND PRACTICE OF FAMILY THERAPY AND COUNSELING, 2nd Edition supports the development of personal, professional, and ethical family practice. The book's concrete, empirically based approaches, as well as diagnostics and visual tools, allow readers to observe others in groups. Updated to reflect recent research and current practice, the Second Edition also includes a new chapter on Object Relations Family Therapy. Case studies, sample dialogues, and exercises help readers apply the concepts they have learned. Available with InfoTrac Student Collections http://gocengage.com/infotrac.
|
104 |
Statistical Methods for Non-Linear Profile MonitoringQuevedo Candela, Ana Valeria 02 January 2020 (has links)
We have seen an increased interest and extensive research in the monitoring of a process over time whose characteristics are represented mathematically in functional forms such as profiles. Most of the current techniques require all of the data for each profile to determine the state of the process. Thus, quality engineers from industrial processes such as agricultural, aquacultural, and chemical cannot make process corrections to the current profile that are essential for correcting their processes at an early stage. In addition, the focus of most of the current techniques is on the statistical significance of the parameters or features of the model instead of the practical significance, which often relates to the actual quality characteristic. The goal of this research is to provide alternatives to address these two main concerns. First, we study the use of a Shewhart type control chart to monitor within profiles, where the central line is the predictive mean profile and the control limits are formed based on the prediction band. Second, we study a statistic based on a non-linear mixed model recognizing that the model leads to correlations among the estimated parameters. / Doctor of Philosophy / Checking the stability over time of the quality of a process which is best expressed by a relationship between a quality characteristic and other variables involved in the process has received increasing attention. The goal of this research is to provide alternative methods to determine the state of such a process. Both methods presented here are compared to the current methodologies. The first method will allow us to monitor a process while the data is still being collected. The second one is based on the quality characteristic of the process and takes full advantage of the model structure. Both methods seem to be more robust than the current most well-known method.
|
105 |
Why do people drive when they can’t see clearly?Fylan, F., Hughes, A., Wood, J.M., Elliott, David 24 April 2018 (has links)
Yes / Purpose
Refractive blur is associated with decreased hazard perception and impairments in driving performance, but little is known about why people who have spectacles to correct their distance vision drive with uncorrected vision.
Methods
We conducted six focus groups. Participants were 30 drivers (mean age 45) who reported having driven uncorrected at least twice in the past six months despite having spectacles to correct their distance vision. Focus groups were audio recorded, transcribed verbatim and analysed thematically.
Results
We identified three themes. 1. Responsibility: participants did not feel obliged to drive with optimal vision and believed that others have a responsibility to ensure drivers maintain clear vision. 2. Safe Enough: participants felt safe to drive uncorrected, did not believe they need to wear spectacles to see sufficiently clearly and that they would know if their uncorrected eyesight fails to meet minimum standards. 3. Situations: participants discussed how they would drive uncorrected for short and familiar journeys, when they feel alert, in daylight and in good weather.
Conclusions
Beliefs about the importance of driving with clear vision compete with the benefits of not wearing spectacles. Eyecare professionals should provide more direct advice to patients regarding the need to wear their visual correction for driving.
|
106 |
Wastewater Treatment by Spiral Wound Reverse Osmosis: Development and Validation of a Two Dimensional Process ModelAl-Obaidi, Mudhar A.A.R., Kara-Zaitri, Chakib, Mujtaba, Iqbal 04 October 2016 (has links)
Yes / Reverse osmosis (RO) has become a significant method for removing salts and organic compounds from seawater and wastewater in recent decades. Spiral-wound module has been widely used due to a number of special features such as high packing density, premium separation and low operating cost. In this paper, a two-dimensional mathematical model is developed for the transport of dilute aqueous solutions through a spiral-wound RO module and the operational characteristics of the process under steady state conditions are analysed. The model is based on the solution-diffusion model coupled with the concentration polarization mechanism. This model yields a set of Differential and Algebraic Equations (DAEs), which are solved using the gPROMS software. The model is validated using experimental data from the literature for the rejection of dimethylphenol as solute in aqueous solutions. The model is then used to simulate the process under steady state conditions to gain deeper insight of the process.
|
107 |
Study of industrial naphtha catalytic reforming reactions via modelling and simulationZakari, A.Y., Aderemi, B.O., Patel, Rajnikant, Mujtaba, Iqbal 02 April 2019 (has links)
Yes / Steady state and dynamic modelling and simulation of catalytic reforming unit of Kaduna Refining & Petrochemical Company, NNPC (Nigeria) was carried to find out the behaviour of the reactions under both steady and unsteady state conditions. The basic model together with kinetic and thermodynamic parameters and properties were taken from the literature but is developed in gPROMS (an equation oriented modelling software) model building platform for the first time rather than in MATLAB or other modelling platform used by other researchers in the past. The simulation was performed using gPROMS and the predictions were validated against those available in the literature. The validated model was then used to monitor the behaviour of the temperature, concentrations of parafins, naphthenes and aromatics with respect to both time and height of the reactor of the industrial refinery of Nigeria. Hydrogen yield, Research octane number (RON) and temperature profiles are also reported. The components behave similarly in terms of reactions in the reactors but the time to attain quasi-steady state is different. The results are in good agreement with the industrial plant data.
|
108 |
Método de avaliação do modelo de processos de negócio do EKD / Assessment method of business process model of EKDPádua, Silvia Inês Dallavalle de 03 December 2004 (has links)
Atualmente as empresas precisam de sistemas ágeis a mudanças constantes do ambiente do negócio e para garantir que os sistemas cumpram com sua finalidade, os desenvolvedores devem possuir uma compreensão mais aprofundada sobre a organização, seus objetivos, metas e estratégias de mercado. O principal problema para o desenvolvimento de sistemas de software tem sido a dificuldade em se obter informações sobre o domínio da aplicação. Essa dificuldade levou ao surgimento de técnicas de modelagem organizacional, sendo uma atividade valiosa para a compreensão do ambiente empresarial. O EKD - Enterprise Knowledge Development - é uma metodologia que fornece uma forma sistemática e controlada de analisar, entender, desenvolver e documentar uma organização. Infelizmente não possui uma sintaxe e semântica bem definidas, dificultando análises mais complexas dos modelos. Como resultado, o modelo de processos de negócio do EKD pode ser ambíguo e de difícil análise, principalmente em sistemas mais complexos, não sendo possível verificar a consistência e completude do modelo. Neste trabalho, esses problemas serão estudados sob uma abordagem baseada em redes de Petri. O formalismo de redes de Petri a torna uma importante técnica de modelagem para a representação de processos. Além disso, redes de Petri permitem rastrear cada etapa da operação sem ambigüidade e possuem métodos eficientes de análise que garantem que o modelo está livre de erros. Assim, este trabalho tem como objetivo desenvolver um método de avaliação do modelo de processo de negócio do EKD (MPN-EKD). Por meio desse método é possível verificar se o modelo tem erros de construção e travamentos. Este método pode ser aplicado em modelos direcionados para o desenvolvimento de sistema de informação ou de controle do fluxo de trabalho, e pode ser utilizado também para o estudo de estratégias de trabalho e simulação do fluxo de trabalho. / Nowadays all companies need fast systems and frequent changes on the business environment and to guarantee that the systems are reaching their goals, the developers must have a deeper comprehension of the enterprise, its goals and market strategies. The main problem to the development of software systems has been the difficulty to obtain information about the application domain. This difficulty leaded to the creation of enterprise modeling techniques, which is a valuable activity for the comprehension of business environment. The EKD - Enterprise Knowledge Development - is a methodology that gives a systematic and controlled way to analyze, understand, develop, and document an enterprise. Unfortunately it doesn\'t have syntax neither a semantic well defined, which doesn\'t help on more complex analyses of the models. As a result, the enterprise process model of EKD can be ambiguous and hard to analyze, especially on more complex systems, and also it is not possible to verify the consistency and entireness of the model. On this paper, these problems will be studied under an approach based on Petri nets. Because of the Petri nets formalism this is an important modeling technique to process representation. Furthermore, Petri nets allow the tracking of each step of the operation without ambiguity and also they have efficient methodology for analyses, which guarantee the accuracy of the model. Therefore, this work has the objective to develop an evaluation methodology of the business process model of EKD (MPN-EKD). Such methodology will make possible the verification of possible building and locking model errors. This methodology can be applied to information systems or workflow, and also can be used to study the strategies of work and workflow simulations.
|
109 |
Exploring complexity metrics for artifact- centric business process ModelsMarin, Mike Andy 02 1900 (has links)
This study explores complexity metrics for business artifact process models described by Case
Management Model and Notation (CMMN). Process models are usually described using
Business Process Management (BPM), which is a relatively mature discipline with a large
number of practitioners. Over the last few decades a new way of describing data intensive
business processes has emerged in BPM literature, for which traditional BPM is no longer
adequate. This emerging method, used to describe more flexible processes, is called business
artifacts with Guard-Stage-Milestone (GSM). The work on GSM influenced CMMN, which
was created to fill a market need for more flexible case management processes for knowledge
workers.
Complexity metrics have been developed for traditional BPM models, such as the Business
Process Model and Notation (BPMN). However, traditional BPM is not suitable for describing
GSM or CMMN process models. Therefore, complexity metrics developed for traditional
process models may not be applicable to business artifact process models such as CMMN.
This study addresses this gap by exploring complexity metrics for business artifact process
models using CMMN. The findings of this study have practical implications for the CMMN
standard and for the commercial products implementing CMMN. This research makes the
following contributions:
• The development of a formal description of CMMN using first-order logic.
• An exploration of the relationship between CMMN and GSM and the development of
transformation procedures between them.
• A comparison between the method complexity of CMMN and other popular process
methods, including BPMN, Unified Modeling Language (UML) Activity diagrams, and
Event-driven Process Charts (EPC).
• The creation of a systematic literature review of complexity metrics for process models,
which was conducted in order to inform the creation of CMMN metrics.
• The identification of a set of complexity metrics for the CMMN standard, which underwent
theoretical and empirical validation.
This research advances literature in the areas of method complexity, complexity metrics
for process models, declarative processes, and research on CMMN by characterizing CMMN
method complexity, identifying complexity metrics for CMMN, and exploring the relationship
between CMMN and GSM. / School of Computing / Ph. D. (Computer Science)
|
110 |
Exploring complexity metrics for artifact-centric business process modelsMarin, Mike A. 02 1900 (has links)
This study explores complexity metrics for business artifact process models described by Case Management Model and Notation (CMMN). Process models are usually described using Business Process Management (BPM), which is a relatively mature discipline with a large
number of practitioners. Over the last few decades a new way of describing data intensive business processes has emerged in BPM literature, for which traditional BPM is no longer adequate. This emerging method, used to describe more flexible processes, is called business artifacts with Guard-Stage-Milestone (GSM). The work on GSM influenced CMMN, which was created to fill a market need for more flexible case management processes for knowledge workers.
Complexity metrics have been developed for traditional BPM models, such as the Business Process Model and Notation (BPMN). However, traditional BPM is not suitable for describing GSM or CMMN process models. Therefore, complexity metrics developed for traditional
process models may not be applicable to business artifact process models such as CMMN. This study addresses this gap by exploring complexity metrics for business artifact process models using CMMN. The findings of this study have practical implications for the CMMN
standard and for the commercial products implementing CMMN. This research makes the following contributions:
• The development of a formal description of CMMN using first-order logic.
• An exploration of the relationship between CMMN and GSM and the development of
transformation procedures between them.
• A comparison between the method complexity of CMMN and other popular process
methods, including BPMN, Unified Modeling Language (UML) Activity diagrams, and Event-driven Process Charts (EPC).
• The creation of a systematic literature review of complexity metrics for process models, which was conducted in order to inform the creation of CMMN metrics.
• The identification of a set of complexity metrics for the CMMN standard, which underwent theoretical and empirical validation.
This research advances literature in the areas of method complexity, complexity metrics for process models, declarative processes, and research on CMMN by characterizing CMMN method complexity, identifying complexity metrics for CMMN, and exploring the relationship
between CMMN and GSM. / Ph.D. (Computer Science)
|
Page generated in 0.082 seconds