Spelling suggestions: "subject:"bindustrial."" "subject:"0industrial.""
321 |
Defining display complexity in electric utility system operator displaysMcElhaney, Steven Hunt 15 January 2014 (has links)
<p> In the electric utility industry, displays provide power system operators with information on and the status of the system, who then make decisions on how to maintain the safety, the reliability and the efficient operation of the utility generation and transmission grid based on that information. Complexity of the data presented and the display itself can lead to errors or misjudgments that can cause power system operators to make unwise decisions. The primary goal of this research was to develop a method to quantify display complexity for select displays used by system operators when operating the electric generation and transmission grids. Three studies were performed: (1) complexity measure development, (2) validation of the measure using usability and situation awareness (SA) techniques, and (3) display revisions based on complexity measure findings. Fifteen 15 different complexity metrics were originally considered (additive models, multiplicative models, and combination models with five different weighting schemes). The additive model with equal weighting was found to be the most sensitive in differentiating displays and was used in the later studies. For the validation study, system operators were asked to complete a usability questionnaire and a paper-based SA test using the current displays. Correlation and scatter plot analyses was used to determine if the complexity metric and usability and SA scores were related. Results of the validation study indicated that usability and SA scores for the studied displays were not well correlated with the complexity metric. In study 3, the highest and lowest scoring displays were redesigned with an emphasis on maintaining functionality but reducing aspects of complexity that were driving the complexity score. Systems operators again completed the usability and SA testing using the redesigned displays and again correlation analysis was performed. As was the case with study 2, usability scores were not correlated with the complexity metric; however, SA scores were significantly correlated. The complexity metric developed here can be used to quantify the complexity in a display and identify redesign opportunities to reduce non-essential information, as displays that are less complex should result in improved operator performance and satisfaction with the display.</p>
|
322 |
CULTURAL VALUE PRIORITIES OF MANAGERS, SKILLED AND SEMI-SKILLED WORKERS IN THE MINING AND ORE PROCESSING FUNCTIONS OF A SOUTH AFRICAN MINEdu Plessis, Maartin Jacobus 04 August 2014 (has links)
Social and business culture are integrated constructs that determine how organisations function as a subsystem of the larger society in which it provides outputs or services (Katz & Kahn, 1978). In South Africa most organisations, including Mining organisations, are still conceptualised and structured in a Western/Eurocentric mould (Van der Wal & Ramotsehoa, 2001). The culture of organisations is dominated by these values (Du Plessis, 2012) and the fact that the largest proportion of the population/workforce is neither European nor American, but African, is largely ignored (Xiaoxing et al., 2008). In practice many employees cannot relate to these values and little congruence exists between organisational values and goals, and those of the general workforce (Du Plessis, 2012).
Values are at the heart of culture and influence most, if not all motivated behaviour (Schwartz, 2006). Individuals function across multiple domains and over time to construct life stories that both shape and reflect the social structures of which they are part or have been part of during their life course (Kohn, 1989). The context of an individualâs life course facilitates values acquisition, which enables individuals to function in organisations where work occurs in lower or higher degrees of complexity, people communicate, and have access to each other, informed by the business/functional objectives of the organisation to perform a pattern of activities at incoherent levels of complexity, separated into a serious of steps or levels of work called organisational hierarchy.
Using a non-experimental design and a convenience sampling approach to collect data, the data was analysed employing a broad scope of descriptive and inferential statistics, including Confirmatory Factor Analysis and Multi-dimensional Scaling. The existing theory on values as formulated by Schwartz was utilised to study value priorities at the various levels of work in the South African Mining Industry. The non-parametric analyses provided clear indication that significant differences in value priorities do exist between the various levels of work. Multiple independent variables, levels of work, required an extension of the non-parametric analyses to investigate the movement in the values priorities of the levels of work. These analyses assisted to confirm the hypotheses, but also provided an understanding of why the value priorities changed for the various levels of work.
|
323 |
An economic/productivity study of a newcomer to the world of technology, industrial robotsKoger, Steven Allen January 1983 (has links)
This thesis has explored the industrial robot in a manner that is appropriate to help the new application engineer justify a robotic purchase. Information was gathered from technical journals, books, technical seminars, and a plant tour.The first step in the justification process is to be thoroughly educated in the field of robotics. Once this is completed, the reader is then free to investigate the three essential methods of economic analysis. This is the second step of the justification process.A cash flow analysis is completed so that an individual may tell whether a purchase will be sufficiently profitable or not. This is based on a five-year standard depreciation schedule. The rate of return on investment analysis shows the person at what specific rate the robot will generate revenue under certain conditions of use. Finally, the payback period analysis indicates how long it will take for the robot to actually pay for itself and begin to generate profit.
|
324 |
Do you agree with this critique? An analysis of the impacts of feedback, feedback acceptance, and fairness perceptions on performanceNam, Sophia 10 June 2014 (has links)
<p> Performance feedback has been widely used to improve performance, motivate employees, and increase organizational effectiveness. However, feedback research has yielded mixed results, ranging from improving performance to decreasing performance. The present study examined the impact of feedback perceptions on a unique sample, university art students, on fairness perceptions and subsequent performance.</p><p> Seventy-one art students at a western public university were surveyed immediately following a critique on a draft of their artwork. Feedback was measured by duration and positivity and converted into ratio format for analysis through hierarchical regression. Feedback acceptance and perceptions on fairness of feedback were surveyed. Final versions of the art project scores were collected at the end of the academic semester. In contrast to expectations, perceptions of fairness and feedback acceptance were neither significant moderators nor mediators of performance.</p>
|
325 |
A process comparison algorithm /Nayestani, Naynaz January 2002 (has links)
The purpose of this research was to design a method for the detailed comparison of processes. People already do process measurement and compare these measurements to internal or regulatory standards. The innovation in this thesis is the development of an algorithm for process comparison that accommodates any process, responds to any kind of semantic junction in the process model, deals with uncertain process data, and compares all or part of a process. This is done by the activity-by-activity comparison of data and measurements from an actual process to a model of a process. / IDEF3 diagrams were used for process modeling, a modified PERT network technique was used for process comparison, and the technique incorporated fuzzy data and metric comparison. / Solutions of practical problems showed that the algorithm does not have any limitations for process comparison. While the test data were for time performance only, other metrics can easily be accommodated, such as cost, quality, human resource or energy.
|
326 |
Models for estimating design effortBashir, Hamdi A. January 2000 (has links)
In today's competitive environment, it is necessary to deliver products on time and within budget. Unfortunately, design projects have been plagued by severe cost and schedule overruns. This problem persists in spite of the significant advances that have been made in design technology over the last two decades. In most of the cases, the problem of overruns is due to poor estimation. The search for a solution has become even more pressing in the present era of shrinking product cycle times. / Driven primarily by this need, this thesis presents new effort estimation models. Unlike existing estimation techniques that are based on work breakdown structures with respect to process or product, the proposed models are based on a new metric for estimating product complexity, which is based on product functional decomposition. The validity of the metric as a good predictor of design effort was tested using data obtained from an experiment involving simple design tasks, and empirically using historical data collected for 32 projects from 3 companies. / The performance of the new effort estimation models was tested in terms of a number of objective criteria. The results indicated that the average estimation error of the models ranged from 12% to 15%. The improvement in estimation accuracy accomplished by the models ranged from 52% to 64% compared to estimates originally made by the companies which had errors from 27% to 41%. / Moreover, models for estimating cost and duration, as well as updating the estimates during project execution, were derived. The applications of the derived models are described through demonstrative examples. Thus, a complete methodology is given for the estimation of project effort and duration.
|
327 |
Sparse data estimation for knowledge processesLari, Kamran A. January 2004 (has links)
During recent years, industry has increasingly focused on knowledge processes. Similar to traditional or manufacturing processes, knowledge processes need to be managed and controlled in order to provide the expected results for which they were designed. During the last decade, the principals of process management have evolved, especially through work done in software engineering and workflow management. / Process monitoring is one of the major components for any process management system. There have been efforts to design process control and monitoring systems; however, no integrated system has yet been developed as a "generic intelligent system shell". In this dissertation, an architecture for an integrated process monitoring system (IPMS) is developed, whereby the end-to-end activities of a process can be automatically measured and evaluated. In order to achieve this goal, various components of the IPMS and the interrelationship among these components are designed. / Furthermore, a comprehensive study on the available methodologies and techniques revealed that sparse data estimation (SDE) is the key component of the IPMS which does not yet exist. Consequently, a series of algorithms and methodologies are developed as the basis for the sparse data estimation of knowledge based processes. Finally, a series of computer programs demonstrate the feasibility and functionality of the proposed approach when applied to a sample process. The sparse data estimation method is successful for not only knowledge based processes, but also for any process, and indeed for any set of activities that can be modeled as a network.
|
328 |
A study of key mechanisms for concurrent engineering processes /Liu, Yun, 1969- January 2005 (has links)
Previous studies (Bhuiyan, 2001; Bhuiyan et al., 2003; Jaafar, 2001) used stochastic computer models to study concurrent engineering (CE) processes. This thesis used the same models, but modified the two main mechanisms used in CE, that is, functional interaction and overlap, in order to better understand how they contribute to process performance, i.e., how they affect product development effort and span time. The present study used more realistic development processes over previous research in addition to using the same uncertainty conditions, rework, learning, and communication techniques. Simulation results of the updated models were discussed in comparison to the baseline models in terms of effort versus span time and effort distribution during the process versus span time. / Research outcomes indicated that the use of CE was beneficial as long as the uncertainty of information during product development was moderate to low. When uncertainty was high, sequential engineering was best. Several cases were demonstrated. / The distribution of effort during product development was studied and it showed that processes should be designed to avert rework due to design versions (complete redesign of the product) and should emphasize churn (redesign in small steps during teamwork).
|
329 |
A systematic approach to multi-criteria site selection with an analysis of weight sensitivity /Mills, Nancy L. January 1988 (has links)
Thesis (Ph. D.)--Oregon State University, 1989. / Typescript (photocopy). Includes bibliographical references. Also available on the World Wide Web.
|
330 |
A study of instruction sheets early history and present use /Smith, Walter Wellman. January 1940 (has links)
Thesis (M.S.)--Oregon State College, 1941. / Typescript. Includes bibliographical references (leaves 66-67). Also available on the World Wide Web.
|
Page generated in 0.078 seconds