• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 757
  • 687
  • 90
  • 62
  • 39
  • 39
  • 32
  • 26
  • 11
  • 8
  • 6
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 2123
  • 2123
  • 661
  • 659
  • 357
  • 184
  • 183
  • 180
  • 172
  • 159
  • 144
  • 142
  • 117
  • 116
  • 116
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
491

A robust Shewhart control chart adjustment strategy

Zou, Xueli 06 June 2008 (has links)
The standard Shewhart control chart for monitoring process stability is generalized by selecting a point in time at which the distance between the control limits is reduced. Three cost models are developed to describe the total cost per unit time of monitoring the mean of a process using both the standard and the generalized Shewhart control chart. The cost models are developed under the assumption that the quality characteristic of interest is normally distributed with known and constant variance. In the development of the first model, the negative exponential distribution is employed to model the time to process shift. Then, the uniform distribution and the Weibull distribution are used for the same purpose in the second and the third model, respectively. The motivation for this effort is to increase chart sensitivity to small but anticipated shifts in the process average. Cost models are constructed to allow the optimal choice of change over time and the best values for the initial and adjusted control limit values. The cost models are analyzed to determine the optimal control chart parameters including those associated with both the standard and the generalized control chart. The models are also used to provide a comparison with conventional implementation of the control chart. It is shown that the proposed cost models are efficient and economical. Figures and tables are provided to aid in the design of models for both the standard and the generalized Shewhart control chart. / Ph. D.
492

Control chart procedures based on cumulative gauging scores

Chung, Jain January 1985 (has links)
Control charts based on cumulative gauging scores rely on gauge scoring systems used for transforming actual observations into integer gauging scores. In some cases, the gauging scores are easy to obtain by using a mechanical device such as in the go-no-go inspection process. Thus, accurate measurements of selected quality characteristics are not necessary. Also, different control purposes can be achieved p by using different scoring systems. Cumulative gauging score charts based on two pairs of gauges are proposed to control the process mean or the standard deviation by either gauging one or several observations. Both random walk and cusum type cumulative gauging score charts are used. For controlling the process mean and standard deviation at the same time, a cusum type and a two-dimensional random walk type procedure are proposed. A gauging scheme can be applied to multivariate quality control by gauging either x² or T² statistics. A simple multivariate control chart which is based on the multivariate sign score vector is also proposed. The exact run length distribution of these cumulative gauging score charts can be obtained by formulating the procedures as Markov chain processes. For some procedures, the average run length (ARL) can be obtained in a closed form expression by solving a system of difference equations with appropriate boundary conditions. Comparisons based on the ARL show that the cumulative gauging score charts can detect small shifts in the quality characteristic more quickly than the Shewhart type X-chart. The efficiency of the cusum type gauging score chart is close to the regular CUSUM chart. The random walk type gauging score chart is more robust than the Shewhart and CUSUM charts to observations which have heavy a tailed distribution or which are serially correlated. For multivariate quality control. A procedure based on gauging the x² statistic has better performance than the x² chart. Also, a new multivariate control chart procedure which is more robust to the misspecification of the correlation than the x² chart is proposed. / Ph. D.
493

On monitoring the attributes of a process

Marcucci, Mark O. January 1982 (has links)
Two prominent monitoring procedures in statistical quality control are the p-chart for the proportion of items defective, and the c-chart, for the number of defects per item. These procedures are reconsidered, and some extensions are examined for monitoring processes with multiple attributes. Some relevant distribution theory is reviewed, and some new results are given. The distributions considered are multivariate versions of the binomial, Poisson, and chi-squared distributions, plus univariate and multivariate generalized Poisson distributions. All of these distributions prove useful in the discussion of attribute control charts. When quality standards are known, p-charts and c-charts are shown to have certain optimal properties. Generalized p-charts, for monitoring multinomial processes, and generalized c-charts are introduced. Their properties are shown to depend upon multivariate chi-squared and generalized Poisson distributions, respectively. Various techniques are considered for monitoring multivariate Bernoulli, Poisson, multinomial, and generalized Poisson processes. Omnibus procedures are given, and some of their asymptotic properties are derived. Also examined are diagnostic procedures based upon both small- and large-sample. / Ph. D.
494

Contributions to experimental design for quality control

Kim, Sang Ik January 1988 (has links)
A parameter design introduced by Taguchi provides a new quality control method which can reduce cost-effectively the product variation due to various uncontrollable noise factors such as product deterioration, manufacturing imperfections, and environmental factors under which a product is actually used. This experimental design technique identifies the optimal setting of the control factors which is least sensitive to the noise factors. Taguchi’s method utilizes orthogonal arrays which allow the investigation of main effects only, under the assumption that interaction effects are negligible. In this paper new techniques are developed to investigate two-factor interactions for 2<sup>t</sup> and 3<sup>t</sup> factorial parameter designs. The major objective is to be able to identify influential two-factor interactions and take those into account in properly assessing the optimal setting of the control factors. For 2<sup>t</sup> factorial parameter designs, we develop some new designs for the control factors by using a partially balanced array. These designs are characterized by a small number of runs and some balancedness property of the variance-covariance matrix of the estimates of main effects and two-factor interactions. Methods of analyzing the new designs are also developed. For 3<sup>t</sup> factorial parameter designs, a detection procedure consisting of two stages is developed by using a sequential method in order to reduce the number of runs needed to detect influential two-factor interactions. In this paper, an extension of the parameter design to several quality characteristics is also developed by devising suitable statistics to be analyzed, depending on whether a proper loss function can be specified or not. / Ph. D.
495

Reporting drug errors in a British Acute Hospital Trust

Armitage, Gerry R., Newell, Robert J., Wright, J. January 2007 (has links)
No / Purpose - The purpose of this article is to examine a sample of paper-based incident reports concerning drug incidents to assess the utility of a reporting system. Design/methodology/approach - A 50 per cent random sample of drug-related incident reports between 1999 and 2003 (n=1,253) was reviewed. Details of the incident including error type and contributory factors were identified, as was status of the reporter. Content analysis of the free text established whether the data provided could promote medication safety and organisational learning. Findings The paper finds that all definitive drug errors (n=991) allowed an error type to be identified, but 276 (27.8 per cent) did not include the contributory factor(s) involved. Content analysis of the errors demonstrated an inconsistent level of completeness, and circumstances, causation and action taken were not always logically related. Inter-rater reliability scores were varied. There was sometimes a significant focus on the actions of one individual in comparison to other factors. Research limitations/implications - Incident reports can be biased by psychological phenomena, and may not be representative of the parent organisation other than those who report. This study was carried out in a single health care organisation and generalisability may be questioned. Practical implications - How health professionals interpret drug errors and their reporting could be improved. Reporting can be further developed by reference to taxonomies, but their validity should be considered. Incident report analysis can provide an insight into the competence of individual reporters and the organisation's approach to risk management. Originality/value - This paper highlights the various data that can be captured from drug error reports but also their shortfalls which include: superficial content, incoherence; and according to professional group - varied reporting rates and an inclination to target individuals.
496

Evaluating the effectiveness of Umalusi council for quality assurance in general and further education and training as a public entity in the South African education regulatory system

Thomas, Jeremy Ralph 31 March 2008 (has links)
The South African government like most governments around the world create public entities to perform functions on its behalf and achieve particular objectives ranging from facilitating investments, delivering services or providing goods and advice. These public entities receive annual funding either whole or in part from the national fiscus and report to parliament through their respective Ministries. In the 2005/6 financial year government funded Umalusi 7, 69 million rands through direct transfer payments from the Department of Education, excluding any indirect payments from other governmental structures. Many public entities, about three hundred and thirty odd or so in South Africa, were promulgated to ensure and improve service delivery to the nation. However, they were not intended to be seen as an extension of their reporting departments. This research work evaluates the effectiveness of Umalusi in the education regulatory system and seeks to find ways to improve public entity effectiveness using the South African Excellence Model (SAEM) as the base tool to measure organisational effectiveness. A brief conclusion to this study is that Umalusi as a public entity, is adequately meeting its intended purpose. This is confirmed through its annual reports having never received a qualified audit since its inception. This research triangulates the results of the South African Excellence Model, the Questionnaire to senior education officials and the Auditors' Reports to confirm that Umalusi is effective as a public entity in the South African regulatory system. / Busniness Management / M.Tech. (Busniness Administration)
497

Lean six sigma deployment and implementation strategies for MCG Industries (PTY) LTD.

Stone, Mark Eric 03 1900 (has links)
Thesis (MBA)--Stellenbosch University, 2007. / ENGLISH ABSTRACT: Continuous improvement is a consensus theme used by many industries for improving product quality and service. In the last decade a new quality philosophy known as Six Sigma has become well established in many companies, e.g., Motorola, General Electric, Ford, Honda, Sony, Hitachi, Texas Instruments, American Express, etc. Some have suggested that the Six Sigma quality improvement philosophy is not only impacting the global business sector, but will also re-shape the discipline of statistics. The Six Sigma philosophy for improving product and service quality is based upon existing principles established by other well-recognised quality experts, (Le. Deming, Juran, and Ishikawa). The significant departure of the Six Sigma philosophy from existing quality philosophies is that it promotes a stronger emphasis on monitOring production yield and manufacturing costs associated with any quality improvement effort. The other significant contribution that Six Sigma makes to the quality movement is the detailed structure for continuous improvement and the step-by-step statistical methodology. The goal of any Six Sigma improvement effort is to obtain a long-term defect rate of only 3.4 defective parts per million manufactured. Lean and Six Sigma are recent developments in continuous improvement methodology that have been popularised by several high-profile companies. The success and complementary nature of these methodologies has led to their combination into a single methodology, commonly called Lean Six Sigma (LSS). Although there is considerable literature available and many implementations of LSS, very little published research addresses the practical experiences of companies that have implemented LSS. To formalise a Lean Six Sigma implementation strategy for MeG Industries the focus of this research was to answer the research question: "How and why are certain implementations of LSS successful or unsuccessful?" To answer this question, this research investigates the implementation processes of organisations by addressing the following investigative questions: .:. How has LSS been deployed and implemented in organisations? .:. What are barriers to LSS deployment and how are they overcome? .:. What are challenges experienced during a LSS implementation and how are they overcome? The investigative questions further focused the research question and identified several factors that appeared to significantly contribute to implementation success; these factors are: .:. Fusing business strategy with continuous improvement strategy .:. Leadership commitment and involvement in the deployment and implementation processes .:. The use of consultants that are proficient and experienced .:. A defined organisational model and infrastructure which links the continuous improvement efforts with the performance measurement system and senior leadership .:. Defined and standardised personnel selection criteria This research's purpose is to assist MeG Industries to structure a continuous improvement program that abates or eliminates the negative effects caused by deployment barriers and implementation challenges. / AFRIKAANSE OPSOMMING: Deurlopende verbetering is 'n eenstemminge tema gebruik deur menige nywerhede vir die verbetering van produkgehalte en diens. Gedurende die afgelope dekade is 'n nuwe kwaliteitsfilosofie, bekend as Six Sigma, goed gevestig in verskeie maatskappye, bv. Motorola, General Electric, Ford, Honda, Sony, Hitachi, Texas Instruments, American Express ens. Sommige het voorgestel dat die Six Sigma kwaliteit verbeteringfilosofie nie alleenlik impak maak op die globale besigheidsekor nie maar sal ook die disipline van statistiek herskep. Die Six Sigma filosofie vir die verbetering van produk en dienskwaliteit is gebasseer op bestaande beginsels gevestig deur welbekende kwaliteitdeskundiges (bv. Deming, Juran en Ishikawa). Die betekenisvolle afwyking van die Six Sigma filosofie vanaf die bestaande kwaliteitfilosofie is die bevordering van 'n sterk klem op die moniteering van produksieopbrengs en vervaardigingskostes verbind met enige kwaliteitverbeterings inspanning. Die ander betekenisvolle bydrae wat Six Sigma aan kwaliteitbeweging maak is die struktuur vir deurlopende verbetering en die stap vir stap statistiese metodiek. Die doel van enige Six Sigma verbeterings inspanning, is om 'n langtermyn defekgraad van net 3.4 defektiewe parte per miljoen vervaardig, te verkry. Lean en Six Sigma is onlangse ontwikkelings in deurlopende verbeteringsmetodiek, wat populer gemaak is deur verskeie hoe profiel maatskappye. Die sukses en komplimerende karakter van hierdie metodiekke het gelei tot die kombinasie van 'n enkel metodiek, algemeen bekend as Lean Six Sigma (LSS). Alhoewel daar aansienlike literatuur beskikbaar is, bestaan daar min gepubliseerde navorsingstukke wat die praktiese implementering van LSS deur maatskappye aanspreek. Om 'n Lean Six Sigma implementering strategie vir MCG Industries te formuleer is gefokus op navorsing wat die navorsingvraagstuk: Hoekom en waarom is sekere implementerings van LSS suksesvol of onsuksesvol?". Om hierdie vraag te beantwoord ondersoek die navorser die implementeringsprosesse van organisasies deur middel van die volgende navorsingvraagstukke: - Hoe is LSS ontplooi en geimplementeer in organisasies? - Wat is die hindernisse tot LSS ontplooiing en hoe word dit oorbrug? - Watter uitdagings word ondervind met die implementering van LSS en hoe word dit oorbrug? Die ondersoek bevraagteken verdere gefokusde navorsingvraagstukke en identifiseer verskeie faktore wat skynbaar 'n betekenisvolle bydrae lewer tot suksesvolle implementering; hierdie faktore is: -Samesmelting van besigheidstrategie met deurlopende verbeteringstrategie -Leierskapvertroue en betrokkenheid in die ontplooiing en implementerings prosesse -Die gebruik van bekwame en ervare konsultante -'n Gedefineerde organisasiemodel en infrastruktuur wat gekoppel word aan deurlopende verbeteringsinstelling deur middel van 'n prestasiemetingstelsel en senior leierskap -Bepaalde en gestandaardiseerde personeel seleksie kriteria. Die navorsing doel is om MCG Industries behulpsaam te wees met die struktuur van 'n deurlopende verbeteringsprogram wat vermindering of eliminasie van negatiewe uitwerkings, veroorsaak deur ontplooiings hindernisse en implementerings uitdagings.
498

A critical evaluation on the implementation of ISO 9000 in the building industry in Hong Kong

Kwok, Wai-lit, Bernard., 郭偉烈. January 1997 (has links)
published_or_final_version / Business Administration / Master / Master of Business Administration
499

Use of linear and nonlinear programming to optimize surimi seafood

Yoon, Won Byong 09 July 1996 (has links)
Least cost formulations for surimi seafood were studied by linear programming (LP) and nonlinear programming (NLP). The effects of water and starches on functional properties of Alaska pollock and Pacific whiting surimi gels were investigated. Six starches (modified potato starch, potato starch, modified wheat starch, wheat starch, modified waxy corn starch, and corn starch) and their mixtures were used as ingredients. Mixture and extreme vertices design were used as experimental designs. Canonical models were applied to the optimization techniques. Blending different kinds of surimi showed linear trends for each functional property, so that LP was successfully employed to optimize surimi lots. Strong interactions were found between surimi and starch or in starch mixtures. Two optimum solutions, obtained from LP and NLP, were compared in this study. Corn starch and modified waxy corn starch greatly improved the functional properties. / Graduation date: 1997
500

Process parameter optimisation of steel components laser forming using a Taguchi design of experiments approach

Sobetwa, Siyasanga January 2017 (has links)
A research report submitted to the Faculty of Engineering and the Built Environment, University of the Witwatersrand, Johannesburg, in partial fulfilment of the requirements for the degree of Master of Science in Engineering. Date: September 2017, Johannesburg / The focus in this research investigation is to investigate the Process Parameter Optimisation in Laser Beam Forming (LBF) process using the 4.4 kW Nd: YAG laser system – Rofin DY 044 to form 200 x 50 x 3 mm3 mild steel - AISI 1008 samples. The laser power P, beam diameter B, scan velocity V, number of scans N, and cooling flow C were the five input parameters of interest in the investigation because of their influence in the final formed product. Taguchi Design of Experiment (DoE) was used for the selection and combination of input parameters for LBF process. The investigation was done experimentally and computationally. Laser Beam Forming (LBF) input parameters were categorised to three different levels, low (L), medium (M), and high (H) laser forming (LBF) parameters to evaluate parameters that yield maximum bending and better surface finish/quality. The conclusion drawn from LBF process is that samples which are LBFormed using low parameter settings had unnoticeable bending and good material surface finishing. On the other hand, samples LBFormed using medium parameters yielded visible bending and non-smooth surface finishing, while samples processed using high LBF parameters yielded maximum bending and more surface roughness than the other two process parameters. / MT2018

Page generated in 0.0558 seconds