• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 757
  • 687
  • 90
  • 62
  • 39
  • 39
  • 32
  • 26
  • 11
  • 8
  • 6
  • 5
  • 5
  • 5
  • 5
  • Tagged with
  • 2123
  • 2123
  • 661
  • 659
  • 357
  • 184
  • 183
  • 180
  • 172
  • 159
  • 144
  • 142
  • 117
  • 116
  • 116
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
201

Service quality measurement for non-executive directors in public entities

Van Wyk, M.F. 12 September 2012 (has links)
D.Comm. / In commercial corporations shareholders, at least in theory, evaluate the performance of the boards they have appointed. Such evaluation is mainly based on the financial performance of the entity. Public (state funded) entities have only the state as shareholder and the performance of their boards is not evaluated by the taxpayers who ultimately pay the directors' fees. The term "public entity" refers to 20 corporations with an annual turnover in excess of R 55 billion which are substantially tax-funded or are awarded a market monopoly in terms of legislation by parliament. Although these public entities are regularly criticised by the press, the academic literature reports neither an assessment of the quality of governance by their non-executive directors' nor any instrument to use in such an assessment. The aim of this study was to measure the expectations and perceptions of executives in public entities about their non-executive boards' corporate governance service. This began with a literature was analysis, firstly to define what "proper" corporate governance and secondly to find a recognised methodology to use in the development of an assessment instrument. It was found that two main corporate governance models were generally recognised, namely the United Kingdom model and the German model. The United Kingdom model advocates a single board comprising both executive and non-executive directors while the German model has a supervisory board of non-executive directors overseeing the activities of an executive management board. It was further found that, contrary to King's (1994) recommendation to use unitary boards, the 20 listed public entities all had supervisory boards as advocated in the German model. A procedure advocated by Churchill (1979:65-72), in his paradigm for developing measures of marketing constructs, proved to be very successful in the development in the United States of America of an instrument named SERVQUAL which was applied in the general service arena where a paying client evaluated a service. Churchill's method was therefore used in this study to develop an instrument called ECGSI to measure the quality of governance of listed public entities' non-executive boards. The opinions of executives attending board meetings, e.g. to make presentations, were used both to develop ECGSI and to measure the quality of the non-executive directors' service.
202

Discrete random feedback models in industrial quality control /

Bishop, Albert B. January 1957 (has links)
No description available.
203

Quality Control Recommendations for Structural Interventions on Historic Properties

Holland, Michele M. 29 September 2006 (has links)
This thesis presents recommendations for controlling quality in structural interventions on historic properties. Recognizing that establishing quality in the early stages of an intervention can set the standard of quality for an entire project, these recommendations are for the first phase of an intervention, the Pre-Construction Phase. To create these recommendations, first a literature review of past and present intervention methods is conducted. After breaking down the Pre-Construction Phase first into a series of steps, and then each step into a series of details, a standard of quality is established for each detail. The available methods for conducting each detail are then analyzed. Using the literature review and the established standards of quality, recommendations are made as to which method is most appropriate for a given project. These recommendations are applied to two case studies, the structural interventions of Boykin's Tavern and Fallingwater. Finally, conclusions on the use of the proposed quality control recommendations are drawn, and suggestions are given for further work in this field. / Master of Science
204

Near Infrared Investigation of Polypropylene-Clay Nanocomposites for Further Quality Control Purposes-Opportunities and Limitations

Witschnigg, A., Laske, S., Holzer, C., Patel, Rajnikant, Khan, Atif H., Benkreira, Hadj, Coates, Philip D. 31 August 2015 (has links)
Yes / Polymer nanocomposites are usually characterized using various methods, such as small angle X-ray diffraction (XRD) or transmission electron microscopy, to gain insights into the morphology of the material. The disadvantages of these common characterization methods are that they are expensive and time consuming in terms of sample preparation and testing. In this work, near infrared spectroscopy (NIR) spectroscopy is used to characterize nanocomposites produced using a unique twin-screw mini-mixer, which is able to replicate, at ~25 g scale, the same mixing quality as in larger scale twin screw extruders. We correlated the results of X-ray diffraction, transmission electron microscopy, G′ and G″ from rotational rheology, Young’s modulus, and tensile strength with those of NIR spectroscopy. Our work has demonstrated that NIR-technology is suitable for quantitative characterization of such properties. Furthermore, the results are very promising regarding the fact that the NIR probe can be installed in a nanocomposite-processing twin screw extruder to measure inline and in real time, and could be used to help optimize the compounding process for increased quality, consistency, and enhanced product properties
205

Application of quality control and other statistical methods to the precision wood industry

Rhodes, Raymond C. 17 March 2010 (has links)
Investigations were conducted of the statistical aspects of basic research, engineering development, and economic problems pertinent to the Lane Company, Altavista, Virginia, cedar chest manufacturer. Estimations were made of the quality level and variability of various manufacturing operations, e.g., the veneer slicer, gang saws, hot plate press, planers, sanders, top panel inspection, and finish inspection. Statistical quality control procedures were established at points in the processes most feasible for and responsive to their application. A thorough study was made of available data on chests returned by consumers because of open corners. The percentage of returned chests was related to differences in case size and to differences in the predicted equilibrium moisture content of wood in the plant during manufacture. These relationships were presented as a basis for determining the months of the year during which it will be economically profitable to 3-ply chests of various sizes as a protective action against returned chests. An experiment was designed to estimate the effects of high humidity conditions on the rupture of the corners of cedar chests having different panel constructions, corner constructions, and glue treatments. A proposed design with an outline of the analysis was presented. Some thought was directed to the measurement of the moisture content of cedar wood. It was proposed that a combination of both oven-dry and electrometric methods, rather than by an extraction-distillation method alone, might be employed to estimate more precisely the true moisture content under industrial conditions. / Master of Science
206

A systematic, experimental methodology for design optimization

Ritchie, Paul Andrew, 1960- January 1988 (has links)
Much attention has been directed at off-line quality control techniques in recent literature. This study is a refinement of and an enhancement to one technique, the Taguchi Method, for determining the optimum setting of design parameters in a product or process. In place of the signal-to-noise ratio, the mean square error (MSE) for each quality characteristic of interest is used. Polynomial models describing mean response and variance are fit to the observed data using statistical methods. The settings for the design parameters are determined by minimizing a statistical model. The model uses a multicriterion objective consisting of the MSE for each quality characteristic of interest. Minimum bias central composite designs are used during the data collection step to determine the settings of the parameters where observations are to be taken. Included is the development of minimum bias designs for various cases. A detailed example is given.
207

Managing the quality of colour television receivers in the Republic of South Africa

Higgins, John Morwood 06 1900 (has links)
This study investigates whether quality management has developed to such an extent that retailers, service repair organisations and consumers are satisfied with product and repair service quality provided by South African manufacturing companies. To investigate these aspects, the colour television industry has been selected because it contains various manufacturing companies of varying sizes and characters, employing different quality policies and performances. It offers relatively standardised products and services, which facilitates intercompany comparisons and employs a simple flow type assembly line process that is representative of other mass production industries. The hypotheses are tested by interviewing four selected populations by means of carefully constructed questionnaires, namely a retail population, a repair service population, a consumer population and a manufacturing population. The empirical results are statistically evaluated in terms of the various manufacturers ability to provide satisfactory product and repair service quality. Consumers and repair service organisations are selected because they represent a broad spectrum of the population with varying, but important opinions on product and repair service quality. Retailers selling colour television receivers are also selected as they play an important role in the management of quality and vary from small independent retailers to large chain stores and discounters. The results obtained from this study show that: • there is a need from the retailers, service repair organisations and consumers for the South African manufacturers to improve the quality of colour television receivers • there is a need by the consumers and the retailers for the manufacturers to improve their repair service quality • there is a need to improve the quality control procedures employed by the colour television manufacturers. / Business Management / D. Com. (Business Management)
208

Harmonization of internal quality tasks in analytical laboratories case studies : water analysis methods using polarographic and voltammetric techniques

Gumede, Njabulo Joyfull January 2008 (has links)
Dissertation submitted in partial compliance with the requirements of the Masters Degree in Technology: Chemistry, in the Faculty of Applied Sciences at the Durban University of Technology, 2008. / In this work, a holistic approach to validate analytical methods was assessed by virtue of Monte Carlo simulations. This approach involves a statement of the methodsâ s scope (i.e. analytes, matrices and concentration levels) and requisites (internal or external); selection of the methodâ s (fit-for-purpose) features; pre-validation and validation of the intermediate accuracy and its assessment by means of Monte Carlo simulations. Validation of the other methodâ s features and a validity statement in terms of a â fit-for-purposeâ decision making, harmonized validation-control-uncertainty statistics and short-term routine work with the aim of proposing virtually â ready-to-useâ methods. The protocol could be transferred to other methods. The main aim is to harmonize the work to be done by research teams and routine laboratories assuming that different aims, strategies and practical viewpoints exist. As a result, the recommended protocol should be seen as a starting point. It is necessary to propose definitive (harmonized) protocols that must be established by international normalisation/accreditation entities. The Quality Assurance (Method verification and Internal Quality Control, IQC) limits, as well as sample uncertainty were estimated consistently with the validated accuracy statistics i.e. E U (E) and RSDi + U (RSDi). Two case studies were used to assess Monte Carlo simulation as a tool for method validation in analytical laboratories, the first involves an indirect polarographic method for determining nitrate in waste water and the second involves a direct determination of heavy metals in sea water by differential pulse anodic stripping voltammetry, as an example of the application of the protocol. In this sense the uncertainty obtained could be used for decision making purposes as it is very tempting to use uncertainty as a commercial argument and in this work it has been shown that the smaller the uncertainty, the better the measurement of the instrument or the laboratoryâ s reputation.
209

A HACCP study on yoghurt manufacture

Hoolasi, Kasthurie January 2005 (has links)
Thesis (M.Tech.: Quality)-Dept. of Operations & Quality Management, Durban Institute of Technology, 2005 xiii, 68 leaves / The increasing awareness and demand of consumers for safe and high quality food have lead many companies to undertake a comprehensive evaluation and reorganisation of their food control systems in order to improve efficiency, rationalisation of human resources and to harmonise approaches. This evaluation in food control systems has resulted towards the necessity to shift from the traditional approach that relied heavily on end-product sampling and inspection and to move towards the implementation of a preventative safety and quality approach, based on risk analysis and on the principles of the hazard analysis critical control point (HACCP) system. Yoghurt is the most popular fermented milk world-wide; the estimated annual consumption in South Africa amounts to nearly 67 million litres. The aim of this study was to implement a HACCP program in a commercial yoghurt factory and then to evaluate the program during certain critical stages of the manufacturing process.
210

A unified approach to the economic aspects of statistical quality control and improvement

Ghebretensae Manna, Zerai 12 1900 (has links)
Assignment (MSc)--Stellenbosch University, 2004. / ENGLISH ABSTRACT: The design of control charts refers to the selection of the parameters implied, including the sample size n, control limit width parameter k, and the sampling interval h. The design of the X -control chart that is based on economic as well as statistical considerations is presently one of the more popular subjects of research. Two assumptions are considered in the development and use of the economic or economic statistical models. These assumptions are potentially critical. It is assumed that the time between process shifts can be modelled by means of the exponential distribution. It is further assumed that there is only one assignable cause. Based on these assumptions, economic or economic statistical models are derived using a total cost function per unit time as proposed by a unified approach of the Lorenzen and Vance model (1986). In this approach the relationship between the three control chart parameters as well as the three types of costs are expressed in the total cost function. The optimal parameters are usually obtained by the minimization of the expected total cost per unit time. Nevertheless, few practitioners have tried to optimize the design of their X -control charts. One reason for this is that the cost models and their associated optimization techniques are often too complex and difficult for practitioners to understand and apply. However, a user-friendly Excel program has been developed in this paper and the numerical examples illustrated are executed on this program. The optimization procedure is easy-to-use, easy-to-understand, and easy-to-access. Moreover, the proposed procedure also obtains exact optimal design values in contrast to the approximate designs developed by Duncan (1956) and other subsequent researchers. Numerical examples are presented of both the economic and the economic statistical designs of the X -control chart in order to illustrate the working of the proposed Excel optimal procedure. Based on the Excel optimization procedure, the results of the economic statistical design are compared to those of a pure economic model. It is shown that the economic statistical designs lead to wider control limits and smaller sampling intervals than the economic designs. Furthermore, even if they are more costly than the economic design they do guarantee output of better quality, while keeping the number of false alarm searches at a minimum. It also leads to low process variability. These properties are the direct result of the requirement that the economic statistical design must assure a satisfactory statistical performance. Additionally, extensive sensitivity studies are performed on the economic and economic statistical designs to investigate the effect of the input parameters and the effects of varying the bounds on, a, 1-f3 , the average time-to-signal, ATS as well as the expected shift size t5 on the minimum expected cost loss as well as the three control chart decision variables. The analyses show that cost is relatively insensitive to improvement in the type I and type II error rates, but highly sensitive to changes in smaller bounds on ATS as well as extremely sensitive for smaller shift levels, t5 . Note: expressions like economic design, economic statistical design, loss cost and assignable cause may seen linguistically and syntactically strange, but are borrowed from and used according the known literature on the subject. / AFRIKAANSE OPSOMMING: Die ontwerp van kontrolekaarte verwys na die seleksie van die parameters geïmpliseer, insluitende die steekproefgrootte n , kontrole limiete interval parameter k , en die steekproefmterval h. Die ontwerp van die X -kontrolekaart, gebaseer op ekonomiese sowel as statistiese oorwegings, is tans een van die meer populêre onderwerpe van navorsing. Twee aannames word in ag geneem in die ontwikkeling en gebruik van die ekonomiese en ekonomies statistiese modelle. Hierdie aannames is potensieel krities. Dit word aanvaar dat die tyd tussen prosesverskuiwings deur die eksponensiaalverdeling gemodelleer kan word. Daar word ook verder aangeneem dat daar slegs een oorsaak kan wees vir 'n verskuiwing, of te wel 'n aanwysbare oorsaak (assignable cause). Gebaseer op hierdie aannames word ekonomies en ekonomies statistiese modelle afgelei deur gebruik te maak van 'n totale kostefunksie per tydseenheid soos voorgestel deur deur 'n verenigende (unified) benadering van die Lorenzen en Vance-model (1986). In hierdie benadering word die verband tussen die drie kontrole parameters sowel as die drie tipes koste in die totale kostefunksie uiteengesit. Die optimale parameters word gewoonlik gevind deur die minirnering van die verwagte totale koste per tydseenheid. Desnieteenstaande het slegs 'n minderheid van praktisyns tot nou toe probeer om die ontwerp van hulle X -kontrolekaarte te optimeer. Een rede hiervoor is dat die kosternodelle en hulle geassosieerde optimeringstegnieke té kompleks en moeilik is vir die praktisyns om te verstaan en toe te pas. 'n Gebruikersvriendelike Excelprogram is egter hier ontwikkel en die numeriese voorbeelde wat vir illustrasie doeleindes getoon word, is op hierdie program uitgevoer. Die optimeringsprosedure is maklik om te gebruik, maklik om te verstaan en die sagteware is geredelik beskikbaar. Wat meer is, is dat die voorgestelde prosedure eksakte optimale ontwerp waardes bereken in teenstelling tot die benaderde ontwerpe van Duncan (1956) en navorsers na hom. Numeriese voorbeelde word verskaf van beide die ekonomiese en ekonomies statistiese ontwerpe vir die X -kontrolekaart om die werking van die voorgestelde Excel optimale prosedure te illustreer. Die resultate van die ekonomies statistiese ontwerp word vergelyk met dié van die suiwer ekomomiese model met behulp van die Excel optimerings-prosedure. Daar word aangetoon dat die ekonomiese statistiese ontwerpe tot wyer kontrole limiete en kleiner steekproefmtervalle lei as die ekonomiese ontwerpe. Al lei die ekonomies statistiese ontwerp tot ietwat hoër koste as die ekonomiese ontwerpe se oplossings, waarborg dit beter kwaliteit terwyl dit die aantal vals seine tot 'n minimum beperk. Hierbenewens lei dit ook tot kleiner prosesvartasie. Hierdie eienskappe is die direkte resultaat van die vereiste dat die ekonomies statistiese ontwerp aan sekere statistiese vereistes moet voldoen. Verder is uitgebreide sensitiwiteitsondersoeke op die ekonomies en ekonomies statistiese ontwerpe gedoen om die effek van die inset parameters sowel as van variërende grense op a, 1- f3 , die gemiddelde tyd-tot-sein, ATS sowel as die verskuiwingsgrootte 8 op die minimum verwagte kosteverlies sowel as die drie kontrolekaart besluitnemingsveranderlikes te bepaal. Die analises toon dat die totale koste relatief onsensitief is tot verbeterings in die tipe I en die tipe II fout koerse, maar dat dit hoogs sensitief is vir wysigings in die onderste grens op ATS sowel as besonder sensitief vir klein verskuiwingsvlakke, 8. Let op: Die uitdrukkings ekonomiese ontwerp (economic design), ekonomies statistiese ontwerp (economic statistical design), verlies kostefunksie (loss cost function) en aanwysbare oorsaak (assignable cause) mag taalkundig en sintakties vreemd voordoen, maar is geleen uit, en word so gebruik in die bekende literatuur oor hierdie onderwerp.

Page generated in 0.0311 seconds