• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 11
  • 9
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 30
  • 30
  • 30
  • 30
  • 11
  • 11
  • 11
  • 11
  • 7
  • 7
  • 7
  • 6
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Implantação do controle estatístico de processo: um estudo de caso em uma empresa do setor alimentício

Marcos, Elizangela de Lima [UNESP] 19 December 2014 (has links) (PDF)
Made available in DSpace on 2015-06-17T19:34:39Z (GMT). No. of bitstreams: 0 Previous issue date: 2014-12-19. Added 1 bitstream(s) on 2015-06-18T12:48:28Z : No. of bitstreams: 1 000825476.pdf: 2129640 bytes, checksum: 776235849b47c4f8e270a1869fa3f3d4 (MD5) / O propósito desta dissertação é analisar a implantação da ferramenta Controle Estatístico de Processo (CEP) para a gestão de operações de uma empresa do setor alimentício. Para tanto, realizou-se uma fundamentação que compreende: o Controle de Qualidade, a Variabilidade do Processo e o CEP. Esta fundamentação teórica apoiou o presente estudo de caso, consequentemente, foi possível descrever e discutir todas as etapas de aplicação da ferramenta no contexto de iplementação de um novo sistema de Gestão. Em adição, sugerir a proposta de um roteiro para auxílio na implantação da ferramenta CEP em processos de manufatura. Os resultados permitem constatar que as ferramentas que compõe um pacote corporativo podem não ser eficazes, quando esta não é analisada quanto à sua dimensão teórica de aplicabilidade no local a ser utilizado. Outro ponto constatado nos resultados é a necessidade de formalizar uma norma padrão de aplicabilidade de ferramentas de gestão, evitando a ocorrência de erros comuns de erros comuns, como, a não existência de um estudo para identificar não apenas a característica da qualidade, mas quais equipamentos ou pontos de operação devem ser monitorados pela ferramenta contida no Sistema de Gestão / The purpose of this dissertation is to analize the implentation of the SPC (Statistical Process Control) for the management of operations in a food company. Furthermore, it was performer a theoretical review that includes: Quality Control, Process Variability and SPC. This theoretical review supported this case study, and it has possible to describe and discuss all the steps of deployment the tool in the new System of Management. In addition, to suggest a proposal of a guide to aid the implementation of the SPC in manufacturing processes. The results reveal that comprise a corporative guide may not be effective, when this is not analyzed in a techinical dimension of applicability on the local to be used. Moreover, the point observed in the results, all the System of Management need to formalize a standard applicatibility of management tools, avoiding common mistakes; for instance, non-existence of a study to identify which equipment or point of operatio must be monitored
12

Tempos, movimentos e qualidade da operação de colheita mecanizada de soja em função do formato de talhões

Paixão, Carla Segatto Strini [UNESP] 20 February 2015 (has links) (PDF)
Made available in DSpace on 2015-06-17T19:34:45Z (GMT). No. of bitstreams: 0 Previous issue date: 2015-02-20. Added 1 bitstream(s) on 2015-06-18T12:47:09Z : No. of bitstreams: 1 000829820.pdf: 1310627 bytes, checksum: eb12274765f51baf4d64fce97b2fe834 (MD5) / Informações a cerca da capacidade da operação, eficiência de colheita e desempenho da colhedora são de grande importância no gerenciamento de sistemas mecanizados agrícolas, auxiliando nas decisões a serem tomadas pela administração visando a sua otimização. Neste sentido objetivou-se neste trabalho avaliar os tempos, movimentos e a qualidade da operação da colheita mecanizada de soja em diferentes formatos de talhões, utilizando-se como indicadores de qualidade parâmetros de desempenho operacional da colhedora e variáveis representativas dos aspectos agronômicos da cultura, por meio de ferramentas de controle estatístico de processo. A colheita mecanizada foi realizada na fazenda localizada no município de Uberaba, MG e o delineamento experimental utilizado foi inteiramente casualizado, sendo realizadas 18, 28 e 24 repetições para os talhões irregular, retangular e trapezoidal, respectivamente. Os tratamentos foram estabelecidos a partir dos formatos dos talhões. Monitoraram-se as atividades da colhedora (colheita, descarga, manutenção e paradas climáticas). As variáveis e/ou indicadores de qualidade avaliados para o desempenho da colhedora foram: velocidade de deslocamento, rotação do motor, rotação do cilindro e abertura do côncavo. Na determinação das perdas foram utilizadas armações circulares confeccionadas com aros de 0,33 m², que juntos totalizam a área de aproximadamente 1,00 m². As perdas determinadas foram perdas na plataforma, perdas dos mecanismos internos, perdas totais e perdas em relação à produtividade. A eficiência gerencial (Eg) e o tempo de manobra (Tm) apresentam os melhores resultados para o talhão trapezoidal e para o talhão retangular, respectivamente. O desempenho da colhedora é considerado capaz de atingir os limites específicos estabelecidos, para os indicadores de qualidade velocidade e rotação do cilindro, ambos para o talhão trapezoidal. Os parâmetros do ... / The harvesting operation is considered the most expensive method and the major determinants in the production and soybean profitability, and information about the capacity of the operation, harvest efficiency and performance of the harvester are of great importance in the management of agricultural mechanized systems, assisting in the decisions to be taken by the administration in order to optimization. In this sense we aimed in this study to evaluate the times, movements and the quality of mechanical soybean harvesting operation on different sizes of plots, using as indicators of quality operating performance parameters of the harvester and representative variables of agronomic aspects of culture, by means of statistical process control tools. Mechanized harvesting was carried out at the farm located in Uberaba, Minas Gerais and the experimental design was completely randomized, being held 18, 28 and 24 repetitions for irregular plots, rectangular and trapezoidal, respectively. The treatments were established from the stands formats. Were monitored the activities of the harvester (harvest, unloading, handling and climate charts). The variables and / or indicators of assessed quality for combine harvester performance were: forward speed, engine speed, cylinder speed and concave opening. In determining losses circular frames were used made with 0.33 m² hoops, which together total area of approximately 1.00 m². The losses were determined header losses, losses of internal mechanisms, total losses and losses in relation to productivity. The format of the blocks affects the time efficiency and combine movements and the quality of mechanical soybean harvesting operation. The parameters of soybean mechanical harvesting process are presented for the three plots unable evaluated the short- and long-term
13

Tempos, movimentos e qualidade da operação de colheita mecanizada de soja em função do formato de talhões /

Paixão, Carla Segatto Strini. January 2015 (has links)
Orientador: Rouverson Pereira da Silva / Banca: Carlos Eduardo Angeli Furlani / Banca: Diego Augusto Fiorese / Resumo: Informações a cerca da capacidade da operação, eficiência de colheita e desempenho da colhedora são de grande importância no gerenciamento de sistemas mecanizados agrícolas, auxiliando nas decisões a serem tomadas pela administração visando a sua otimização. Neste sentido objetivou-se neste trabalho avaliar os tempos, movimentos e a qualidade da operação da colheita mecanizada de soja em diferentes formatos de talhões, utilizando-se como indicadores de qualidade parâmetros de desempenho operacional da colhedora e variáveis representativas dos aspectos agronômicos da cultura, por meio de ferramentas de controle estatístico de processo. A colheita mecanizada foi realizada na fazenda localizada no município de Uberaba, MG e o delineamento experimental utilizado foi inteiramente casualizado, sendo realizadas 18, 28 e 24 repetições para os talhões irregular, retangular e trapezoidal, respectivamente. Os tratamentos foram estabelecidos a partir dos formatos dos talhões. Monitoraram-se as atividades da colhedora (colheita, descarga, manutenção e paradas climáticas). As variáveis e/ou indicadores de qualidade avaliados para o desempenho da colhedora foram: velocidade de deslocamento, rotação do motor, rotação do cilindro e abertura do côncavo. Na determinação das perdas foram utilizadas armações circulares confeccionadas com aros de 0,33 m², que juntos totalizam a área de aproximadamente 1,00 m². As perdas determinadas foram perdas na plataforma, perdas dos mecanismos internos, perdas totais e perdas em relação à produtividade. A eficiência gerencial (Eg) e o tempo de manobra (Tm) apresentam os melhores resultados para o talhão trapezoidal e para o talhão retangular, respectivamente. O desempenho da colhedora é considerado capaz de atingir os limites específicos estabelecidos, para os indicadores de qualidade velocidade e rotação do cilindro, ambos para o talhão trapezoidal. Os parâmetros do ... / Abstract: The harvesting operation is considered the most expensive method and the major determinants in the production and soybean profitability, and information about the capacity of the operation, harvest efficiency and performance of the harvester are of great importance in the management of agricultural mechanized systems, assisting in the decisions to be taken by the administration in order to optimization. In this sense we aimed in this study to evaluate the times, movements and the quality of mechanical soybean harvesting operation on different sizes of plots, using as indicators of quality operating performance parameters of the harvester and representative variables of agronomic aspects of culture, by means of statistical process control tools. Mechanized harvesting was carried out at the farm located in Uberaba, Minas Gerais and the experimental design was completely randomized, being held 18, 28 and 24 repetitions for irregular plots, rectangular and trapezoidal, respectively. The treatments were established from the stands formats. Were monitored the activities of the harvester (harvest, unloading, handling and climate charts). The variables and / or indicators of assessed quality for combine harvester performance were: forward speed, engine speed, cylinder speed and concave opening. In determining losses circular frames were used made with 0.33 m² hoops, which together total area of approximately 1.00 m². The losses were determined header losses, losses of internal mechanisms, total losses and losses in relation to productivity. The format of the blocks affects the time efficiency and combine movements and the quality of mechanical soybean harvesting operation. The parameters of soybean mechanical harvesting process are presented for the three plots unable evaluated the short- and long-term / Mestre
14

Introduction de critères ergonomiques dans un système de génération automatique d’interfaces de supervision / Introduction of ergonomic criteria in an automatic generation system of supervision interfaces

Rechard, Julien 06 November 2015 (has links)
La conception d’interface écologique se décompose en deux étapes, une analyse du domaine de travail et une retranscription des informations du domaine en des représentations écologiques (Naikar, 2010). Ce type de conception a montré son efficacité pour la supervision de système complexe (Burns, 2008). Cependant, Vicente (2002) a pointé deux lacunes le temps de conceptions très long et la difficulté à transcrire de manière formalisée un domaine de travail en des représentations écologiques. De même, il n’existe pas d’outil formel de validation de domaine de travail. Dans ce manuscrit, nous proposons plusieurs réponses à la question : comment formaliser la conception d’une interface écologique, afin de réduire le temps et les efforts liés à la conception ? La première proposition est un outil de vérification de modèle de domaine de travail basé sur la méthode TMTA (Morineau, 2010). La seconde apporte, au travers d’une deuxième version du flot Anaxagore (Bignon, 2012), une intégration des travaux de Liu et al (2002) avec le principe d’une bibliothèque de widgets écologiques associée à un schéma d’entrées de haut niveau. Sur la base du domaine de travail d’un système d’eau douce sanitaire à bord d’un navire, une interface écologique a été implémentée et validée expérimentalement. Cette interface a été comparée à une interface conventionnelle générée également par le flot Anaxagore. Les résultats montrent que les interfaces écologiques favorisent un plus grand nombre de parcours cohérents dans un domaine de travail. Elles favorisent également une meilleure précision du diagnostic pour les opérateurs utilisant les interfaces écologiques. / The ecological interface design is composed of two steps, a work domain analysis and a transcription of the information of the work domain into ecological representation (Naikar, 2010). This kind of design showed his effectiveness for the supervision of complex system (Burns, 2008). Nevertheless, Vicente (2002) highlighted two issues, the long design time and the difficulties to translate with a formal way a work domain into ecological representation. Moreover, he doesn’t exist a formal tool of validation for a work domain. Several tools and works allow to be comfortable in the possibility to find some solution (Functional methodology (Liu et al, 2002), TMTA (Morineau, 2010) and Anaxagore (Bignon, 2012). We propose several answers at the issue: how formalize the design of an ecological interface in order to reduce the time and effort linked to the design? The first proposition is a tool of verification of model of work domain based on a simulation by TMTA. The second bring thanks to a second version of the Anaxagore flow, an integration of the works of Liu et al (2002) with the principle of the ecological library of ecological widget linked to a scheme of input of high level. Based on the work domain of a fresh water system in a ship, an ecological interface has been implemented and validated experimentally. This interface has been compared with a conventional interface also generated by Anaxagore. The results show that the ecological interface promotes a biggest numbers of coherent ways in the work domain. This kind of interface also promotes a better accuracy of the diagnostic for the operators using the ecological interface.
15

Construção dos gráficos de Shewhart e avaliação de sua eficiência no controle de processos de envase /

Cruz, Raul Acedo Pinto Alves da. January 2019 (has links)
Orientador: Marcela Aparecida Guerreiro Machado de Freitas / Banca: Paloma Maria Silva Rocha Rizol / Banca: Fabricio Maciel Gomes / Resumo: Os gráficos de Shewhart há muito auxiliam na manutenção da estabilidade de diversos processos de produção. Um dos processos que mais intensamente dependem de ajustes finos e monitoramento são os processos de envase pois são executados em alta velocidade e contém muitos pontos críticos de controle. Este trabalho contempla a implantação dos gráficos de controle de Shewhart no processo de envase do equipamento de envase de líquidos do DPD da FEG-UNESP e as dificuldades inerentes ao ajuste e operação do equipamento desde a sua instalação. Testes foram executados com o intuito de gerar uma base de dados para verificar se causas especiais estão interferindo no processo, utilizando histograma, diagrama de Ishikawa de causa e efeito e ferramentas de verificação de normalidade dos dados como o teste de Shapiro-Wilk. Foi também verificado o número médio de amostras até o alarme falso NMA0, e simulados resultados que alteram a média do processo para verificar a sua eficiência. Os resultados mostram que mesmo com a ação inicial de causas especiais sobre o processo, estas foram corrigidas, e após ajustes o equipamento encontra-se em controle estatístico / Abstract: Shewhart control charts assist, since long time ago, in the maintenance of stability in many production processes. One of the most intense processes that are dependent on fine adjustments and monitoring are the filling process due to its high speed and lots of critical control points. This work covers the implementation of Shewhart control charts on the equipment of liquid filling process of DPD of FEG-UNESP and the inherent difficulties of adjustment and operation of this equipment since its installation. Trials were performed with intention to generate a database to verify if special causes are interfering in the process, using histogram, Ishikawa diagram of cause and effect and tools for data normality verification as the Shapiro-Wilk test. The Average Run Length, ARL0, was verified and simulated results were used to check the eficiency. The results indicate that even with the initial action of special causes on the process, they were fixed, and after adjustments the equipment is under statistical control / Mestre
16

Nonparametric procedures for process control when the control value is not specified

Park, Changsoon January 1984 (has links)
In industrial production processes, control charts have been developed to detect changes in the parameters specifying the quality of the production so that some rectifying action can be taken to restore the parameters to satisfactory values. Examples of the control charts are the Shewhart chart and the cumulative sum control chart (CUSUM chart). In designing a control chart, the exact distribution of the observations, e.g. normal distribution, is usually assumed to be known. But, when there is not sufficient information in determining the distribution, nonparametric procedures are appropriate. In such cases, the control value for the parameter may not be given because of insufficient information. To construct a control chart when the control value is not given, a standard sample must be obtained when the process is known to be under control so that the quality of the product can be maintained at the same level as that of the standard sample. For this purpose, samples of fixed size are observed sequentially, and at each time a sample is observed a two-sample nonparametric statistic is obtained from the standard sample and the sequentially observed sample. With these sequentially obtained statistics, the usual process control procedure can be done. The truncation point is applied to denote the finite run length or the time at which sufficient information about the distribution of the observations and/or the control value is obtained so that the procedure may be switched to a parametric procedure or a nonparametric procedure with a control value. To lessen the difficulties in the dependent structure of the statistics we use the fact that conditioned on the standard sample the statistics are i.i.d. random variables. Upper and lower bounds of the run length distribution are obtained for the Shewhart chart. A Brownian motion process is used to approximate the discrete time process of the CUSUM chart. The exact run length distribution of the approximated CUSUM chart is derived by using the inverse Laplace transform. Applying an appropriate correction to the boundary improves the approximation. / Ph. D.
17

The use of classification methods for gross error detection in process data

Gerber, Egardt 12 1900 (has links)
Thesis (MScEng)-- Stellenbosch University, 2013. / ENGLISH ABSTRACT: All process measurements contain some element of error. Typically, a distinction is made between random errors, with zero expected value, and gross errors with non-zero magnitude. Data Reconciliation (DR) and Gross Error Detection (GED) comprise a collection of techniques designed to attenuate measurement errors in process data in order to reduce the effect of the errors on subsequent use of the data. DR proceeds by finding the optimum adjustments so that reconciled measurement data satisfy imposed process constraints, such as material and energy balances. The DR solution is optimal under the assumed statistical random error model, typically Gaussian with zero mean and known covariance. The presence of outliers and gross errors in the measurements or imposed process constraints invalidates the assumptions underlying DR, so that the DR solution may become biased. GED is required to detect, identify and remove or otherwise compensate for the gross errors. Typically GED relies on formal hypothesis testing of constraint residuals or measurement adjustment-based statistics derived from the assumed random error statistical model. Classification methodologies are methods by which observations are classified as belonging to one of several possible groups. For the GED problem, artificial neural networks (ANN’s) have been applied historically to resolve the classification of a data set as either containing or not containing a gross error. The hypothesis investigated in this thesis is that classification methodologies, specifically classification trees (CT) and linear or quadratic classification functions (LCF, QCF), may provide an alternative to the classical GED techniques. This hypothesis is tested via the modelling of a simple steady-state process unit with associated simulated process measurements. DR is performed on the simulated process measurements in order to satisfy one linear and two nonlinear material conservation constraints. Selected features from the DR procedure and process constraints are incorporated into two separate input vectors for classifier construction. The performance of the classification methodologies developed on each input vector is compared with the classical measurement test in order to address the posed hypothesis. General trends in the results are as follows: - The power to detect and/or identify a gross error is a strong function of the gross error magnitude as well as location for all the classification methodologies as well as the measurement test. - For some locations there exist large differences between the power to detect a gross error and the power to identify it correctly. This is consistent over all the classifiers and their associated measurement tests, and indicates significant smearing of gross errors. - In general, the classification methodologies have higher power for equivalent type I error than the measurement test. - The measurement test is superior for small magnitude gross errors, and for specific locations, depending on which classification methodology it is compared with. There is significant scope to extend the work to more complex processes and constraints, including dynamic processes with multiple gross errors in the system. Further investigation into the optimal selection of input vector elements for the classification methodologies is also required. / AFRIKAANSE OPSOMMING: Alle prosesmetings bevat ʼn sekere mate van metingsfoute. Die fout-element van ʼn prosesmeting word dikwels uitgedruk as bestaande uit ʼn ewekansige fout met nul verwagte waarde, asook ʼn nie-ewekansige fout met ʼn beduidende grootte. Data Rekonsiliasie (DR) en Fout Opsporing (FO) is ʼn versameling van tegnieke met die doelwit om die effek van sulke foute in prosesdata op die daaropvolgende aanwending van die data te verminder. DR word uitgevoer deur die optimale veranderinge aan die oorspronklike prosesmetings aan te bring sodat die aangepaste metings sekere prosesmodelle gehoorsaam, tipies massa- en energie-balanse. Die DR-oplossing is optimaal, mits die statistiese aannames rakende die ewekansige fout-element in die prosesdata geldig is. Dit word tipies aanvaar dat die fout-element normaal verdeel is, met nul verwagte waarde, en ʼn gegewe kovariansie matriks. Wanneer nie-ewekansige foute in die data teenwoordig is, kan die resultate van DR sydig wees. FO is daarom nodig om nie-ewekansige foute te vind (Deteksie) en te identifiseer (Identifikasie). FO maak gewoonlik staat op die statistiese eienskappe van die meting aanpassings wat gemaak word deur die DR prosedure, of die afwykingsverskil van die model vergelykings, om formele hipoteses rakende die teenwoordigheid van nie-ewekansige foute te toets. Klassifikasie tegnieke word gebruik om die klasverwantskap van observasies te bepaal. Rakende die FO probleem, is sintetiese neurale netwerke (SNN) histories aangewend om die Deteksie en Identifikasie probleme op te los. Die hipotese van hierdie tesis is dat klassifikasie tegnieke, spesifiek klassifikasiebome (CT) en lineêre asook kwadratiese klassifikasie funksies (LCF en QCF), suksesvol aangewend kan word om die FO probleem op te los. Die hipotese word ondersoek deur middel van ʼn simulasie rondom ʼn eenvoudige gestadigde toestand proses-eenheid wat aan een lineêre en twee nie-lineêre vergelykings onderhewig is. Kunsmatige prosesmetings word geskep met behulp van lukrake syfers sodat die foutkomponent van elke prosesmeting bekend is. DR word toegepas op die kunsmatige data, en die DR resultate word gebruik om twee verskillende insetvektore vir die klassifikasie tegnieke te skep. Die prestasie van die klassifikasie metodes word vergelyk met die metingstoets van klassieke FO ten einde die gestelde hipotese te beantwoord. Die onderliggende tendense in die resultate is soos volg: - Die vermoë om ‘n nie-ewekansige fout op te spoor en te identifiseer is sterk afhanklik van die grootte asook die ligging van die fout vir al die klassifikasie tegnieke sowel as die metingstoets. - Vir sekere liggings van die nie-ewekansige fout is daar ‘n groot verskil tussen die vermoë om die fout op te spoor, en die vermoë om die fout te identifiseer, wat dui op smering van die fout. Al die klassifikasie tegnieke asook die metingstoets baar hierdie eienskap. - Oor die algemeen toon die klassifikasie metodes groter sukses as die metingstoets. - Die metingstoets is meer suksesvol vir relatief klein nie-ewekansige foute, asook vir sekere liggings van die nie-ewekansige fout, afhangende van die klassifikasie tegniek ter sprake. Daar is verskeie maniere om die bestek van hierdie ondersoek uit te brei. Meer komplekse, niegestadigde prosesse met sterk nie-lineêre prosesmodelle en meervuldige nie-ewekansige foute kan ondersoek word. Die moontlikheid bestaan ook om die prestasie van klassifikasie metodes te verbeter deur die gepaste keuse van insetvektor elemente.
18

A brief introduction to basic multivariate economic statistical process control

Mudavanhu, Precious 12 1900 (has links)
Thesis (MComm)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: Statistical process control (SPC) plays a very important role in monitoring and improving industrial processes to ensure that products produced or shipped to the customer meet the required specifications. The main tool that is used in SPC is the statistical control chart. The traditional way of statistical control chart design assumed that a process is described by a single quality characteristic. However, according to Montgomery and Klatt (1972) industrial processes and products can have more than one quality characteristic and their joint effect describes product quality. Process monitoring in which several related variables are of interest is referred to as multivariate statistical process control (MSPC). The most vital and commonly used tool in MSPC is the statistical control chart as in the case of the SPC. The design of a control chart requires the user to select three parameters which are: sample size, n , sampling interval, h and control limits, k.Several authors have developed control charts based on more than one quality characteristic, among them was Hotelling (1947) who pioneered the use of the multivariate process control techniques through the development of a 2 T -control chart which is well known as Hotelling 2 T -control chart. Since the introduction of the control chart technique, the most common and widely used method of control chart design was the statistical design. However, according to Montgomery (2005), the design of control has economic implications. There are costs that are incurred during the design of a control chart and these are: costs of sampling and testing, costs associated with investigating an out-of-control signal and possible correction of any assignable cause found, costs associated with the production of nonconforming products, etc. The paper is about giving an overview of the different methods or techniques that have been employed to develop the different economic statistical models for MSPC. The first multivariate economic model presented in this paper is the economic design of the Hotelling‟s 2 T -control chart to maintain current control of a process developed by Montgomery and Klatt (1972). This is followed by the work done by Kapur and Chao (1996) in which the concept of creating a specification region for the multiple quality characteristics together with the use of a multivariate quality loss function is implemented to minimize total loss to both the producer and the customer. Another approach by Chou et al (2002) is also presented in which a procedure is developed that simultaneously monitor the process mean and covariance matrix through the use of a quality loss function. The procedure is based on the test statistic 2ln L and the cost model is based on Montgomery and Klatt (1972) as well as Kapur and Chao‟s (1996) ideas. One example of the use of the variable sample size technique on the economic and economic statistical design of the control chart will also be presented. Specifically, an economic and economic statistical design of the 2 T -control chart with two adaptive sample sizes (Farazet al, 2010) will be presented. Farazet al (2010) developed a cost model of a variable sampling size 2 T -control chart for the economic and economic statistical design using Lorenzen and Vance‟s (1986) model. There are several other approaches to the multivariate economic statistical process control (MESPC) problem, but in this project the focus is on the cases based on the phase II stadium of the process where the mean vector, and the covariance matrix, have been fairly well established and can be taken as known, but both are subject to assignable causes. This latter aspect is often ignored by researchers. Nevertheless, the article by Farazet al (2010) is included to give more insight into how more sophisticated approaches may fit in with MESPC, even if the mean vector, only may be subject to assignable cause. Keywords: control chart; statistical process control; multivariate statistical process control; multivariate economic statistical process control; multivariate control chart; loss function. / AFRIKAANSE OPSOMMING: Statistiese proses kontrole (SPK) speel 'n baie belangrike rol in die monitering en verbetering van industriële prosesse om te verseker dat produkte wat vervaardig word, of na kliënte versend word wel aan die vereiste voorwaardes voldoen. Die vernaamste tegniek wat in SPK gebruik word, is die statistiese kontrolekaart. Die tradisionele wyse waarop statistiese kontrolekaarte ontwerp is, aanvaar dat ‟n proses deur slegs 'n enkele kwaliteitsveranderlike beskryf word. Montgomery and Klatt (1972) beweer egter dat industriële prosesse en produkte meer as een kwaliteitseienskap kan hê en dat hulle gesamentlik die kwaliteit van 'n produk kan beskryf. Proses monitering waarin verskeie verwante veranderlikes van belang mag wees, staan as meerveranderlike statistiese proses kontrole (MSPK) bekend. Die mees belangrike en algemene tegniek wat in MSPK gebruik word, is ewe eens die statistiese kontrolekaart soos dit die geval is by SPK. Die ontwerp van 'n kontrolekaart vereis van die gebruiker om drie parameters te kies wat soos volg is: steekproefgrootte, n , tussensteekproefinterval, h en kontrolegrense, k . Verskeie skrywers het kontrolekaarte ontwikkel wat op meer as een kwaliteitseienskap gebaseer is, waaronder Hotelling wat die gebruik van meerveranderlike proses kontrole tegnieke ingelei het met die ontwikkeling van die T2 -kontrolekaart wat algemeen bekend is as Hotelling se 2 T -kontrolekaart (Hotelling, 1947). Sedert die ingebruikneming van die kontrolekaart tegniek is die statistiese ontwerp daarvan die mees algemene benadering en is dit ook in daardie formaat gebruik. Nietemin, volgens Montgomery and Klatt (1972) en Montgomery (2005), het die ontwerp van die kontrolekaart ook ekonomiese implikasies. Daar is kostes betrokke by die ontwerp van die kontrolekaart en daar is ook die kostes t.o.v. steekproefneming en toetsing, kostes geassosieer met die ondersoek van 'n buite-kontrole-sein, en moontlike herstel indien enige moontlike korreksie van so 'n buite-kontrole-sein gevind word, kostes geassosieer met die produksie van niekonforme produkte, ens. In die eenveranderlike geval is die hantering van die ekonomiese eienskappe al in diepte ondersoek. Hierdie werkstuk gee 'n oorsig oor sommige van die verskillende metodes of tegnieke wat al daargestel is t.o.v. verskillende ekonomiese statistiese modelle vir MSPK. In die besonder word aandag gegee aan die gevalle waar die vektor van gemiddeldes sowel as die kovariansiematriks onderhewig is aan potensiële verskuiwings, in teenstelling met 'n neiging om slegs na die vektor van gemiddeldes in isolasie te kyk synde onderhewig aan moontlike verskuiwings te wees.
19

Exploring Data Quality of Weigh-In-Motion Systems

Dai, Chengxin 24 July 2013 (has links)
This research focuses on the data quality control methods for evaluating the performance of Weigh-In-Motion (WIM) systems on Oregon highways. This research identifies and develops a new methodology and algorithm to explore the accuracy of each station's weight and spacing data at a corridor level, and further implements the Statistical Process Control (SPC) method, finite mixture model, axle spacing error rating method, and data flag method in published research to examine the soundness of WIM systems. This research employs the historical WIM data to analyze sensor health and compares the evaluation results of the methods. The results suggest the new triangulation method identified most possible WIM malfunctions that other methods sensed, and this method unprecedentedly monitors the process behavior with controls of time and meteorological variables. The SPC method appeared superior in differentiating between sensor noises and sensor errors or drifts, but it drew wrong conclusions when accurate WIM data reference was absent. The axle spacing error rating method cannot check the essential weight data in special cases, but reliable loop sensor evaluation results were arrived at by employing this multiple linear regression model. The results of the data flag method and the finite mixed model results were not accurate, thus they could be used as additional tools to complement the data quality evaluation results. Overall, these data quality analysis results are the valuable sources for examining the early detection of system malfunctions, sensor drift, etc., and allow the WIM operators to correct the situation on time before large amounts of measurement are lost.
20

Multiscale fractality with application and statistical modeling and estimation for computer experiment of nano-particle fabrication

Woo, Hin Kyeol 24 August 2012 (has links)
The first chapter proposes multifractal analysis to measure inhomogeneity of regularity of 1H-NMR spectrum using wavelet-based multifractal tools. The geometric summaries of multifractal spectrum are informative summaries, and as such employed to discriminate 1H-NMR spectra associated with different treatments. The methodology is applied to evaluate the effect of sulfur amino acids. The second part of this thesis provides essential materials for understanding engineering background of a nano-particle fabrication process. The third chapter introduces a constrained random effect model. Since there are certain combinations of process variables resulting to unproductive process outcomes, a logistic model is used to characterize such a process behavior. For the cases with productive outcomes a normal regression serves the second part of the model. Additionally, random-effects are included in both logistics and normal regression models to describe the potential spatial correlation among data. This chapter researches a way to approximate the likelihood function and to find estimates for maximizing the approximated likelihood. The last chapter presents a method to decide the sample size under multi-layer system. The multi-layer is a series of layers, which become smaller and smaller. Our focus is to decide the sample size in each layer. The sample size decision has several objectives, and the most important purpose is the sample size should be enough to give a right direction to the next layer. Specifically, the bottom layer, which is the smallest neighborhood around the optimum, should meet the tolerance requirement. Performing the hypothesis test of whether the next layer includes the optimum gives the required sample size.

Page generated in 0.0952 seconds