• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 9
  • 1
  • 1
  • 1
  • Tagged with
  • 28
  • 28
  • 28
  • 9
  • 9
  • 8
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Micro-injection moulding of three-dimensional integrated microfluidic devices

Attia, Usama M. January 2009 (has links)
This thesis investigates the use of micro-injection moulding (μIM), as a high-volume process, for producing three-dimensional, integrated microfluidic devices. It started with literature reviews that covered three topics: μIM of thermoplastic microfluidics, designing for three-dimensional (3-D) microfluidics and functional integration in μIM. Research gaps were identified: Designing 3-D microfluidics within the limitations of μIM, process optimisation and the integration of functional elements. A process chain was presented to fabricate a three-dimensional microfluidic device for medical application by μIM. The thesis also investigated the effect of processing conditions on the quality of the replicated component. The design-of-experiments (DOE) approach is used to highlight the significant processing conditions that affect the part mass taking into consideration the change in part geometry. The approach was also used to evaluate the variability within the process and its effect on the replicability of the process. Part flatness was also evaluated with respect to post-filling process parameters. The thesis investigated the possibility of integrating functional elements within μIM to produce microfluidic devices with hybrid structures. The literature reviews highlighted the importance of quality control in high-volume micromoulding and in-line functional integration in microfluidics. A taxonomy of process integration was also developed based on transformation functions. The experimental results showed that μIM can be used to fabricate microfluidic devices that have true three-dimensional structures by subsequent lamination. The DOE results showed a significant effect of individual process variables on the filling quality of the produced components and their flatness. The geometry of the replicated component was shown to have effect on influential parameters. Other variables, on the other hand, were shown to have a possible effect on process variability. Optimization statistical tools were used to improve multiple quality criteria. Thermoplastic elastomers (TPE) were processed with μIM to produce hybrid structures with functional elements.
12

Critérios de aceitação e controle da qualidade da execução de camadas de fundação de pavimentos novos através de métodos deflectométricos. / Criteria for acceptance and quality control to execution of new pavements foundation layers through nondestructive deflection tests.

Santi Ferri 14 December 2012 (has links)
As obras de infraestrutura rodoviária representam investimentos públicos que influenciam de maneira determinante no desenvolvimento social e econômico de uma região. Pavimentos mal construídos, com controle de qualidade inadequado ou que não atendam às exigências de projeto em termos de características e espessuras dos materiais selecionados representam prejuízos sociais e econômicos na medida em que a deterioração precoce do pavimento implica em aumentos crescentes nos custos diretos e indiretos relacionados à conservação e operação rodoviária. Desta maneira, torna-se imprescindível que sejam adotadas medidas de garantia da qualidade da construção de modo que o investimento apresente melhores condições de atingir o retorno esperado, tal como planejado. Neste contexto, o presente trabalho tem por objetivo realizar revisão bibliográfica e propor ferramentas auxiliares de análise voltados ao controle deflectométrico do sistema de fundação dos pavimentos. Os estudos realizados baseiam-se em critérios estatísticos de análise e em critérios de ruptura adotados nos métodos de dimensionamento em uso corrente no Brasil e no exterior. Simulações e estudos de casos são utilizados na validação e apresentação dos modelos propostos. / The infrastructure road projects represent investments, often public, that influence in a decisive way on social and economic development of a region. Poorly constructed pavements with inadequate quality control or that do not meet design requirements in terms of selected materials characteristics and thickness represent social and economic losses since the early deterioration of the pavement results in incremental increases in direct and indirect costs related the road maintenance and operation. Thus, it is essential that actions be taken to guarantee the construction quality so that investments are more likely to achieve the expected return as planned. In this context, this research aims to conduct a literature review and propose auxiliary analysis tools related with nondestructive deflection tests in the pavements foundation systems. The studies will be based in statistical analysis criteria and failure criteria adopted in design methods currently in use in Brazil and abroad. Simulations and study cases are used in the validation and presentation of the proposed models.
13

A statistical quality control support system to facilitate acceptance sampling and control chart procedures

Nadeem, Mohammed January 1994 (has links)
No description available.
14

Error equivalence theory for manufacturing process control

Wang, Hui 01 June 2007 (has links)
Due to uncertainty in manufacturing processes, applied probability and statistics have been widely applied for quality and productivity improvement. In spite of significant achievements made in causality modeling for control of process variations, there exists a lack of understanding on error equivalence phenomenon, which concerns the mechanism that different error sources result in identical variation patterns on part features. This so called error equivalence phenomenon could have dual effects on dimensional control: significantly increasing the complexity of root cause identification, and providing an opportunity to use one error source to counteract or compensate the others. Most of previous research has focused on analyses of individual errors, process modeling of variation propagation, process diagnosis, reduction of sensing noise, and error compensation for machine tool. This dissertation presents a mathematical formulation of the error equivalence to achieve a better, insightful understanding, and control of manufacturing process. The first issue to be studied is mathematical modeling of the error equivalence phenomenon in manufacturing to predict product variation. Using kinematic analysis and analytical geometry, the research derives an error equivalence model that can transform different types of errors to the equivalent amount of one base error. A causal process model is then developed to predict the joint impact of multiple process errors on product features. Second, error equivalence analysis is conducted for root cause identification. Based on the error equivalence modeling, this study proposes a sequential root cause identification procedure to detect and pinpoint the error sources. Comparing with the conventional measurement strategy, the proposed sequential procedure identifies the potential error sources more effectively. Finally, an error-canceling-error compensation strategy with integration of statistical quality control is proposed. A novel error compensation approach has been proposed to compensate for process errors by controlling the base error. The adjustment process and product quality will be monitored by quality control charts. Based on the monitoring results, an updating scheme is developed to enhance the stability and sensitivity of the compensation algorithm. These aspects constitute the "Error Equivalence Theory". The research will lead to new analytical tools and algorithms for continuous variation reduction and quality improvement in manufacturing.
15

Estudo do desempenho dos gráficos de controle quando a média do processo oscila de acordo com o modelo AR(1) /

Leoni, Roberto Campos. January 2011 (has links)
Resumo: No planejamento dos gráficos de controle destinados ao monitoramento da média do processo, assume-se que esta permanece fixa em seu valor alvo até a ocorrência de uma causa especial, que a desloca. Em muitos processos, contudo, é mais razoável supor que a média oscila mesmo na ausência de causas especiais. Para descrever este comportamento oscilatório, tem-se utilizado o modelo autoregressivo de 1ª ordem, AR (1). Quando esta oscilação é grande, o melhor desempenho do gráfico de X é obtido com amostras unitárias. O mesmo não se observa com a carta de EWMA (exceto quando o parâmetro de ponderação  é próximo de um); os melhores desempenhos são obtidos com a adoção de amostras de tamanho n>1 e  pequeno, mesmo quando o objetivo é a detecção rápida de grandes deslocamentos da média. Neste estudo, utiliza-se como medida de desempenho o TES - tempo médio entre a ocorrência de uma mudança na posição em torno da qual a média oscila e sua sinalização pelo gráfico de controle. Quando a média do processo oscila, o TES passa a ser uma função do número esperado de visitas aos estados transientes de uma cadeia de Markov / Abstract: The design of the control charts for the process mean, assumes that this parameter remains fixed on its target value until the occurrence of a special cause that shifts it. In many cases, however, it is more reasonable to assume that the mean wanders even in the absence of special causes. To describe this wandering behavior, has used the AR(1) model. When the wandering behavior is responsible for significant proportion of the data variability, the best performance of the X chart is obtained with samples of size one (n=1). The same is not true with the EWMA control chart (except when the smoothing parameter  is very close to one), its best performance is achieved with the adoption of n>1 and small , even to detect large changes in the process mean position. In this study, the average time between the occurrence of a change in the process mean position and the signal (TES) - is used to assess the chart's performance. With the process mean wandering, this measure of performance becomes function of the expected number of visits to the transient states of a Markov chain / Orientador: Antonio Fernando Branco Costa / Coorientador: Marcela Aparecida Guerreiro Machado / Banca: Mauro Hugo Mathias / Banca: Fernando Antonio Elias Claro / Mestre
16

Utilização do sinal analitico liquido para avaliação de modelos de calibração multivariada atraves do calculo de figuras de merito e de cartas de controle / Utilization of net analyte signal for validation of models of multivariate calibration by using figure of merit and chart control

Rocha, Werickson Fortunato de Carvalho 18 July 2007 (has links)
Orientador: Ronei Jesus Poppi / Dissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Quimica / Made available in DSpace on 2018-08-08T23:09:23Z (GMT). No. of bitstreams: 1 Rocha_WericksonFortunatodeCarvalho_M.pdf: 1597999 bytes, checksum: b9442a7ba3c5fb6665bc87f6c0abf34f (MD5) Previous issue date: 2007 / Resumo: Este trabalho teve como objetivo realizar a validação de modelos de calibração multivariada através do cálculo de figuras de mérito e de cartas de controle multivariadas do medicamento Nimesulida por meio do infravermelho próximo. Foram preparadas 69 amostras sintéticas contendo o princípio ativo (Nimesulida) na faixa 10,38-39,47% (m/m) em excipiente (lactose, povidona-KV29- 32, celulose 200, lauril sulfato de sódio, croscarmelose sódica e estereato de magnésio). Destas, 49 amostras foram utilizadas para a calibração e 20 para a validação as quais foram separadas através do algoritmo de Kennard-Stone.O tratamento utilizado nos espectros foi a correção do espalhamento multiplicativo. Na sequência, foram obtidos os seguintes valores para as figuras de mérito: limite de detecção (0,61), limite de quantificação (2,03), exatidão (RMESC 0,66 %(m/m), RMESCV 0,92 %(m/m), RMESP 1,05 %(m/m) de Nimesulida), seletividade média (0,0056), sensibilidade (0,0036), inverso da sensibilidade analítica (0,20 %(m/m)-1 de Nimesulida) e razão sinal ruído (181,11). Na segunda parte do trabalho foram utilizadas 113 amostras sintéticas para construção das cartas de controle multivariadas. Foram desenvolvidas três cartas de controle e calculado os limites de controle para cada carta. Os valores encontrados foram: - carta NAS (Superior = 6,54x10; Inferior = 5,25x10), carta interferente (7,45) e carta resíduo (2,38x10). Através dessas cartas de controle multivariadas foi possível identificar as amostras que estavam dentro e fora de controle. Obteve-se 64 amostras fora de controle e 20 amostras dentro de controle de acordo com o planejamento experimental realizado. Sendo assim, foi possível identificar, de forma qualitativa, as amostras de Nimesulida que estavam dentro e fora de controle. Portanto, a dissertação desenvolvida sugere um novo método analítico que pode ser usado para controle de qualidade em fármacos e para validação de modelos de calibração multivariada, pois os resultados obtidos, indicam que o modelo desenvolvido pode ser utilizado na indústria farmacêutica como uma alternativa ao método-padrão / Abstract: This study was mainly intended to elaborate the validation of multivariate calibration models based on the determination of figures of merit and multivariate control charts constructed with data from Nimesulide tablets upon near-infrared spectroscopy use. A total of 69 synthetic samples were prepared containing the active principle (Nimesulide) in the range of 10,38-39, 47% (m/m) in excipients (Cellulose,Sodium Lauryl Sulphate, Magnesium Stearate, Carmellose Sodium,Povidone and Lactose). It was used 49 samples for the calibration and 20 for the validation that were separated by the Kennard-Stone algorithm. It was used the multiplicative scatter correction on the spectra set. Then, the following values for figures of merit were calculated: limit of detection (0,61) , limit quantification (2,03) , accuracy (RMESC 0,66 %(m/m), RMESCV 0,92%(m/m), RMESP 1,05%(m/m) of Nimesulide), mean selectivity (0,0056), sensitivity (0,0036), inverse analytical sensitivity (0,20 %(m/m)- 1 of the Nimesulide) and signal-to-noise ration (181, 11). In the second application, 113 synthetic samples were used for the construction of multivariate control charts. Three control charts were designed and the control limits for each chart were calculated. The values found were: - chart NAS (upper = 6,54x10; below = 5,25x10), chart interferent (7,45) and chart residual (2,38x10). From these charts it was possible to identify the samples that were in and out of control. It was obtained 64 samples out of control and 20 samples in control, according to the experimental design. Therefore, it was possible to identify, in a qualitative way, the Nimisulide samples that were in control and the ones that were out of control. Hence it follows that, this research project suggests a new analytical method that can be used for quality control in drugs and also for the validation of multivariate calibration models. Thus, the results obtained indicate that this model can be used in pharmaceutical industries as an alternative to the standard procedure / Mestrado / Quimica Analitica / Mestre em Química
17

Cost Optimisation through Statistical Quality Control : A case study on the plastic industry

Moberg, Pontus, Svensson, Filip January 2021 (has links)
Background. Shewhart was the first to describe the possibilities that come with having a statistically robust process in 1924. Since his discovery, the importance of a robust process became more apparent and together with the consequences of an unstable process. A firm with a manufacturing process that is out of statistical control tends to waste money, increase risks, and provide an uncertain quality to its customers. The framework of Statistical Quality Control has been developed since its founding, and today it is a well-established tool used in several industries with successful results. When it was first thought of, complicated calculations had to be performed and was performed manually. With digitalisation, the quality tools can be used in real-time, providing high-precision accuracy on the quality of the product. Despite this, not all firms nor industries have started using these tools as of today.    The costs that occur in relation to the quality, either as a consequence of maintaining good quality or that arises from poor quality, are called Cost of Quality. These are often displayed through one of several available cost models. In this thesis, we have created a cost model that was heavily inspired by the P-A-F model. Several earlier studies have shown noticeable results by using SPC, COQ or a combination of them both.     Objectives. The objective of this study is to determine if cost optimisation could be utilised through SQC implementation. The cost optimisation is a consequence of an unstable process and the new way of thinking that comes with SQC. Further, it aims to explore the relationship between cost optimisation and SQC. Adding a layer of complexity and understanding to the spread of Statistical Quality Tools and their importance for several industries. This will contribute to tightening the bonds of production economics, statistical tools and quality management even further.   Methods. This study made use of two closely related methodologies, combining SPC with Cost of Quality. The combination of these two hoped to demonstrate a possible cost reduction through stabilising the process. The cost reduction was displayed using an optimisation model based on the P-A-F (Prevention, Appraisal, External Failure and Internal Failure) and further developed by adding a fifth parameter for optimising materials (OM). Regarding whether the process was in control or not, we focused on the thickness of the PVC floor, 1008 data points over three weeks were retrieved from the production line, and by analysing these, a conclusion on whether the process was in control could be drawn.    Results. Firstly, none of the three examined weeks were found to be in statistical control, and therefore, nor were the total sample. Through the assumption of the firm achieving 100% statistical control over their production process, a possible cost reduction of 874 416 SEK yearly was found.    Conclusions. This study has proven that through focusing on stabilising the production process and achieving control over their costs related to quality, possible significant yearly savings can be achieved. Furthermore, an annual cost reduction was found by optimising the usage of materials by relocating the ensuring of thickness variation from post-production to during the production.
18

Process Monitoring with Multivariate Data:Varying Sample Sizes and Linear Profiles

Kim, Keunpyo 01 December 2003 (has links)
Multivariate control charts are used to monitor a process when more than one quality variable associated with the process is being observed. The multivariate exponentially weighted moving average (MEWMA) control chart is one of the most commonly recommended tools for multivariate process monitoring. The standard practice, when using the MEWMA control chart, is to take samples of fixed size at regular sampling intervals for each variable. In the first part of this dissertation, MEWMA control charts based on sequential sampling schemes with two possible stages are investigated. When sequential sampling with two possible stages is used, observations at a sampling point are taken in two groups, and the number of groups actually taken is a random variable that depends on the data. The basic idea is that sampling starts with a small initial group of observations, and no additional sampling is done at this point if there is no indication of a problem with the process. But if there is some indication of a problem with the process then an additional group of observations is taken at this sampling point. The performance of the sequential sampling (SS) MEWMA control chart is compared to the performance of standard control charts. It is shown that that the SS MEWMA chart is substantially more efficient in detecting changes in the process mean vector than standard control charts that do not use sequential sampling. Also the situation is considered where different variables may have different measurement costs. MEWMA control charts with unequal sample sizes based on differing measurement costs are investigated in order to improve the performance of process monitoring. Sequential sampling plans are applied to MEWMA control charts with unequal sample sizes and compared to the standard MEWMA control charts with a fixed sample size. The steady-state average time to signal (SSATS) is computed using simulation and compared for some selected sets of sample sizes. When different variables have significantly different measurement costs, using unequal sample sizes can be more cost effective than using the same fixed sample size for each variable. In the second part of this dissertation, control chart methods are proposed for process monitoring when the quality of a process or product is characterized by a linear function. In the historical analysis of Phase I data, methods including the use of a bivariate <i>T</i>² chart to check for stability of the regression coefficients in conjunction with a univariate Shewhart chart to check for stability of the variation about the regression line are recommended. The use of three univariate control charts in Phase II is recommended. These three charts are used to monitor the <i>Y</i>-intercept, the slope, and the variance of the deviations about the regression line, respectively. A simulation study shows that this type of Phase II method can detect sustained shifts in the parameters better than competing methods in terms of average run length (ARL) performance. The monitoring of linear profiles is also related to the control charting of regression-adjusted variables and other methods. / Ph. D.
19

Εμπλουτισμός στατιστικού ελέγχου ποιότητας με τεχνικές μηχανικής μάθησης / Augmenting statistical quality control with machine learning techniques

Φουντουλάκη, Αικατερίνη 09 January 2012 (has links)
Η παρούσα διατριβή αφορά στην ολοκλήρωση των μεθόδων Στατιστικού Ελέγχου Ποιότητας με τεχνικές Μηχανικής Μάθησης, για την καλύτερη εξυπηρέτηση των αναγκών των σύγχρονων επιχειρήσεων. Προς αυτή την κατεύθυνση, έγινε αρχικά μια λεπτομερής ανασκόπηση της σχετικής βιβλιογραφίας για τον εντοπισμό και την αναγνώριση των σημαντικότερων ελλείψεων του Στατιστικού Ελέγχου Ποιότητας. Στη συνέχεια, χρησιμοποιήθηκαν τεχνικές Μηχανικής Μάθησης για την αντιμετώπιση των παραπάνω ελλείψεων. Πιο συγκεκριμένα, προτάθηκε μια μεθοδολογία για αναγνώριση μέσων μετατοπίσεων σε αυτοσυσχετιζόμενα δεδομένα πολυμεταβλητών διεργασιών, τα οποία συναντώνται πολύ συχνά σε πραγματικές διεργασίες. Η προτεινόμενη μεθοδολογία δοκιμάζεται και ελέγχεται ως προς την απόδοσή της και την ικανότητά της για εφαρμογή σε δεδομένα διαφορετικής φύσεως σε δυο μελέτες περίπτωσης. Τα αποτελέσματα από τις μελέτες αυτές είναι ενθαρρυντικά καθώς επιτεύχθηκαν αρκετά υψηλά ποσοστά επιτυχών αναγνωρίσεων μέσων μετατοπίσεων. Η διατριβή ολοκληρώνεται με παράθεση μιας σειράς συμπερασμάτων, ανάδειξη της συμβολής της προτεινόμενης μεθοδολογίας και υπόδειξη μελλοντικών ερευνητικών κατευθύνσεων για την επέκτασή της. / This thesis concerns the integration of Statistical Quality Control methods with Machine Learning techniques for covering contemporary business needs. The proposed approach took into account a thorough review of the literature, which identified the major shortcomings of Statistical Quality Control. A consideration of Machine Learning techniques with respect to the above shortcomings was then performed. More specifically, a methodology was proposed for identifying mean shifts in auto-correlated multivariate data processes, which occurs very often in real processes. The proposed approach was tested through two different case studies for its performance and ability to implement data of different type. The results of these case studies were encouraging as quite high rates were achieved for the successful recognition of mean shifts. The thesis concludes by listing a series of findings, highlighting the contribution of the proposed approach and suggesting a series of future research directions.
20

Quality Data Management in the Next Industrial Revolution : A Study of Prerequisites for Industry 4.0 at GKN Aerospace Sweden

Erkki, Robert, Johnsson, Philip January 2018 (has links)
The so-called Industry 4.0 is by its agitators commonly denoted as the fourth industrial revolution and promises to turn the manufacturing sector on its head. However, everything that glimmers is not gold and in the backwash of hefty consultant fees questions arises: What are the drivers behind Industry 4.0? Which barriers exists? How does one prepare its manufacturing procedures in anticipation of the (if ever) coming era? What is the internet of things and what file sizes’ is characterised as big data? To answer these questions, this thesis aims to resolve the ambiguity surrounding the definitions of Industry 4.0, as well as clarify the fuzziness of a data-driven manufacturing approach. Ergo, the comprehensive usage of data, including collection and storage, quality control, and analysis. In order to do so, this thesis was carried out as a case study at GKN Aerospace Sweden (GAS). Through interviews and observations, as well as a literature review of the subject, the thesis examined different process’ data-driven needs from a quality management perspective. The findings of this thesis show that the collection of quality data at GAS is mainly concerned with explicitly stated customer requirements. As such, the data available for the examined processes is proven inadequate for multivariate analytics. The transition towards a data-driven state of manufacturing involves a five-stage process wherein data collection through sensors is seen as a key enabler for multivariate analytics and a deepened process knowledge. Together, these efforts form the prerequisites for Industry 4.0. In order to effectively start transition towards Industry 4.0, near-time recommendations for GAS includes: capture all data, with emphasize on process data; improve the accessibility of data; and ultimately taking advantage of advanced analytics. Collectively, these undertakings pave the way for the actual improvements of Industry 4.0, such as digital twins, machine cognition, and process self-optimization. Finally, due to the delimitations of the case study, the findings are but generalized for companies with similar characteristics, i.e. complex processes with low volumes.

Page generated in 0.1117 seconds