• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 80
  • 46
  • 20
  • 9
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 202
  • 202
  • 202
  • 76
  • 60
  • 56
  • 55
  • 52
  • 29
  • 26
  • 25
  • 24
  • 21
  • 21
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Improving Availability of the Pelletization Process

Andreasson, Emil, Åhman, Pontus January 2022 (has links)
The Grate-Kiln-Cooler process is a commonly used method of sintering during iron ore pelletization, where the pellets are formed, dried, and hardened. The pellets are oxidized in the rotating Kiln, turning magnetite (Fe3O4) to hematite (Fe2O3), making the pellets attain suitable metallurgical attributes for further processing. The process is constantly exposed to thermal and mechanical stress, causing equipment degradation and thus unwanted production stops due to internal process disturbances. A suitable maintenance policy is required to cope with the risk of equipment degradation causing these production stops. Predictive maintenance (PdM) is the most current maintenance policy, utilizing a substantial amount of production data to foresee breakdowns and thus indicating the need for maintenance efforts to prevent them from occurring.           The global supplier of iron ore products, Loussavaara-Kiirunavaara Aktiebolag (LKAB), operates three pelletization plants in Kiruna. One of these pelletization plants experiences availability below desired levels. This hampers the plant from fulfilling its yearly production goals, resulting in lost revenue. This master's thesis aimed to increase the understanding of which causes influence the Grate-Kiln-Cooler process' availability. When these causes were identified, the aim was to develop a method of monitoring these to predict the need for maintenance (i.e., incorporating a PdM policy) to mitigate the risk of production stops. The work has been conducted by utilizing the systematic problem-solving DMAIC methodology.    The refractory material was identified as the primary contributor to the low availability in the investigated plant. Using principal component analysis (PCA) and statistical process control (SPC), a Hotelling T2 chart based on principal components was established to monitor the refractory material's condition. In this context, the combined usage of PCA and SPC highlighted three possible tendencies in the Kiln that potentially damaged the refractory material, causing production stops. The observed tendencies with the possibility of damaging the refractory material were; abnormally high refractory material temperatures, periods where the pellets' temperature exceeded the refractory material's temperature, and sporadic heat fluctuations in the refractory material.  The utilized Hotelling T2 chart provided a current state evaluation of the refractory material's condition and thus indicated the need for maintenance efforts. However, it was impossible to predict breakdowns by identifying patterns in either the T2-statistics or the individual charts. The inability to predict stops was derived from obstacles related to lacking documentation, deficient data, and that the time for breakdown is difficult to determine accurately. These obstacles hinder the prediction of breakdowns and, therefore, need to be dealt with to facilitate the implementation of a successful PdM strategy.
112

Cost Optimisation through Statistical Quality Control : A case study on the plastic industry

Moberg, Pontus, Svensson, Filip January 2021 (has links)
Background. Shewhart was the first to describe the possibilities that come with having a statistically robust process in 1924. Since his discovery, the importance of a robust process became more apparent and together with the consequences of an unstable process. A firm with a manufacturing process that is out of statistical control tends to waste money, increase risks, and provide an uncertain quality to its customers. The framework of Statistical Quality Control has been developed since its founding, and today it is a well-established tool used in several industries with successful results. When it was first thought of, complicated calculations had to be performed and was performed manually. With digitalisation, the quality tools can be used in real-time, providing high-precision accuracy on the quality of the product. Despite this, not all firms nor industries have started using these tools as of today.    The costs that occur in relation to the quality, either as a consequence of maintaining good quality or that arises from poor quality, are called Cost of Quality. These are often displayed through one of several available cost models. In this thesis, we have created a cost model that was heavily inspired by the P-A-F model. Several earlier studies have shown noticeable results by using SPC, COQ or a combination of them both.     Objectives. The objective of this study is to determine if cost optimisation could be utilised through SQC implementation. The cost optimisation is a consequence of an unstable process and the new way of thinking that comes with SQC. Further, it aims to explore the relationship between cost optimisation and SQC. Adding a layer of complexity and understanding to the spread of Statistical Quality Tools and their importance for several industries. This will contribute to tightening the bonds of production economics, statistical tools and quality management even further.   Methods. This study made use of two closely related methodologies, combining SPC with Cost of Quality. The combination of these two hoped to demonstrate a possible cost reduction through stabilising the process. The cost reduction was displayed using an optimisation model based on the P-A-F (Prevention, Appraisal, External Failure and Internal Failure) and further developed by adding a fifth parameter for optimising materials (OM). Regarding whether the process was in control or not, we focused on the thickness of the PVC floor, 1008 data points over three weeks were retrieved from the production line, and by analysing these, a conclusion on whether the process was in control could be drawn.    Results. Firstly, none of the three examined weeks were found to be in statistical control, and therefore, nor were the total sample. Through the assumption of the firm achieving 100% statistical control over their production process, a possible cost reduction of 874 416 SEK yearly was found.    Conclusions. This study has proven that through focusing on stabilising the production process and achieving control over their costs related to quality, possible significant yearly savings can be achieved. Furthermore, an annual cost reduction was found by optimising the usage of materials by relocating the ensuring of thickness variation from post-production to during the production.
113

Recommendations for Measurement and Management of an Elite Athlete

Sands, William, Cardinale, Marco, McNeal, Jeni, Murray, Steven, Sole, Christopher, Reed, Jacob, Apostolopoulos, Nikos, Stone, Michael H. 07 May 2019 (has links)
Athletes who merit the title ‘elite’ are rare and differ both quantitatively and qualitatively from athletes of lower qualifications. Serving and studying elite athletes may demand non-traditional approaches. Research involving elite athletes suffers because of the typical nomothetic requirements for large sample sizes and other statistical assumptions that do not apply to this population. Ideographic research uses single-athlete study designs, trend analyses, and statistical process control. Single-athlete designs seek to measure differences in repeated measurements under prescribed conditions, and trend analyses may permit systematic monitoring and prediction of future outcomes. Statistical process control uses control charting and other methods from management systems to assess and modify training processes in near real-time. These methods bring assessment and process control into the real world of elite athletics.
114

Gray-box Modeling for Stable and Efficient Operation of Steel Making Process / 鉄鋼製造プロセスの安定・効率的な操業のためのグレイボックスモデリング

Ahmad, Iftikhar 24 March 2014 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第18310号 / 工博第3902号 / 新制||工||1598(附属図書館) / 31168 / 京都大学大学院工学研究科化学工学専攻 / (主査)教授 長谷部 伸治, 教授 大嶋 正裕, 教授 河瀬 元明 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
115

Optimalizace výrobního procesu / Production process optimization

Habásko, Jakub January 2009 (has links)
HABÁSKO Jakub: Production process optimization. The Master´s thesis of Master degree, second grade, school year 2008/2009, FME Brno University of Technology, Institute of Metrology and Quality Assurance Testing, October 2009, pages 68, pictures 40, tables 2, supplements 4. The project elaborated in frame of the Master degree, study branch Metrology and Quality Assurance Testing. This Master´s thesis deals with optimization of process plan. In virtue of submission was created an analysis present condition using of statistical methods and it was created suitable statistical methods applicable in production organization. Then is here described the problem of brazing bells, because in this place is originating the most of uptightness. Then the thesis brings recommendation and findings, which can help with optimization this process.
116

Process Monitoring with Multivariate Data:Varying Sample Sizes and Linear Profiles

Kim, Keunpyo 01 December 2003 (has links)
Multivariate control charts are used to monitor a process when more than one quality variable associated with the process is being observed. The multivariate exponentially weighted moving average (MEWMA) control chart is one of the most commonly recommended tools for multivariate process monitoring. The standard practice, when using the MEWMA control chart, is to take samples of fixed size at regular sampling intervals for each variable. In the first part of this dissertation, MEWMA control charts based on sequential sampling schemes with two possible stages are investigated. When sequential sampling with two possible stages is used, observations at a sampling point are taken in two groups, and the number of groups actually taken is a random variable that depends on the data. The basic idea is that sampling starts with a small initial group of observations, and no additional sampling is done at this point if there is no indication of a problem with the process. But if there is some indication of a problem with the process then an additional group of observations is taken at this sampling point. The performance of the sequential sampling (SS) MEWMA control chart is compared to the performance of standard control charts. It is shown that that the SS MEWMA chart is substantially more efficient in detecting changes in the process mean vector than standard control charts that do not use sequential sampling. Also the situation is considered where different variables may have different measurement costs. MEWMA control charts with unequal sample sizes based on differing measurement costs are investigated in order to improve the performance of process monitoring. Sequential sampling plans are applied to MEWMA control charts with unequal sample sizes and compared to the standard MEWMA control charts with a fixed sample size. The steady-state average time to signal (SSATS) is computed using simulation and compared for some selected sets of sample sizes. When different variables have significantly different measurement costs, using unequal sample sizes can be more cost effective than using the same fixed sample size for each variable. In the second part of this dissertation, control chart methods are proposed for process monitoring when the quality of a process or product is characterized by a linear function. In the historical analysis of Phase I data, methods including the use of a bivariate <i>T</i>² chart to check for stability of the regression coefficients in conjunction with a univariate Shewhart chart to check for stability of the variation about the regression line are recommended. The use of three univariate control charts in Phase II is recommended. These three charts are used to monitor the <i>Y</i>-intercept, the slope, and the variance of the deviations about the regression line, respectively. A simulation study shows that this type of Phase II method can detect sustained shifts in the parameters better than competing methods in terms of average run length (ARL) performance. The monitoring of linear profiles is also related to the control charting of regression-adjusted variables and other methods. / Ph. D.
117

Efficient Sampling Plans for Control Charts When Monitoring an Autocorrelated Process

Zhong, Xin 15 March 2006 (has links)
This dissertation investigates the effects of autocorrelation on the performances of various sampling plans for control charts in detecting special causes that may produce sustained or transient shifts in the process mean and/or variance. Observations from the process are modeled as a first-order autoregressive process plus a random error. Combinations of two Shewhart control charts and combinations of two exponentially weighted moving average (EWMA) control charts based on both the original observations and on the process residuals are considered. Three types of sampling plans are investigated: samples of n = 1, samples of n > 1 observations taken together at one sampling point, or samples of n > 1 observations taken at different times. In comparing these sampling plans it is assumed that the sampling rate in terms of the number of observations per unit time is fixed, so taking samples of n = 1 allows more frequent plotting. The best overall performance of sampling plans for control charts in detecting both sustained and transient shifts in the process is obtained by taking samples of n = 1 and using an EWMA chart combination with a observations chart for mean and a residuals chart for variance. The Shewhart chart combination with the best overall performance, though inferior to the EWMA chart combination, is based on samples of n > 1 taken at different times and with a observations chart for mean and a residuals chart for variance. / Ph. D.
118

GLR Control Charts for Monitoring a Proportion

Huang, Wandi 19 December 2011 (has links)
The generalized likelihood ratio (GLR) control charts are studied for monitoring a process proportion of defective or nonconforming items. The type of process change considered is an abrupt sustained increase in the process proportion, which implies deterioration of the process quality. The objective is to effectively detect a wide range of shift sizes. For the first part of this research, we assume samples are collected using rational subgrouping with sample size n>1, and the binomial GLR statistic is constructed based on a moving window of past sample statistics that follow a binomial distribution. Steady state performance is evaluated for the binomial GLR chart and the other widely used binomial charts. We find that in terms of the overall performance, the binomial GLR chart is at least as good as the other charts. In addition, since it has only two charting parameters that both can be easily obtained based on the approach we propose, less effort is required to design the binomial GLR chart for practical applications. The second part of this research develops a Bernoulli GLR chart to monitor processes based on the continuous inspection, in which case samples of size n=1 are observed. A constant upper bound is imposed on the estimate of the process shift, preventing the corresponding Bernoulli GLR statistic from being undefined. Performance comparisons between the Bernoulli GLR chart and the other charts show that the Bernoulli GLR chart has better overall performance than its competitors, especially for detecting small shifts. / Ph. D.
119

A strategy for the synthesis of real-time statistical process control within the framework of a knowledge based controller

Crowe, Edward R. January 1995 (has links)
No description available.
120

Surveillance of Negative Binomial and Bernoulli Processes

Szarka, John Louis III 03 May 2011 (has links)
The evaluation of discrete processes are performed for industrial and healthcare processes. Count data may be used to measure the number of defective items in industrial applications or the incidence of a certain disease at a health facility. Another classification of a discrete random variable is for binary data, where information on an item can be classified as conforming or nonconforming in a manufacturing context, or a patient's status of having a disease in health-related applications. The first phase of this research uses discrete count data modeled from the Poisson and negative binomial distributions in a healthcare setting. Syndromic counts are currently monitored by the BioSense program within the Centers for Disease Control and Prevention (CDC) to provide real-time biosurveillance. The Early Aberration Reporting System (EARS) uses recent baseline information comparatively with a current day's syndromic count to determine if outbreaks may be present. An adaptive threshold method is proposed based on fitting baseline data to a parametric distribution, then calculating an upper-tailed p-value. These statistics are then converted to an approximately standard normal random variable. Monitoring is examined for independent and identically distributed data as well as data following several seasonal patterns. An exponentially weighted moving average (EWMA) chart is also used for these methods. The effectiveness of these methods in detecting simulated outbreaks in several sensitivity analyses is evaluated. The second phase of research explored in this dissertation considers information that can be classified as a binary event. In industry, it is desirable to have the probability of a nonconforming item, p, be extremely small. Traditional Shewhart charts such as the p-chart, are not reliable for monitoring this type of process. A comprehensive literature review of control chart procedures for this type of process is given. The equivalence between two cumulative sum (CUSUM) charts, based on geometric and Bernoulli random variables is explored. An evaluation of the unit and group--runs (UGR) chart is performed, where it is shown that the in--control behavior of this chart is quite misleading and should not be recommended for practitioners. / Ph. D.

Page generated in 0.0774 seconds