• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 405
  • 234
  • 55
  • 31
  • 18
  • 11
  • 9
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • 5
  • Tagged with
  • 966
  • 966
  • 239
  • 231
  • 185
  • 145
  • 140
  • 127
  • 116
  • 110
  • 98
  • 73
  • 69
  • 68
  • 65
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
741

Mejoras a la implementación del gráfico de control CEV para procesos con observaciones censuradas. Aportaciones, mediciones de propiedades y potencia de predicción

Neira Rueda, Javier Orlando 17 March 2024 (has links)
[ES] El proceso de estimación de parámetros para caracterizar una población mediante algoritmos está en constante desarrollo y perfeccionamiento. Los últimos años demuestran que la toma de decisiones basada en datos es compleja cuando existe incertidumbre generada por la censura estadística. La presente tesis evalúa el efecto de la censura estadística en una variable aleatoria normalmente distribuida común en muchos procesos. Posteriormente, las propiedades de estimación de los parámetros se caracterizarán con el algoritmo de máximo verosimilitud llamado valor condicional esperado CEV (Siglas en ingles), utilizando diferentes porcentajes de censura y tamaños de muestra. Posteriormente, se sistematiza y caracteriza el proceso de implementación del gráfico de control para supervisar tales variables aleatorias, proponiendo acciones de mejora y haciendo observaciones en el proceso. Finalmente, esta tesis resalta la importancia actual de tomar de decisiones basadas en algoritmos de estimación de datos con presencia de algún tipo de censura estadística, que a su vez se interpreta como una pérdida de información. / [CA] El procés d'estimació de paràmetres per a caracteritzar una població mitjançant algorismes està en constant desenvolupament i perfeccionament. Els últims anys demostren que la presa de decisions basada en dades és complexa quan existeix incertesa per la censura estadística. La present tesi avalua l'efecte de la censura estadística en una variable aleatòriament distribuïda comuna en molts processos. Posteriorment, les propietats d'estimació dels paràmetres es caracteritzaran amb l'algorisme de màxima versemblança anomenat valor condicional esperat CEV (Sigles en anglés), utilitzant diferents percentatges de censura i grandària de mostra. Se sistematitza i caracteritza el procés d'implementació del gràfic de control per a supervisar les variables aleatòries, proposant accions de millora i fent observacions posteriors en el procés. Finalment, aquesta tesi ressalta la importància actual de prendre decisions basades en algorismes d'estimació de dades amb presència d'alguna mena de censura estadística, que al seu torn s'interpreta com una pèrdua d'informació. / [EN] The process of parameter estimation in order to characterize a population using algorithms is in constant development and perfection. Recent years show that data-based decision-making is complex when there is uncertainty generated by statistical censoring. This thesis evaluates the effect of statistical censoring on a normally distributed random variable common to many processes. Subsequently, the estimation properties of the parameters will be characterised with the maximum likelihood algorithm called conditional expected value (CEV), using different censoring percentages and sample sizes. Subsequently, the process of implementing the control chart to monitor such random variables is systematised and characterised, proposing improvement actions and making observations in the process. Finally, this thesis highlights the current importance of making decisions based on data estimation algorithms with the presence of some kind of statistical censoring, which in turn is interpreted as a loss of information. / Neira Rueda, JO. (2024). Mejoras a la implementación del gráfico de control CEV para procesos con observaciones censuradas. Aportaciones, mediciones de propiedades y potencia de predicción [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/203154
742

Critical success factors for the implementation of an operational risk management system for South African financial services organisations

Gibson, Michael David 02 1900 (has links)
Operational risk has become an increasingly important topic within financial institutions of late, resulting in an increased spend by financial service organisations on operational risk management solutions. While this move is positive, evidence has shown that information technology implementations have tended to have low rates of success. Research highlighted that a series of defined critical success factors could reduce the risk of implementation failure. Investigations into the literature revealed that no critical success factors had been defined for the implementation of an operational risk management system. Through a literature study, a list of 29 critical success factors was identified. To confirm these factors, a questionnaire was developed. The questionnaire was distributed to an identified target audience within the South African financial services community. Reponses to the questionnaire revealed that 27 of the 29 critical success factors were deemed important and critical to the implementation of an operational risk management system. / Business Management / M. Com. (Business Management)
743

Multiscale process monitoring with singular spectrum analysis

Krishnannair, Syamala 12 1900 (has links)
Thesis (MScEng (Process Engineering))--University of Stellenbosch, 2010. / Thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Engineering (Extractive Metallurgy) In the Department of Process Engineering at the University of Stellenbosch / ENGLISH ABSTRACT: Multivariate statistical process control (MSPC) approaches are now widely used for performance monitoring, fault detection and diagnosis in chemical processes. Conventional MSPC approaches are based on latent variable projection methods such as principal component analysis and partial least squares. These methods are suitable for handling linearly correlated data sets, with minimal autocorrelation in the variables. Industrial plant data invariably violate these conditions, and several extensions to conventional MSPC methodologies have been proposed to account for these limitations. In practical situations process data usually contain contributions at multiple scales because of different events occurring at different localizations in time and frequency. To account for such multiscale nature, monitoring techniques that decompose observed data at different scales are necessary. Hence the use of standard MSPC methodologies may lead to unreliable results due to false alarms and significant loss of information. In this thesis a multiscale methodology based on the use of singular spectrum analysis is proposed. Singular spectrum analysis (SSA) is a linear method that extracts information from the short and noisy time series by decomposing the data into deterministic and stochastic components without prior knowledge of the dynamics affecting the time series. These components can be classified as independent additive time series of slowly varying trend, periodic series and aperiodic noise. SSA does this decomposition by projecting the original time series onto a data-adaptive vector basis obtained from the series itself based on principal component analysis (PCA). The proposed method in this study treats each process variable as time series and the autocorrelation between the variables are explicitly accounted for. The data-adaptive nature of SSA makes the proposed method more flexible than other spectral techniques using fixed basis functions. Application of the proposed technique is demonstrated using simulated, industrial data and the Tennessee Eastman Challenge process. Also, a comparative analysis is given using the simulated and Tennessee Eastman process. It is found that in most cases the proposed method is superior in detecting process changes and faults of different magnitude accurately compared to classical statistical process control (SPC) based on latent variable methods as well as the wavelet-based multiscale SPC. / AFRIKAANSE OPSOMMING: Meerveranderlike statistiese prosesbeheerbenaderings (MSPB) word tans wydverspreid benut vir werkverrigtingkontrolering, foutopsporing en .diagnose in chemiese prosesse. Gebruiklike MSPB word op latente veranderlike projeksiemetodes soos hoofkomponentontleding en parsiele kleinste-kwadrate gebaseer. Hierdie metodes is geskik om lineer gekorreleerde datastelle, met minimale outokorrelasie, te hanteer. Nywerheidsaanlegdata oortree altyd hierdie voorwaardes, en verskeie MSPB is voorgestel om verantwoording te doen vir hierdie beperkings. Prosesdata afkomstig van praktiese toestande bevat gewoonlik bydraes by veelvuldige skale, as gevolg van verskillende gebeurtenisse wat by verskillende lokaliserings in tyd en frekwensie voorkom. Kontroleringsmetodes wat waargenome data ontbind by verskillende skale is nodig om verantwoording te doen vir sodanige multiskaalgedrag. Derhalwe kan die gebruik van standaard-MSPB weens vals alarms en beduidende verlies van inligting tot onbetroubare resultate lei. In hierdie tesis word . multiskaalmetodologie gebaseer op die gebruik van singuliere spektrumontleding (SSO) voorgestel. SSO is . lineere metode wat inligting uit die kort en ruiserige tydreeks ontrek deur die data in deterministiese en stochastiese komponente te ontbind, sonder enige voorkennis van die dinamika wat die tydreeks affekteer. Hierdie komponente kan as onafhanklike, additiewe tydreekse geklassifiseer word: stadigveranderende tendense, periodiese reekse en aperiodiese geruis. SSO vermag hierdie ontbinding deur die oorspronklike tydreeks na . data-aanpassende vektorbasis te projekteer, waar hierdie vektorbasis verkry is vanaf die tydreeks self, gebaseer op hoofkomponentontleding. Die voorgestelde metode in hierdie studie hanteer elke prosesveranderlike as . tydreeks, en die outokorrelasie tussen veranderlikes word eksplisiet in berekening gebring. Aangesien die SSO metode aanpas tot data, is die voorgestelde metode meer buigsaam as ander spektraalmetodes wat gebruik maak van vaste basisfunksies. Toepassing van die voorgestelde tegniek word getoon met gesimuleerde prosesdata en die Tennessee Eastman-proses. . Vergelykende ontleding word ook gedoen met die gesimuleerde prosesdata en die Tennessee Eastman-proses. In die meeste gevalle is dit gevind dat die voorgestelde metode beter vaar om prosesveranderings en .foute met verskillende groottes op te spoor, in vergeleke met klassieke statistiese prosesbeheer (SP) gebaseer op latente veranderlikes, asook golfie-gebaseerde multiskaal SP.
744

Diagnostic monitoring of dynamic systems using artificial immune systems

Maree, Charl 12 1900 (has links)
Thesis (MScEng (Process Engineering))--University of Stellenbosch, 2006. / The natural immune system is an exceptional pattern recognition system based on memory and learning that is capable of detecting both known and unknown pathogens. Artificial immune systems (AIS) employ some of the functionalities of the natural immune system in detecting change in dynamic process systems. The emerging field of artificial immune systems has enormous potential in the application of fault detection systems in process engineering. This thesis aims to firstly familiarise the reader with the various current methods in the field of fault detection and identification. Secondly, the notion of artificial immune systems is to be introduced and explained. Finally, this thesis aims to investigate the performance of AIS on data gathered from simulated case studies both with and without noise. Three different methods of generating detectors are used to monitor various different processes for anomalous events. These are: (1) Random Generation of detectors, (2) Convex Hulls, (3) The Hypercube Vertex Approach. It is found that random generation provides a reasonable rate of detection, while convex hulls fail to achieve the required objectives. The hypercube vertex method achieved the highest detection rate and lowest false alarm rate in all case studies. The hypercube vertex method originates from this project and is the recommended method for use with all real valued systems, with a small number of variables at least. It is found that, in some cases AIS are capable of perfect classification, where 100% of anomalous events are identified and no false alarms are generated. Noise has, expectedly so, some effect on the detection capability on all case studies. The computational cost of the various methods is compared, which concluded that the hypercube vertex method had a higher cost than other methods researched. This increased computational cost is however not exceeding reasonable confines therefore the hypercube vertex method nonetheless remains the chosen method. The thesis concludes with considering AIS’s performance in the comparative criteria for diagnostic methods. It is found that AIS compare well to current methods and that some of their limitations are indeed solved and their abilities surpassed in certain cases. Recommendations are made to future study in the field of AIS. Further the use of the Hypercube Vertex method is highly recommended in real valued scenarios such as Process Engineering.
745

Optimering av den kemiska reningen vid Fläskebo deponi / Optimization of the Chemical Treatment at Fläskebo Landfill

Nilsson, Anna January 2006 (has links)
<p>Landfill leachate contains a variety of contaminants and is created when rainwater percolates the landfill. For landfill management the leachate is the main issue that can cause problems to the environment. At the landfill of Fläskebo, Renova AB treats the leachate in a local treatment plant. The treatment consists of a chemical treatment step with chemical precipitation, flocculation, sedimentation and filtration, and a final step with a carbon and peat filter. Renova has to ensure that the condition of the leachate reaches the regulation set for the landfill before it is released to the recipient. This regulation has not yet been established and a final suggestion will be given to the county administrative board in spring 2006.</p><p>In this master thesis the chemical treatment of Fläskebo is optimized. A comparison between the control program and the regulation was made to estimate the contamination of the leachate. Also the effectiveness of the two steps is evaluated. For optimization, the leachate was first tested in a laboratory with different coagulants and flocculants.</p><p>The purpose was to increase the precipitation of particles and metals; arsenic, cadmium, chromium, mercury, lead, copper, nickel and zinc through sweep floc coagulation and hydroxide precipitation. After the laboratory tests the precipitation was tested in the treatment plant with higher pH and coagulant. Also the process control for sodium hydroxide was examined.</p><p>The leachate had a small content of organic matter and nutrients, but had a large content of halogenated substances (AOX) and the heavy metals nickel and copper. High concentrations of contaminants were reduced better than low concentrations in the two treatment steps. The carbon and peat filter material also caused an increase of the arsenic content in the leachate after filtration. From the laboratory work the results showed a better reduction of metals with iron-chloride, PlusJÄRN and the anjonic polyacrylamid, Fennopol A. Because of the high content of chloride the iron- sulphate, PurFect was chosen for further tests. The optimal pH for the heavy metals arsenic, zinc, copper and nickel was between pH 9 and 9, 5. The precipitation in the treatment plant showed better results with sodium hydroxide and a higher pH, pH 9 in the flocculation basin. An increase of the coagulant PurFect from 202 mg/l to 225 mg/l meant only a higher chemical cost. The process control of sodium hydroxide showed an oscillating and unstable control performance, which may lead to a higher consumption of chemicals.</p> / <p>Lakvatten har varierande föroreningsgrad och karaktär och bildas då regnvatten perkolerar igenom en deponi. För en deponiverksamhet är lakvattnet den huvudsakliga påverkan på den omgivande miljön. Vid en av Renova ABs deponier, Fläskebo, utanför Göteborg sker reningen av vattnet i den lokala reningsanläggningen. Reningen består av kemisk fällning, flockning, sedimentering och filtrering samt ett kol- och torvfilter. För att få släppa ut lakvattnet har Renova just nu ett prövotidsvillkor på lakvattnets kvalité och till våren 2006 skall förslag på slutgiltiga utsläppsvillkor lämnas till Länsstyrelsen.</p><p>I detta examensarbete har den kemiska reningen av lakvattnet från Fläskebo optimerats utifrån lakvattnets karaktär. Lakvattnets föroreningsgrad bedömdes efter en jämförelse av analysresultatet inom kontrollprogrammet och riktvärdena i prövotidsvillkoret. Dessutom utvärderades effektiviteten i varje enskilt reningssteg, den kemiska reningen och kol- och torvfiltret. För optimering av den kemiska reningen testades lakvattnet först på lab med olika fällnings- och flockningsmedel. Uppgiften var att öka partikelfällningen och reducera metallerna arsenik, kadmium, krom, kvicksilver, bly, koppar, nickel och zink i lakvattnet genom svepkoagulering och hydroxidfällning.</p><p>Utifrån resultaten på lab testades sedan fällningen i full skala, pH höjdes i flockningsbassängen och så även dosen fällningsmedel till vattnet. En inledande undersökning av regleringen av lutdosering gjordes med några stegsvarsexperiment.</p><p>Lakvatten innehöll låga halter av organiskt material och närsalterna kväve och fosfor, medan de halogena ämnena (AOX) och tungmetallerna nickel och koppar var höga. De båda reningsstegen, kemisk rening och kol- och torvfiltret var generellt bra på att rena föroreningar i höga koncentrationer men var sämre vid låga. Kol- och torvfiltret ökade koncentrationen i vattnet av arsenik genom materialets interna läckage. I fällningsförsöken gav järnkloriden, PlusJÄRN och den anjoniska polyakrylamiden, Fennopol A bäst resultat i att avskilja metaller. Järnsulfaten PurFect gav näst bäst resultat och valdes istället för järnkloriden för vidare försök då kloridhalten i lakvattnet redan var högt. Optimalt pH för arsenik, zink, koppar och nickel var inom pHintervallet 9 och 9,5. Fällningen i full skala ute i verket visade ett bättre resultat vid tillsatt lut och ett högre pH (pH 9) i flockningsbassängen. Däremot betydde en höjning av fällningskemikalien från 202 mg/l PurFect till 225 mg/l enbart en ökad kemikaliekostnad och en överdosering. Det visade sig dessutom att lutregleringen var svängig och på gränsen till instabil. Regulatorn bör därför ses över så att risken för ökad kemikaliekonsumption och kostnader minskar.</p>
746

Converged IP-over-standard ethernet progress control networks for hydrocarbon process automation applications controllers

Almadi, Soloman Moses January 2011 (has links)
The maturity level of Internet Protocol (IP) and the emergence of standard Ethernet interfaces of Hydrocarbon Process Automation Application (HPAA) present a real opportunity to combine independent industrial applications onto an integrated IP based network platform. Quality of Service (QoS) for IP over Ethernet has the strength to regulate traffic mix and support timely delivery. The combinations of these technologies lend themselves to provide a platform to support HPAA applications across Local Area Network (LAN) and Wide Area Network (WAN) networks. HPAA systems are composed of sensors, actuators, and logic solvers networked together to form independent control system network platforms. They support hydrocarbon plants operating under critical conditions that — if not controlled — could become dangerous to people, assets and the environment. This demands high speed networking which is triggered by the need to capture data with higher frequency rate at a finer granularity. Nevertheless, existing HPAA network infrastructure is based on unique autonomous systems, which has resulted in multiple, parallel and separate networks with limited interconnectivity supporting different functions. This created increased complexity in integrating various applications and resulted higher costs in the technology life cycle total ownership. To date, the concept of consolidating HPAA into a converged IP network over standard Ethernet has not yet been explored. This research aims to explore and develop the HPAA Process Control Systems (PCS) in a Converged Internet Protocol (CIP) using experimental and simulated networks case studies. Results from experimental and simulation work showed encouraging outcomes and provided a good argument for supporting the co-existence of HPAA and non-HPAA applications taking into consideration timeliness and reliability requirements. This was achieved by invoking priority based scheduling with the highest priority being awarded to PCS among other supported services such as voice, multimedia streams and other applications. HPAA can benefit from utilizing CIP over Ethernet by reducing the number of interdependent HPAA PCS networks to a single uniform and standard network. In addition, this integrated infrastructure offers a platform for additional support services such as multimedia streaming, voice, and data. This network‐based model manifests itself to be integrated with remote control system platform capabilities at the end user's desktop independent of space and time resulting in the concept of plant virtualization.
747

Near real-time detection and approximate location of pipe bursts and other events in water distribution systems

Romano, Michele January 2012 (has links)
The research work presented in this thesis describes the development and testing of a new data analysis methodology for the automated near real-time detection and approximate location of pipe bursts and other events which induce similar abnormal pressure/flow variations (e.g., unauthorised consumptions, equipment failures, etc.) in Water Distribution Systems (WDSs). This methodology makes synergistic use of several self-learning Artificial Intelligence (AI) and statistical/geostatistical techniques for the analysis of the stream of data (i.e., signals) collected and communicated on-line by the hydraulic sensors deployed in a WDS. These techniques include: (i) wavelets for the de-noising of the recorded pressure/flow signals, (ii) Artificial Neural Networks (ANNs) for the short-term forecasting of future pressure/flow signal values, (iii) Evolutionary Algorithms (EAs) for the selection of optimal ANN input structure and parameters sets, (iv) Statistical Process Control (SPC) techniques for the short and long term analysis of the burst/other event-induced pressure/flow variations, (v) Bayesian Inference Systems (BISs) for inferring the probability of a burst/other event occurrence and raising the detection alarms, and (vi) geostatistical techniques for determining the approximate location of a detected burst/other event. The results of applying the new methodology to the pressure/flow data from several District Metered Areas (DMAs) in the United Kingdom (UK) with real-life bursts/other events and simulated (i.e., engineered) burst events are also reported in this thesis. The results obtained illustrate that the developed methodology allowed detecting the aforementioned events in a fast and reliable manner and also successfully determining their approximate location within a DMA. The results obtained additionally show the potential of the methodology presented here to yield substantial improvements to the state-of-the-art in near real-time WDS incident management by enabling the water companies to save water, energy, money, achieve higher levels of operational efficiency and improve their customer service. The new data analysis methodology developed and tested as part of the research work presented in this thesis has been patented (International Application Number: PCT/GB2010/000961).
748

The Fixed v. Variable Sampling Interval Shewhart X-Bar Control Chart in the Presence of Positively Autocorrelated Data

Harvey, Martha M. (Martha Mattern) 05 1900 (has links)
This study uses simulation to examine differences between fixed sampling interval (FSI) and variable sampling interval (VSI) Shewhart X-bar control charts for processes that produce positively autocorrelated data. The influence of sample size (1 and 5), autocorrelation parameter, shift in process mean, and length of time between samples is investigated by comparing average time (ATS) and average number of samples (ANSS) to produce an out of control signal for FSI and VSI Shewhart X-bar charts. These comparisons are conducted in two ways: control chart limits pre-set at ±3σ_x / √n and limits computed from the sampling process. Proper interpretation of the Shewhart X-bar chart requires the assumption that observations are statistically independent; however, process data are often autocorrelated over time. Results of this study indicate that increasing the time between samples decreases the effect of positive autocorrelation between samples. Thus, with sufficient time between samples the assumption of independence is essentially not violated. Samples of size 5 produce a faster signal than samples of size 1 with both the FSI and VSI Shewhart X-bar chart when positive autocorrelation is present. However, samples of size 5 require the same time when the data are independent, indicating that this effect is a result of autocorrelation. This research determined that the VSI Shewhart X-bar chart signals increasingly faster than the corresponding FSI chart as the shift in the process mean increases. If the process is likely to exhibit a large shift in the mean, then the VSI technique is recommended. But the faster signaling time of the VSI chart is undesirable when the process is operating on target. However, if the control limits are estimated from process samples, results show that when the process is in control the ARL for the FSI and the ANSS for the VSI are approximately the same, and exceed the expected value when the limits are fixed.
749

Whether using encryption in SCADA systems, the services performance requirements are still met in OT IT environment over an MPLS core network?

Chego, Lloyd January 2016 (has links)
A Research Project Abstract submitted in fulfillment of the requirements for Master of Science in Engineering [Electrical]: Telecommunications at the University Of The Witwatersrand, Johannesburg 07 June 2016 / Utilities use Supervisory Control and Data Acquisition systems as their industrial control system. The architecture of these systems in the past was based on them being isolated from other networks. Now with recent ever changing requirements of capabilities from these systems there is a need to converge with information technology systems and with the need to have these industrial networks communicating on packet switched networks there are cyber security concerns that come up. This research project looks at the whether using encryption in an IP/MPLS core network for SCADA in an OT IT environment has an effect on the performance requirements. This was done through an experimental simulation with the results recorded. The research project also looks at the key literature study considerations. The key research question for the research project of this MSc 50/50 mini-thesis is “whether using encryption in SCADA systems, the services performance requirements are still met in OT/ IT environment over an MPLS core network”? The research project seeks to determine if SCADA performance requirements are met over an encrypted MPLS/IP core network in an OT/IT environment. The key focus area of the research project is only encryption in the whole cyber security value chain versus SCADA services performances. This means that the research project only focused on the encryption portion of the whole cyber security value chain and the scope did not focus on other aspects of the value chain. This suffices for an MSc 50/50 mini-thesis research project as a focus on the whole value chain would require a full MSc thesis. Thus the primary objective for the research project is to research and demonstrate that encryption is essential for secure SCADA communication over a MPLS/IP core network. As aforementioned encryption forms an essential part of the Cyber Security value chain which has to achieve the following objectives. Confidentiality: ensuring that the information source is really from that source. Integrity: ensuring that the information has not been altered in any way. Availability: ensuring that system is not comprised but that it is available. These objectives of encryption should be met with SCADA service performance requirements not violated which is the objective of the research project. / M T 2016
750

Indicadores críticos da manufatura de pisos de madeira maciça / Critical indicators applied to manufacture of solid wood flooring

Soares, Philipe Ricardo Casemiro 27 January 2010 (has links)
A implementação de programas de qualidade na linha de produção tem sido utilizada por empresas para garantir a competitividade em um mercado cada dia mais competitivo. A busca por equilíbrio entre qualidade e custos é uma constante na área industrial e, neste contexto, melhorias no processo produtivo são fundamentais. No setor florestal, responsável por 3,5% do PIB do país em 2007, a gestão da qualidade aplicada à linha de produção é recente e muitos aspectos importantes não são considerados. O objetivo deste trabalho foi identificar e avaliar os pontos críticos do processo de produção de uma empresa de processamento da madeira. Para tanto, a pesquisa foi dividida em três etapas. A primeira foi o mapeamento do processo da empresa e a elaboração de fluxogramas para as atividades. A segunda fase foi a identificação e avaliação dos pontos críticos por entrevistas com funcionários, utilizando metodologia FMEA (Failure Mode and Effect Analysis) e diagramas de Ishikawa, que relaciona as falhas com suas possíveis causas. A terceira fase foi a avaliação do processo produtivo, empregando o controle estatístico do processo nos principais pontos críticos e a determinação da capacidade do processo pelo índice Cpk. Os resultados mostraram a existência de 6 sub-processos, sendo dois críticos. Nesses setores foram identificados 15 pontos críticos, com cinco deles selecionados para serem avaliados. Os gráficos de controle para variáveis indicaram que o processo produtivo da empresa é instável quanto ao item peças fora de dimensão em ambos setores, enquanto o índice Cpk permitiu concluir que a empresa não é capaz de produzir pisos de madeira dentro das especificações. Para os atributos, o processo era estável, exceto para peças marcadas pela lixa. A causa de não conformidades mais significativa foi a aferição dos equipamentos de medição, que resultou em processos com baixa variação, porém com médias distantes das especificações. / The implementation of quality programs in line production has been used by companies to guarantee the competitiveness in a market more competitive every day. The search for balance between quality and costs is a constant in industrial area and, in this context, improvements in the productive process are fundamentals. In the forest sector, responsible for 3,5% of the countrys GDP in 2007, the quality management applied to the line production is recent and many important aspects are not considered. This study aimed to identify and evaluate the critical points of the process of production of a company of wood processing. Therefore, the research was divided in three stages. The first one was the process mapping of the company and the elaboration of flowcharts for the activities. The second phase was identify and evaluate the critical points for interviews with employees, using FMEA (Failure Mode and Effect Analysis) methodology and Ishikawas diagram, which relates the failure with their possible causes. The third phase was the evaluation of the productive process, applying statistical process control in the main critical points and the determination of the process capability for the Cpk index. The results showed the existence of 6 processes, two of them critical. In those sectors 15 critical points were indentified and 5 were selected for evaluation. The control charts for variables indicated that the process is unstable for the item pieces out of dimension in both processes, while the Cpk index allowed to conclude that the company is not capable to produce wood flooring according to the specifications. For the attributes, the process was stable, except for pieces marked by the sandpaper. The more significant cause of non-conformities was the gauging of the measurement equipments, that resulted in processes with low variation, however with means far from the specifications.

Page generated in 0.0444 seconds