• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 80
  • 46
  • 20
  • 9
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 202
  • 202
  • 202
  • 76
  • 60
  • 56
  • 55
  • 52
  • 29
  • 26
  • 25
  • 24
  • 21
  • 21
  • 20
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Perdas na colheita mecanizada de cana-de-açúcar crua em função do desgaste das facas do corte de base /

Reis, Gustavo Naves dos. January 2009 (has links)
Resumo: A colheita mecanizada da cana-de-açúcar tem relevância em todo o processo produtivo da cultura, inclusive na qualidade do produto final entregue para o processamento na usina. O presente trabalho teve por objetivo avaliar a qualidade do processo de colheita mecanizada de cana-de-açúcar crua, utilizando o controle estatístico do processo (CEP), e a capacidade do processo gerar resultados dentro dos limites especificados, por meio da análise de capabilidade, quantificando as perdas e os danos causados às soqueiras pelo mecanismo de corte basal. O experimento foi conduzido em área de ARGISSOLO Amarelo, localizada no município de Arez - RN, sob dois tipos manejos do solo, no sistema convencional de preparo. Os tratamentos foram: AR+GM (arado de aivecas + grade média); GP+GM (grade pesada + grade média) e as perdas determinadas por meio de amostragem. Os atributos do tipo rebolo repicado, toco e totais diferiram estatisticamente entre os tratamentos estudados, sendo os maiores valores de perdas quantificados para o tratamento que utilizou o arado de aivecas seguido de grade média. O processo de colheita apresenta-se fora de controle para os atributos do tipo rebolo repicado, cana inteira, cana-ponta, pedaço fixo, pedaço solto, lasca e estilhaço. O tratamento arado de aivecas seguido de grade média apresenta maiores graus de danos ocasionados às soqueiras, comparado ao tratamento (GP+GM). O tratamento (AR+GM) apresenta maior área de microrrelevo, diferindo estatisticamente do tratamento preparado com grade pesada seguida de grade média. / Abstract: The mechanized harvest of sugar cane is relevant to the whole productive process of the crop, including the quality of the final product that is delivered for processing at the plant. The objective of this study is to evaluate the quality of the process of the mechanized harvest of raw sugar cane by means of statistical process control, and also the capacity of the process to generate results within the specified limits through the analysis of capability, quantifying the losses and damages caused to the stalks (ratoon) by the mechanism of the cutting base. The experiment was conducted in an area of yellow clay soil located in the municipal of Arez, RN, Brazil, using two methods of managing the soil within the conventional system of preparation. The treatments were: (a) moldboard ploughs followed by average weight offset disc harrow and; (b) heavy offset disc harrow followed by average weight offset disc harrow, with the losses being determined through samples. The losses attributed to splinted stalks, pieces and stubble total differed statistically between the treatments studied, being that the highest values of quantified losses came from the treatment using moldboard ploughs followed by average weight offset disc harrow. The harvesting process proved unmanageable due to splinted stalks, whole cane, cane points, fixed pieces, loose pieces and splinters. The treatment using moldboard ploughs followed by average weight offset disc harrow presented the highest degree of damage to the stalks when compared to treatment (b). The treatment (a) presented the largest area of microrelief, differing statistically from the treatment using heavy offset disc harrow followed by average weight offset disc harrow. / Orientador: Afonso Lopes / Coorientador: Rouverson Pereira da Silva / Banca: Carlos Eduardo Angeli Furlani / Banca: Gilberto Hirotsugu Azevedo Koike / Banca: Marcilio Vieira Martins Filho / Banca: Marcos Milan / Doutor
82

Manufacturing Process Design and Control Based on Error Equivalence Methodology

Chen, Shaoqiang 15 May 2008 (has links)
Error equivalence concerns the mechanism whereby different error sources result in identical deviation and variation patterns on part features. This could have dual effects on process variation reduction: it significantly increases the complexity of root cause diagnosis in process control, and provides an opportunity to use one error source as based error to compensate the others. There are fruitful research accomplishments on establishing error equivalence methodology, such as error equivalence modeling, and an error compensating error strategy. However, no work has been done on developing an efficient process design approach by investigating error equivalence. Furthermore, besides the process mean shift, process fault also manifests itself as variation increase. In this regard, studying variation equivalence may help to improve the root cause identification approach. This thesis presents engineering driven approaches for process design and control via embedding error equivalence mechanisms to achieve a better, insightful understanding and control of manufacturing processes. The first issue to be studied is manufacturing process design and optimization based on the error equivalence. Using the error prediction model that transforms different types of errors to the equivalent amount of one base error, the research derives a novel process tolerance stackup model allowing tolerance synthesis to be conducted. Design of computer experiments is introduced to assist the process design optimization. Secondly, diagnosis of multiple variation sources under error equivalence is conducted. This allows for exploration and study of the possible equivalent variation patterns among multiple error sources and the construction of the library of equivalent covariance matrices. Based on the equivalent variation patterns library, this thesis presents an excitation-response path orientation approach to improve the process variation sources identification under variation equivalence. The results show that error equivalence mechanism can significantly reduce design space and release us from considerable symbol computation load, thus improve process design. Moreover, by studying the variation equivalence mechanism, we can improve the process diagnosis and root cause identification.
83

Controlling High Quality Manufacturing Processes: A Robustness Study Of The Lower-sided Tbe Ewma Procedure

Pehlivan, Canan 01 September 2008 (has links) (PDF)
In quality control applications, Time-Between-Events (TBE) type observations may be monitored by using Exponentially Weighted Moving Average (EWMA) control charts. A widely accepted model for the TBE processes is the exponential distribution, and hence TBE EWMA charts are designed under this assumption. Nevertheless, practical applications do not always conform to the theory and it is common that the observations do not fit the exponential model. Therefore, control charts that are robust to departures from the assumed distribution are desirable in practice. In this thesis, robustness of the lower-sided TBE EWMA charts to the assumption of exponentially distributed observations has been investigated. Weibull and lognormal distributions are considered in order to represent the departures from the assumed exponential model and Markov Chain approach is utilized for evaluating the performance of the chart. By analyzing the performance results, design settings are suggested in order to achieve robust lower-sided TBE EWMA charts.
84

A Two-sided Cusum For First-order Integer-valued Autoregressive Processes Of Poisson Counts

Yontay, Petek 01 July 2011 (has links) (PDF)
Count data are often encountered in manufacturing and service industries due to ease of data collection. These counts can be useful in process monitoring to detect shifts of a process from an in-control state to various out-of-control states. It is usually assumed that the observations are independent and identically distributed. However, in practice, observations may be autocorrelated and this may adversely affect the performance of the control charts developed under the assumption of independence. In this thesis, the cumulative sum (CUSUM) control chart for monitoring autocorrelated processes of counts is investigated. To describe the autocorrelation structure of counts, a Poisson integer-valued autoregressive moving average model of order 1, Poisson INAR(1), is employed. Changes in the process mean in both positive and negative directions are taken into account while designing the CUSUM chart. A trivariate Markov Chain approach is utilized for evaluating the performance of the chart.
85

Optimal filter design approaches to statistical process control for autocorrelated processes

Chin, Chang-Ho 01 November 2005 (has links)
Statistical Process Control (SPC), and in particular control charting, is widely used to achieve and maintain control of various processes in manufacturing. A control chart is a graphical display that plots quality characteristics versus the sample number or the time line. Interest in effective implementation of control charts for autocorrelated processes has increased in recent years. However, because of the complexities involved, few systematic design approaches have thus far been developed. Many control charting methods can be viewed as the charting of the output of a linear filter applied to the process data. In this dissertation, we generalize the concept of linear filters for control charts and propose new control charting schemes, the general linear filter (GLF) and the 2nd-order linear filter, based on the generalization. In addition, their optimal design methodologies are developed, where the filter parameters are optimally selected to minimize the out-of-control Average Run Length (ARL) while constraining the in-control ARL to some desired value. The optimal linear filters are compared with other methods in terms of ARL performance, and a number of their interesting characteristics are discussed for various types of mean shifts (step, spike, sinusoidal) and various ARMA process models (i.i.d., AR(1), ARMA(1,1)). Also, in this work, a new discretization approach for substantially reducing the computational time and memory use for the Markov chain method of calculating the ARL is proposed. Finally, a gradient-based optimization strategy for searching optimal linear filters is illustrated.
86

A framework of statistical process control for software development

Shih, Tsung-Yo 03 August 2009 (has links)
With the globalization era, software companies around the world not only have to face competition in the domestic industry, as well as the subsequent challenge of large international companies. For this reason, domestic software companies must to upgrade their own software quality. Domestic government agencies and non-governmental units together promote Capability Maturity Model Integration (CMMI). Hope to improve their quality of software development process through internationalized professional evaluation. Towards the high-maturity software development process, software development process should be estimated quantitatively in CMMI Level 4. There are frequently used statistical process control (SPC) methods, including control charts, fishbone diagram, pareto charts ... and other related practices. Its goal is to maintain stability of overall software development process, so the output performance can be expected. Primitive SPC applied in manufacturing industry, successfully improving the quality of their products. But some characteristics of software, such as software development is human-intensive and innovative activities. It increases not only variability of control, but also difficulties of implementation. In this study, collate and analyze the operational framework of SPC and CMMI Level 4 through study of literature and case study with the case company-A company's practices. It contains two points, one is organization point of view, the other is methodological point of view. Organizational point of view includes stage of CMMI Level 4 and SPC implemented in the software industry, as well as how to design the organizational structure. Methodological point of view includes the steps to run SPC¡Buseful methods and tools. Methodological point of view also uses control theory to collate relevant control mechanisms. Finally, we illustrate how to integrate SPC into the company's system development life cycle. The framework can provide a reference for domestic software companies of longing for implementing CMMI Level 4 and SPC.
87

Simulation and Analysis of Analog Circuit and PCM (Process Control Monitor) Test Structures in Circuit Design

Sobe, Udo, Rooch, Karl-Heinz, Mörtl, Dietmar 08 June 2007 (has links) (PDF)
PCM test structures are commonly used to check the produced wafers from the standpoint of the technologist. In general these structures are managed inside the FAB and are focused on standard device properties. Hence their development and analysis is not driven by analog circuit blocks, which are sensitive or often used. Especially for DFM/Y of analog circuits the correlation between design and technology has to be defined. The knowledge of electrical behavior of test structures helps to improve the designer's sensitivity to technological questions. This paper presents a method to bring the PCM methodology into the analog circuit design to improve design performance, yield estimation and technology correlation. We show how both analog circuit and PCM blocks can be simulated and analyzed in the design phase.
88

IMPLEMENTERING AV STATISTISK PROCESSTYRNING VID SMÅ SERIER

Hassan, Sara January 2015 (has links)
Statistisk processtyrning, SPS, är ett välkänt verktyg som används för kvalitetsförbättringar inom organisationer världen över. De senaste åren har tillverkande organisationer tenderat att gå mot kortare serier, vilket medför en problematik när de vill tillämpa statistiska metoder som är utvecklade för traditionell masstillverkning. Framgångsfaktorer för implementering av SPS vid små serier är ett relativt outforskat område och kräver därför ytterligare forskning. Syftet med denna studie var att ta fram en modell över hur SPS framgångsrikt kan implementeras av organisationer med små serier och en stor detaljflora. För att besvara syftet genomfördes en fallstudie med både kvantitativ och kvalitativ metod. Deltagande observationer och en workshop med 15 deltagare utfördes för att identifiera existerande variationer samt definiera nuläget i det avgränsade produktionsflödet, vars processer studerades i studien. Tre produktfamiljer och kritiska parametrar som representerade produktkvalitén valdes ut för att följas upp i styrdiagram. En mätsystemanalys utfördes för att undersöka om de mätdon som i stor utsträckning användes för kvalitetskontroll i processen var tillförlitliga. Styrdiagram upprättades anpassade för små serier och statistiska analyser utfördes för att undersöka om SPS var en användbar metod för kvalitetsförbättringar i processer med små serier. En kvalitativ benchmark med fyra deltagande organisationer utfördes även för att ta del av deras erfarenheter relaterat till implementering och arbete med SPS.  Resultatet visade att den studerade organisationen behöver utföra förändringar gällande arbetsmetoder för kvalitetskontroll samt hantering av processer och mätsystem. Det finns även behov av ett omfattande förbättringsarbete, för att eliminera det flertalet orsaker till systematiska variationer som identifierades påverka processerna och produktkvalitén. Dessa förändringar krävs innan en implementering av SPS kan genomföras. Verktyg inom SPS visade sig med framgång kunna användas för att förbättra processer med små serier, vid användning av standardiserade styrdiagram som möjliggör analys av flera produkter i samma diagram. Processer med små serier och en stor detaljflora medför en utökad komplexitet vid statistiska analyser och visar tecken på ett flertal svårigheter som ökar risken för en fallerad implementering.  Utifrån analyser av det kvalitativa och kvantitativa resultatet skapades en modell med 15 framgångsfaktorer för implementering av SPS vid små serier. Faktorerna bör följas av organisationer med små serier som vill lyckas med att implementera SPS. Framgångsfaktorerna är följande: (1) Var beredd på en kulturförändring som kräver att SPS vävs in i hela organisationen, (2) Förmedla ett tydligt mål och hållbar strategi för arbetet med SPS, (3) Skapa ett utbrett engagemang i hela organisationen, (4) Utse en SPS-koordinator, (5) Inför utbildning och uppföljning från start, (6) Skapa tvärfunktionella team, (7) Främja samarbete och delaktighet under förbättringsarbetet, (8) Ställ krav på ett dugligt mätsystem, (9) Utför en pilotstudie där det finns intresse, (10) Identifiera kritiska processer, produktfamiljer och parametrar, (11) Börja med att lära känna processerna, (12) Upprätta standardiserade styrdiagram, (13) Tolkning och analys av styrdiagram utförs enligt Montgomerys metod för statistiska analyser, (14) Sträva efter stabila processer, (15) Utför kontinuerlig uppföljning. / Statistical process control, SPC, is a widely used technique for quality improvements by companies all over the world. The current trend in manufacturing organizations is directed towards shorter productions runs, which cause problems when applying traditional statistical methods developed for SPC on mass production runs. The critical factors for a successful implementation of SPC on short runs are still not fully explored and require further research. The main purpose of this study was to present a conceptual framework that illustrates the successful implementation of SPC in organizations with short runs and extensive product portfolio. In order to answer the research questions, a case study research methodology with both quantitative and qualitative methods was used. Participant observations and a workshop including 15 participators were performed in order to identify existing process variability’s and current state of the studied production processes.  Three product families and key quality characteristics of each product were chosen to be monitored in control charts, based on scrap costs and staff experiences of the production process.  A measurement system analysis was used to determine if the gauges, used to make measurement quality controls, were capable. Control charts were constructed and adjusted to short production runs. Statistical analysis was then made on the information gathered through the control charts to determine if statistical tools within SPC was useful for quality improvements on short production runs.  Also a qualitative benchmark was performed with four manufacturing companies to take part of their experiences and knowledge related to the implementation and application of SPC.  The findings indicate that the studied organization needs to improve working methods related to quality inspections and monitoring of the production processes. The organization also needs to improve the measurement system and make an extensive work of improvement to reduce the many identified special causes of variation that affects the processes and product quality, before implementing SPC in the organization. Findings showed that SPC tools and techniques successfully can be adopted to improve short run production processes when using standardized control charts for different product types. Short run production processes involve more complex statistical analysis which could inhibit the success of an implementation of SPC.   The analysis of the qualitative and quantitative findings resulted in a framework including 15 critical success factors for the implementation of SPC in short production runs. All the following critical success factors should be taken into account by organizations with short runs that aspire a successful implementation of SPC: (1) Be ready to make a cultural change including the recognition of the importance of SPC within the whole organization, (2) Communicate a clear goal and long-term strategy, (3) Create motivation and commitment from top management to operators on the shop floor, (4) Select a SPC coordinator, (5) Introduce a training programme with feedback from start, (6) Create cross-functional teams, (7) Stimulate cooperation and participation within the work of improvements, (8) Ensure a capable measurement system, (9) Perform a pilot project with enthusiastic employees, (10) Identify critical processes, product families and key quality characteristics, (11) Focus on exploring process behaviors, (12) Construct standardized control charts, (13) Interpret and analyze control charts according to Montgomery’s method for statistical analysis, (14) Attempt to obtain processes in control, (15) Perform continuous follow ups.
89

Στατιστική επεξεργασία ιατρικών δεδομένων : μελέτη περίπτωσης

Παραμέρα, Σπυριδούλα 11 July 2013 (has links)
Στην παρούσα μεταπτυχιακή εργασία αξιολογήθηκε η συνδυαστική θεραπευτική φαρμακευτική αγωγή των HIV οροθετικών ασθενών βασιζόμενη σε 2 δείκτες: 1) στο ιϊκό φορτίο VL (Varial Load, copies/ml) που εκφράζει τις συγκεντρώσεις του RNA του HIV στο αίμα και 2) στον αριθμό των CD4 Τ-λεμφοκυττάρων (κύτταρα/mm3) που εκφράζει των αριθμό των κυττάρων τα οποία βοηθούν τον ασθενή να καταπολεμήσει την λοίμωξη. Συγκεντρώθηκαν δεδομένα από 278 ασθενείς και 32 ‘παλιούς’, από όλη την Ελλάδα, για το χρονικό διάστημα 1990-2006 (Πανεπιστημιακό Νοσοκομείο Ρίο). Για καθέναν από τους ασθενείς ελήφθησαν δεδομένα σε φύλλα του excel, μη κατηγοριοποιημένα, και για καθέναν υπήρχαν από 20 έως 100 μετρήσεις. Μια τόσο εκτεταμένη συγκέντρωση δεδομένων για HIV οροθετικούς ασθενείς γίνεται για πρώτη φορά στην Ελλάδα. Η θεραπεία για τον HIV περιλαμβάνει συνήθως 3 ή περισσότερα φάρμακα (HAART). Στην παρούσα μελέτη χρησιμοποιήθηκαν 3 κατηγορίες αντιρετροϊκών φαρμάκων: 1. Νουκλεοσιδικοί αναστολείς της ανάστροφης μεταγραφάσης (NRTIs) (ομάδα πράσινη), (10 φάρμακα) 2. Mη νουκλεοσιδικοί αναστολείς της ανάστροφης μεταγραφάσης (NNRTIs) (ομάδα κίτρινη), (3 φάρμακα) 3. Αναστολείς των πρωτεασών (PIs) (ομάδα μπλέ) (11 φάρμακα) Οι ασθενείς ανάλογα με την αγωγή της πρώτης θεραπείας τους κατηγοριοποιήθηκαν σε ομάδες (4) και σε υποομάδες με βάση τα επιμέρους φάρμακα κάθε ομάδας: Ομάδα πράσινη (ΝRTIs) (72 ασθενείς) Ομάδα πράσινη- μπλε (ΝRTIs-PIs) (139 ασθενείς)και 8 επιμέρους υποομάδες Ομάδα πράσινη- κίτρινη (ΝRTIs-NΝRTIs) (35 ασθενείς)και 2 επιμέρους υποομάδες Ομάδα πράσινη-κίτρινη-μπλε (ΝRTIs-NΝRTIs- PIs) (3 ασθενείς) Οι ασθενείς των ομάδων και υποομάδων μελετήθηκαν ξεχωριστά, χωρισμένοι σε μη-πεπειραμένους (naïve) και σε πεπειραμένους (experienced). Η αξιολόγηση της θεραπείας των ασθενών (κατά ομάδες και υποομάδες) έγινε με βάση:  το % ποσοστό που πέτυχαν ‘μη-ανιχνεύσιμο’ VL≤50, καθώς και VL≤200 και VL≤500, ανά τρείς μήνες (διαγραμματική παρουσίαση).  Το ποσοστό των ασθενών που δεν πέτυχαν VL≤50.  Τον μέσο χρόνο που πέτυχε VL≤50 το 50% των ασθενών (ταχύτητα ανταπόκρισης στη θεραπεία).  Τη συσχέτιση των τιμών VLέναρξης και CD4έναρξης που είχαν οι επιτυχημένοι ασθενείς κατά την έναρξη της θεραπείας (κατηγοριοποίηση σε κλίμακες), καθώς και του CD4 την στιγμή επίτευξης VL≤50, με ταυτόχρονο υπολογισμό των %μεταβολών των VL και CD4. Με βάση τα παραπάνω προέκυψαν τα ακόλουθα: Η λήψη αποκλειστικά NRTIs (ποσοστό αποτυχίας VL≤50 78,5%) ήταν λιγότερο αποτελεσματική σε σχέση με συνδυαστική αγωγή NRTIs-ΡIs (επιτυχία 81,8%), ή με NRTIs-NNRTIs (επιτυχία 81,6%). Για να δράσει η αγωγή με αποκλειστικά NRTIs, απαιτείται μεγαλύτερο διάστημα, στο οποίο μπορεί η τιμή του VL να ελαττωθεί <500, αλλά είναι λιγότερο πιθανό να πέσει <200. Μεταξύ των NRTIs-PIs & NRTIs-NNRTIs λαμβάνοντας υπόψη και τον χρόνο που πέτυχε το 50% των ασθενών, η αγωγή NRTIs-NNRTIs ήταν πιο αποτελεσματική αφού επιτεύχθηκε VL≤50 σχεδόν στο 1/3 του χρόνου. Όσον αφορά τις επιμέρους υποομάδες (NRTIs-PIs & NRTIs-NNRTIs) οι naïve ασθενείς έφταναν ταχύτερα σε μη ανιχνεύσιμα επίπεδα VL σε σχέση με τους experienced (μικρότερη ταχύτητα ανταπόκρισης), με όλα τα ΡΙs που συνδυαζόταν με NRTIs, και είχαν μικρότερα ποσοστά αποτυχίας. Οι experienced ασθενείς (ποσοστά αποτυχίας 11 έως 87%) είχαν ως πιο επιτυχημένες αγωγές την πράσινο-μπλε4 (NRTIs-NLF), την πράσινο-μπλε3 (NRTIs-IND), ακολουθούμενη από την πράσινο-μπλε5 (NRTIs-SAQ) και λιγότερο επιτυχημένη την πράσινο-μπλε1 (NRTIs-RIT), ενώ για τις πράσινο-μπλε8 (ΝRTIs-ΑΒΤ), πράσινο-μπλε1,2 (ΝRTIs-RIT-ΙΝV) και πράσινο-μπλε1,3 (ΝRTIs-RIT-ΙΝD) δεν προέκυψαν ασφαλή συμπεράσματα. Στους naïve ασθενείς υπήρχαν διαφορές στην ταχύτητα ανταπόκρισης ανάλογα με τον ληφθέντα PI. Ταχύτερη ανταπόκριση παρατηρήθηκε με την αγωγή της πράσινο-κίτρινο (συνδυασμός NRTIs-EFV & NRTIs-VIR). Όσον αφορά την αγωγή NRTIs-ΡΙ, πιο αποτελεσματική ήταν η πράσινο-μπλέ4, ακολουθούμενη από την πράσινο-μπλέ5, ενώ αντίθετα οι συνδυασμοί πράσινο-μπλέ8 και πράσινο-μπλέ2 ήταν οι λιγότερο επιτυχημένοι. Υψηλή τιμή του CD4έναρξης και χαμηλή του VLέναρξης βοηθά στην επίτευξη VL≤50. Οι περισσότεροι naïve ασθενείς είχαν χαμηλό VLέναρξης 1.000-10.000 και υψηλό CD4έναρξης (όπως και οι περισσότεροι experienced) 300-550. Οι naïve ασθενείς της πράσινο-μπλε2 (μεγαλύτερα ποσοστά αποτυχίας) είχαν σχετικά χαμηλό VLεν (<50.000) και CD4εν>100. Οι ασθενείς της πράσινο-μπλε4, είχαν τα μεγαλύτερα ποσοστά επιτυχίας πιθανόν λόγω χαμηλού VLεν (1.000-50.000) και υψηλού CD4εν (300-750 και >750). Το 85% των ασθενών της πράσινο-μπλε5 (επιτυχημένη), είχαν VLεν <10.000 και μάλιστα όταν ήταν <1.000 η ταχύτητα ανταπόκρισης ήταν η μισή σε σχέση με όταν ήταν 1.000-10.000. Οι ασθενείς της πράσινο-μπλε8, (επιτυχημένοι ακόμα και στις υψηλές κλίμακες VLεν), είχαν μεγάλο χρόνο ανταπόκρισης. Οι περισσότεροι (n=20) ασθενείς της πράσινο-κίτρινο3, είχαν VLεν 1.000-50.000 και υψηλό CD4εν 300-550 και πέτυχαν σε μεγαλύτερο χρόνο απ’ ότι της πράσινο-κίτρινο1. Για τους experienced ασθενείς της πράσινο-μπλε2, αν και υπήρχαν ευνοϊκές συνθήκες (VLεν 1.000-50.000 και CD4εν 300-550) είχαν μεγάλο ποσοστό αποτυχίας. Οι ασθενείς της πράσινο-μπλε5 (επιτυχημένη) είχαν VLεν<10.000 και όταν το CD4εν ήταν 550-750, η ταχύτητα ανταπόκρισης ήταν 4 φορές μεγαλύτερη. Οι επιτυχημένοι ασθενείς της πράσινης ομάδας κατά την επίτευξη VL≤50 (Μ.Ο.~47 μήνες) είχαν CD4 κατά ~33% αυξημένο σε σχέση με το CD4εν. Οι επιτυχημένοι ασθενείς της πράσινο-μπλε είχαν CD4 όταν VL≤50 (Μ.Ο.~27 μήνες) αυξημένο κατά ~125% σε σχέση με το CD4εν και VLεν~80πλάσιο (Μ.Ο.~86.000). Οι επιτυχημένοι ασθενείς της πράσινο-κίτρινο είχαν επίσης ανάλογο αριθμό CD4εν, αλλά ο Μ.Ο. CD4 όταν VL≤50 ήταν μικρότερος (αυξημένος κατά 21,6% έναντι 125%). Πέτυχαν σε μόλις κατά Μ.Ο. 15 μήνες, έχοντας πολύ υψηλό Μ.Ο. VLεν (~129.000) και με ποσοστό επιτυχίας ανάλογο με αυτό της πράσινο-μπλέ ομάδας (~81,6%). Οι επιτυχημένοι naïve ασθενείς των υποομάδων πράσινο-μπλε, που είχαν τις υψηλότερες τιμές VLεν κατά σειρά αποτελεσματικότητας ήταν: Π-Μπλέ1,3(3 ασθενείς)> Π-Μπλέ1>Π-Μπλέ8. Τα CD4 των επιτυχημένων ασθενών της πράσινο-μπλέ4 την στιγμή που πέτυχαν (Μ.Ο.~22 μήνες), ήταν τα πιο υψηλά (Μ.Ο. 720). Οι επιτυχημένοι ασθενείς της πράσινο-μπλέ5 είχαν τις χαμηλότερες τιμές VLεν (Μ.Ο.~ 7.000) και CD4 την στιγμή που πέτυχαν υψηλά (Μ.Ο. ~600). Οι επιτυχημένοι ασθενείς των υποομάδων πράσινο-κίτρινο, είχαν υψηλές τιμές VL έναρξης (Μ.Ο.~130.000). Οι επιτυχημένοι experienced ασθενείς που είχαν τις υψηλότερες τιμές VLεν ήταν της πράσινο-μπλέ1 (Μ.Ο.~162.000), (αποτυχία~62%), της πράσινο-μπλέ2 (αποτυχία~87%) (Μ.Ο.~90.000), καθώς και των επιτυχημένων πράσινο-μπλέ3 και πράσινο-μπλέ4 (Μ.Ο.~86.000-130.000), ενώ της επιτυχημένης πράσινο-μπλέ5 είχαν χαμηλό VLεν (Μ.Ο.~3.000). / -
90

Utiliza??o do controle estat?stico do processo para monitoramento do peso m?dio de c?psulas de Tuberculost?ticos: estudo de caso no NUPLAM- RN

Ferreira, Paula de Oliveira 29 August 2008 (has links)
Made available in DSpace on 2014-12-17T14:52:39Z (GMT). No. of bitstreams: 1 PaulaOF.pdf: 1749658 bytes, checksum: c5a7f0123f17bea5eee66bccfbc7edd0 (MD5) Previous issue date: 2008-08-29 / Coordena??o de Aperfei?oamento de Pessoal de N?vel Superior / This Master Thesis presents a case study on the use of Statistical Process Control (SPC) at the N?cleo de Pesquisas em Alimentos e Medicamentos (NUPLAM). The SPC basic tools have been applied in the process of the tuberculost?ticos drugs encapsulation, primarily concerning the objective to choose, between two speeds, which one is the best one to perform the tuberculostatics encapsulation. Later on, with the company effectively operating, the SPC was applied intending to know the variability of the process and, through the tracking of the process itself, to arrive at an estimated limit for the control of future lots of tuberculostatics of equal dosage. As special causes were detected acting in the process, a cause-and-effect diagram was built in order to try to discover, in each factor that composes the productive process, the possible causes of variation of the capsules average weight. The hypotheses raised will be able to serve as a base for deepened the study to eliminate or reduce these interferences in the process. Also a study on the capacity of the process to attend the specifications was carried out, and this study has shown the process?s inaptitude to take care of them. However, on the side of NUPLAM exists a real yearning to implant the SPC and consequently to improve the existing quality already present on its medicines / Esta disserta??o apresenta um estudo de caso sobre a utiliza??o do Controle Estat?stico do Processo (CEP) no N?cleo de Pesquisas em Alimentos e Medicamentos (NUPLAM). As ferramentas b?sicas do CEP foram aplicadas no processo de encapsulamento de tuberculost?ticos primeiramente com o objetivo de escolher entre duas velocidades qual a melhor para a realiza??o do encapsulamento dos tuberculost?ticos. Posteriormente, com a empresa efetivamente funcionando, o CEP foi aplicado com o intuito de conhecer a variabilidade do processo e atrav?s do monitoramento do mesmo chegar a um limite de controle estimado para o controle de lotes futuros de tuberculost?ticos que possuam a mesma dosagem do tuberculost?tico monitorado. Como foram detectadas causas especiais atuando no processo, um diagrama de causa-e- efeito foi constru?do a fim de tentar descobrir em cada fator que comp?e o processo produtivo as poss?veis causas de varia??o do peso m?dio das c?psulas. As hip?teses levantadas poder?o servir de base para um estudo mais aprofundado para a elimina??o ou redu??o dessas interfer?ncias no processo. Tamb?m foi realizado um estudo sobre a capacidade do processo em atender ?s especifica??es o qual se mostrou incapaz de atend?-las. Contudo, existe por parte do NUPLAM um real anseio em implantar o CEP e conseq?entemente melhorar a qualidade j? existente em seus medicamentos

Page generated in 0.094 seconds