• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 1
  • 1
  • Tagged with
  • 8
  • 8
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Improvement of the predictive character of test results issued from analytical methods life cycle / Amélioration du caractère prédictif des résultats associés au cycle de vie des méthodes analytiques

Rozet, Eric 29 April 2008 (has links)
Les résultats issus des méthodes analytiques ont un rôle essentiel dans de nombreux domaines suite aux décisions qui sont prises sur leur base telles que la détermination de la qualité des principes actifs, des spécialités pharmaceutiques, des nutriments ou autres échantillons tels que ceux dorigine biologique impliqués dans les études pharmacocinétiques ou de biodisponibilité et bioéquivalence. La fiabilité des résultats analytiques est primordiale dans ce contexte et surtout ils doivent être en accord avec les besoins des utilisateurs finaux. Pour sassurer de la fiabilité des résultats qui seront fournis lors des analyses de routine, la validation des méthodes analytiques est un élément crucial du cycle de vie dune méthode analytique. Par ailleurs, bien souvent une méthode analytique nest pas uniquement employée dans le laboratoire qui la développée et validée, elle est régulièrement transférée vers un autre laboratoire, comme par exemple lors du passage dun laboratoire de recherche et développement vers un laboratoire de contrôle qualité ou, vers ou depuis un sous-traitant. Le transfert de cette méthode doit permettre de garantir la fiabilité des résultats qui seront fournis par ces laboratoires receveurs. Ce sont en effet eux qui utiliseront la méthode analytique en question ultérieurement. Cest dans ce contexte que se situe notre thèse. Son objectif principal est daméliorer la fiabilité des décisions prises au moyen des résultats obtenus par des méthodes analytiques quantitatives lors de ces deux étapes de leur cycle de vie. Pour atteindre cet objectif, nous avons dune part reprécisé lobjectif de toute méthode analytique quantitative et de sa validation. Ensuite, une revue des textes réglementaires de lindustrie pharmaceutique relatifs à la validation des méthodes a été faite, dégageant les erreurs et confusions incluses dans ces documents et en particulier leurs implications pratiques lors de lévaluation de la validité dune méthode. Compte tenu de ces constatations, une nouvelle approche pour évaluer la validité des méthodes analytiques quantitatives a été proposée et détaillées dun point de vue statistique. Elle se base sur lutilisation dune méthodologie statistique utilisant un intervalle de tolérance de type « β-expectation » qui a été transposée en un outil de décision final appelé profil dexactitude. Ce profil permet de garantir quune proportion définie des futurs résultats qui seront fournis par la méthode lors de son utilisation en routine sera bien inclue dans des limites dacceptation fixées a priori en fonction des besoins des utilisateurs. De cette manière lobjectif de la validation est parfaitement cohérant avec celui de toute méthode quantitative : obtenir des résultats exactes. Cette approche de validation a été appliquée avec succès a différents types de méthodes analytiques comme la chromatographie liquide, lélectrophorèse capillaire, la spectrophotométrie UV ou proche infra-rouge, aussi bien pour le dosage danalyte dans des matrices issues de la production de médicaments (formulations pharmaceutiques) que dans des matrices plus complexes comme les fluides biologiques (plasma, urine). Ceci démontre le caractère universel du profil dexactitude pour statuer sur la validité dune méthode. Ensuite, et afin daugmenter lobjectivité de cet outil de décision, nous avons introduit des indexes de désirabilité articulés autour de critères de validation à savoir les indexes de justesse, de fidélité, dintervalle de dosage et dexactitude. Ce dernier constitue un indexe de désirabilité global qui correspond à la moyenne géométrique des trois autres indexes. Ces différents indexes permettent de comparer et classer les profils dexactitude obtenus en phase de validation et ainsi de choisir celui qui correspond le mieux à lobjectif de la méthode, et ce de manière plus objective. Enfin, nous avons pour la première fois démontré le caractère prédictif du profil dexactitude en vérifiant que la proportion de résultats inclue dans les limites dacceptation prédite lors de létape de validation létait effectivement bien lors de lapplication en routine des diverse méthodes de dosage. Contrairement à létape de validation, le transfert de méthode analytique dun laboratoire émetteur qui a validé la méthode vers un laboratoire receveur qui utilisera la méthode en routine, est une étape qui ne bénéficie daucun texte normatif. La seule exigence réglementaire est de documenter le transfert. Dès lors toutes les approches sont possibles. Toutefois celles que lon rencontre le plus souvent ne répondent pas à lobjectif du transfert, à savoir garantir que les résultats obtenus par le laboratoire receveur seront exactes et donc fiables. Dès lors, nous avons développé une approche originale qui permet de statuer de manière appropriée quant à lacceptabilité du transfert. Cette approche basée sur le concept de lerreur totale, utilise également comme méthodologie statistique lintervalle de tolérance et tient compte simultanément de lincertitude sur lestimation de la vraie valeur fournie par le laboratoire émetteur. En effet, un intervalle de tolérance de type « β-expectation » est calculé avec les résultats du receveur puis, comparé à des limites dacceptation autour de la vraie valeur et ajustées en fonction de lincertitude associée à cette valeur de référence. Dautre part, des simulations statistiques ont permis de montrer le gain dans la gestion des risques associés à un transfert à savoir rejeter un transfert acceptable et accepter un transfert qui ne lest pas. Enfin, ladéquation et lapplicabilité de cette nouvelle approche ont été démontrées par le transfert dune méthode dédiée au contrôle de qualité dune formulation pharmaceutique et de deux méthodes bio-analytiques. Les améliorations de la qualité prédictive des méthodologies proposées pour évaluer la validité et le transfert de méthodes analytiques quantitatives permettent ainsi daugmenter la fiabilité des résultats générés par ces méthodes et par conséquent daccroître la confiance dans les décisions critiques qui en découleront.
2

Validation of a DC-DC Boost Circuit Model and Control Algorithm

Zumberge, Jon T. 27 August 2015 (has links)
No description available.
3

Corporate Default Predictions and Methods for Uncertainty Quantifications

Yuan, Miao 01 August 2016 (has links)
Regarding quantifying uncertainties in prediction, two projects with different perspectives and application backgrounds are presented in this dissertation. The goal of the first project is to predict the corporate default risks based on large-scale time-to-event and covariate data in the context of controlling credit risks. Specifically, we propose a competing risks model to incorporate exits of companies due to default and other reasons. Because of the stochastic and dynamic nature of the corporate risks, we incorporate both company-level and market-level covariate processes into the event intensities. We propose a parsimonious Markovian time series model and a dynamic factor model (DFM) to efficiently capture the mean and correlation structure of the high-dimensional covariate dynamics. For estimating parameters in the DFM, we derive an expectation maximization (EM) algorithm in explicit forms under necessary constraints. For multi-period default risks, we consider both the corporate-level and the market-level predictions. We also develop prediction interval (PI) procedures that synthetically take uncertainties in the future observation, parameter estimation, and the future covariate processes into account. In the second project, to quantify the uncertainties in the maximum likelihood (ML) estimators and compute the exact tolerance interval (TI) factors regarding the nominal confidence level, we propose algorithms for two-sided control-the-center and control-both-tails TI for complete or Type II censored data following the (log)-location-scale family of distributions. Our approaches are based on pivotal properties of ML estimators of parameters for the (log)-location-scale family and utilize the Monte-Carlo simulations. While for Type I censored data, only approximate pivotal quantities exist. An adjusted procedure is developed to compute the approximate factors. The observed CP is shown to be asymptotically accurate by our simulation study. Our proposed methods are illustrated using real-data examples. / Ph. D.
4

Statistical inference with randomized nomination sampling

Nourmohammadi, Mohammad 08 1900 (has links)
In this dissertation, we develop several new inference procedures that are based on randomized nomination sampling (RNS). The first problem we consider is that of constructing distribution-free confidence intervals for quantiles for finite populations. The required algorithms for computing coverage probabilities of the proposed confidence intervals are presented. The second problem we address is that of constructing nonparametric confidence intervals for infinite populations. We describe the procedures for constructing confidence intervals and compare the constructed confidence intervals in the RNS setting, both in perfect and imperfect ranking scenario, with their simple random sampling (SRS) counterparts. Recommendations for choosing the design parameters are made to achieve shorter confidence intervals than their SRS counterparts. The third problem we investigate is the construction of tolerance intervals using the RNS technique. We describe the procedures of constructing one- and two-sided RNS tolerance intervals and investigate the sample sizes required to achieve tolerance intervals which contain the determined proportions of the underlying population. We also investigate the efficiency of RNS-based tolerance intervals compared with their corresponding intervals based on SRS. A new method for estimating ranking error probabilities is proposed. The final problem we consider is that of parametric inference based on RNS. We introduce different data types associated with different situation that one might encounter using the RNS design and provide the maximum likelihood (ML) and the method of moments (MM) estimators of the parameters in two classes of distributions; proportional hazard rate (PHR) and proportional reverse hazard rate (PRHR) models.
5

Métodos para detecção de outliers em séries de preços do índice de preços ao consumidor

Lyra, Taíse Ferraz 24 February 2014 (has links)
Submitted by Taíse Ferraz Lyra (taise.lyra@fgv.br) on 2014-05-14T15:24:28Z No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) / Approved for entry into archive by Janete de Oliveira Feitosa (janete.feitosa@fgv.br) on 2014-05-19T16:45:31Z (GMT) No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) / Approved for entry into archive by Marcia Bacha (marcia.bacha@fgv.br) on 2014-05-26T19:26:19Z (GMT) No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) / Made available in DSpace on 2014-05-26T19:28:52Z (GMT). No. of bitstreams: 1 Dissertação - Taíse Ferraz Lyra (Versão Final).pdf: 1069993 bytes, checksum: 3407689a27bfac06aff01d4fda05f6f2 (MD5) Previous issue date: 2014-02-24 / Outliers are observations that appear to be inconsistent with the others. Also called atypical, extreme or aberrant values, these inconsistencies can be caused, for instance, by political changes or economic crises, unexpected cold or heat waves, and measurement or typing errors. Although outliers are not necessarily incorrect values, they can distort the results of an analysis and lead researchers to erroneous conclusions if they are related to measurement or typing errors. The objective of this research is to study and compare different methods for detecting abnormalities in the price series from the Consumer Price Index (Índice de Preços ao Consumidor - IPC), calculated by the Brazilian Institute of Economy (Instituto Brasileiro de Economia - IBRE) from Getulio Vargas Foundation (Fundação Getulio Vargas - FGV). The IPC measures the price variation of a fixed set of goods and services, which are part of customary expenses for families with income levels between 1 and 33 monthly minimum wages and is mainly used as an indice of reference to evaluate the purchasing power of consumer. In addition to the method currently used by price analysts in IBRE, the study also considered variations of the IBRE Method, the Boxplot Method, the SIQR Boxplot Method, the Adjusted Boxplot Method, the Resistant Fences Method, the Quartile Method, the Modified Quartile Method, the Median Absolute Deviation Method and the Tukey Algorithm. These methods wre applied to data of the munucipalities Rio de Janeiro and São Paulo. In order to analyze the performance of each method, it is necessary to know the real extreme values in advance. Therefore, in this study, it was assumed that prices which were discarded or changed by analysts in the critical process were the real outliers. The method from IBRE is correlated with altered or discarded prices by analysts. Thus, the assumption that the changed or discarded prices by the analysts are the real outliers can influence the results, causing the method from IBRE be favored compared to other methods. However, thus, it is possible to compute two measurements by which the methods are evaluated. The first is the method’s accuracy score, which displays the proportion of detected real outliers. The second is the number of false-positive produced by the method, that tells how many values needed to be flagged to detect a real outlier. As higher the hit rate generated by the method and as the lower the amount of false positives produced therefrom, the better the performance of the method. Therefore, it was possible to construct a ranking relative to the performance of the methods, identifying the best among those analyzed. In the municipality of Rio de Janeiro, some of the variations of the method from IBRE showed equal or superior to the original method performances. As for the city of São Paulo, the method from IBRE showed the best performance. It is argued that a method correctly detects an outlier when it signals a real outlier as an extreme value. The method with the highest accuracy score and with smaller number of false-positive was from IBRE. For future investigations, we hope to test the methods in data obtained from simulation and from widely used data bases, so that the assumption related to the discarded or changed prices, during the critical process, does not alter the results. / Outliers são observações que parecem ser inconsistentes com as demais. Também chamadas de valores atípicos, extremos ou aberrantes, estas inconsistências podem ser causadas por mudanças de política ou crises econômicas, ondas inesperadas de frio ou calor, erros de medida ou digitação, entre outras. Outliers não são necessariamente valores incorretos, mas, quando provenientes de erros de medida ou digitação, podem distorcer os resultados de uma análise e levar o pesquisador à conclusões equivocadas. O objetivo deste trabalho é estudar e comparar diferentes métodos para detecção de anormalidades em séries de preços do Índice de Preços ao Consumidor (IPC), calculado pelo Instituto Brasileiro de Economia (IBRE) da Fundação Getulio Vargas (FGV). O IPC mede a variação dos preços de um conjunto fixo de bens e serviços componentes de despesas habituais das famílias com nível de renda situado entre 1 e 33 salários mínimos mensais e é usado principalmente como um índice de referência para avaliação do poder de compra do consumidor. Além do método utilizado atualmente no IBRE pelos analistas de preços, os métodos considerados neste estudo são: variações do Método do IBRE, Método do Boxplot, Método do Boxplot SIQR, Método do Boxplot Ajustado, Método de Cercas Resistentes, Método do Quartil, do Quartil Modificado, Método do Desvio Mediano Absoluto e Algoritmo de Tukey. Tais métodos foram aplicados em dados pertencentes aos municípios Rio de Janeiro e São Paulo. Para que se possa analisar o desempenho de cada método, é necessário conhecer os verdadeiros valores extremos antecipadamente. Portanto, neste trabalho, tal análise foi feita assumindo que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers. O Método do IBRE é bastante correlacionado com os preços alterados ou descartados pelos analistas. Sendo assim, a suposição de que os preços alterados ou descartados pelos analistas são os verdadeiros valores extremos pode influenciar os resultados, fazendo com que o mesmo seja favorecido em comparação com os demais métodos. No entanto, desta forma, é possível computar duas medidas através das quais os métodos são avaliados. A primeira é a porcentagem de acerto do método, que informa a proporção de verdadeiros outliers detectados. A segunda é o número de falsos positivos produzidos pelo método, que informa quantos valores precisaram ser sinalizados para um verdadeiro outlier ser detectado. Quanto maior for a proporção de acerto gerada pelo método e menor for a quantidade de falsos positivos produzidos pelo mesmo, melhor é o desempenho do método. Sendo assim, foi possível construir um ranking referente ao desempenho dos métodos, identificando o melhor dentre os analisados. Para o município do Rio de Janeiro, algumas das variações do Método do IBRE apresentaram desempenhos iguais ou superiores ao do método original. Já para o município de São Paulo, o Método do IBRE apresentou o melhor desempenho. Em trabalhos futuros, espera-se testar os métodos em dados obtidos por simulação ou que constituam bases largamente utilizadas na literatura, de forma que a suposição de que os preços descartados ou alterados pelos analistas no processo de crítica são os verdadeiros outliers não interfira nos resultados.
6

Technologie elektrojiskrového drátového řezání / Technology of wire electrical discharge machining

Brázda, Radim January 2013 (has links)
This master´s thesis deals with unconventional technology of wire electrical discharge machining. There are described the principles and essence of electrical discharge machining and the principles related to wire electrical discharge machining with emphasis on the application of this technology in terms of medium-sized engineering company. There is also described the complete assembly of technolgy wire cutting and machining on wire cutter Excetek V 650. Then in the work there are statistically evaluated parameters precision machined surfaces, specifically to the belt pulley 116-8M-130. At the end of the work there is the technical-economic evaluation that addresses the hourly cost of machining on wire cutter Excetek V 650.
7

ASSESSMENT OF AGREEMENT AND SELECTION OF THE BEST INSTRUMENT IN METHOD COMPARISON STUDIES

Choudhary, Pankaj K. 11 September 2002 (has links)
No description available.
8

[pt] INTERVALOS DE TOLERÂNCIA PARA VARIÂNCIAS AMOSTRAIS APLICADOS AO ESTUDO DO DESEMPENHO NA FASE II E PROJETO DE GRÁFICOS DE S(2) COM PARÂMETROS ESTIMADOS / [en] TOLERANCE INTERVALS FOR SAMPLE VARIANCES APPLIED TO THE STUDY OF THE PHASE II PERFORMANCE AND DESIGN OF S(2) CHARTS WITH ESTIMATED PARAMETERS

MARTIN GUILLERMO CORNEJO SARMIENTO 25 July 2019 (has links)
[pt] Os gráficos de controle de S(2) são ferramentas fundamentais amplamente utilizados para monitoramento da dispersão do processo em aplicações de CEP. O desempenho na Fase II de diferentes tipos de gráficos de controle, incluindo o gráfico de S(2), com parâmetros desconhecidos pode ser significativamente diferente do desempenho nominal por causa do efeito da estimação de parâmetros. Nos anos mais recentes, este efeito tem sido abordado predominantemente sob a perspectiva condicional, que considera a variabilidade das estimativas de parâmetros obtidas a partir de diferentes amostras de referência da Fase I em vez das típicas medidas de desempenho baseadas na distribuição marginal (incondicional) do número de amostras até o sinal (Run Length-RL), como sua média. À luz dessa nova perspectiva condicional, a análise do desempenho da Fase II e do projeto de gráficos de controle é frequentemente realizada usando o Exceedance Probability Criterion para a média da distribuição condicional do RL (CARL 0), isto é, o critério que garante uma alta probabilidade de que CARL 0 seja pelo menos um valor mínimo tolerado e especificado. Intervalos de tolerância para variâncias amostrais são úteis quando o maior interesse está focado na precisão dos valores de uma característica de qualidade, e podem ser usados na tomada de decisões sobre a aceitação de lotes por amostragem. Motivado pelo fato de que estes intervalos, especificamente no caso dos intervalos bilaterais, não foram abordados na literatura, limites bilaterais de tolerância exatos e aproximados de variâncias amostrais são derivados e apresentados neste trabalho. A relação matemática-estatística entre o intervalo de tolerância para a variância amostral e o Exceedance Probability (função de sobrevivência) da CARL 0 do gráfico de S(2) com parâmetro estimado é reconhecida, destacada e usada neste trabalho de tal forma que o estudo do desempenho na Fase II e o projeto desse gráfico pode ser baseado no intervalo de tolerância para a variância amostral, e vice-versa. Os trabalhos sobre o desempenho e projeto do gráfico de S(2) com parâmetro estimado focaram-se em apenas uma perspectiva (incondicional ou condicional) e consideraram somente um tipo de gráfico (unilateral superior ou bilateral). A existência de duas perspectivas e dois tipos de gráficos poderia ser confusa para os usuários. Por esse motivo, o desempenho e o projeto do gráfico de S(2) de acordo com essas duas perspectivas são comparados, considerando cada tipo de gráfico. Da mesma forma, esses dois tipos de gráficos também são comparados para cada perspectiva. Alguns resultados importantes relacionados ao projeto do gráfico de S(2), que ainda não estão disponíveis na literatura, foram necessários e obtidos neste trabalho para fornecer um estudo comparativo completo que permita aos usuários estarem cientes das diferenças significativas entre as duas perspectivas e os dois tipos de gráficos para tomar decisões informadas sobre a escolha do projeto do gráfico de S(2). Além disso, dado que a distribuição condicional do RL é em geral fortemente enviesada à direita, a mediana e alguns quantis extremos desta distribuição são propostos como medidas de desempenho complementares à sua tradicional média (CARL 0). Finalmente, algumas recomendações práticas são oferecidas. / [en] The S(2) control charts are fundamental tools widely used to monitor the process dispersion in applications of Statistical Process Monitoring and Control. Phase II performance of different types of control charts, including the S(2) chart, with unknown process parameters may be significantly different from the nominal performance due to the effect of parameter estimation. In the last few years, this effect has been addressed predominantly under the conditional perspective, which considers the variability of parameter estimates obtained from different Phase I reference samples instead of the traditional unconditional performance measures based on the marginal (unconditional) run length (RL) distribution, such as the unconditional average run length. In light of this new conditional perspective, the analysis of the Phase II performance and design of control charts is frequently undertaken using the Exceedance Probability Criterion for the conditional (given the parameter estimates) in-control average run length (CARL 0), that is, the criterion that ensures a high probability that the CARL 0 is at least a specified minimum tolerated value. Tolerance intervals for sample variances are useful when the main concern is the precision of the values of the quality characteristic and then they can be used in decision-making on lot acceptance sampling. Motivated by the fact that these tolerance intervals, specifically in the case of the two-sided intervals, have not been addressed in the literature so far, exact and approximate two-sided tolerance limits for the population of sample variances are derived and presented in this work. The mathematical-statistical relationship between tolerance interval for the sample variance and the exceedance probability (survival probability) of the CARL 0 for the S(2) control chart with estimated parameter is recognized, highlighted and used in this work in such a way that the study of the Phase II performance and design of this chart can be based on tolerance interval for the sample variance, and vice versa. Works on performance and design of S(2) chart with estimated parameter generally focused on only one perspective (either unconditional or conditional) and considered only one type of chart (either upper one-sided chart or two-sided chart). The existence of both perspectives and two types of charts may be confusing for practitioners. For that reason, the performance and design of S(2) control chart according to these two perspectives are compared, considering each type of chart. Similarly, these two types of charts are also compared for each perspective. Some important results related to the S(2) chart design, which are not yet available in the literature, were required and obtained in this work to provide a comprehensive comparative study that enables practitioners to be aware of the significant differences between these two perspectives and the two types of charts so that proper informed decisions about the chart design to choose can be made. Furthermore, because the conditional RL distribution is usually highly rightskewed, the median and some extreme quantiles of the conditional RL distribution are proposed as complementary performance measures to the customary mean (CARL 0). Finally, some practical recommendations are offered.

Page generated in 0.0988 seconds