• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 551
  • 94
  • 78
  • 58
  • 36
  • 25
  • 25
  • 25
  • 25
  • 25
  • 24
  • 22
  • 15
  • 4
  • 3
  • Tagged with
  • 956
  • 956
  • 221
  • 163
  • 139
  • 126
  • 97
  • 92
  • 90
  • 74
  • 72
  • 69
  • 66
  • 65
  • 64
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
901

Avaliação da confiabilidade em tubos de revestimento de poços de petróleo / Reliability assessment in casing tubes of oil Wells

Gouveia, Lucas Pereira de 08 August 2014 (has links)
This work aims to evaluate the reliability levels associated to a probabilistic approach of mechanical strength models of casing tubes on oil and gas wells. A comparative study between different reliability evaluation methods commonly applied is also carried out. On the oil and gas well design, casing tubes must bear the mechanical loadings in the subsurface, such as the ones from formations, from drilling and completion fluids, from production fluid over the well lifetime, from the self-weight of casing column and from weight of other components. Reliability-based analysis applied to a structural design allows the assessment of the probability of violation for a given limit state of the structure, so that it can be predicted with adequate value since the design stage. This kind of analysis is useful to obtain adequate safety levels in design and to discuss the quality control level in the manufacturer production process. In this work, the failure probability is evaluated by the following reliability methods: failure domain numericintegration,MonteCarlosimulationandthetransformationmethods:FirstOrder eliabilityMethod(FORM)andSecondOrderReliabilityMethod(SORM).Thelimitstatesv rified are established by using casing strength models found in the literature, based on mechanics of materials theory and rupture test data.Statistical data are based on technical reports from casing manufacturers found in open-access literature. The achieved results contributes to well casing structural assessment taking into account the influence of design uncertainties, motivating the adoption of reliability-based analysis in decision-making process on OCTG design. / FUNDEPES - Fundação Universitária de Desenvolvimento de Extensão e Pesquisa / Estetrabalho visa avaliar os níveis de confiabilidade associados a uma abordagem probabilística das resistências mecânicas de tubos de revestimento em poços de petróleo. Além disso, durante as análises realizadas, objetiva-se comparar os diferentes métodos de confiabilidade comumente encontrados na literatura com a finalidade de identificar o método mais vantajoso para a aplicação proposta. Em projetos de poços de petróleo e gás natural, os revestimentos exercem o papel de resistir mecanicamente aos esforços existentes na subsuperfície, como as solicitações impostas pela formação, pelo fluido de perfuração, pelos fluidos produzidos ao longo da vida útil do poço e pelos pesos da própria coluna de revestimento e de outros equipamentos. Já a análise de confiabilidade, aplicada a um projeto estrutural, permite a avaliação da probabilidade de violação de um determinado estado limite da estrutura, de forma que esta pode ser prevista, com valor adequado, ainda na fase de projeto.Esse tipo de análise é útil não obtenção da margem de segurança adequada do projeto e na discussão do nível de controle no processo de produção de elementos estruturais. Neste trabalho, o cálculo da probabilidade de falha é realizado através dos seguintes métodos: integração numérica sobre o domínio de falha, simulação de Monte Carlo e dos métodos de transformação: First Order Reliability Method (FORM) e Second Order Reliability Method (SORM). Os estados limites dos tubos são estimados por modelos de resistência encontrados na literatura, baseados em teorias da mecânica dos materiais e em dados de ensaios de ruptura. Os dados estatísticos utilizados são baseados em relatórios técnicos de produção disponíveis na literatura sob domínio público. Os resultados obtidos contribuem para a avaliação estrutural de revestimentos de poços de petróleo sob a influência de incertezas de projeto, motivando a incorporação da análise de confiabilidade no processo de tomada de decisão do projetista.
902

Separação de eventos sísmicos por métodos de decomposição de sinais / Seismic events separation by means of signal decomposition

Zanetti, Ricardo Antonio, 1978- 08 May 2013 (has links)
Orientadores: João Marcos Travassos Romano, Leonardo Tomazeli Duarte / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação / Made available in DSpace on 2018-08-23T23:21:46Z (GMT). No. of bitstreams: 1 Zanetti_RicardoAntonio_M.pdf: 21586747 bytes, checksum: 452b3dadea31fa37e922d925b45c10be (MD5) Previous issue date: 2013 / Resumo: : O Resumo poderá ser visualizado no texto completo da tese digital / Abstract: : The complete Abstract is available with the full electronic / Mestrado / Telecomunicações e Telemática / Mestre em Engenharia Elétrica
903

Modelos de processo de Poisson não-homogêneo na presença de um ou mais pontos de mudança, aplicados a dados de poluição do ar / Non-homogeneous Poisson process in the presence of one or more change-points, an application to air pollution data

Vicini, Lorena 06 December 2012 (has links)
Orientadores: Luiz Koodi Hotta, Jorge Alberto Achcar / Tese (doutorado) ¿ Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica / Made available in DSpace on 2018-08-20T14:22:26Z (GMT). No. of bitstreams: 1 Vicini_Lorena_D.pdf: 75122511 bytes, checksum: 796c27170036587b321bbe88bc0d369e (MD5) Previous issue date: 2012 / Resumo: A poluição do ar é um problema que tem afetado várias regiões ao redor do mundo. Em grandes centros urbanos, como é esperado, a concentração de poluição do ar é maior. Devido ao efeito do vento, no entanto, este problema não se restringe a esses centros, e consequentemente a poluição do ar se espalha para outras regiões. Os dados de poluição do ar são modelados por processos de Poisson não-homogêneos (NHPP) em três artigos: dois usando métodos Bayesianos com Markov Chain Monte Carlo (MCMC) para dados de contagem, e um usando análise de dados funcionais. O primeiro artigo discute o problema da especificação das distribuições a priori, incluindo a discussão de sensibilidade e convergência das cadeias MCMC. O segundo artigo introduz um modelo incluindo pontos de mudança para NHPP com a função taxa modelada por uma distribuição gama generalizada, usando métodos Bayesianos. Modelos com e sem pontos de mudança foram considerados para fins de comparação. O terceiro artigo utiliza análise de dados funcionais para estimar a função taxa de um NHPP. Esta estimação é feita sob a suposição de que a função taxa é contínua, mas com um número finito de pontos de descontinuidade na sua primeira derivada, localizados exatamente nos pontos de mudança. A função taxa e seus pontos de mudança foram estimadas utilizando suavização splines e uma função de penalização baseada nos candidatos a pontos de mudança. Os métodos desenvolvidos neste trabalho foram testadas através de simulações e aplicados a dados de poluição de ozônio da Cidade do México, descrevendo a qualidade do ar neste centro urbano. Ele conta quantas vezes, em um determinado período, a poluição do ar excede um limiar especificado de qualidade do ar, com base em níveis de concentração de ozônio. Observou-se que quanto mais complexos os modelos, incluindo os pontos de mudança, melhor foi o ajuste / Abstract: Air pollution is a problem that is currently affecting several regions around the world. In major urban centers, as expected, the concentration of air pollution is higher. Due to wind effect, however, this problem does not remain constrained in such centers, and air pollution spreads to other regions. In the thesis the air pollution data is modeled by Non-Homogeneous Poisson Process (NHPP) in three papers: two using Bayesian methods with Markov Chain Monte Carlo (MCMC) for count data, and one using functional data analysis. Paper one discuss the problem of the prior specification, including discussion of the sensitivity and convergence of the MCMC chains. Paper two introduces a model including change point for NHPP with rate function modeled by a generalized gamma distribution, using Bayesian methods. Models with and without change points were considered for comparison purposes. Paper three uses functional data analysis to estimate the rate function of a NHPP. This estimation is done under the assumption that the rate function is continuous, but with a finite number of discontinuity points in its first derivative, exactly at the change-points. The rate function and its change-points were estimated using splines smoothing and a penalty function based on candidate change points. The methods developed in this work were tested using simulations and applied to ozone pollution data from Mexico City, describing the air quality in this urban center. It counts how many times, in a determined period, air pollution exceeds a specified threshold of air quality, based on ozone concentration levels. It was observed that the more complex the models, including change-points, the better the fitting / Doutorado / Estatistica / Doutor em Estatística
904

Programa de medição para organizações de alta maturidade / Measurement program for high maturity organizations

Batista, Gabriela de Fatima 25 February 2005 (has links)
Orientadores: Mario Jino / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação / Made available in DSpace on 2018-08-04T03:55:01Z (GMT). No. of bitstreams: 1 Batista_GabrieladeFatima_M.pdf: 1354598 bytes, checksum: 95bb9bb4e444a6f9b717df69650f4e8b (MD5) Previous issue date: 2005 / Resumo: Organizações de alto nível de maturidade têm como meta principal a melhoria contínua de seus processos. Essas organizações usam sistematicamente métricas e fazem análise dos dados coletados para tomada de decisão, ou seja, fazem efetivamente gerenciamento por dados. Avaliação quantitativa da capacidade do processo de software definido para o projeto e suas variações permite planejar e gerenciar melhor os projetos. Considerando a necessidade de medir, prever e ajustar o processo de software para alcançar as metas de qualidade, um programa de medição é proposto com o intuito de dar suporte à gerência quantitativa. O programa de medição apresenta métricas alinhadas às metas organizacionais e exige que, após a coleta dos dados e sua análise, os envolvidos nessas métricas - um gerente administrativo, um gerente funcional, um líder de projeto ou um desenvolvedor - comprometam-se a usar os resultados da análise para identificar os desvios de processo e aplicar as ações corretivas necessárias; desta forma, pode-se controlar o desempenho do processo de desenvolvimento de software dentro dos limites aceitáveis. Para apoiar o processo de implantação e aplicação de métricas, uma ferramenta de coleta, validação e análise dos dados, baseada em controle estatístico de processo, denominada Vigia, foi desenvolvida. Vigia pode ser usada para controlar o desempenho do processo de software definido para o projeto assegurando que o processo não compromete as metas de qualidade da organização nem as metas de negócio, por meio de ações corretivas em tempo real e, conseqüentemente, de ajustes no processo de software. Um estudo de caso foi realizado na Motorola Industrial para avaliar tanto o programa de medição como a ferramenta Vigia / Abstract: Organizations of a high level of maturity have as main goal the continuous improvement of their processes. Such organizations systematically apply metrics by measuring process performance and analyzing these measurements to make decisions; hence, they effectively perform management by data. Quantitative assessment of the performance of the project's defined software process and its variations allows better planning and management of projects. Considering the need for measuring, predicting and adjusting the software process to reach quality goals, a measurement program is being proposed to give support to quantitative management. The measurement program presents metrics aligned to the organizational goals and requires that, after data collection and analysis, the metrics stakeholders - a senior manager, a functional manager, a project leader or a developer - be committed to use the results of the analysis to identify process deviations and to apply the necessary corrective actions. In this way, we may control the performance of the project's defined software process within acceptable limits. To support deployment of the measurement process and application of metrics, a tool to collect, validate and analyze data, based on statistical process control, called Vigia, was developed. Vigia can be used to control the performance of the project's defined software process, assuring that the process does not compromise neither the organizational quality goals nor the business goals through corrective actions in near-real time. Consequently, it carries through adjustments in the software process. A case study was carried out at Motorola Company to evaluate the measurement program as well as the Vigia tool / Mestrado / Engenharia de Computação / Mestre em Engenharia Elétrica
905

Degradation modeling based on a time-dependent Ornstein-Uhlenbeck process and prognosis of system failures / Modélisation des dégradations par un processus d’Ornstein-Uhlenbeck et pronostic de défaillances du système

Deng, Yingjun 24 February 2015 (has links)
Cette thèse est consacrée à la description, la prédiction et la prévention des défaillances de systèmes. Elle se compose de quatre parties relatives à la modélisation stochastique de dégradation, au pronostic de défaillance du système, à l'estimation du niveau de défaillance et à l'optimisation de maintenance.Le processus d'Ornstein-Uhlenbeck (OU) dépendant du temps est introduit dans un objectif de modélisation des dégradations. Sur la base de ce processus, le premier instant de passage d’un niveau de défaillance prédéfini est considéré comme l’instant de défaillance du système considéré. Différentes méthodes sont ensuite proposées pour réaliser le pronostic de défaillance. Dans la suite, le niveau de défaillance associé au processus de dégradation est estimé à partir de la distribution de durée de vie en résolvant un problème inverse de premier passage. Cette approche permet d’associer les enregistrements de défaillance et le suivi de dégradation pour améliorer la qualité du pronostic posé comme un problème de premier passage. Le pronostic de défaillances du système permet d'optimiser sa maintenance. Le cas d'un système contrôlé en permanence est considéré. La caractérisation de l’instant de premier passage permet une rationalisation de la prise de décision de maintenance préventive. L’aide à la décision se fait par la recherche d'un niveau virtuel de défaillance dont le calcul est optimisé en fonction de critères proposés / This thesis is dedicated to describe, predict and prevent system failures. It consists of four issues: i) stochastic degradation modeling, ii) prognosis of system failures, iii) failure level estimation and iv) maintenance optimization. The time-dependent Ornstein-Uhlenbeck (OU) process is introduced for degradation modeling. The time-dependent OU process is interesting from its statistical properties on controllable mean, variance and correlation. Based on such a process, the first passage time is considered as the system failure time to a pre-set failure level. Different methods are then proposed for the prognosis of system failures, which can be classified into three categories: analytical approximations, numerical algorithms and Monte-Carlo simulation methods. Moreover, the failure level is estimated from the lifetime distribution by solving inverse first passage problems. This is to make up the potential gap between failure and degradation records to reinforce the prognosis process via first passage problems. From the prognosis of system failures, the maintenance optimization for a continuously monitored system is performed. By introducing first passage problems, the arrangement of preventive maintenance is simplified. The maintenance decision rule is based on a virtual failure level, which is solution of an optimization problem for proposed objective functions
906

Modélisation markovienne en fiabilité: réduction des grands systèmes

Tombuyses, Béatrice 09 December 1994 (has links)
Le sujet de cette thèse de doctorat est l'étude de divers aspects liés à l'approche markovienne dans le cadre des études de fiabilité.<p><p>La première partie de cette thèse concerne Ia modélisation d'installations industrielles et la construction de la matrice de transition. Le but poursuivi est le développement d'un code markovien permettant une description réaliste et aisée du système. Le système est décrit en termes de composants multiétats :pompes, vannes .<p>La définition d'une série de règles types permet l'introduction de dépendances entre composants. Grâce à la modélisation standardisée du système, un algorithme permettant la construction automatique de la matrice de transition est développé. L'introduction d'opérations de maintenance ou d'information est également présentée.<p><p>La seconde partie s'intéresse aux techniques de réduction de la taille de la matrice, afin de rendre possible le traitement de grosses installations. En effet, le nombre d'états croit exponentiellement avec le nombre de composants, ce qui limite habituellement les installations analysables à une dizaine de composants. Les techniques classiques de réduction sont passées en revue :<p>accessibilité des états,<p>séparation des groupes de composants indépendants,<p>symétrie et agrégation exacte des états (cfr Papazoglou). Il faut adapter la notion de symétrie des composants en tenant compte des dépendances pouvant exister entre composants.<p><p>Une méthode d'agrégation approchée pour le calcul de la fiabilité et de la disponibilité de groupes de composants à deux états est développée.<p><p>La troisième partie de la thèse contient une approche originale pour l'utilisation de la méthode markovienne. Il s'agit du développement d'une technique de réduction basée sur le graphe d'influence des composants. Un graphe d'influence des composants est construit à partir des dépendances existant entre composants. Sur base de ce graphe, un système markovien non homogène est construit, décrivant de manière approchée le comportement du système exact. Les résultats obtenus sur divers exemples sont très bons.<p><p>Une quatrième partie de cette thèse s'intéresse aux problèmes numériques liés à l'intégration du système différentiel du problème markovien. Ces problèmes résultent principalement du caractère stiff du système. Différentes méthodes classiques sont implantées pour l'intégration du système différentiel. Elles sont testées sur un exemple type de problème de fiabilité.<p><p>Pour finir, on trouve la présentation du code CAMERA dans lequel ont été implantées les différentes techniques présentées ci-dessus.<p> / Doctorat en sciences appliquées / info:eu-repo/semantics/nonPublished
907

Statistická analýza kontrolních zkoušek horninových kotev / Statistical analysis of the acceptance tests of ground anchors

Štefaňák, Jan Unknown Date (has links)
The objective of dissertation is to find the approaches for processing the data extracted from the reports that document the performing of acceptance tests of ground anchors. The purpose of this activity is to allow further utilization of this data for designing practice. 795 test records were collected. It is essential for the correctness of analysis, that the whole anchor bond must be placed in homogeneous material. The records for anchors that don’t fulfilled this condition were removed. The set of 379 records of anchors installed in six different soil types during construction work in Czech Republic remained. All those anchors were tested according the demands of european standard ČSN EN 1537:2001, valid until 2013. The methodics based on the mathematical statistics, regression analysis and probability methods were compiled during solving the task defined above. The major result of data processing that was performed via methodics based on combination of mathematical statistics and probability simulation methods is the set of bond shear stress parameter values elaborated for variety of soil types. The regression model for determination of the force-displacement curve and the model predicting the creep behavior of loaded ground anchor were constructed, where the creep value is dependent on the tendon bond length, tendon free length and on the level of prestressing force. The description of full-scale experiment, whose results were used for verification of assumptions incorporated in relevant methodics, is included. The example of determining the probability of failure of anchored structure using the stochastic simulation technique is mentioned also, where the previously obtained results are used as input values for this calculation. Moreover, the software application serving for automatization of processes associated to conducting of the tests of ground anchors and to creating the test report is introduced.
908

Analýza AVG signálů / Analysis of AVG signals

Musil, Václav January 2008 (has links)
The presented thesis discusses the basic analysis methods of arteriovelocitograms. The core of this work rests in classification of signals and contribution to possibilities of noninvasive diagnostic methods for evaluation patients with peripheral ischemic occlusive arterial disease. The classification employs multivariate statistical methods and principles of neural networks. The data processing works with an angiographic verified set of arteriovelocitogram dates. The digital subtraction angiography classified them into 3 separable classes in dependence on degree of vascular stenosis. Classification AVG signals are represented in the program by the 6 parameters that are measured on 3 different places on each patient’s leg. Evaluation of disease appeared to be a comprehensive approach at signals acquired from whole patient’s leg. The sensitivity of clustering method compared with angiography is between 82.75 % and 90.90 %, specificity between 80.66 % and 88.88 %. Using neural networks sensitivity is in range of 79.06 % and 96.87 %, specificity is in range of 73.07 % and 91.30 %.
909

Výzkum faktorů ovlivňujících výkonnost podniků / Analysis of Business Performance Factors

Janková, Ivana January 2011 (has links)
This graduation theses analyses intercompany comparisons using one-dimensional and multi-dimensional methods. It focuses on enterprises actuating in terms of a subsegment of national economy – building industry. Factors that influence the performance of companies most in this field will be selected by using correlation coefficient. Its economic development during the years and also characteristics of the industry will be described in details.
910

Uplatnění časových řad v technické analýze akcií / Use of Time Series for Technical Analysis of Shares

Hela, Michael January 2012 (has links)
This master's thesis deals with the analysis of selected rates of shares by using statistical methods including regression analysis and analysis of time series. Using moving averages as technical indicators in technical analysis of securities to predict the future development of rates of shares and finding buy and sell signals that these indicators generate. The results of this work are suggestions for stock trading based on the use of these methods.

Page generated in 0.126 seconds