• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 5
  • 5
  • 5
  • 5
  • 5
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Methods for Bayesian inversion of seismic data

Walker, Matthew James January 2015 (has links)
The purpose of Bayesian seismic inversion is to combine information derived from seismic data and prior geological knowledge to determine a posterior probability distribution over parameters describing the elastic and geological properties of the subsurface. Typically the subsurface is modelled by a cellular grid model containing thousands or millions of cells within which these parameters are to be determined. Thus such inversions are computationally expensive due to the size of the parameter space (being proportional to the number of grid cells) over which the posterior is to be determined. Therefore, in practice approximations to Bayesian seismic inversion must be considered. A particular, existing approximate workflow is described in this thesis: the so-called two-stage inversion method explicitly splits the inversion problem into elastic and geological inversion stages. These two stages sequentially estimate the elastic parameters given the seismic data, and then the geological parameters given the elastic parameter estimates, respectively. In this thesis a number of methodologies are developed which enhance the accuracy of this approximate workflow. To reduce computational cost, existing elastic inversion methods often incorporate only simplified prior information about the elastic parameters. Thus a method is introduced which transforms such results, obtained using prior information specified using only two-point geostatistics, into new estimates containing sophisticated multi-point geostatistical prior information. The method uses a so-called deep neural network, trained using only synthetic instances (or `examples') of these two estimates, to apply this transformation. The method is shown to improve the resolution and accuracy (by comparison to well measurements) of elastic parameter estimates determined for a real hydrocarbon reservoir. It has been shown previously that so-called mixture density network (MDN) inversion can be used to solve geological inversion analytically (and thus very rapidly and efficiently) but only under certain assumptions about the geological prior distribution. A so-called prior replacement operation is developed here, which can be used to relax these requirements. It permits the efficient MDN method to be incorporated into general stochastic geological inversion methods which are free from the restrictive assumptions. Such methods rely on the use of Markov-chain Monte-Carlo (MCMC) sampling, which estimate the posterior (over the geological parameters) by producing a correlated chain of samples from it. It is shown that this approach can yield biased estimates of the posterior. Thus an alternative method which obtains a set of non-correlated samples from the posterior is developed, avoiding the possibility of bias in the estimate. The new method was tested on a synthetic geological inversion problem; its results compared favourably to those of Gibbs sampling (a MCMC method) on the same problem, which exhibited very significant bias. The geological prior information used in seismic inversion can be derived from real images which bear similarity to the geology anticipated within the target region of the subsurface. Such so-called training images are not always available from which this information (in the form of geostatistics) may be extracted. In this case appropriate training images may be generated by geological experts. However, this process can be costly and difficult. Thus an elicitation method (based on a genetic algorithm) is developed here which obtains the appropriate geostatistics reliably and directly from a geological expert, without the need for training images. 12 experts were asked to use the algorithm (individually) to determine the appropriate geostatistics for a physical (target) geological image. The majority of the experts were able to obtain a set of geostatistics which were consistent with the true (measured) statistics of the target image.
2

[en] ENERGY PRICE SIMULATION IN BRAZIL THROUGH DEMAND SIDE BIDDING / [pt] SIMULAÇÃO DOS PREÇOS DE ENERGIA NO LEILÃO DE EFICIÊNCIA ENERGÉTICA NO BRASIL

JAVIER LINKOLK LOPEZ GONZALES 18 May 2016 (has links)
[pt] A Eficiência Energética (EE) pode ser considerada sinônimo de preservação ambiental, pois a energia economizada evita a construção de novas plantas de geração e de linhas de transmissão. O Leilão de Eficiência Energética (LEE) poderia representar uma alternativa muito interessante para a dinamização e promoção de práticas de EE no Brasil. Porém, é importante mencionar que isso pressupõe uma confiança na quantidade de energia reduzida, o que só pode se tornar realidade com a implantação e desenvolvimento de um sistema de Medição e Verificação (M&V) dos consumos de energia. Neste contexto, tem-se como objetivo principal simular os preços de energia do Leilão de Eficiência Energética no ambiente regulado para conhecer se a viabilidade no Brasil poderia se concretizar. A metodologia utilizada para realizar as simulações foi a de Monte Carlo, ademais, antes se utilizou o método do Kernel com a finalidade de conseguir ajustar os dados a uma curva através de polinômios. Uma vez conseguida a curva melhor ajustada se realizou a análise de cada cenário (nas diferentes rodadas) com cada amostra (500, 1000, 5000 e 10000) para encontrar a probabilidade dos preços ficarem entre o intervalo de 110 reais e 140 reais (preços ótimos propostos no LEE). Finalmente, os resultados apresentam que a probabilidade de o preço ficar no intervalo de 110 reais e 140 reais na amostra de 500 dados é de 28,20 por cento, na amostra de 1000 é de 33,00 por cento, na amostra de 5000 é de 29,96 por cento e de 10000 é de 32,36 por cento. / [en] The Energy Efficiency (EE) is considered a synonymous of environmental preservation, because the energy saved prevents the construction of new generating plants and transmission lines. The Demand-Side Bidding (DSB) could represent a very interesting alternative for the revitalization and promotion of EE practices in Brazil. However, it is important to note that this presupposes a confidence on the amount of reduced energy, which can only take reality with the implementation and development of a measurement system and verification (M&V) the energy consumption. In this context, the main objective is to simulate of the prices of the demand-side bidding in the regulated environment to meet the viability in Brazil that could become a reality. The methodology used to perform the simulations was the Monte Carlo addition, prior to the Kernel method was used in order to be able to adjust the data to a curve, using polynomials. Once achieved the best-fitted curve was carried out through an analysis of each scenario (in different rounds) with each sample (500, 1000, 5000 and 10000) to find the probability of the price falling between the 110 real range and 140 real (great prices proposed by the DSB). Finally, the results showed that the probability of staying in the price range from 110 real nd 140 real data 500 in the sample is 28.20 percent, the sample 1000 is 33.00 percent, the sample 5000 is 29.96 percent and 10000 is 32.36 percent.
3

Matematické modely spolehlivosti v technické praxi / Mathematical Models of Reliability in Technical Applications

Schwarzenegger, Rafael January 2017 (has links)
Tato práce popisuje a aplikuje parametrické a neparametrické modely spolehlivosti na cenzorovaná data. Ukazuje implementaci spolehlivosti v metodologii Six Sigma. Metody jsou využity pro přežití/spolehlivost reálných technických dat.
4

Generating Evidence for COPD Clinical Guidelines Using EHRs

Amber M Johnson (7023350) 14 August 2019 (has links)
The Global Initiative for Chronic Obstructive Lung Disease (GOLD) guidelinesare used to guide clinical practices for treating Chronic Obstructive Pulmonary Disease (COPD). GOLD focuses heavily on stable COPD patients, limiting its use fornon-stable COPD patients such as those with severe, acute exacerbations of COPD (AECOPD) that require hospitalization. Although AECOPD can be heterogeneous, it can lead to deterioration of health and early death. Electronic health records (EHRs) can be used to analyze patient data for understanding disease progression and generating guideline evidence for AECOPD patients. However, because of its structure and representation, retrieving, analyzing, and properly interpreting EHR data can be challenging, and existing tools do not provide granular analytic capabil-ities for this data.<div><br></div><div>This dissertation presents, develops, and implements a novel approach that systematically captures the effect of interventions during patient medical encounters, and hence may support evidence generation for clinical guidelines in a systematic and principled way. A conceptual framework that structures components, such as data storage, aggregation, extraction, and visualization, to support EHR data analytics for granular analysis is introduced. We develop a software framework in Python based on these components to create longitudinal representations of raw medical data extracted from the Medical Information Mart for Intensive Care (MIMIC-III) clinical database. The software framework consists of two tools: Patient Aggregated Care Events (PACE), a novel tool for constructing and visualizing entire medical histories of both individual patients and patient cohorts, and Mark SIM, a Markov Chain Monte Carlo modeling and simulation tool for predicting clinical outcomes through probabilistic analysis that captures granular temporal aspects of aggregated, clinicaldata.<br></div><div><br></div><div>We assess the efficacy of antibiotic treatment and the optimal time of initiationfor in-hospitalized AECOPD patients as an application to probabilistic modeling. We identify 697 AECOPD patients of which 26.0% were administered antibiotics. Our model simulations show a 50% decrease in mortality rate as the number of patients administered antibiotics increase, and an estimated 5.5% mortality rate when antibiotics are initially administrated after 48 hours vs 1.8% when antibiotics are initially administrated between 24 and 48 hours. Our findings suggest that there may be amortality benefit in initiation of antibiotics early in patients with acute respiratory failure in ICU patients with severe AECOPD.<br></div><div><br></div><div>Thus, we show that it is feasible to enhance representation of EHRs to aggregate patients’ entire medical histories with temporal trends and support complex clinical questions to drive clinical guidelines for COPD.<br></div>
5

[pt] COMPARAÇÃO DOS MÉTODOS DE QUASE-VEROSSIMILHANÇA E MCMC PARA ESTIMAÇÃO DE MODELOS DE VOLATILIDADE ESTOCÁSTICA

EVANDRO DE FIGUEIREDO QUINAUD 05 June 2002 (has links)
[pt] A dissertação trata da comparação de dois métodos de estimação para modelos de séries temporais com volatilidade estocástica. Um dos métodos é baseado em inferência Bayesiana e depende de simulações enquanto o outro utiliza máxima verossimilhança para o processo de estimação. A comparação é feita tanto com séries temporais artificialmente geradas como também com séries financeiras reais. O objetivo é mostrar que os dois métodos apresentam resultados semelhantes, sendo que o segundo método é significativamente mais rápido do que o primeiro.

Page generated in 0.0414 seconds