• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 83
  • 18
  • 13
  • 3
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 148
  • 148
  • 148
  • 30
  • 25
  • 23
  • 20
  • 20
  • 19
  • 19
  • 18
  • 16
  • 15
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
81

Bayesian and Quasi-Monte Carlo spherical integration for global illumination

Marques, Ricardo 22 October 2013 (has links) (PDF)
The spherical sampling of the incident radiance function entails a high computational cost. Therefore the llumination integral must be evaluated using a limited set of samples. Such a restriction raises the question of how to obtain the most accurate approximation possible with such a limited set of samples. In this thesis, we show that existing Monte Carlo-based approaches can be improved by fully exploiting the information available which is later used for careful samples placement and weighting.The first contribution of this thesis is a strategy for producing high quality Quasi-Monte Carlo (QMC) sampling patterns for spherical integration by resorting to spherical Fibonacci point sets. We show that these patterns, when applied to the rendering integral, are very simple to generate and consistently outperform existing approaches. Furthermore, we introduce theoretical aspects on QMC spherical integration that, to our knowledge, have never been used in the graphics community, such as spherical cap discrepancy and point set spherical energy. These metrics allow assessing the quality of a spherical points set for a QMC estimate of a spherical integral.In the next part of the thesis, we propose a new heoretical framework for computing the Bayesian Monte Carlo quadrature rule. Our contribution includes a novel method of quadrature computation based on spherical Gaussian functions that can be generalized to a broad class of BRDFs (any BRDF which can be approximated sum of one or more spherical Gaussian functions) and potentially to other rendering applications. We account for the BRDF sharpness by using a new computation method for the prior mean function. Lastly, we propose a fast hyperparameters evaluation method that avoids the learning step.Our last contribution is the application of BMC with an adaptive approach for evaluating the illumination integral. The idea is to compute a first BMC estimate (using a first sample set) and, if the quality criterion is not met, directly inject the result as prior knowledge on a new estimate (using another sample set). The new estimate refines the previous estimate using a new set of samples, and the process is repeated until a satisfying result is achieved.
82

Numerical analysis of highly oscillatory Stochastic PDEs / Analyse numérique d'EDPS hautement oscillantes

Bréhier, Charles-Edouard 27 November 2012 (has links)
Dans une première partie, on s'intéresse à un système d'EDP stochastiques variant selon deux échelles de temps, et plus particulièrement à l'approximation de la composante lente à l'aide d'un schéma numérique efficace. On commence par montrer un principe de moyennisation, à savoir la convergence de la composante lente du système vers la solution d'une équation dite moyennée. Ensuite on prouve qu'un schéma numérique de type Euler fournit une bonne approximation d'un coefficient inconnu apparaissant dans cette équation moyennée. Finalement, on construit et on analyse un schéma de discrétisation du système à partir des résultats précédents, selon la méthodologie dite HMM (Heterogeneous Multiscale Method). On met en évidence l'ordre de convergence par rapport au paramètre d'échelle temporelle et aux différents paramètres du schéma numérique- on étudie les convergences au sens fort (approximation des trajectoires) et au sens faible (approximation des lois). Dans une seconde partie, on étudie une méthode d'approximation de solutions d'EDP paraboliques, en combinant une approche semi-lagrangienne et une discrétisation de type Monte-Carlo. On montre d'abord dans un cas simplifié que la variance dépend des pas de discrétisation- enfin on fournit des simulations numériques de solutions, afin de mettre en avant les applications possibles d'une telle méthode. / In a first part, we are interested in the behavior of a system of Stochastic PDEs with two time-scales- more precisely, we focus on the approximation of the slow component thanks to an efficient numerical scheme. We first prove an averaging principle, which states that the slow component converges to the solution of the so-called averaged equation. We then show that a numerical scheme of Euler type provides a good approximation of an unknown coefficient appearing in the averaged equation. Finally, we build and we analyze a discretization scheme based on the previous results, according to the HMM methodology (Heterogeneous Multiscale Method). We precise the orders of convergence with respect to the time-scale parameter and to the parameters of the numerical discretization- we study the convergence in a strong sense - approximation of the trajectories - and in a weak sense - approximation of the laws. In a second part, we study a method for approximating solutions of parabolic PDEs, which combines a semi-lagrangian approach and a Monte-Carlo discretization. We first show in a simplified situation that the variance depends on the discretization steps. We then provide numerical simulations of solutions, in order to show some possible applications of such a method.
83

Generating Evidence for COPD Clinical Guidelines Using EHRs

Amber M Johnson (7023350) 14 August 2019 (has links)
The Global Initiative for Chronic Obstructive Lung Disease (GOLD) guidelinesare used to guide clinical practices for treating Chronic Obstructive Pulmonary Disease (COPD). GOLD focuses heavily on stable COPD patients, limiting its use fornon-stable COPD patients such as those with severe, acute exacerbations of COPD (AECOPD) that require hospitalization. Although AECOPD can be heterogeneous, it can lead to deterioration of health and early death. Electronic health records (EHRs) can be used to analyze patient data for understanding disease progression and generating guideline evidence for AECOPD patients. However, because of its structure and representation, retrieving, analyzing, and properly interpreting EHR data can be challenging, and existing tools do not provide granular analytic capabil-ities for this data.<div><br></div><div>This dissertation presents, develops, and implements a novel approach that systematically captures the effect of interventions during patient medical encounters, and hence may support evidence generation for clinical guidelines in a systematic and principled way. A conceptual framework that structures components, such as data storage, aggregation, extraction, and visualization, to support EHR data analytics for granular analysis is introduced. We develop a software framework in Python based on these components to create longitudinal representations of raw medical data extracted from the Medical Information Mart for Intensive Care (MIMIC-III) clinical database. The software framework consists of two tools: Patient Aggregated Care Events (PACE), a novel tool for constructing and visualizing entire medical histories of both individual patients and patient cohorts, and Mark SIM, a Markov Chain Monte Carlo modeling and simulation tool for predicting clinical outcomes through probabilistic analysis that captures granular temporal aspects of aggregated, clinicaldata.<br></div><div><br></div><div>We assess the efficacy of antibiotic treatment and the optimal time of initiationfor in-hospitalized AECOPD patients as an application to probabilistic modeling. We identify 697 AECOPD patients of which 26.0% were administered antibiotics. Our model simulations show a 50% decrease in mortality rate as the number of patients administered antibiotics increase, and an estimated 5.5% mortality rate when antibiotics are initially administrated after 48 hours vs 1.8% when antibiotics are initially administrated between 24 and 48 hours. Our findings suggest that there may be amortality benefit in initiation of antibiotics early in patients with acute respiratory failure in ICU patients with severe AECOPD.<br></div><div><br></div><div>Thus, we show that it is feasible to enhance representation of EHRs to aggregate patients’ entire medical histories with temporal trends and support complex clinical questions to drive clinical guidelines for COPD.<br></div>
84

Estudos teóricos de propriedades estruturais e eletrônicas da molécula emodina em solução / Theoretical studies of structural and electronic properties of emodin molecule in solution

Cunha, Antonio Rodrigues da 14 October 2009 (has links)
Estudamos as propriedades estruturais e eletrônicas da molécula emodina (EM), em diferentes condições, do ponto de vista experimental e teórico. Numa primeira parte, realizamos medidas do espectro eletrônico de absorção da EM, em meio solvente (água, clorofórmio e metanol). Nessa parte, obtivemos que o solvente provoca pouco efeito nos deslocamentos das bandas. Numa segunda parte, estudamos a EM, isoladamente e nos três solventes, através de cálculos quânticos com funcional de densidade (B3LYP), conjunto de função base de Pople (6-31G*) e modelo contínuo polariz ável (PCM). Como principais resultados obtivemos que a EM é rígida a menos da orientação relativa das 3 hidroxilas. A mudança orientacional nessas hidroxilas pode provocar formação de até 2 ligações de hidrogênio intramolecular (o que estabiliza sua geometria) e conseqüente uma diminuição no momento dipolo de 5.5 a 1.7D (o que desestabiliza sua interação com a água). Numa terceira parte, realizamos simulações com método Monte Carlo e Dinâmica Molecular em solução. Nessa parte, obtivemos que as ligações de hidrogênio intramoleculares são raramente quebradas devido as interações com o solvente e isso atribui a EM um caráter hidrofóbico. Adicionalmente, utilizando Teoria de Perturbação Termodinâmica nas simulações, calculamos a variação de energia livre de solvatação da EM em partição água/clorofórmio e água/- metanol e obtivemos -2.6 e -4.9 kcal/mol, respectivamente. Esse resultado está em boa concordância com o resultado experimental de -5.6 kcal/mol para partição de água/octanol. Por último, realizamos cálculos do espectro eletrônico de absorção da EM, isoladamente e nos três solventes, considerando as moléculas através do modelo, contínuo de solvente (SCRF) e explícito de solvente, com o método INDO/CIS. Nessa parte, obtivemos que o efeito do solvente é bem descrito teoricamente. / We study the structural and electronic properties of the emodin (EM) in different solvents of experimental and theoretical the point of view. We started performing measurements of the UV-Vis absorption spectrum of the EM in solution (water, chloroform and methanol). Our main result is that the solvent causes little effect on shifts the bands. In the second part of this work, we performing quantum calculations of isolated EM and in the three solutions using density functional (B3LYP), a set of Pople basis function (6-31G*) and the polarizable continuum model (PCM). In this part, our result is that EM presents a rigid conformation unless the orientation of its 3 hydroxyls. The change in these hydroxyls orientation can form up to 2 intramolecular H-bonds (which stabilizes its geometry) and causes a decrease in the dipole moment from 5.5 to 1.7D (which destabilizes its interaction with water). In the third part of this work, we performing Monte Carlo and Molecular Dynamics simulations in solution. Our main result is that the intramolecular H-bonds are rarely broken, even in aqueous solution, and these give to EM a hydrophobic character. Additionally, using Thermodynamics Perturbation Theory in the simulations, we calculate variations of free energy of solvation of EM in partition of water/chloroform and water/methanol and obtained -2.6 and -4.9kcal/mol, respectively. This last result is in good agreement with the experimental result[3] of -5.6kcal/mol for partition of water/octanol. Finally, we performing calculations of UV-Vis absorption spectrum of isolated EM and in the three solutions. In this calculations, we considering the molecules through the continuum solvent (SCRF) and explicit solvent model with the method INDO/CIS. In this part, we obtaining that effect of solvent is well described theoretically.
85

Backflow and pairing wave function for quantum Monte Carlo methods

López Ríos, Pablo January 2016 (has links)
Quantum Monte Carlo (QMC) methods are a class of stochastic techniques that can be used to compute the properties of electronic systems accurately from first principles. This thesis is mainly concerned with the development of trial wave functions for QMC. An extension of the backflow transformation to inhomogeneous electronic systems is presented and applied to atoms, molecules and extended systems. The backflow transformation I have developed typically retrieves an additional 50% of the remaining correlation energy at the variational Monte Carlo level, and 30% at the diffusion Monte Carlo level; the number of parameters required to achieve a given fraction of the correlation energy does not appear to increase with system size. The expense incurred by the use of backflow transformations is investigated, and it is found to scale favourably with system size. Additionally, I propose a single wave function form for studying the electron-hole system which includes pairing effects and is capable of describing all of the relevant phases of this system. The effectiveness of this general wave function is demonstrated by applying it to a particular transition between two phases of the symmetric electron-hole bilayer, and it is found that using a single wave function form gives a more accurate physical description of the system than using a different wave function to describe each phase. Both of these developments are new, and they provide a powerful set of tools for designing accurate wave functions. Backflow transformations are particularly important for systems with repulsive interactions, while pairing wave functions are important for attractive interactions. It is possible to combine backflow and pairing to further increase the accuracy of the wave function. The wave function technology that I have developed should therefore be useful across a very wide range of problems.
86

Análise estatística de curvas de crescimento sob o enfoque clássico e Bayesiano: aplicação à dados médicos e biológicos / Statistical analysis of growth curves under the classical and Bayesian approach: application to medical and biological data

Oliveira, Breno Raphael Gomes de 16 February 2016 (has links)
Introdução: A curva de crescimento é um modelo empírico da evolução de uma quantidade ao longo do tempo. As curvas de crescimento são utilizadas em muitas disciplinas , em particular no domínio da estatística, onde há uma grande literatura sobre o assunto relacionado a modelos não lineares. Método:No desenvolvimento dessa dissertação de mestrado, foi realizado um estudo baseado em dados de crescimento nas áreas biológica e médica para comparar os dois tipos de inferência (Clássica e Bayesiana), na busca de melhores estimativas e resultados para modelos de regressão não lineares, especialmente considerando alguns modelos de crescimento introduzidos na literatura. No método Bayesiano para a modelagem não linear assumimos erros normais uma suposição usual e também distribuições estáveis para a variável resposta. Estudamos também alguns aspectos de robustez dos modelos de regressão não linear para a presença de outliers ou observações discordantes considerando o uso de distribuições estáveis para a resposta no lugar da suposição de normalidade habitual. Resultados e Conclusões: Análise dos dois exemplos pode-se observar melhores ajustes quando utilizada o método Bayesiano de ajustes de modelos não lineares de curvas de crescimento. É bem sabido que, em geral, não há nenhuma forma fechada para a função densidade de probabilidade de distribuições estáveis. No entanto, sob uma abordagem Bayesiana, a utilização de uma variável aleatória latente ou auxiliar proporciona uma simplificação para obter qualquer distribuição a posteriori quando relacionado com distribuições estáveis. Esses resultados poderiam ser de grande interesse para pesquisadores e profissionais, ao lidar com dados não Gauss. Para demonstrar a utilidade dos aspectos computacionais, a metodologia é aplicada a um exemplo relacionado com as curvas de crescimento intra-uterino para prematuros. Resumos a posteriori de interesse são obtidos utilizando métodos MCMC (Markov Chain Monte Carlo) e o software OpenBugs. / Introduction: The growth curve is an empirical model of the evolution of a quantity over time. Growth curves are used in many disciplines, particularly in the field of statistics, where there is a large literature on the subject related to nonlinear models. Method: In the development of this dissertation, a study based on data growth in biological areas and medical was conducted to compare two types of inferences (Classical and Bayesian), in search of better estimates and results for nonlinear regression models, especially considering some growth models introduced in the literature. The Bayesian method for nonlinear modeling assume normal errors an usual assumption and also stable distributions for the response variable. We also study some aspects of robustness of nonlinear regression models for the presence of outliers or discordant observations regarding the use of stable distributions to the response in place of the usual assumption of normality. Results and Conclusions: In the analysis of two examples it can be seen best results using Bayesian methodology for non linear models of growth curves. It is well known that, in general, there is no closed form for the probability density function of stable distributions. However, under a Bayesian approach, the use of a latent random variable or auxiliary variable provides a simplification to get every conditional posterior related to stable distributions. These results could be of great interest to researchers and practitioners when dealing with non-Gaussian data. To demonstrate the utility of the computational aspects, the methodology is also applied to an example related to intrauterine growth curves for premature infants. Posterior summaries of interest are obtained using MCMC methods (MCMC) and the OpenBugs software.
87

Análise estatística de curvas de crescimento sob o enfoque clássico e Bayesiano: aplicação à dados médicos e biológicos / Statistical analysis of growth curves under the classical and Bayesian approach: application to medical and biological data

Breno Raphael Gomes de Oliveira 16 February 2016 (has links)
Introdução: A curva de crescimento é um modelo empírico da evolução de uma quantidade ao longo do tempo. As curvas de crescimento são utilizadas em muitas disciplinas , em particular no domínio da estatística, onde há uma grande literatura sobre o assunto relacionado a modelos não lineares. Método:No desenvolvimento dessa dissertação de mestrado, foi realizado um estudo baseado em dados de crescimento nas áreas biológica e médica para comparar os dois tipos de inferência (Clássica e Bayesiana), na busca de melhores estimativas e resultados para modelos de regressão não lineares, especialmente considerando alguns modelos de crescimento introduzidos na literatura. No método Bayesiano para a modelagem não linear assumimos erros normais uma suposição usual e também distribuições estáveis para a variável resposta. Estudamos também alguns aspectos de robustez dos modelos de regressão não linear para a presença de outliers ou observações discordantes considerando o uso de distribuições estáveis para a resposta no lugar da suposição de normalidade habitual. Resultados e Conclusões: Análise dos dois exemplos pode-se observar melhores ajustes quando utilizada o método Bayesiano de ajustes de modelos não lineares de curvas de crescimento. É bem sabido que, em geral, não há nenhuma forma fechada para a função densidade de probabilidade de distribuições estáveis. No entanto, sob uma abordagem Bayesiana, a utilização de uma variável aleatória latente ou auxiliar proporciona uma simplificação para obter qualquer distribuição a posteriori quando relacionado com distribuições estáveis. Esses resultados poderiam ser de grande interesse para pesquisadores e profissionais, ao lidar com dados não Gauss. Para demonstrar a utilidade dos aspectos computacionais, a metodologia é aplicada a um exemplo relacionado com as curvas de crescimento intra-uterino para prematuros. Resumos a posteriori de interesse são obtidos utilizando métodos MCMC (Markov Chain Monte Carlo) e o software OpenBugs. / Introduction: The growth curve is an empirical model of the evolution of a quantity over time. Growth curves are used in many disciplines, particularly in the field of statistics, where there is a large literature on the subject related to nonlinear models. Method: In the development of this dissertation, a study based on data growth in biological areas and medical was conducted to compare two types of inferences (Classical and Bayesian), in search of better estimates and results for nonlinear regression models, especially considering some growth models introduced in the literature. The Bayesian method for nonlinear modeling assume normal errors an usual assumption and also stable distributions for the response variable. We also study some aspects of robustness of nonlinear regression models for the presence of outliers or discordant observations regarding the use of stable distributions to the response in place of the usual assumption of normality. Results and Conclusions: In the analysis of two examples it can be seen best results using Bayesian methodology for non linear models of growth curves. It is well known that, in general, there is no closed form for the probability density function of stable distributions. However, under a Bayesian approach, the use of a latent random variable or auxiliary variable provides a simplification to get every conditional posterior related to stable distributions. These results could be of great interest to researchers and practitioners when dealing with non-Gaussian data. To demonstrate the utility of the computational aspects, the methodology is also applied to an example related to intrauterine growth curves for premature infants. Posterior summaries of interest are obtained using MCMC methods (MCMC) and the OpenBugs software.
88

Modelling Long-Term Persistence in Hydrological Time Series

Thyer, Mark Andrew January 2001 (has links)
The hidden state Markov (HSM) model is introduced as a new conceptual framework for modelling long-term persistence in hydrological time series. Unlike the stochastic models currently used, the conceptual basis of the HSM model can be related to the physical processes that influence long-term hydrological time series in the Australian climatic regime. A Bayesian approach was used for model calibration. This enabled rigourous evaluation of parameter uncertainty, which proved crucial for the interpretation of the results. Applying the single site HSM model to rainfall data from selected Australian capital cities provided some revealing insights. In eastern Australia, where there is a significant influence from the tropical Pacific weather systems, the results showed a weak wet and medium dry state persistence was likely to exist. In southern Australia the results were inconclusive. However, they suggested a weak wet and strong dry persistence structure may exist, possibly due to the infrequent incursion of tropical weather systems in southern Australia. This led to the postulate that the tropical weather systems are the primary cause of two-state long-term persistence. The single and multi-site HSM model results for the Warragamba catchment rainfall data supported this hypothesis. A strong two-state persistence structure was likely to exist in the rainfall regime of this important water supply catchment. In contrast, the single and multi-site results for the Williams River catchment rainfall data were inconsistent. This illustrates further work is required to understand the application of the HSM model. Comparisons with the lag-one autoregressive [AR(1)] model showed that it was not able to reproduce the same long-term persistence as the HSM model. However, with record lengths typical of real data the difference between the two approaches was not statistically significant. Nevertheless, it was concluded that the HSM model provides a conceptually richer framework than the AR(1) model. / PhD Doctorate
89

Predominant magnetic states in the Hubbard model on anisotropic triangular lattices

Watanabe, T., Yokoyama, H., Tanaka, Y., Inoue, J. 06 1900 (has links)
No description available.
90

Coupled flow systems, adjoint techniques and uncertainty quantification

Garg, Vikram Vinod, 1985- 25 October 2012 (has links)
Coupled systems are ubiquitous in modern engineering and science. Such systems can encompass fluid dynamics, structural mechanics, chemical species transport and electrostatic effects among other components, all of which can be coupled in many different ways. In addition, such models are usually multiscale, making their numerical simulation challenging, and necessitating the use of adaptive modeling techniques. The multiscale, multiphysics models of electrosomotic flow (EOF) constitute a particularly challenging coupled flow system. A special feature of such models is that the coupling between the electric physics and hydrodynamics is via the boundary. Numerical simulations of coupled systems are typically targeted towards specific Quantities of Interest (QoIs). Adjoint-based approaches offer the possibility of QoI targeted adaptive mesh refinement and efficient parameter sensitivity analysis. The formulation of appropriate adjoint problems for EOF models is particularly challenging, due to the coupling of physics via the boundary as opposed to the interior of the domain. The well-posedness of the adjoint problem for such models is also non-trivial. One contribution of this dissertation is the derivation of an appropriate adjoint problem for slip EOF models, and the development of penalty-based, adjoint-consistent variational formulations of these models. We demonstrate the use of these formulations in the simulation of EOF flows in straight and T-shaped microchannels, in conjunction with goal-oriented mesh refinement and adjoint sensitivity analysis. Complex computational models may exhibit uncertain behavior due to various reasons, ranging from uncertainty in experimentally measured model parameters to imperfections in device geometry. The last decade has seen a growing interest in the field of Uncertainty Quantification (UQ), which seeks to determine the effect of input uncertainties on the system QoIs. Monte Carlo methods remain a popular computational approach for UQ due to their ease of use and "embarassingly parallel" nature. However, a major drawback of such methods is their slow convergence rate. The second contribution of this work is the introduction of a new Monte Carlo method which utilizes local sensitivity information to build accurate surrogate models. This new method, called the Local Sensitivity Derivative Enhanced Monte Carlo (LSDEMC) method can converge at a faster rate than plain Monte Carlo, especially for problems with a low to moderate number of uncertain parameters. Adjoint-based sensitivity analysis methods enable the computation of sensitivity derivatives at virtually no extra cost after the forward solve. Thus, the LSDEMC method, in conjuction with adjoint sensitivity derivative techniques can offer a robust and efficient alternative for UQ of complex systems. The efficiency of Monte Carlo methods can be further enhanced by using stratified sampling schemes such as Latin Hypercube Sampling (LHS). However, the non-incremental nature of LHS has been identified as one of the main obstacles in its application to certain classes of complex physical systems. Current incremental LHS strategies restrict the user to at least doubling the size of an existing LHS set to retain the convergence properties of LHS. The third contribution of this research is the development of a new Hierachical LHS algorithm, that creates designs which can be used to perform LHS studies in a more flexibly incremental setting, taking a step towards adaptive LHS methods. / text

Page generated in 0.0455 seconds