• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 309
  • 67
  • 48
  • 32
  • 31
  • 18
  • 16
  • 14
  • 14
  • 9
  • 4
  • 3
  • 3
  • 2
  • 2
  • Tagged with
  • 707
  • 707
  • 374
  • 374
  • 153
  • 152
  • 105
  • 79
  • 69
  • 69
  • 66
  • 65
  • 64
  • 63
  • 60
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

Skill Evaluation in Women's Volleyball

Florence, Lindsay Walker 11 March 2008 (has links) (PDF)
The Brigham Young University Women's Volleyball Team recorded and rated all skills (pass, set, attack, etc.) and recorded rally outcomes (point for BYU, rally continues, point for opponent) for the entire 2006 home volleyball season. Only sequences of events occurring on BYU's side of the net were considered. Events followed one of these general patterns: serve-outcome, pass-set-attack-outcome, or block-dig-set-attack-outcome. These sequences of events were assumed to be first-order Markov chains where the quality of each contact depended only explicitly on the quality of the previous contact but not on contacts further removed in the sequence. We represented these sequences in an extensive matrix of transition probabilities where the elements of the matrix were the probabilities of moving from one state to another. The count matrix consisted of the number of times play moved from one transition state to another during the season. Data in the count matrix were assumed to have a multinomial distribution. A Dirichlet prior was formulated for each row of the count matrix, so posterior estimates of the transition probabilities were then available using Gibbs sampling. The different paths in the transition probability matrix were followed through the possible sequences of events at each step of the MCMC process to compute the posterior probability density that a perfect pass results in a point, a perfect set results in a point, and so forth. These posterior probability densities are used to address questions about skill performance in BYU women's volleyball.
92

When Infinity is Too Long to Wait: On the Convergence of Markov Chain Monte Carlo Methods

Olsen, Andrew Nolan 08 October 2015 (has links)
No description available.
93

信用衍生性商品評價-馬可夫鏈模型

林明宗 Unknown Date (has links)
信用衍生性商品(credit derivatives)是用於移轉信用風險之契約,契約是由保護買方(protection buyer)與保護賣方(protection seller)所簽定,由保護買方支付保險金(可為躉繳或分期支付)以獲得信用的保護,而保護賣方則需在律定之信用事件發生時支付償金予保護買方做為補償。近年來頻傳金融事件,巴塞爾銀行監理委員會(Basel Committee on Banking Supervision)也不得不制定新版的巴塞爾協定以要求銀行強化信用風險控制與分散,而信用衍生性商品亦有助於信用風險的移轉與抵減的功能。 本篇針對利用conditional Markov chain來建構信用違約交換與第n次信用違約交換之評價模型,並利用模擬的方式來求算出各商品之利差。藉由現實中的資料取得參數的估計值放入模型內則可以模擬出各種不同的狀況,進而做出避險的策略。 此外,本篇亦探討如何利用Gibbs sampler來改良conditional Markov chain的模擬方法,以模擬當信用衍生性商品中的資產組合有傳染效果的情況。
94

Critical slowing down and error analysis of lattice QCD simulations

Virotta, Francesco 07 May 2012 (has links)
In dieser Arbeit untersuchen wir das Critical Slowing down der Gitter-QCD Simulationen. Wir führen eine Vorstudie in der quenched Approximation durch, in der wir feststellen, dass unsere Schätzung der exponentiellen Autokorrelation wie $\tauexp(a) \sim a^{-5} $ skaliert, wobei $a$ der Gitterabstand ist. In unquenched Simulationen mit O(a)-verbesserten Wilson-Fermionen finden wir ein ähnliches Skalierungsgesetz. Die Diskussion wird von einem gro\ss{}en Satz an Ensembles sowohl in reiner Eichtheorie als auch in der Theorie mit zwei entarteten Seequarks unterstützt. Wir haben darüber hinaus die Wirkung von langsamen algorithmischen Modi in der Fehleranalyse des Erwartungswertes von typischen Gitter-QCD-Observablen (hadronische Matrixelemente und Massen) untersucht. Im Kontext der Simulationen, die durch langsame Modi betroffen sind, schlagen wir vor und testen eine Methode, um zuverlässige Schätzungen der statistischen Fehler zu bekommen. Diese Methode soll in dem typischen Simulationsbereich der Gitter-QCD helfen, nämlich dann, wenn die gesamte erfasste Statistik O(10)\tauexp ist. Dies ist der typische Fall bei Simulationen in der Nähe des Kontinuumslimes, wo der Rechenaufwand für die Erzeugung von zwei unabhängigen Datenpunkten sehr gro\ss{} sein kann. Schlie\ss{}lich diskutieren wir die Skalenbestimmung in N_f=2-Simulationen mit der Kaon Zerfallskonstante f_K als experimentellem Input. Die Methode wird zusammen mit einer gründlichen Diskussion der angewandten Fehleranalyse erklärt. Eine Beschreibung der öffentlich zugänglichen Software, die für die Fehleranalyse genutzt wurde, ist eingeschlossen. / In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as $\tauexp(a)\sim a^{-5}$, where $a$ is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10)\tauexp. This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in Nf=2 simulations using the Kaon decay constant f_K as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.
95

[en] ENERGY PRICE SIMULATION IN BRAZIL THROUGH DEMAND SIDE BIDDING / [pt] SIMULAÇÃO DOS PREÇOS DE ENERGIA NO LEILÃO DE EFICIÊNCIA ENERGÉTICA NO BRASIL

JAVIER LINKOLK LOPEZ GONZALES 18 May 2016 (has links)
[pt] A Eficiência Energética (EE) pode ser considerada sinônimo de preservação ambiental, pois a energia economizada evita a construção de novas plantas de geração e de linhas de transmissão. O Leilão de Eficiência Energética (LEE) poderia representar uma alternativa muito interessante para a dinamização e promoção de práticas de EE no Brasil. Porém, é importante mencionar que isso pressupõe uma confiança na quantidade de energia reduzida, o que só pode se tornar realidade com a implantação e desenvolvimento de um sistema de Medição e Verificação (M&V) dos consumos de energia. Neste contexto, tem-se como objetivo principal simular os preços de energia do Leilão de Eficiência Energética no ambiente regulado para conhecer se a viabilidade no Brasil poderia se concretizar. A metodologia utilizada para realizar as simulações foi a de Monte Carlo, ademais, antes se utilizou o método do Kernel com a finalidade de conseguir ajustar os dados a uma curva através de polinômios. Uma vez conseguida a curva melhor ajustada se realizou a análise de cada cenário (nas diferentes rodadas) com cada amostra (500, 1000, 5000 e 10000) para encontrar a probabilidade dos preços ficarem entre o intervalo de 110 reais e 140 reais (preços ótimos propostos no LEE). Finalmente, os resultados apresentam que a probabilidade de o preço ficar no intervalo de 110 reais e 140 reais na amostra de 500 dados é de 28,20 por cento, na amostra de 1000 é de 33,00 por cento, na amostra de 5000 é de 29,96 por cento e de 10000 é de 32,36 por cento. / [en] The Energy Efficiency (EE) is considered a synonymous of environmental preservation, because the energy saved prevents the construction of new generating plants and transmission lines. The Demand-Side Bidding (DSB) could represent a very interesting alternative for the revitalization and promotion of EE practices in Brazil. However, it is important to note that this presupposes a confidence on the amount of reduced energy, which can only take reality with the implementation and development of a measurement system and verification (M&V) the energy consumption. In this context, the main objective is to simulate of the prices of the demand-side bidding in the regulated environment to meet the viability in Brazil that could become a reality. The methodology used to perform the simulations was the Monte Carlo addition, prior to the Kernel method was used in order to be able to adjust the data to a curve, using polynomials. Once achieved the best-fitted curve was carried out through an analysis of each scenario (in different rounds) with each sample (500, 1000, 5000 and 10000) to find the probability of the price falling between the 110 real range and 140 real (great prices proposed by the DSB). Finally, the results showed that the probability of staying in the price range from 110 real nd 140 real data 500 in the sample is 28.20 percent, the sample 1000 is 33.00 percent, the sample 5000 is 29.96 percent and 10000 is 32.36 percent.
96

Stochastic Optimization Methods for Infrastructure Management with Incomplete Monitoring Data / 不完備モニタリング情報下における社会基盤マネジメントのための確率的最適化手法 / フカンビ モニタリング ジョウホウカ ニ オケル シャカイ キバン マネジメント ノ タメ ノ カクリツテキ サイテキカ シュホウ

Nam, Le Thanh 24 September 2009 (has links)
Kyoto University (京都大学) / 0048 / 新制・課程博士 / 博士(工学) / 甲第14919号 / 工博第3146号 / 新制||工||1472(附属図書館) / 27357 / UT51-2009-M833 / 京都大学大学院工学研究科都市社会工学専攻 / (主査)教授 小林 潔司, 教授 大津 宏康, 教授 河野 広隆 / 学位規則第4条第1項該当
97

Irreversible parallel dynamics in statistical mechanics

Mariani, Riccardo 12 December 2018 (has links)
Nous présentons des approches théoriques et numériques pour deux dynamiques irréversibles et parallèles sur des modèles de mécanique statistique. Dans le premier chapitre, nous présentons les résultats théoriques sur un système de particules induite par une chaîne de Markov irréversible, à savoir le TASEP. Permettant des multiples retournements de spin \`à chaque itération, nous définissons un modèle avec une dynamique parallèle appartenant à la famille des PCA et nous dérivons sa mesure stationnaire. Dans ce cadre, nous traitons {\it le problème du blocage}, {\it i.e.} comprendre les effets d’une perturbation localisée dans le taux de transition des particules sur des systèmes irréversibles: le problème du blocage. Dans le deuxième chapitre, nous présentons une version unidimensionnelle du modèle d'Ising avec potentiel de Kac. Nous définissons une PCA avec une interaction asymétrique et nous trouvons sa mesure stationnaire avec condition aux limites périodique.Ensuite, nous prouvons la convergence, dans la limite thermodynamique, de cette mesure stationnaire vers la mesure de Gibbs pour toutes les températures supérieures à la température critique via les estimations de F\"ollmer et le théorème d'unicité de Dobrushin. Dans la seconde partie de la thèse, nous étudions ces deux dynamiques à travers des expériences numériques. Dans le cas du TASEP en exploitant des processeurs graphiques (GPU) et CUDA pour identifier une estimation raisonnable du {temps de m\'elange} et renforcer la conjecture qu’à la fois dans la version, la règle de mise à jour série ou parallèle, le courant peut ne pas être analytique dans l’intensité du blocage autour de la valeur $ \varepsilon = 0 $ / In this thesis we present theoretical and numerical approaches for two irreversible and parallel dynamics on one-dimensional statistical mechanics models. In the first chapter we present theoretical results on a particles system driven by an irreversible Markov chain namely the totally asymmetric simple exclusion process (TASEP). Allowing multiples spin-flips in each time-step we define a model with a parallel dynamics that belongs to the family of the probabilistic cellular automata (PCA) and we derive its stationary measure. In this framework we deal with {\it the blockage problem}, {\it i.e.} to understand the effects of a localized perturbation in the transition rates of the particles on irreversible systems: the blockage problem. In the second chapter we present a one-dimensional version of the Ising model with Kac potential. Again we define a PCA dynamics with asymmetric interaction between particles and we find its stationary measure for periodic boundary condition. Then we prove the convergence, in the thermodynamic limit, of such stationary measure to the Gibbs measure for all temperatures above the critical one via F\"ollmer estimates and dobrushin's uniqueness theorem. In the second part of the thesis, we investigate these two dynamics through numerical experiments.In the case of the TASEP we exploit general purpose graphical processors unit (GPGPU) writing a parallel code in CUDA to identify a reasonable {\it mixing time} and reinforce the conjecture that in both version, serial or parallel update rule, the current may be non-analytic in the blockage intensity around the value $\varepsilon = 0$
98

Data-driven test case design of automatic test cases using Markov chains and a Markov chain Monte Carlo method / Datadriven testfallsdesign av automatiska testfall med Markovkedjor och en Markov chain Monte Carlo-metod

Lindahl, John, Persson, Douglas January 2021 (has links)
Large and complex software that is frequently changed leads to testing challenges. It is well established that the later a fault is detected in software development, the more it costs to fix. This thesis aims to research and develop a method of generating relevant and non-redundant test cases for a regression test suite, to catch bugs as early in the development process as possible. The research was executed at Axis Communications AB with their products and systems in mind. The approach utilizes user data to dynamically generate a Markov chain model and with a Markov chain Monte Carlo method, strengthen that model. The model generates test case proposals, detects test gaps, and identifies redundant test cases based on the user data and data from a test suite. The sampling in the Markov chain Monte Carlo method can be modified to bias the model for test coverage or relevancy. The model is generated generically and can therefore be implemented in other API-driven systems. The model was designed with scalability in mind and further implementations can be made to increase the complexity and further specialize the model for individual needs.
99

Measuring Skill Importance in Women's Soccer and Volleyball

Allan, Michelle L. 11 March 2009 (has links) (PDF)
The purpose of this study is to demonstrate how to measure skill importance for two sports: soccer and volleyball. A division I women's soccer team filmed each home game during a competitive season. Every defensive, dribbling, first touch, and passing skill was rated and recorded for each team. It was noted whether each sequence of plays led to a successful shot. A hierarchical Bayesian logistic regression model is implemented to determine how the performance of the skill affects the probability of a successful shot. A division I women's volleyball team rated each skill (serve, pass, set, etc.) and recorded rally outcomes during home games in a competitive season. The skills were only rated when the ball was on the home team's side of the net. Events followed one of these three patterns: serve-outcome, pass-set-attack-outcome, or dig-set-attack-outcome. We analyze the volleyball data using two different techniques, Markov chains and Bayesian logistic regression. These sequences of events are assumed to be first-order Markov chains. This means the quality of the current skill only depends on the quality of the previous skill. The count matrix is assumed to follow a multinomial distribution, so a Dirichlet prior is used to estimate each row of the count matrix. Bayesian simulation is used to produce the unconditional posterior probability (e.g., a perfect serve results in a point). The volleyball logistic regression model uses a Bayesian approach to determine how the performance of the skill affects the probability of a successful outcome. The posterior distributions produced from each of the models are used to calculate importance scores. The soccer data importance scores revealed that passing, first touch, and dribbling skills are the most important to the primary team. The Markov chain model for the volleyball data indicates setting 3–5 feet off the net increases the probability of a successful outcome. The logistic regression model for the volleyball data reveals that serves have a high importance score because of their steep slope. Importance scores can be used to assist coaches in allocating practice time, developing new strategies, and analyzing each player's skill performance.
100

Branching Out with Mixtures: Phylogenetic Inference That’s Not Afraid of a Little Uncertainty / Förgreningar med mixturer: Fylogenetisk inferens som inte räds lite osäkerhet

Molén, Ricky January 2023 (has links)
Phylogeny, the study of evolutionary relationships among species and other taxa, plays a crucial role in understanding the history of life. Bayesian analysis using Markov chain Monte Carlo (MCMC) is a widely used approach for inferring phylogenetic trees, but it suffers from slow convergence in higher dimensions and is slow to converge. This thesis focuses on exploring variational inference (VI), a methodology that is believed to lead to improved speed and accuracy of phylogenetic models. However, VI models are known to concentrate the density of the learned approximation in high-likelihood areas. This thesis evaluates the current state of Variational Inference Bayesian Phylogenetics (VBPI) and proposes a solution using a mixture of components to improve the VBPI method's performance on complex datasets and multimodal latent spaces. Additionally, we cover the basics of phylogenetics to provide a comprehensive understanding of the field. / Fylogeni, vilket är studien av evolutionära relationer mellan arter och andra taxonomiska grupper, spelar en viktig roll för att förstå livets historia. En ofta använd metod för att dra slutsatser om fylogenetiska träd är bayesiansk analys med Markov Chain Monte Carlo (MCMC), men den lider av långsam konvergens i högre dimensioner och kräver oändligt med tid. Denna uppsats fokuserar på att undersöka hur variationsinferens (VI) kan nyttjas inom fylogenetisk inferens med hög noggranhet. Vi fokuserar specifik på en modell kallad VBPI. Men VI-modeller är allmänt kända att att koncentrera sig på höga sannolikhetsområden i posteriorfördelningar. Vi utvärderar prestandan för Variatinal Inference Baysian Phylogenetics (VBPI) och föreslår en förbättring som använder mixturer av förslagsfördelningar för att förbättra VBPI-modellens förmåga att hantera mer komplexa datamängder och multimodala posteriorfördelningar. Utöver dettta går vi igenom grunderna i fylogenetik för att ge en omfattande förståelse av området.

Page generated in 0.0446 seconds