• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 209
  • 196
  • 31
  • 18
  • 12
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 558
  • 558
  • 214
  • 196
  • 107
  • 102
  • 73
  • 67
  • 67
  • 67
  • 66
  • 57
  • 54
  • 50
  • 49
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

Exploring stellar magnetic activities with Bayesian inference / ベイズ推論による恒星磁気活動の探究

Ikuta, Kai 23 March 2021 (has links)
京都大学 / 新制・課程博士 / 博士(理学) / 甲第23006号 / 理博第4683号 / 新制||理||1672(附属図書館) / 京都大学大学院理学研究科物理学・宇宙物理学専攻 / (主査)准教授 野上 大作, 教授 一本 潔, 教授 太田 耕司 / 学位規則第4条第1項該当 / Doctor of Science / Kyoto University / DFAM
212

Bayesian Learning in Computational Rheology: Applications to Soft Tissues and Polymers

Kedari, Sayali Ravindra 23 May 2022 (has links)
No description available.
213

Bayesian Music Alignment / ベイス推定に基づく音楽アライメント

Maezawa, Akira 23 March 2015 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(情報学) / 甲第19106号 / 情博第552号 / 新制||情||98(附属図書館) / 32057 / 京都大学大学院情報学研究科知能情報学専攻 / (主査)教授 河原 達也, 教授 田中 利幸, 講師 吉井 和佳 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DFAM
214

Approximate Bayesian Inference based on Dense Matrices and New Features using INLA

Abdul Fattah, Esmail 30 July 2023 (has links)
The Integrated Nested Laplace Approximations (INLA) method has become a commonly used tool for researchers and practitioners to perform approximate Bayesian inference for various fields of applications. It has become essential to incorporate more complex models and expand the method’s capabilities with more features. In this dissertation, we contribute to the INLA method in different aspects. First, we present a new framework, INLA$^+$, based on dense matrices to perform approximate Bayesian inference. An application of the new approach is fitting disease-mapping models for count data with complex interactions. When the precision matrix is dense, the new approach scales better than the existing INLA method and utilizes the power of multiprocessors on shared and distributed memory architectures in today’s computational resources. Second, we propose an adaptive technique to improve gradient estimation for the convex gradient-based optimization framework in INLA. We propose a simple limited-memory technique for improving the accuracy of the numerical gradient of the marginal posterior of the hyperparameter by exploiting a coordinate transformation of the gradient and the history of previously taken descent directions. Third, we extend the commonly utilized Bayesian spatial model in disease mapping, known as the Besag model, into a non-stationary spatial model. This new model considers variations in spatial dependency among a predetermined number of sub-regions. The model incorporates multiple precision parameters, which enable different intensities of spatial dependence in each sub-region. To avoid overfitting and enhance generalization, we derive a joint penalized complexity prior for these parameters. These contributions expand the capabilities of the INLA method, improving its scalability, accuracy, and flexibility for a wider range of applications.
215

Study on method for estimating phase coupling functions from time-series data and its application to human gait motion data / 時系列データからの位相結合関数の推定手法とその歩行運動データへの応用に関する研究

Arai, Takahiro 23 March 2023 (has links)
京都大学 / 新制・課程博士 / 博士(情報学) / 甲第24739号 / 情博第827号 / 新制||情||139(附属図書館) / 京都大学大学院情報学研究科先端数理科学専攻 / (主査)教授 青柳 富誌生, 教授 田口 智清, 准教授 寺前 順之介 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DGAM
216

Investigating Convergence of Markov Chain Monte Carlo Methods for Bayesian Phylogenetic Inference

Spade, David Allen 29 August 2013 (has links)
No description available.
217

Improving Machine Learning Through Oracle Learning

Menke, Joshua Ephraim 12 March 2007 (has links) (PDF)
The following dissertation presents a new paradigm for improving the training of machine learning algorithms, oracle learning. The main idea in oracle learning is that instead of training directly on a set of data, a learning model is trained to approximate a given oracle's behavior on a set of data. This can be beneficial in situations where it is easier to obtain an oracle than it is to use it at application time. It is shown that oracle learning can be applied to more effectively reduce the size of artificial neural networks, to more efficiently take advantage of domain experts by approximating them, and to adapt a problem more effectively to a machine learning algorithm.
218

Simulation and Calibration of Uncertain Space Fractional Diffusion Equations

Alzahrani, Hasnaa H. 10 January 2023 (has links)
Fractional diffusion equations have played an increasingly important role in ex- plaining long-range interactions, nonlocal dynamics and anomalous diffusion, pro- viding effective means of describing the memory and hereditary properties of such processes. This dissertation explores the uncertainty propagation in space fractional diffusion equations in one and multiple dimensions with variable diffusivity and order parameters. This is achieved by:(i) deploying accurate numerical schemes of the forward problem, and (ii) employing uncertainty quantifications tools that accelerate the inverse problem. We begin by focusing on parameter calibration of a variable- diffusivity fractional diffusion model. A random, spatially-varying diffusivity field is considered together with an uncertain but spatially homogeneous fractional operator order. Polynomial chaos (PC) techniques are used to express the dependence of the stochastic solution on these random variables. A non-intrusive methodology is used, and a deterministic finite-difference solver of the fractional diffusion model is utilized for this purpose. The surrogates are first used to assess the sensitivity of quantities of interest (QoIs) to uncertain inputs and to examine their statistics. In particular, the analysis indicates that the fractional order has a dominant effect on the variance of the QoIs considered. The PC surrogates are further exploited to calibrate the uncertain parameters using a Bayesian methodology. In the broad range of parameters addressed, the analysis shows that the uncertain parameters having a significant impact on the variance of the solution can be reliably inferred, even from limited observations. Next, we address the numerical challenges when multidimensional space-fractional diffusion equations have spatially varying diffusivity and fractional order. Significant computational challenges arise due to the kernel singularity in the fractional integral operator as well as the resulting dense discretized operators. Hence, we present a singularity-aware discretization scheme that regularizes the singular integrals through a singularity subtraction technique adapted to the spatial variability of diffusivity and fractional order. This regularization strategy is conveniently formulated as a sparse matrix correction that is added to the dense operator, and is applicable to different formulations of fractional diffusion equations. Numerical results show that the singularity treatment is robust, substantially reduces discretization errors, and attains the first-order convergence rate allowed by the regularity of the solutions. In the last part, we explore the application of a Bayesian formalism to detect an anomaly in a fractional medium. Specifically, a computational method is presented for inferring the location and properties of an inclusion inside a two-dimensional domain. The anomaly is assumed to have known shape, but unknown diffusivity and fractional order parameters, and is assumed to be embedded in a fractional medium of known fractional properties. To detect the presence of the anomaly, the medium is forced using a collection of localized sources, and its response is measured at the source locations. To this end, the singularity-aware finite-difference scheme is applied. A non-intrusive regression approach is used to explore the dependence of the computed signals on the properties of the anomaly, and the resulting surrogates are first exploited to characterize the variability of the response, and then used to accelerate the Bayesian inference of the anomaly. In the regime of parameters considered, the computational results indicate that robust estimates of the location and fractional properties of the anomaly can be obtained, and that these estimates become sharper when high contrast ratios prevail between the anomaly and the surrounding matrix.
219

Real-time Traffic Safety Evaluation Models And Their Application For Variable Speed Limits

Yu, Rongjie 01 January 2013 (has links)
Traffic safety has become the first concern in the transportation area. Crashes have cause extensive human and economic losses. With the objective of reducing crash occurrence and alleviating crash injury severity, major efforts have been dedicated to reveal the hazardous factors that affect crash occurrence at both the aggregate (targeting crash frequency per segment, intersection, etc.,) and disaggregate levels (analyzing each crash event). The aggregate traffic safety studies, mainly developing safety performance functions (SPFs), are being conducted for the purpose of unveiling crash contributing factors for the interest locations. Results of the aggregate traffic safety studies can be used to identify crash hot spots, calculate crash modification factors (CMF), and improve geometric characteristics. Aggregate analyses mainly focus on discovering the hazardous factors that are related to the frequency of total crashes, of specific crash type, or of each crash severity level. While disaggregate studies benefit from the reliable surveillance systems which provide detailed real-time traffic and weather data. This information could help in capturing microlevel influences of the hazardous factors which might lead to a crash. The disaggregate traffic safety models, also called real-time crash risk evaluation models, can be used in monitoring crash hazardousness with the real-time field data fed in. One potential use of real-time crash risk evaluation models is to develop Variable Speed Limits (VSL) as a part of a freeway management system. Models have been developed to predict crash occurrence to proactively improve traffic safety and prevent crash occurrence. iv In this study, first, aggregate safety performance functions were estimated to unveil the different risk factors affecting crash occurrence for a mountainous freeway section. Then disaggregate real-time crash risk evaluation models have been developed for the total crashes with both the machine learning and hierarchical Bayesian models. Considering the need for analyzing both aggregate and disaggregate aspects of traffic safety, systematic multi-level traffic safety studies have been conducted for single- and multi-vehicle crashes, and weekday and weekend crashes. Finally, the feasibility of utilizing a VSL system to improve traffic safety on freeways has been investigated. This research was conducted based on data obtained from a 15-mile mountainous freeway section on I-70 in Colorado. The data contain historical crash data, roadway geometric characteristics, real-time weather data, and real-time traffic data. Real-time weather data were recorded by 6 weather stations installed along the freeway section, while the real-time traffic data were obtained from the Remote Traffic Microwave Sensor (RTMS) radars and Automatic Vechicle Identification (AVI) systems. Different datasets have been formulated from various data sources, and prepared for the multi-level traffic safety studies. In the aggregate traffic safety investigation, safety performance functions were developed to identify crash occurrence hazardous factors. For the first time real-time weather and traffic data were used in SPFs. Ordinary Poisson model and random effects Poisson models with Bayesian inference approach were employed to reveal the effects of weather and traffic related variables on crash occurrence. Two scenarios were considered: one seasonal based case and one crash type v based case. Deviance Information Criterion (DIC) was utilized as the comparison criterion; and the correlated random effects Poisson models outperform the others. Results indicate that weather condition variables, especially precipitation, play a key role in the safety performance functions. Moreover, in order to compare with the correlated random effects Poisson model, Multivariate Poisson model and Multivariate Poisson-lognormal model have been estimated. Conclusions indicate that, instead of assuming identical random effects for the homogenous segments, considering the correlation effects between two count variables would result in better model fit. Results from the aggregate analyses shed light on the policy implication to reduce crash frequencies. For the studied roadway segment, crash occurrence in the snow season have clear trends associated with adverse weather situations (bad visibility and large amount of precipitation); weather warning systems can be employed to improve road safety during the snow season. Furthermore, different traffic management strategies should be developed according to the distinct seasonal influence factors. In particular, sites with steep slopes need more attention from the traffic management center and operators especially during snow seasons to control the excess crash occurrence. Moreover, distinct strategy of freeway management should be designed to address the differences between single- and multi-vehicle crash characteristics. In addition to developing safety performance functions with various modeling techniques, this study also investigates four different approaches of developing informative priors for the independent variables. Bayesian inference framework provides a complete and coherent way to balance the empirical data and prior expectations; merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson- vi lognormal models). Deviance Information Criterion, R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparisons across the models indicate that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies. In addition to the aggregate analyses, real-time crash risk evaluation models have been developed to identify crash contributing factors at the disaggregate level. Support Vector Machine (SVM), a recently proposed statistical learning model and Hierarchical Bayesian logistic regression models were introduced to evaluate real-time crash risk. Classification and regression tree (CART) model has been developed to select the most important explanatory variables. Based on the variable selection results, Bayesian logistic regression models and SVM models with different kernel functions have been developed. Model comparisons based on receiver operating curves (ROC) demonstrate that the SVM model with Radial basis kernel function outperforms the others. Results from the models demonstrated that crashes are likely to happen during congestion periods (especially when the queuing area has propagated from the downstream segment); high variation of occupancy and/or volume would increase the probability of crash occurrence. Moreover, effects of microscopic traffic, weather, and roadway geometric factors on the occurrence of specific crash types have been investigated. Crashes have been categorized as rear- vii end, sideswipe, and single-vehicle crashes. AVI segment average speed, real-time weather data, and roadway geometric characteristics data were utilized as explanatory variables. Conclusions from this study imply that different active traffic management (ATM) strategies should be designed for three- and two-lane roadway sections and also considering the seasonal effects. Based on the abovementioned results, real-time crash risk evaluation models have been developed separately for multi-vehicle and single-vehicle crashes, and weekday and weekend crashes. Hierarchical Bayesian logistic regression models (random effects and random parameter logistic regression models) have been introduced to address the seasonal variations, crash unit level’s diversities, and unobserved heterogeneity caused by geometric characteristics. For the multi-vehicle crashes: congested conditions at downstream would contribute to an increase in the likelihood of multi-vehicle crashes; multi-vehicle crashes are more likely to occur during poor visibility conditions and if there is a turbulent area that exists downstream. Drivers who are unable to reduce their speeds timely are prone to causing rear-end crashes. While for the singlevehicle crashes: slow moving traffic platoons at the downstream detector of the crash occurrence locations would increase the probability of single-vehicle crashes; large variations of occupancy downstream would also increase the likelihood of single-vehicle crash occurrence. Substantial efforts have been dedicated to revealing the hazardous factors that affect crash occurrence from both the aggregate and disaggregate level in this study, however, findings and conclusions from these research work need to be transferred into applications for roadway design and freeway management. This study further investigates the feasibility of utilizing Variable Speed Limits (VSL) system, one key part of ATM, to improve traffic safety on freeways. A proactive traffic safety improvement VSL control algorithm has been proposed. First, an viii extension of the traffic flow model METANET was employed to predict traffic flow while considering VSL’s impacts on the flow-density diagram; a real-time crash risk evaluation model was then estimated for the purpose of quantifying crash risk; finally, the optimal VSL control strategies were achieved by employing an optimization technique of minimizing the total predicted crash risks along the VSL implementation area. Constraints were set up to limit the increase of the average travel time and differences between posted speed limits temporarily and spatially. The proposed VSL control strategy was tested for a mountainous freeway bottleneck area in the microscopic simulation software VISSIM. Safety impacts of the VSL system were quantified as crash risk improvements and speed homogeneity improvements. Moreover, three different driver compliance levels were modeled in VISSIM to monitor the sensitivity of VSL’s safety impacts on driver compliance levels. Conclusions demonstrate that the proposed VSL system could effectively improve traffic safety by decreasing crash risk, enhancing speed homogeneity, and reducing travel time under both high and moderate driver compliance levels; while the VSL system does not have significant effects on traffic safety enhancement under the low compliance scenario. Future implementations of VSL control strategies and related research topics were also discussed.
220

Bayesian, Frequentist, and Information Geometry Approaches to Parametric Uncertainty Quantification of Classical Empirical Interatomic Potentials

Kurniawan, Yonatan 20 December 2021 (has links)
Uncertainty quantification (UQ) is an increasingly important part of materials modeling. In this paper, we consider the problem of quantifying parametric uncertainty in classical empirical interatomic potentials (IPs). Previous work based on local sensitivity analysis using the Fisher Information has shown that IPs are sloppy, i.e., are insensitive to coordinated changes of many parameter combinations. We confirm these results and further explore the non-local statistics in the context of sloppy model analysis using both Bayesian (MCMC) and Frequentist (profile likelihood) methods. We interface these tools with the Knowledgebase of Interatomic Models (OpenKIM) and study three models based on the Lennard-Jones, Morse, and Stillinger-Weber potentials, respectively. We confirm that IPs have global properties similar to those of sloppy models from fields such as systems biology, power systems, and critical phenomena. These models exhibit a low effective dimensionality in which many of the parameters are unidentifiable, i.e., do not encode any information when fit to data. Because the inverse problem in such models is ill-conditioned, unidentifiable parameters present challenges for traditional statistical methods. In the Bayesian approach, Monte Carlo samples can depend on the choice of prior in subtle ways. In particular, they often "evaporate" parameters into high-entropy, sub-optimal regions of the parameter space. For profile likelihoods, confidence regions are extremely sensitive to the choice of confidence level. To get a better picture of the relationship between data and parametric uncertainty, we sample the Bayesian posterior at several sampling temperatures and compare the results with those of Frequentist analyses. In analogy to statistical mechanics, we classify samples as either energy-dominated, i.e., characterized by identifiable parameters in constrained (ground state) regions of parameter space, or entropy-dominated, i.e., characterized by unidentifiable (evaporated) parameters. We complement these two pictures with information geometry to illuminate the underlying cause of this phenomenon. In this approach, a parameterized model is interpreted as a manifold embedded in the space of possible data with parameters as coordinates. We calculate geodesics on the model manifold and find that IPs, like other sloppy models, have bounded manifolds with a hierarchy of widths, leading to low effective dimensionality in the model. We show how information geometry can motivate new, natural parameterizations that improve the stability and interpretation of UQ analysis and further suggest simplified, less-sloppy models.

Page generated in 0.09 seconds