• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 36
  • 9
  • 5
  • 3
  • 3
  • Tagged with
  • 67
  • 67
  • 18
  • 16
  • 12
  • 11
  • 11
  • 8
  • 8
  • 7
  • 7
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Location Knowledge Discovery from User Activities / ユーザアクティビティからの場所に関する知識発見

Zhuang, Chenyi 25 September 2017 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(情報学) / 甲第20737号 / 情博第651号 / 新制||情||112(附属図書館) / 京都大学大学院情報学研究科社会情報学専攻 / (主査)教授 吉川 正俊, 教授 石田 亨, 教授 美濃 導彦, 准教授 馬 強 / 学位規則第4条第1項該当 / Doctor of Informatics / Kyoto University / DFAM
22

Jet grouting as a method for sealing sheet pile excavations in Swedish conditions : A probabilistic approach

Brinck, Mårten, Stigenius, Karl January 2019 (has links)
Jet grouting is a groundimprovement method that creates cemented columns in the soil. The soil isinjected with different pressurized fluids, through the monitor, to replace andcement the soil, often with water cement grout. There are three different commonsystems for ejecting the fluids, the single, double and triple fluid system.The process is performed from the ground surface by drilling to desired depthand then withdrawing the monitor while rotating and ejecting and thus creatinga column. There are many applications for this technique. However, this thesisfocus on using jet grouted columns in formation to seal sheet pile excavationsfrom water.
23

Development of Probabilistic Models for Long Term Reliability of Sandwich Composites in Saline Freeze/Thaw Environment for Civil Engineering Applications

Emami, Sadra January 2017 (has links)
No description available.
24

Agents of Influence in Social Networks

Vedanarayanan, Srinivasa Raghavan 11 October 2012 (has links)
No description available.
25

A probabilistic model of virus transport through packed beds

Shah, Jayesh R. January 1989 (has links)
No description available.
26

Probabilistic Assessment of Pile Drivability in Swedish Soils

ALINEJAD KORDMAHALLEH, MAEDEH January 2020 (has links)
Site investigations are often performed prior to the design of pile foundations with the aim to collect data regarding soil properties including boulder content. The obtained data is typically limited due to non-homogeneous characteristic of the soil. The geological conditions of the Mälardalen region are characterized by glacial and post glacial clay overlaying on the layer of moraine containing boulders on a bedrock. Thus, pile refusal in results of encountering boulders is a common issue during pile driving in this region. The current methods to assess the pile refusal risk are mostly qualitative and relies on the expertise gained from experience.This thesis aims to develop a numerical model to quantify the boulder content in a soil strata with a confidence interval based on the site investigation results. Furthermore, this study estimates the probability and the consequences of hitting boulders while installing piles.The model simulations show that an increase in the boulder content raises the probability of hitting a boulder, but it is not proportional to the diameter of piles. It means that even in low rate of boulder content with small piles there is a high probability to hit a boulder. Results from simulations using simplified quantitative method show that slender piles have substantial consequences in case they encounter boulders. Thus, these piles are more prone to pile refusal or breakage. It is suggested that in projects where small piles are preferred due to the superstructure design, enough tests be performed to have an estimate of boulder content with tighter interval and more precise design. On the other hand, in projects with low pile counts, it is recommended to have a conservative design with large diameter piles instead of performing numerous in-situ tests. This is because the price for piling becomes relatively low compared to site investigations’ cost.
27

Probabilistic modelling of bed-load composition.

Tait, Simon J., Heald, J., McEwan, I.K., Soressen, M., Cunningham, G., Willetts, B., Goring, D. 24 June 2009 (has links)
No / This paper proposes that the changes which occur in composition of the bed load during the transport of mixed-grain-size sediments are largely controlled by the distributions of critical entrainment shear stress for the various size fractions. This hypothesis is examined for a unimodal sediment mixture by calculating these distributions with a discrete particle model and using them in a probabilistic calculation of bed-load composition. The estimates of bed-load composition compare favorably with observations of fractional transport rates made in a laboratory flume for the same sediment, suggesting that the hypothesis is reasonable. The analysis provides additional insight, in terms of grain mechanics, into the processes that determine bed-load composition. These insights strongly suggest that better prediction methods will result from taking account of the variation of threshold within size fractions, something that most previous studies have neglected.
28

Formal methods for the analysis of wireless network protocols

Fruth, Matthias January 2011 (has links)
In this thesis, we present novel software technology for the analysis of wireless networks, an emerging area of computer science. To address the widely acknowledged lack of formal foundations in this field, probabilistic model checking, a formal method for verification and performance analysis, is used. Contrary to test and simulation, it systematically explores the full state space and therefore allows reasoning about all possible behaviours of a system. This thesis contributes to design, modelling, and analysis of ad-hoc networks and randomised distributed coordination protocols. First, we present a new hybrid approach that effectively combines probabilistic model checking and state-of-the-art models from the simulation community in order to improve the reliability of design and analysis of wireless sensor networks and their protocols. We describe algorithms for the automated generation of models for both analysis methods and their implementation in a tool. Second, we study spatial properties of wireless sensor networks, mainly with respect to Quality of Service and energy properties. Third, we investigate the contention resolution protocol of the networking standard ZigBee. We build a generic stochastic model for this protocol and analyse Quality of Service and energy properties of it. Furthermore, we assess the applicability of different interference models. Fourth, we explore slot allocation protocols, which serve as a bandwidth allocation mechanism for ad-hoc networks. We build a generic model for this class of protocols, study real-world protocols, and optimise protocol parameters with respect to Quality of Service and energy constraints. We combine this with the novel formalisms for wireless communication and interference models, and finally we optimise local (node) and global (network) routing policies. This is the first application of probabilistic model checking both to protocols of the ZigBee standard and protocols for slot allocation.
29

Optimisation de potentiels statistiques pour un modèle d'évolution soumis à des contraintes structurales / Optimization of statistical potentials for a structurally constrained evolutionary model

Bonnard, Cécile 05 January 2010 (has links)
Ces dernières années, plusieurs modèles d'évolution moléculaire, basés sur l'hypothèse que les séquences des protéines évoluent sous la contrainte d'une structure bien définie et constante au cours de l'évolution, ont été développés. Cependant, un tel modèle repose sur l'expression de la fonction représentant le lien entre la structure et sa séquence. Les potentiels statistiques proposent une solution intéressante, mais parmi l'ensemble des potentiels statistiques existants, lequel serait le plus approprié pour ces modèles d'évolution ? Dans cette thèse est développé un cadre probabiliste d'optimisation de potentiels statistiques, dans le contexte du maximum de vraisemblance, et dans une optique de protein design. Ce cadre intègre différentes méthodes d'optimisation, incluant la prise en compte de structures alternatives pour l'optimisation des potentiels, et fournit un cadre robuste et des tests statistiques (à la fois dans le contexte de l'optimisation des potentiels mais aussi dans le contexte de l'évolution moléculaire) permettant de comparer différentes méthodes d'optimisation de potentiels statistiques pour les modèles soumis à des contraintes structurales. / In the field of molecular evolution, so called Structurally constrained (SC) models have been developped. Expressed at the codon level, they explicitely separe the mutation (applied to the nucleotide sequence) and the selection (applied to the encoded protein sequence) factors. The selection factor is described as a function between the structure and the sequence of the protein, via the use of a statistical potential. However, the whole evolutionary model depends on the expression of this potential, and one can ask wether a potential would be better than another. In this thesis, is developped a probabilistic framework to optimize statistical potentials especially meant for protein design, using a maximum likelihood approach. The statistical potential used in this thesis is composed by a contact potential and a solvent accessibility potential, but the probabilistic framework can easily be generalized to more complex statistical potentials. In a first part, the framework is defined, and then an algorithmical enhancement is proposed, and finally, the framework is modified in order to take into account misfolded structures (decoys). The framework defined in this thesis and in other works allows to compare different optimization methods of statistical potentials for SC models, using cross-validation and Bayes factor comparisons.
30

Efficient variable screening method and confidence-based method for reliability-based design optimization

Cho, Hyunkyoo 01 May 2014 (has links)
The objectives of this study are (1) to develop an efficient variable screening method for reliability-based design optimization (RBDO) and (2) to develop a new RBDO method incorporated with the confidence level for limited input data problems. The current research effort involves: (1) development of a partial output variance concept for variable screening; (2) development of an effective variable screening sequence; (3) development of estimation method for a confidence level of a reliability output; and (4) development of a design sensitivity method for the confidence level. In the RBDO process, surrogate models are frequently used to reduce the number of simulations because analysis of a simulation model takes a great deal of computational time. On the other hand, to obtain accurate surrogate models, we have to limit the dimension of the RBDO problem and thus mitigate the curse of dimensionality. Therefore, it is desirable to develop an efficient and effective variable screening method for reduction of the dimension of the RBDO problem. In this study, it is found that output variance is critical for identifying important variables in the RBDO process. A partial output variance, which is an efficient approximation method based on the univariate dimension reduction method (DRM), is proposed to calculate output variance efficiently. For variable screening, the variables that has larger partial output variances are selected as important variables. To determine important variables, hypothesis testing is used so that possible errors are contained at a user-specified error level. Also, an appropriate number of samples is proposed for calculating the partial output variance. Moreover, a quadratic interpolation method is studied in detail to calculate output variance efficiently. Using numerical examples, performance of the proposed variable screening method is verified. It is shown that the proposed method finds important variables efficiently and effectively. The reliability analysis and the RBDO require an exact input probabilistic model to obtain accurate reliability output and RBDO optimum design. However, often only limited input data are available to generate the input probabilistic model in practical engineering problems. The insufficient input data induces uncertainty in the input probabilistic model, and this uncertainty forces the RBDO optimum to lose its confidence level. Therefore, it is necessary to consider the reliability output, which is defined as the probability of failure, to follow a probability distribution. The probability of the reliability output is obtained with consecutive conditional probabilities of input distribution type and parameters using the Bayesian approach. The approximate conditional probabilities are obtained under reasonable assumptions, and Monte Carlo simulation is applied to practically calculate the probability of the reliability output. A confidence-based RBDO (C-RBDO) problem is formulated using the derived probability of the reliability output. In the C-RBDO formulation, the probabilistic constraint is modified to include both the target reliability output and the target confidence level. Finally, the design sensitivity of the confidence level, which is the new probabilistic constraint, is derived to support an efficient optimization process. Using numerical examples, the accuracy of the developed design sensitivity is verified and it is confirmed that C-RBDO optimum designs incorporate appropriate conservativeness according to the given input data.

Page generated in 0.0369 seconds