• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 560
  • 200
  • 89
  • 62
  • 22
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 3
  • Tagged with
  • 1254
  • 224
  • 181
  • 178
  • 159
  • 119
  • 114
  • 105
  • 100
  • 95
  • 91
  • 90
  • 90
  • 88
  • 86
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Probabilistic databases and their application

Zhao, Wenzhong. January 2004 (has links) (PDF)
Thesis (Ph. D.)--University of Kentucky, 2004. / Title from document title page (viewed Jan. 7, 2005). Document formatted into pages; contains x, 180p. : ill. Includes abstract and vita. Includes bibliographical references (p. 173-178).
32

Risk in the development design

Crossland, Ross January 1997 (has links)
No description available.
33

STUDY ON SLOPE STABILITY OF PENANG ISLAND CONSIDERING EARTHQUAKE AND RAINFALL EFFECTS / 地震と降雨の影響を考慮したペナン島の斜面安定性に関する研究

Mastura Binti Azmi 24 March 2014 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(工学) / 甲第18226号 / 工博第3818号 / 新制||工||1585(附属図書館) / 31084 / 京都大学大学院工学研究科都市社会工学専攻 / (主査)教授 清野 純史, 教授 三村 衛, 准教授 古川 愛子 / 学位規則第4条第1項該当 / Doctor of Philosophy (Engineering) / Kyoto University / DFAM
34

Categorical approach to automata theory

Sznajder-Glodowski, Malgorzata January 1986 (has links)
No description available.
35

The Metaphysics of Probabilistic Laws of Nature

Maclean, Duncan 04 1900 (has links)
In this thesis I treat success in explicating probabilistic laws of nature (e.g., laws of radioactive decay) as a criterion of adequacy for a metaphysics of laws. I devote a chapter of analysis to each of the three best known theories of laws: the best systems analysis, contingent necessitation, and dispositional essentialism. I treat the problem of undermining that David Lewis identified in his theory of chance as a challenge that any metaphysical theory of probabilistic laws must overcome. I argue that dispositional essentialism explicates probabilistic laws while the other two theories fail to do so. Lewis's best systems analysis explicates probabilistic laws only with a solution to the problem of undermining. Michael Thau's solution was met with Lewis's approval. I argue that Thau's solution is ad hoc and renders impossible the fit of best systems with probabilistic laws to indeterministic worlds. Bas van Fraassen argued that David Armstrong's theory of contingent necessitation is totally incapable of explicating probabilistic laws of nature. I argue that Armstrong is able to respond to some of van Fraassen's arguments, but not to the extent of rehabilitating his theory. I also argue that Armstrong's theory of probabilistic laws suffers from the problem of undermining. This result adds to the widely held suspicion that Armstrong's theory is a version of a regularity theory of laws. With propensities grounding probabilistic laws of nature, the problem of undermining does not arise for dispositional essentialism, because all nomically possible futures are compatible with the propensities instantiated in the world. I conclude that dispositional essentialism explicates probabilistic laws of nature better than Lewis's and Armstrong's theories do. Since probabilistic laws are ubiquitous in contemporary physics, I conclude that dispositional essentialism furnishes a better metaphysics of laws than Lewis's and Armstrong's theories do. / Thesis / Doctor of Philosophy (PhD)
36

Probabilistic boolean logic, arithmetic and architectures

Chakrapani, Lakshmi Narasimhan 25 August 2008 (has links)
Parameter variations, noise susceptibility, and increasing energy dissipation of CMOS devices have been recognized as major challenges in circuit and micro-architecture design in the nanometer regime. Among these, parameter variations and noise susceptibility are increasingly causing CMOS devices to behave in an "unreliable" or "probabilistic" manner. To address these challenges, a shift in design paradigm, from current day deterministic designs to "statistical" or "probabilistic" designs is deemed inevitable. Motivated by these considerations, I introduce and define probabilistic Boolean logic, whose logical operators are by definition "correct" with a probability 1/2 <= p <= 1. While most of the laws of conventional Boolean logic can be naturally extended to be valid in the probabilistic case, there are a few significant departures. We also show that computations realized using implicitly probabilistic Boolean operators are more energy efficient than their counterparts which use explicit sources of randomness, in the context of probabilistic Boolean circuits as well as probabilistic models with state, Rabin automata. To demonstrate the utility of implicitly probabilistic elements, we study a family of probabilistic architectures: the probabilistic system-on-a-chip PSOC, based on CMOS devices rendered probabilistic due to noise, referred to as probabilistic CMOS or PCMOS devices. These architectures yield significant improvements, both in the energy consumed as well as in the performance in the context of probabilistic or randomized applications with broad utility. Finally, we extend the consideration of probability of correctness to arithmetic operations, through probabilistic arithmetic. We show that in the probabilistic context, substantial savings in energy over correct arithmetic operations may be achieved. This is the theoretical basis of the energy savings reported in the video decoding and radar processing applications that has been demonstrated in prior work.
37

Probabilistic models for classification of bioacoustic data

Lakshminarayanan, Balaji 30 December 2010 (has links)
Probabilistic models have been successfully applied for a wide variety of problems, such as but not limited to information retrieval, computer vision, bio-informatics and speech processing. Probabilistic models allow us to encode our assumptions about the data in an elegant fashion and enable us to perform machine learning tasks such as classification and clustering in a principled manner. Probabilistic models for bio-acoustic data help in identifying interesting patterns in the data (for instance, the species-specific vocabulary), as well as species identification (classification) in recordings where the label is not available. The focus of this thesis is to develop efficient inference techniques for existing models, as well as develop probabilistic models tailored to bioacoustic data. First, we develop inference algorithms for the supervised latent Dirichlet allocation (LDA) model. We present collapsed variational Bayes, collapsed Gibbs sampling and maximum-a-posteriori (MAP) inference for parameter estimation and classification in supervised LDA. We provide an empirical evaluation of the trade-off between computational complexity and classification performance of the inference methods for supervised LDA, on audio classification (species identification in this context)as well as image classification and document classification tasks. Next, we present novel probabilistic models for bird sound recordings, that can capture temporal structure at different hierarchical levels, and model additional information such as the duration and frequency of vocalizations. We present a non-parametric density estimation technique for parameter estimation and show that the MAP classifier for our models can be interpreted as a weighted nearest neighbor classifier. We provide an experimental comparison between the proposed models and a support vector machine based approach, using bird sound recordings from the Cornell Macaulay library. / Graduation date: 2011 / Access restricted to the OSU Community at author's request from Dec. 30, 2010 - Dec. 30, 2011
38

Development of software for reliability based design of steel framed structures in fire

Devaney, Shaun January 2015 (has links)
Fire in building structures represents a risk both to life and property that cannot be fully eliminated. It is the aim of fire safety engineering to reduce this risk to an acceptable level through the application of scientific and engineering principles to evaluate the risk posed by fire and to determine the optimal set of protective measures. This is increasingly being achieved through performance-based design methods. Performance-based design sets out performance requirements, typically related to life safety and control of property losses, and the designer is free to choose the most suitable approach to meet these requirements. Accurate performance-based design requires the evaluation of the risks to a structure through the evaluation of the range of hazards that may occur and the resulting structural responses. The purpose of this research is to develop simplified methodologies for the reliability based design of steel framed structures in fire. These methodologies are incorporated into a software package, FireLab, which is intended to act as a tool for practicing engineers to aid in learning and applying performance-based design. FireLab is a Matlab based program that incorporates a number of different models for analysing the response of structural elements exposed to fire. It includes both deterministic and probabilistic analysis procedures. A range of simple fire models are presented for modelling compartment fires. A set of heat transfer processes are discussed for calculating the temperature distribution within common structural elements exposed to fire. A variety of structural models are discussed which may be used to model the effects of fire on a structure. An analytical model for the analysis of composite beams has been implemented in the software program. Interfaces between the software and 2 separate third party programs have also been created to allow for the analysis of composite beams using the finite element method. Analytical methods for the analysis of composite slabs under thermo-mechanical load have been implemented in the software. These methods account for the additional load carrying capacity that slabs have in fire due to the positive effects of tensile membrane action. A numerical analysis method for the vertical stability of structures subjected to multi-floor fires has been implemented using the direct stiffness method. This method uses an elastic 2nd order solution in order to check the stability of a column under the fire induced horizontal loads from sagging floors. These models of potential failure scenarios provide the basis for the probabilistic analysis methods. A variety of methods for reliability analysis are evaluated based on ease of use, accuracy and efficiency. A selection of these methods has been implemented in the software program. A selection of sample cases are examined in order to illustrate the procedures and to evaluate the important input variables. These methods provide the probability of failure of a structure under specific loads. The probability of failure is a useful parameter in comparing the level of safety between various design options. A more comprehensive framework is developed for the evaluation of the probable costs due to fire associated with a given design. This framework is based on an existing framework from earthquake engineering. It involves calculating the statistical spread of both the magnitude and likelihood of occurrence of fire and the resulting structural responses. The damage that occurs from the structural response may be then estimated. Finally, given the likely level of damage that will occur it is possible to estimate the cost of the damage either in terms of monetary cost of repair or downtime due to repair works. This method is applied to a variety of design options for a typical office building in order to illustrate the application of the framework.
39

Optimal Sensor Placement for Infrastructure System Monitoring using Probabilistic Graphical Models and Value of Information

Malings, Carl Albert 01 May 2017 (has links)
Civil infrastructure systems form the backbone of modern civilization, providing the basic services that allow society to function. Effective management of these systems requires decision-making about the allocation of limited resources to maintain and repair infrastructure components and to replace failed or obsolete components. Making informed decisions requires an understanding of the state of the system; such an understanding can be achieved through a computational or conceptual system model combined with information gathered on the system via inspections or sensors. Gathering of this information, referred to generally as sensing, should be optimized to best support the decision-making and system management processes, in order to reduce long-term operational costs and improve infrastructure performance. In this work, an approach to optimal sensing in infrastructure systems is developed by combining probabilistic graphical models of infrastructure system behavior with the value of information (VoI) metric, which quantifies the utility of information gathering efforts (referred to generally as sensor placements) in supporting decision-making in uncertain systems. Computational methods are presented for the efficient evaluation and optimization of the VoI metric based on the probabilistic model structure. Various case studies on the application of this approach to managing infrastructure systems are presented, illustrating the flexibility of the basic method as well as various special cases for its practical implementation. Three main contributions are presented in this work. First, while the computational complexity of the VoI metric generally grows exponentially with the number of components, growth can be greatly reduced in systems with certain topologies (designated as cumulative topologies). Following from this, an efficient approach to VoI computation based on a cumulative topology and Gaussian random field model is developed and presented. Second, in systems with non-cumulative topologies, approximate techniques may be used to evaluate the VoI metric. This work presents extensive investigations of such systems and draws some general conclusions about the behavior of this metric. Third, this work presents several complete application cases for probabilistic modeling techniques and the VoI metric in supporting infrastructure system management. Case studies are presented in structural health monitoring, seismic risk mitigation, and extreme temperature response in urban areas. Other minor contributions included in this work are theoretical and empirical comparisons of the VoI with other sensor placement metrics and an extension of the developed sensor placement method to systems that evolve in time. Overall, this work illustrates how probabilistic graphical models and the VoI metric can allow for efficient sensor placement optimization to support infrastructure system management. Areas of future work to expand on the results presented here include the development of approximate, heuristic methods to support efficient sensor placement in non-cumulative system topologies, as well as further validation of the efficient sensing optimization approaches used in this work.
40

Using Decline Curve Analysis, Volumetric Analysis, and Bayesian Methodology to Quantify Uncertainty in Shale Gas Reserve Estimates

Gonzalez Jimenez, Raul 1988- 14 March 2013 (has links)
Probabilistic decline curve analysis (PDCA) methods have been developed to quantify uncertainty in production forecasts and reserves estimates. However, the application of PDCA in shale gas reservoirs is relatively new. Limited work has been done on the performance of PDCA methods when the available production data are limited. In addition, PDCA methods have often been coupled with Arp’s equations, which might not be the optimum decline curve analysis model (DCA) to use, as new DCA models for shale reservoirs have been developed. Also, decline curve methods are based on production data only and do not by themselves incorporate other types of information, such as volumetric data. My research objective was to integrate volumetric information with PDCA methods and DCA models to reliably quantify the uncertainty in production forecasts from hydraulically fractured horizontal shale gas wells, regardless of the stage of depletion. In this work, hindcasts of multiple DCA models coupled to different probabilistic methods were performed to determine the reliability of the probabilistic DCA methods. In a hindcast, only a portion of the historical data is matched; predictions are made for the remainder of the historical period and compared to the actual historical production. Most of the DCA models were well calibrated visually when used with an appropriate probabilistic method, regardless of the amount of production data available to match. Volumetric assessments, used as prior information, were incorporated to further enhance the calibration of production forecasts and reserves estimates when using the Markov Chain Monte Carlo (MCMC) as the PDCA method and the logistic growth DCA model. The proposed combination of the MCMC PDCA method, the logistic growth DCA model, and use of volumetric data provides an integrated procedure to reliably quantify the uncertainty in production forecasts and reserves estimates in shale gas reservoirs. Reliable quantification of uncertainty should yield more reliable expected values of reserves estimates, as well as more reliable assessment of upside and downside potential. This can be particularly valuable early in the development of a play, because decisions regarding continued development are based to a large degree on production forecasts and reserves estimates for early wells in the play.

Page generated in 0.1051 seconds