• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 560
  • 200
  • 89
  • 62
  • 22
  • 10
  • 8
  • 7
  • 6
  • 6
  • 6
  • 5
  • 4
  • 4
  • 3
  • Tagged with
  • 1254
  • 224
  • 181
  • 178
  • 159
  • 119
  • 114
  • 105
  • 100
  • 95
  • 91
  • 90
  • 90
  • 88
  • 86
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

A Study of Mobile Robot Algorithms with Sycamore

Prakash, Harish January 2016 (has links)
In this thesis we considered a simulation platform for mobile robots algorithms: Sycamore. We implemented several new features for Sycamore and we tested them while studying three different algorithms to achieve gathering by robots with limited visibility: a deterministic well known algorithm, a simple new probabilistic algorithm, and a combination of the two. The deterministic algorithm is known to achieve exact gathering when there are no faults; we tested it for the first time in presence of crashes and observed interesting and unexpected behaviors. We then performed extensive simulations with the probabilistic solution to identify the cause of an unexpected high rate of success, the simulations help us identify the relation between the rate of success and the initial configuration. Finally, we combined the two designing a hybrid solution. This work resulted in improvements of Sycamore, which can now be better employed to study mobile robots algorithms, as well as in empirical observations leading to new theoretical problems to be investigated.
12

Dense subgraph mining in probabilistic graphs

Esfahani, Fatemeh 09 December 2021 (has links)
In this dissertation we consider the problem of mining cohesive (dense) subgraphs in probabilistic graphs, where each edge has a probability of existence. Mining probabilistic graphs has become the focus of interest in analyzing many real-world datasets, such as social, trust, communication, and biological networks due to the intrinsic uncertainty present in them. Studying cohesive subgraphs can reveal important information about connectivity, centrality, and robustness of the network, with applications in areas such as bioinformatics and social networks. In deterministic graphs, there exists various definitions of cohesive substructures, including cliques, quasi-cliques, k-cores and k-trusses. In this regard, k-core and k-truss decompositions are popular tools for finding cohesive subgraphs. In deterministic graphs, a k-core is the largest subgraph in which each vertex has at least k neighbors, and a k-truss is the largest subgraph whose edges are contained in at least k triangles (or k-2 triangles depending on the definition). The k-core and k-truss decomposition in deterministic graphs have been thoroughly studied in the literature. However, in the probabilistic context, the computation is challenging and state-of-art approaches are not scalable to large graphs. The main challenge is efficient computation of the tail probabilities of vertex degrees and triangle count of edges in probabilistic graphs. We employ a special version of central limit theorem (CLT) to obtain the tail probabilities efficiently. Based on our CLT approach we propose peeling algorithms for core and truss decomposition of a probabilistic graph that scales to very large graphs and offers significant improvement over state-of-the-art approaches. Moreover, we propose a second algorithm for probabilistic core decomposition that can handle graphs not fitting in memory by processing them sequentially one vertex at a time. In terms of truss decomposition, we design a second method which is based on progressive tightening of the estimate of the truss value of each edge based on h-index computation and novel use of dynamic programming. We provide extensive experimental results to show the efficiency of the proposed algorithms. Another contribution of this thesis is mining cohesive subgraphs using the recent notion of nucleus decomposition introduced by Sariyuce et al. Nucleus decomposition is based on higher order structures such as cliques nested in other cliques. Nucleus decomposition can reveal interesting subgraphs that can be missed by core and truss decompositions. In this dissertation, we present nucleus decomposition for probabilistic graphs. The major questions we address are: How to define meaningfully nucleus decomposition in probabilistic graphs? How hard is computing nucleus decomposition in probabilistic graphs? Can we devise efficient algorithms for exact or approximate nucleus decomposition in large graphs? We present three natural definitions of nucleus decomposition in probabilistic graphs: local, global, and weakly-global. We show that the local version is in PTIME, whereas global and weakly-global are #P-hard and NP-hard, respectively. We present an efficient and exact dynamic programming approach for the local case. Further, we present statistical approximations that can scale to bigger datasets without much loss of accuracy. For global and weakly-global decompositions we complement our intractability results by proposing efficient algorithms that give approximate solutions based on search space pruning and Monte-Carlo sampling. Extensive experiments show the scalability and efficiency of our algorithms. Compared to probabilistic core and truss decompositions, nucleus decomposition significantly outperforms in terms of density and clustering metrics. / Graduate
13

Development and Benchmarking of RAVEN with TRACE for use in Dynamic Probabilistic Risk Assessment

Boniface, Kendall January 2021 (has links)
The identification of potential accident conditions for a nuclear power plant requires a systematic evaluation of postulated hazards, and accurate methods for predicting the behaviour of the system if these hazards were to occur. It is particularly important to identify scenarios which carry severe consequences (e.g., large radioactive releases to the environment), even if the conditions have a low probability of occurrence, so that preventative measures can be implemented. Dynamic probabilistic risk assessment (DPRA) is a field of analysis that aims to determine the failure pathways of complex systems while simultaneously analyzing the time-evolution of the proposed accident. By studying the dynamics of the system, DPRA methods are capable of analyzing the impact of impaired or late equipment response, human actions during the transient, and the inter relationship between different systems and failures. This approach promotes realistic predictions of the complex response of the system under accident conditions, and for the dynamics of the accident progression to unfold with timing that is not pre-determined by an analyst, thereby removing potential user bias from the results. The work that is outlined in this thesis was undertaken in order to demonstrate the DPRA software platform called RAVEN, and to leverage its application in the near-future probabilistic assessment of accident conditions applied to CANDU reactor simulation models. Features of the work include: • Demonstration of the capability of RAVEN to produce predictable results using the dynamic event tree (DET) approach; • The development of a code interface to allow RAVEN to drive DET simulations of TRACE simulation models; and • Demonstration of the capability of the developed RAVEN-TRACE interface to produce predictable results for systems that are well-understood. / Thesis / Master of Applied Science (MASc)
14

Analysis and Planning of Power Transmission System Subject to Uncertainties in the Grid

Aryal, Durga 01 February 2019 (has links)
Power transmission systems frequently experience new power flow pattern due to several factors that increase uncertainties in the system. For instance, load shape uncertainty, uncertainty due to penetration of renewable sources, changing standards, and energy de-regulation threaten the reliability and security of power transmission systems. This demands for more rigorous analysis and planning of power transmission systems. Stability issues in power transmission system are more pronounced with the penetration of utility-scale Photo-Voltaic (PV) sources. Synchronous generators provide inertia that helps in damping oscillations that arise due to fluctuations in the power system. Therefore, as PV generators replace the conventional synchronous generators, power transmission systems become vulnerable to these abnormalities. In this thesis, we study the effect of reduced inertia due to the penetration of utility-scale PV on the transient stability of power transmissions systems. In addition, the effect of increased PV penetration level in the system during normal operating condition is also analyzed. The later study illustrates that the PV penetration level and the placement of PV sources play crucial roles in determining the stability of power transmission systems. Given increasing uncertainties in power transmission systems, there is a need to seek an alternative to deterministic planning approach because it inherently lacks capability to cover all the uncertainties. One practical alternative is the probabilistic planning approach. In probabilistic planning approach, an analysis is made with a wide variety of scenarios by considering the probability of occurrence of each scenario and the probability of contingencies. Then, the severity of the contingencies risk associated with each planning practice is calculated. However, due to the lack of techniques and tools to select wide varieties of scenarios along with their probability of occurrence, the probabilistic transmission planning approach has not been implemented in real-world power transmission systems. This thesis presents a technique that can select wide varieties of scenarios along with their probability of occurrence to facilitate probabilistic planning in Electricity Reliability Council of Texas (ERCOT) systems. / Master of Science / Reliability of power transmission systems are threatened due to the increasing uncertainties arising from penetration of renewable energy sources, load growth, energy de-regulation and changing standards. Stability issues become more prevalent than in past due to increasing load growth as the demand for reactive power increases. Several researchers have been studying the impact of increased load growth and increased penetration of renewables on the dynamic stability of the distribution system. However, far less emphasis has been given to the power transmission system. This thesis presents the transient stability analysis of power transmission systems during overloading conditions. Our study also facilitates identification of weak areas of the transmission system during overloading condition. In addition, the impact of replacing conventional synchronous generator by Photovoltaics (PV) on voltage stability of the system is also analyzed. With increasing uncertainties in transmission systems, it is necessary to carefully analyze a wide variety of scenarios while planning the system. The current approach to transmission planning i.e., the deterministic approach does not sufficiently cover all the uncertainties. This has imposed the need for the probabilistic transmission planning approach where the overall system is planned based on the analysis of wide varieties of scenarios. In addition, by considering the probability of occurrence of a scenario, the probability of contingencies and severity of contingencies risk associated with each planning practice is calculated. However, there is no well-established approach that is capable of selecting wide varieties of scenarios based on their probability of occurrence. Due to this limitation, probabilistic approach is not widely implemented in real-world power transmission systems. To address this issue, this thesis presents a new technique, based on K-means clustering, to select scenarios based on their probability of occurrence.
15

Variability and Uncertainty in Risk Estimation for Brownfields Redevelopment

Olsen, Arne E. 31 July 2001 (has links)
Various methods can be used to estimate the human health risk associated with exposure to contaminants at brownfields facilities. The deterministic method has been the standard practice, but the use of probabilistic methods is increasing. Contaminant data for non-carcinogens and carcinogens from 21 brownfields sites in Pennsylvania were collected and compiled. These were used to evaluate the performance of the deterministic and several probabilistic methods for assessing exposure and risk in relation to variability and uncertainty in the data set and input parameters. The probabilistic methods were based (a) entirely on Monte Carlo simulated input parameter distribution functions, (b) on a combination of some of these functions and fixed parameter values, or (c) on a parameter distribution function. These methods were used to generate contaminant intake doses, defined as the 90th, 95th, or 99.9th percentile of their estimated output distribution, for the principal human exposure routes. These values were then compared with the corresponding point values estimated by the deterministic method. For all exposure routes the probabilistic intake dose estimates, taken as the 90th and 95th percentiles of the output distribution, were not markedly different from the deterministic values or from each other. The opposite was generally the case for the estimates at the 99.9th cutoff percentile; especially for the Monte Carlo-based methods. Increasing standard deviation of the input contaminant concentration tended to produce higher intake dose estimates for all estimation methods. In pairwise comparisons with the deterministic estimates, this trend differed significantly only for the probabilistic intake doses estimated as the 99.9th percentile of the output distribution. Taken together, these results did not indicate clear and definitive advantages in using probabilistic methods over the deterministic method for exposure and risk assessment for brownfields redevelopment. They supported using the tired system for environmental risk assessment at any particular brownfields facility. / Master of Science
16

5th International Probabilistic Workshop

10 December 2008 (has links) (PDF)
These are the proceedings of the 5th International Probabilistic Workshop. Even though the 5th anniversary of a conference might not be of such importance, it is quite interesting to note the development of this probabilistic conference. Originally, the series started as the 1st and 2nd Dresdner Probabilistic Symposium, which were launched to present research and applications mainly dealt with at Dresden University of Technology. Since then, the conference has grown to an internationally recognised conference dealing with research on and applications of probabilistic techniques, mainly in the field of structural engineering. Other topics have also been dealt with such as ship safety and natural hazards. Whereas the first conferences in Dresden included about 12 presentations each, the conference in Ghent has attracted nearly 30 presentations. Moving from Dresden to Vienna (University of Natural Resources and Applied Life Sciences) to Berlin (Federal Institute for Material Research and Testing) and then finally to Ghent, the conference has constantly evolved towards a truly international level. This can be seen by the language used. The first two conferences were entirely in the German language. During the conference in Berlin however, the change from the German to English language was especially apparent as some presentations were conducted in German and others in English. Now in Ghent all papers will be presented in English. Participants now, not only come from Europe, but also from other continents. Although the conference will move back to Germany again next year (2008) in Darmstadt, the international concept will remain, since so much work in the field of probabilistic safety evaluations is carried out internationally. In two years (2009) the conference will move to Delft, The Netherlands and probably in 2010 the conference will be held in Szczecin, Poland. Coming back to the present: the editors wish all participants a successful conference in Ghent.
17

6th International Probabilistic Workshop - 32. Darmstädter Massivbauseminar

10 December 2008 (has links) (PDF)
These are the proceedings of the 6th International Probabilistic Workshop, formerly known as Dresden Probabilistic Symposium or International Probabilistic Symposium. The workshop was held twice in Dresden, then it moved to Vienna, Berlin, Ghent and finally to Darmstadt in 2008. All of the conference cities feature some specialities. However, Darmstadt features a very special property: The element number 110 was named Darmstadtium after Darmstadt: There are only very few cities worldwide after which a chemical element is named. The high element number 110 of Darmstadtium indicates, that much research is still required and carried out. This is also true for the issue of probabilistic safety concepts in engineering. Although the history of probabilistic safety concepts can be traced back nearly 90 years, for the practical applications a long way to go still remains. This is not a disadvantage. Just as research chemists strive to discover new element properties, with the application of new probabilistic techniques we may advance the properties of structures substantially. (Auszug aus Vorwort)
18

Identifying All Preorders on the Subdistribution Monad / 劣確率分布モナド上の全ての前順序の特定

Sato, Tetsuya 23 March 2015 (has links)
京都大学 / 0048 / 新制・課程博士 / 博士(理学) / 甲第18771号 / 理博第4029号 / 新制||理||1580(附属図書館) / 31722 / 京都大学大学院理学研究科数学・数理解析専攻 / (主査)教授 長谷川 真人, 教授 玉川 安騎男, 准教授 照井 一成 / 学位規則第4条第1項該当 / Doctor of Science / Kyoto University / DFAM
19

Theoretical investigation of RAM-based neural networks

Adeodato, Paulo Jorge Leitao January 1997 (has links)
No description available.
20

Integrating Probabilistic Reasoning with Constraint Satisfaction

Hsu, Eric 09 June 2011 (has links)
We hypothesize and confirm that probabilistic reasoning is closely related to constraint satisfaction at a formal level, and that this relationship yields effective algorithms for guiding constraint satisfaction and constraint optimization solvers. By taking a unified view of probabilistic inference and constraint reasoning in terms of graphical models, we first associate a number of formalisms and techniques between the two areas. For instance, we characterize search and inference in constraint reasoning as summation and multiplication (or disjunction and conjunction) in the probabilistic space; necessary but insufficient consistency conditions for solutions to constraint problems (like arc-consistency) mirror approximate objective functions over probability distributions (like the Bethe free energy); and the polytope of feasible points for marginal probabilities represents the linear relaxation of a particular constraint satisfaction problem. While such insights synthesize an assortment of existing formalisms from varied research communities, they also yield an entirely novel set of “bias estimation” techniques that contribute to a growing body of research on applying probabilistic methods to constraint problems. In practical terms, these techniques estimate the percentage of solutions to a constraint satisfaction or optimization problem wherein a given variable is assigned a given value. By devising search methods that incorporate such information as heuristic guidance for variable and value ordering, we are able to outperform existing solvers on problems of interest from constraint satisfaction and constraint optimization–-as represented here by the SAT and MaxSAT problems. Further, for MaxSAT we present an equivalent transformation” process that normalizes the weights in constraint optimization problems, in order to encourage prunings of the search tree during branch-and-bound search. To control such computationally expensive processes, we determine promising situations for using them throughout the course of an individual search process. We accomplish this using a reinforcement learning-based control module that seeks a principled balance between the exploration of new strategies and the exploitation of existing experiences.

Page generated in 0.0985 seconds