• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 759
  • 105
  • 69
  • 58
  • 24
  • 24
  • 16
  • 16
  • 16
  • 16
  • 16
  • 16
  • 14
  • 10
  • 7
  • Tagged with
  • 1393
  • 1393
  • 290
  • 200
  • 153
  • 149
  • 124
  • 122
  • 120
  • 119
  • 118
  • 115
  • 109
  • 107
  • 107
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
381

Resonances for graph directed Markov systems, and geometry of infinitely generated dynamical systems /

Hille, Martial R. January 2009 (has links)
Thesis (Ph.D.) - University of St Andrews, January 2009.
382

Algorithmic performance of large-scale distributed networks a spectral method approach /

Gkantsidis, Christos. January 2006 (has links)
Thesis (Ph. D.)--Computing, Georgia Institute of Technology, 2006. / Mihail, Milena, Committee Chair ; Ammar, Mostafa, Committee Member ; Dovrolis, Constantinos, Committee Member ; Faloutsos, Michalis, Committee Member ; Zegura, Ellen, Committee Member.
383

Four studies in geometry of biased graphs

Flórez, Rigoberto. January 2005 (has links)
Thesis (Ph. D.)--State University of New York at Binghamton, Department of Mathematical Sciences, 2005. / Includes bibliographical references.
384

Limit theorems for random Euclidean graphs /

Shank, Nathan B. January 2006 (has links)
Thesis (Ph. D.)--Lehigh University, 2006. / Includes vita. Includes bibliographical references (leaf 92).
385

QOS multimedia multicast routing a component based primal dual approach /

Hussain, Faheem A. January 2006 (has links)
Thesis (M.S.)--Georgia State University, 2006. / Title from title screen. Alexander Zelikovsky, committee chair; Anu Bourgeois, Saeid Belkasim, committee members. Electronic text (59 p. : ill. (some col.)) : digital, PDF file. Description based on contents viewed June 28, 2007. Includes bibliographical references (p. 58-59).
386

Modeling and Analysis of Manufacturing Systems with Multiple-Loop Structures

Zhang, Zhenyu, Gershwin, Stanley B. 01 1900 (has links)
Kanban and Constant Work-In-Process (CONWIP) control methods are designed to impose tight controls over inventory, while providing a satisfactory production rate. This paper generalizes systems with kanban or CONWIP control as assembly/disassembly networks with multiple-loop structures. We present a stochastic mathematical model which integrates the information control flows into material flows. Graph theory is used to analyze the multiple-loop structures. An efficient analytical algorithm is developed for evaluating the expected production rate and inventory levels. The performance of the algorithm is reported in terms of accuracy, reliability and speed. / Singapore-MIT Alliance (SMA)
387

Αλγόριθμοι για την γένεση του διατέμνοντος υπεργραφήματος

Τσεκουρώνας, Ιωάννης Χ. 20 October 2010 (has links)
- / -
388

The Game of Light: A Graph Theoretical Approach

Durig, Rebekah Libby 01 August 2017 (has links)
In the Game of Light, as formulated in "Harmonic Evolutions" (J. Kocik, 2007), there is a definition of dynamic graphs and a thorough explanation of how to find the structure of the digraph that shows the changing of states during the game. This thesis furthers this research in two directions: first, by exploring what happens when there are more than two vertex states, by expanding the state space to any cyclic group. Secondly, the research attempted to identify families of graphs and describe their graph states using only the number of vertex states. To further both of these goals, two programs were written, one as a calculator to compute the digraph structure, and one as a visualization tool that automates the game of light, allowing users to input graphs with simple point and click commands, and to easily see how graphs evolve. Finally, about one hundred graphs were evaluated using the calculator, and the resulting structures are recorded.
389

Improving the Analysis of Complex Networks Using Node-Based Resilience Measures

Matta, John 01 August 2018 (has links)
This dissertation examines various facets of the analysis of complex networks. In the first part, we study the resilience of networks, by examining various attempts to quantify resilience. Some of the measures studied are vertex attack tolerance, integrity, tenacity, toughness and scattering number. We prove empirically that, although these measures are NP-hard to calculate, they can be approximated to within reasonable amounts by a novel heuristic called Greedy-BC that relies on the graph-theoretic measure betweenness centrality. After verifying the accuracy of Greedy-BC, we test it on several well-known classes of networks: Barabasi-Albert networks, HOTNets and PLODs. Experiments determine that random-degree PLOD nets have the highest resilience, perhaps because of their random nature. The second part concerns clustering. We use the resilience measures and the Greedy-BC heuristic from part 1 to partition graphs. Many experiments are conducted with a generalized algorithm, NBR-Clust, using all discussed resilience measures, and expanding the data to a wide variety of real-life and synthetically generated networks. A parametrized resilience measure beta-VAT is used to detect noise, or outliers, in noisy data. Results are extended to another facet of network analysis -- that of cluster overlap. Attack sets of NBR-Clust are found to contain overlaps with high probability, and an algorithm is developed to identify them. One remaining problem with NBR-Clust is time complexity. The usefulness of the method is limited by the slowness of Greedy-BC, and particularly by the slowness of computing betweenness centrality. In an extensive series of experiments, we test several methods for approximating and speeding betweenness centrality calculations, and are able to reduce the time to cluster a 10,000-node graph from approximately 2 days with the original method of calculation, to a few minutes. In another exploration of the algorithmic aspects of resilience, we attempt to generalize some of the results obtained to hypergraphs. It is found that resilience measures like VAT and algorithms like Greedy-BC transfer well to a hypergraph representation. The final part of the dissertation reviews applications of the new clustering method. First NBR-Clust is used to cluster data on autism spectrum disorders. Because classifications of these disorders are vague, and the data noisy, the clustering properties of NBR-Clust are useful. Results hopefully lead to a better understanding of how to classify autism spectrum disorders. Second, we use NBR-Clust to examine gene assay data with the hope of identifying genes that confer resistance to powdery mildew disease in certain species of grapevines.
390

Scalable algorithms for misinformation prevention in social networks

Simpson, Michael 19 December 2018 (has links)
This thesis investigates several problems in social network analysis on misinformation prevention with an emphasis on finding solutions that can scale to massive online networks. In particular, it considers two problem formulations related to the spread of misinformation in a network that cover the elimination of existing misinformation and the prevention of future dissemination of misinformation. Additionally, a comprehensive comparison of several algorithms for the feedback arc set (FAS) problem is presented in order to identify an approach that is both scalable and computes a lightweight solution. The feedback arc set problem is of particular interest since several notable problems in social network analysis, including the elimination of existing misinformation, crucially rely on computing a small FAS as a preliminary. The elimination of existing misinformation is modelled as a graph searching game. The problem can be summarized as constructing a search strategy that will leave the graph clear of any misinformation at the end of the searching process in as few steps as possible. Despite the problem being NP-hard, even on directed acyclic graphs, this thesis presents an efficient approximation algorithm and provides new experimental results that compares the performance of the approximation algorithm to the lower bound on several large online networks. In particular, new scalability goals are achieved through careful algorithmic engineering and a highly optimized pre-processing step. The minimum feedback arc set problem is an NP-hard problem on graphs that seeks a minimum set of arcs which, when removed from the graph, leave it acyclic. A comprehensive comparison of several approximation algorithms for computing a minimum feedback arc set is presented with the goal of comparing the quality of the solutions and the running times. Additionally, careful algorithmic engineering is applied for multiple algorithms in order to improve their scalability. In particular, two approaches that are optimized (one greedy and one randomized) result in simultaneously strong performance for both feedback arc set size and running time. The experiments compare the performance of a wide range of algorithms on a broad selection of large online networks and reveal that the optimized greedy and randomized implementations outperform the other approaches by simultaneously computing a feedback arc set of competitive size and scaling to web-scale graphs with billions of vertices and tens of billions of arcs. Finally, the algorithms considered are extended to the probabilistic case in which arcs are realized with some fixed probability and a detailed experimental comparison is provided. \sloppy Finally, the problem of preventing the spread of misinformation propagating through a social network is considered. In this problem, a ``bad'' campaign starts propagating from a set of seed nodes in the network and the notion of a limiting (or ``good'') campaign is used to counteract the effect of misinformation. The goal is to identify a set of $k$ users that need to be convinced to adopt the limiting campaign so as to minimize the number of people that adopt the ``bad'' campaign at the end of both propagation processes. \emph{RPS} (Reverse Prevention Sampling), an algorithm that provides a scalable solution to the misinformation prevention problem, is presented. The theoretical analysis shows that \emph{RPS} runs in $O((k + l)(n + m)(\frac{1}{1 - \gamma}) \log n / \epsilon^2 )$ expected time and returns a $(1 - 1/e - \epsilon)$-approximate solution with at least $1 - n^{-l}$ probability (where $\gamma$ is a typically small network parameter). The time complexity of \emph{RPS} substantially improves upon the previously best-known algorithms that run in time $\Omega(m n k \cdot POLY(\epsilon^{-1}))$. Additionally, an experimental evaluation of \emph{RPS} on large datasets is presented where it is shown that \emph{RPS} outperforms the state-of-the-art solution by several orders of magnitude in terms of running time. This demonstrates that misinformation prevention can be made practical while still offering strong theoretical guarantees. / Graduate

Page generated in 0.1055 seconds