• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 11
  • 11
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Geometric Approximation Algorithms - A Summary Based Approach

Raghvendra, Sharathkumar January 2012 (has links)
<p>Large scale geometric data is ubiquitous. In this dissertation, we design algorithms and data structures to process large scale geometric data efficiently. We design algorithms for some fundamental geometric optimization problems that arise in motion planning, machine learning and computer vision.</p><p>For a stream S of n points in d-dimensional space, we develop (single-pass) streaming algorithms for maintaining extent measures such as the minimum enclosing ball and diameter. Our streaming algorithms have a work space that is polynomial in d and sub-linear in n. For problems of computing diameter, width and minimum enclosing ball of S, we obtain lower bounds on the worst-case approximation ratio of any streaming algorithm that uses polynomial in d space. On the positive side, we design a summary called the blurred ball cover and use it for answering approximate farthest-point queries and maintaining approximate minimum enclosing ball and diameter of S. We describe a streaming algorithm for maintaining a blurred ball cover whose working space is linear in d and independent</p><p>of n.</p><p>For a set P of k pairwise-disjoint convex obstacles in 3-dimensions, we design algorithms and data structures for computing Euclidean shortest path between source s and destination t. The running time of our algorithm is linear in n and the size and query time of our data structure is independent of n. We follow a summary based approach, i.e., quickly compute a small sketch Q of P whose size is independent of n and then compute approximate shortest paths with respect to Q.</p><p>For d-dimensional point sets A and B, |A| |B| n, and for a parameter &epsilon > 0,</p><p>We give an algorithm to compute &epsilon-approximate minimum weight perfect matching of A and B under d(. , .) in time O(n<super>1.5</super>&tau(n)) ; here &tau(n) is the query/update time of a dynamic weighted nearest neighbor under d(. , .). When A, B are point sets from</p><p>a bounded integer grid, for L<sub>1</sub> and L<sub>infinity</sub>-norms, our algorithm computes minimum weight</p><p>perfect matching of A and B in time O(n<super>1.5</super>). Our algorithm also extends to a generalization of matching called the transportation problem.</p><p>We also present an O(n polylog n ) time algorithm that computes under any L<sub>p</sub>-</p><p>norm, an &epsilon-approximate minimum weight perfect matching of A and B with high probability; all previous algorithms take </p><p>O(n<super>1.5</super> time. We approximate the L<sub>p</sub> norm using a distance function, based on a randomly shifted quad-tree. The algorithm iteratively generates an approximate minimum-cost augmenting path under the new distance function in</p><p>time proportional to the length of the path. We show that the total length of the augmenting paths generated by the algorithm is O(n log n) implying a near-linear running time.</p><p>All the problems mentioned above have a history of more than two decades and algorithms presented here improve previous work by an order of magnitude. Many of these improvements are obtained by new geometric techniques that might have broader applications</p><p>and are of independent interest.</p> / Dissertation
2

Empirical Analysis of Algorithms for the k-Server and Online Bipartite Matching Problems

Mahajan, Rutvij Sanjay 14 August 2018 (has links)
The k–server problem is of significant importance to the theoretical computer science and the operations research community. In this problem, we are given k servers, their initial locations and a sequence of n requests that arrive one at a time. All these locations are points from some metric space and the cost of serving a request is given by the distance between the location of the request and the current location of the server selected to process the request. We must immediately process the request by moving a server to the request location. The objective in this problem is to minimize the total distance traveled by the servers to process all the requests. In this thesis, we present an empirical analysis of a new online algorithm for k-server problem. This algorithm maintains two solutions, online solution, and an approximately optimal offline solution. When a request arrives we update the offline solution and use this update to inform the online assignment. This algorithm is motivated by the Robust-Matching Algorithm [RMAlgorithm, Raghvendra, APPROX 2016] for the closely related online bipartite matching problem. We then give a comprehensive experimental analysis of this algorithm and also provide a graphical user interface which can be used to visualize execution instances of the algorithm. We also consider these problems under stochastic setting and implement a lookahead strategy on top of the new online algorithm. / MS / Motivated by real-time logistics, we study the online versions of the well-known bipartite matching and the k-server problems. In this problem, there are servers (delivery vehicles) located in different parts of the city. When a request for delivery is made, we have to immediately assign a delivery vehicle to this request without any knowledge of the future. Making cost-effective assignments, therefore, becomes incredibly challenging. In this thesis, we implement and empirically evaluate a new algorithm for the k-server and online matching problems.
3

Robust Exact Algorithms for the Euclidean Bipartite Matching Problem

Gattani, Akshaykumar Gopalkrishna 06 July 2023 (has links)
The minimum cost bipartite matching problem is a well-studied optimization problem in computer science and operations research, with wide-ranging applications in fields such as machine learning, economics, transportation, logistics and biology. A special instance of this problem is the computation of the p-Wasserstein distance which we define next. Given a complete bipartite graph with two disjoint sets of n points in d-dimensional Euclidean space and an integer p ≥ 1, let the cost of an edge be the p-th power of the Euclidean distance between its endpoints. The objective of this problem is to find a minimum-cost matching in this complete bipartite graph. The Hungarian algorithm is a classical method that solves this problem in O(n^3) time. There are many algorithms that have a run time better than that of the Hungarian algorithm if the graphs have non-negative integer edge costs bounded by C. Since the input points have real-valued coordinates and the Euclidean distances can be irrational, such algorithms only return an approximate matching. Thus, the Hungarian algorithm remains the fastest known algorithm to compute an exact matching. In this thesis, we implement a new algorithm in the divide and conquer framework that computes the exact p-Wasserstein distance and has a run time asymptotically better than the Hungarian algorithm for stochastic point sets. Inspired by the techniques used in the algorithm, we also design an alternate version of the Hungarian algorithm that uses a grid- based approach. Our experimental analysis shows that both of our algorithms significantly outperform the classical Hungarian algorithm. / Master of Science / Suppose we have two sets of equal number of items and a list of compatible pairs of items, where a pair is considered compatible if its items belong to different sets. A perfect matching is a subset of compatible pairs where each item is paired with exactly one other item. When trying to find a perfect matching, there may be multiple options, and minimizing the cost of the perfect matching is often desired. This is referred to as the minimum cost bipartite matching problem, which is extensively studied due to its importance in algorithmic theory and operations research. A special instance of this problem is the calculation of the p- Wasserstein distance. It has many practical applications in fields such as machine learning, economics, transportation, logistics and biology. The Hungarian algorithm is the only known algorithm that can compute the exact p-Wasserstein distance. Therefore, our focus is to develop exact algorithms for this problem that perform better than the Hungarian algorithm and can scale efficiently to large datasets.
4

Aproximação de métricas finitas por métricas arbóreas e aplicações / Approximation of finite metrics by tree metrics and applications

Lima, Murilo Santos de 15 December 2011 (has links)
Muitos problemas de otimização em grafos, em especial problemas métricos, são mais fáceis de resolver em árvores. Portanto, uma estratégia para obter um bom algoritmo para certos problemas é obter uma árvore que aproxime o grafo, e utilizar uma solução do problema nessa árvore como uma solução aproximada para o problema no grafo original. Neste trabalho é estudada a técnica de Fakcharoenphol, Rao e Talwar, que mostraram como aproximar uma métrica finita arbitrária com n pontos por uma métrica numa árvore com distorção esperada O(lg n) -- o ótimo assintótico. Essa estratégia resulta em algoritmos de aproximação com boas razões de aproximação, e em algoritmos com bom fator de competitividade para diversos problemas de otimização online e distribuídos. É apresentada especificamente a aplicação da técnica ao problema do emparelhamento mínimo bipartido online, que ilustra como a aproximação de métricas auxilia na resolução de um problema e os cuidados que devem ser tomados nessa aplicação. / Many optimization problems on graphs, especially metric problems, are easier to solve on trees. Therefore, a strategy for obtaining a good algorithm for certain problems is to obtain a tree that approximates the graph, and use a solution of the problem on the tree as an approximate solution for the problem on the original graph. We study the work of Fakcharoenphol, Rao e Talwar, who showed how to approximate an arbitrary finite metric on n points by a tree metric with expected distortion O(lg n), which is asymptotically optimum. This strategy leads to algorithms with good approximation factors, and to competitive algorithms for various optimization problems, some of them online and distributed. Here, we present the application of that technique to the problem of finding a minimum online matching on a bipartite metric graph. This problem illustrates how metric approximation aids in solving a problem, and the care that must be taken when doing such an application.
5

Mechanism design for complex systems: bipartite matching of designers and manufacturers, and evolution of air transportation networks

Joseph D. Thekinen (5930327) 20 December 2018 (has links)
<div>A central issue in systems engineering is to design systems where the stakeholders do not behave as expected by the systems designer. Usually, these stakeholders have different and often conflicting objectives. The stakeholders try to maximize their individual objective and the overall system do not function as expected by the systems designers.</div><div><br></div><div><div>We specifically study two such systems- a) cloud-based design and manufacturing system (CBDM) and b) Air Transportation System (ATS). In CBDM, two stakeholders</div><div>with conflicting objectives are designers trying to get their parts printed at the lowest possible price and manufacturers trying to sell their excess resource capacity at maximum prots. In ATS, on one hand, airlines make route selection decision with the goal of maximizing their market share and prots and on the other hand regulatory bodies such as Federal Aviation Administration tries to form policies that increase overall welfare of the people.</div></div><div><br></div><div><div>The objective in this dissertation is to establish a mechanism design based framework: a) for resource allocation in CBDM, and b) to guide the policymakers in channeling the evolution of network topology of ATS.</div></div><div><br></div><div><div>This is the rst attempt in literature to formulate the resource allocation in CBDM as a bipartite matching problem with designers and manufacturers forming two distinct set of agents. We recommend best mechanisms in different CBDM scenarios like totally decentralized scenario, organizational scenario etc. based on how well the properties of the mechanism meet the requirements of that scenario. In addition to analyzing existing mechanisms, CBDM offers challenges that are not addressed in the literature. One such challenge is how often should the matching mechanism be implemented when agents interact over a long period of time. We answer this question through theoretical propositions backed up by simulation studies. We conclude that a matching period equal to the ratio of the number of service providers to the arrival rate of designers is optimal when service rate is high and a matching period equal to</div><div>the ratio of mean printing time to mean service rate is optimal when service rate is low.</div></div><div><br></div><div><div>In ATS, we model the evolution of the network topology as the result of route selection decisions made by airlines under competition. Using data from historic decisions we use discrete games to model the preference parameters of airlines towards explanatory variables such as market demand and operating cost. Different from the existing literature, we use an airport presence based technique to estimate these parameters. This reduces the risk of over-tting and improves prediction accuracy. We conduct a forward simulation to study the effect of altering the explanatory variables on the Nash equilibrium strategies. Regulatory bodies could use these insights while forming policies.</div></div><div><br></div><div><div>The overall contribution in this research is a mechanism design framework to design complex engineered systems such as CBDM and ATS. Specically, in CBDM a matching mechanism based resource allocation framework is established and matching mechanisms are recommended for various CBDM scenarios. Through theoretical and</div><div>simulation studies we propose the frequency at which matching mechanisms should be implemented in CBDM. Though these results are established for CBDM, these</div><div>are general enough to be applied anywhere matching mechanisms are implemented multiple times. In ATS, we propose an airport presence based approach to estimate</div><div>the parameters that quantify the preference of airlines towards explanatory variables.</div></div>
6

Multimedia Scheduling in Bandwidth Limited Networks

Sun, Huey-Min 27 April 2004 (has links)
We propose an object-based multimedia model for specifying the QoS (quality of service) requirements, such as the maximum data-dropping rate or the maximum data-delay rate. We also present a resource allocation model, called the net-profit model, in which the satisfaction of user¡¦s QoS requirements is measured by the benefit earned by the system. Based on the net-profit model, the system is rewarded if it can allocate enough resources to a multimedia delivery request and fulfill the QoS requirements specified by the user. At the same time, the system is penalized if it cannot allocate enough resources to a multimedia delivery request. In this dissertation, we present our research in developing optimal solutions for multimedia stream delivery in bandwidth limited networks. To fulfill the QoS requirements, the resource, such as bandwidth, should be reserved in advance. Hence, we first investigate how to allocate a resource such that the QoS satisfaction is maximized, assuming that the QoS requirements are given a priori. The proposed optimal solution has significant improvement over the based line algorithm, EDF (Earliest Deadline First). Among all the optimal solutions found from the above problem, the net-profit may be distributed unevenly among the multimedia delivery requests. Furthermore, we tackle the fairness problem -- how to allocate a resource efficiently so that the difference of the net-profit between two requests is minimized over all the possible optimal solutions of the maximum total net-profit. A dynamic programming based algorithm is proposed to find all the possible optimal solutions and, in addition, three filters are conducted to improve the efficiency of the proposed algorithm. The experimental results show that the filters prune out unnecessary searches and improve the performance significantly, especially when the number of tasks increases. For some multimedia objects, they might need to be delivered in whole, indivisible, so we extend the proposed multimedia object-based model to indivisible objects. A dynamic programming based algorithm is presented to find an optimal solution of the delivery problem, where the total net-profit is maximized.
7

A Fast and Efficient Method for Power Distribution Network Reconfiguration

Ekstrand, Aaron Jordan 01 May 2017 (has links)
We have proposed a method by which the topology of a network might be discovered through an algorithm like the distributed Bellman-Ford algorithm. We have explored the inner workings of two methods to automate power distribution network reconfiguration, the ILP Solver and the Heuristic Solver. We have seen how networks of different shapes can be translated into a flattened topology, which is necessary preprocessing to find a power assignment solution for a network. We have also seen some experimental results comparing the performance of the ILP Solver and the Heuristic Solver. The Heuristic Solver is a very fast, efficient algorithm to reconfigure power distribution, which is important in the case of an emergency. It performs consistently with near perfect results at a speed that is orders of magnitude quicker than the ILP Solver in almost all cases. In an application where a network is small and time is not an important constraint, the ILP Solver could possibly be preferable, but in any context where time is sensitive and near-perfect results are as acceptable as perfect results, the Heuristic Solver is much preferable. There is always room for improvement. Future tests should perhaps allow for non-integer capacity units, or loads that require other values than unit capacity. Optimizing each algorithm by rewriting them in C could give more optimized tests, though this may not be necessary to make judgments about implementing one or the other. There may be some ways to improve the Heuristic Solver, such as arranging the ordered_links in some way that could be more optimal. The algorithm could also be improved by taking advantage of the fact that once there are no more sources with capacity to provide any loads, the process of trying to assign loads to them for power supply can cease. Perhaps this method could be combined with other methods that do not presently account for load priorities or place as much value on fast execution.
8

Aproximação de métricas finitas por métricas arbóreas e aplicações / Approximation of finite metrics by tree metrics and applications

Murilo Santos de Lima 15 December 2011 (has links)
Muitos problemas de otimização em grafos, em especial problemas métricos, são mais fáceis de resolver em árvores. Portanto, uma estratégia para obter um bom algoritmo para certos problemas é obter uma árvore que aproxime o grafo, e utilizar uma solução do problema nessa árvore como uma solução aproximada para o problema no grafo original. Neste trabalho é estudada a técnica de Fakcharoenphol, Rao e Talwar, que mostraram como aproximar uma métrica finita arbitrária com n pontos por uma métrica numa árvore com distorção esperada O(lg n) -- o ótimo assintótico. Essa estratégia resulta em algoritmos de aproximação com boas razões de aproximação, e em algoritmos com bom fator de competitividade para diversos problemas de otimização online e distribuídos. É apresentada especificamente a aplicação da técnica ao problema do emparelhamento mínimo bipartido online, que ilustra como a aproximação de métricas auxilia na resolução de um problema e os cuidados que devem ser tomados nessa aplicação. / Many optimization problems on graphs, especially metric problems, are easier to solve on trees. Therefore, a strategy for obtaining a good algorithm for certain problems is to obtain a tree that approximates the graph, and use a solution of the problem on the tree as an approximate solution for the problem on the original graph. We study the work of Fakcharoenphol, Rao e Talwar, who showed how to approximate an arbitrary finite metric on n points by a tree metric with expected distortion O(lg n), which is asymptotically optimum. This strategy leads to algorithms with good approximation factors, and to competitive algorithms for various optimization problems, some of them online and distributed. Here, we present the application of that technique to the problem of finding a minimum online matching on a bipartite metric graph. This problem illustrates how metric approximation aids in solving a problem, and the care that must be taken when doing such an application.
9

Low rank methods for network alignment

Huda Nassar (7047152) 15 August 2019 (has links)
Network alignment is the problem of finding a common subgraph between two graphs, and more generally <i>k </i>graphs. The results of network alignment are often used for information transfer, which makes it a powerful tool for deducing information or insight about networks. Network alignment is tightly related to the subgraph isomorphism problem which is known to be NP-hard, this makes the network alignment problem supremely hard in practice. Some algorithms have been devised to approach it via solving a form of a relaxed version of the NP-hard problem or by defining certain heuristic measures. These algorithms normally work well for problems when there is some form of prior known similarity between the nodes of the graphs to be aligned. The absence of such information makes the problem more challenging. In this scenario, these algorithms would often require much more time to finish executing, and even fail sometimes. The version of network alignment that this thesis tackles is the one when such prior similarity measures are absent. In this thesis, we address three versions of network alignment: (i) multimoal network alignment, (ii) standard pairwise network alignment, and (iii) multiple network alignment. A key common component of the algorithms presented in this thesis is exploiting a low rank structure in the network alignment problem and thus producing algorithms that run much faster than classic network alignment algorithms.
10

Generalizations Of The Popular Matching Problem

Nasre, Meghana 08 1900 (has links) (PDF)
Matching problems arise in several real-world scenarios like assigning posts to applicants, houses to trainees and room-mates to one another. In this thesis we consider the bipartite matching problem where one side of the bipartition specifies preferences over the other side. That is, we are given a bipartite graph G = (A ∪ P,E) where A denotes the set of applicants, P denotes the set of posts, and the preferences of applicants are specified by ranks on the edges. Several notions of optimality like pareto-optimality, rank-maximality, popularity have been studied in the literature; we focus on the notion of popularity. A matching M is more popular than another matching M′ if the number of applicants that prefer M to M′ exceeds the number of applicants that prefer M′ to M. A matching M is said to be popular if there exists no matching that is more popular than M. Popular matchings have the desirable property that no applicant majority can force a migration to another matching. However, popular matchings do not provide a complete answer since there exist simple instances that do not admit any popular matching. Abraham et al. (SICOMP 2007) characterized instances that admit a popular matching and also gave efficient algorithms to find one when it exists. We present several generalizations of the popular matchings problem in this thesis. Majority of our work deals with instances that do not admit any popular matching. We propose three different solution concepts for such instances. A reasonable solution when an instance does not admit a popular matching is to output a matching that is least unpopular amongst the set of unpopular matchings. McCutchen (LATIN 2008) introduced and studied measures of unpopularity, namely the unpopularity factor and unpopularity margin. He proved that computing either a least unpopularity factor matching or a least unpopularity margin matching is NP-hard. We build upon this work and design an O(km√n) time algorithm which produces matchings with bounded unpopularity provided a certain subgraph of G admits an A-complete matching (a matching that matches all the applicants). Here n and m denote the number of vertices and the number of edges in G respectively, and k, which is bounded by |A|, is the number of iterations taken by our algorithm to terminate. We also show that if a certain subgraph of G admits an A-complete matching, then we have computed a matching with the least unpopularity factor. Another feasible solution for instances without any popular matching is to output a mixed matching that is popular. A mixed matching is simply a probability distribution over the set of matchings. A mixed matching Q is popular if no mixed matching is more popular than Q. We seek to answer the existence and computation of popular mixed matchings in a given instance G. We begin with a linear programming formulation to compute a mixed matching with the least unpopularity margin. We show that although the linear program has exponentially many constraints, we have a polynomial time separation oracle and hence a least unpopularity margin mixed matching can be computed in polynomial time. By casting the popular mixed matchings problem as a two player zero-sum game, it is possible to prove that every instance of the popular matchings problem admits a popular mixed matching. Therefore, the matching returned by our linear program is indeed a popular mixed matching. Finally, we propose augmentation of the input graph for instances that do not admit any popular matching. Assume that we are dealing with a set of items B (say, DVDs/books) instead of posts and it is possible to make duplicates of these items. Our goal is to make duplicates of appropriate items such that the augmented graph admits a popular matching. However, since allowing arbitrarily many copies for items is not feasible in practice, we impose restrictions in two forms – (i) associating costs with items, and (ii) bounding the number of copies. In the first case, we assume that we pay a price of cost(b) for every extra copy of b that we make; the first copy is assumed to be given to us at free. The total cost of the augmented instance is the sum of costs of all the extra copies that we make. Our goal is to compute a minimum cost augmented instance which admits a popular matching. In the second case, along with the input graph G = (A ∪ B,E), we are given a vector hc1, c2, . . . , c|B|i denoting upper bounds on the number of copies of every item. We seek to answer whether there exists a vector hx1, x2, . . . , x|B|i such that having xi copies of item bi where 1 ≤ xi ≤ ci enables the augmented graph to admit a popular matching. We prove that several problems under both these models turn out to be NP-hard – in fact they remain NP-hard even under severe restrictions on the preference lists. Our final results deal with instances that admit popular matchings. When the input instance admits a popular matching, there may be several popular matchings – in fact there may be several maximum cardinality popular matchings. Hence one may not be content with returning any maximum cardinality popular matching and instead ask for an optimal popular matching. Assuming that the notion of optimality is specified as a part of the problem, we present an O(m + n21 ) time algorithm for computing an optimal popular matching in G. Here m denotes the number of edges in G and n1 denotes the number of applicants. We also consider the problem of computing a minimum cost popular matching where with every post p, a price cost(p) and a capacity cap(p) are associated. A post with capacity cap(p) can be matched with up to cap(p) many applicants. We present an O(mn1) time algorithm to compute a minimum cost popular matching in such instances. We believe that the work provides interesting insights into the popular matchings problem and its variants.

Page generated in 0.1037 seconds