271 |
A reformulation-linearization based implicit enumeration algorithm for the rectilinear distance location-allocation problemRamachandran, Sridhar 10 October 2009 (has links)
This thesis is concerned with the analysis of a Rectilinear Distance Location Allocation Problem, where the costs are directly proportional to rectilinear distances and the amount shipped. The problem is formulated as a Mixed Integer Bilinear Programming Problem and as a Discrete Location Allocation Problem. Using linear programming relaxations constructed via the Reformulation-Linearization Technique (RLT), the latter formulation is shown to provide stronger lower bounds and is therefore adopted for implementation. In addition, cutting planes are developed to further strengthen the linear programming relaxation. The special structure of the resulting linear program is exploited in order to get a quick lower bound via a suitable Lagrangian dual formulation. This lower bounding scheme is embedded within a finitely convergent Branch and Bound algorithm that enumerates over the location decision variable space. An illustrative example and computational experience are provided to demonstrate the efficacy of the proposed algorithm. / Master of Science
|
272 |
A Separator-Based Framework for Graph Matching ProblemsLahn, Nathaniel Adam 29 May 2020 (has links)
Given a graph, a matching is a set of vertex-disjoint edges. Graph matchings have been well studied, since they play a fundamental role in algorithmic theory as well as motivate many practical applications. Of particular interest is the problem of finding a maximum cardinality matching of a graph. Also of interest is the weighted variant: the problem of computing a minimum-cost maximum cardinality matching. For an arbitrary graph with m edges and n vertices, there are known, long-standing combinatorial algorithms that compute a maximum cardinality matching in O(m\sqrt{n}) time. For graphs with non-negative integer edge costs at most C, it is known how to compute a minimum-cost maximum cardinality matching in roughly O(m\sqrt{n} log(nC)) time using combinatorial methods. While non-combinatorial methods exist, they are generally impractical and not well understood due to their complexity. As a result, there is great interest in obtaining faster matching algorithms that are purely combinatorial in nature. Improving existing combinatorial algorithms for arbitrary graphs is considered to be a very difficult problem. To make the problem more approachable, it is desirable to make some additional assumptions about the graph. For our work, we make two such assumptions. First, we assume the graph is bipartite. Second, we assume that the graph has a small balanced separator, meaning it is possible to split the graph into two roughly equal-size components by removing a relatively small portion of the graph. Several well-studied classes of graphs have separator-like properties, including planar graphs, minor-free graphs, and geometric graphs. For such graphs, we describe a framework, a general set of techniques for designing efficient algorithms. We demonstrate this framework by applying it to yield polynomial-factor improvements for several open-problems in bipartite matching. / Doctor of Philosophy / Assume we are given a list of objects, and a list of compatible pairs of these objects. A matching consists of a chosen subset of these compatible pairs, where each object participates in at most one chosen pair. For any chosen pair of objects, we say the these two objects are matched. Generally, we seek to maximize the number of compatible matches. A maximum cardinality matching is a matching with the largest possible size. In many cases, there are multiple options for maximizing the number of compatible pairings. While maximizing the size of the matching is often the primary concern, one may also seek to minimize the cost of the matching. This is known as the minimum-cost maximum-cardinality matching problem. These two matching problems have been well studied, since they play a fundamental role in algorithmic theory as well as motivate many practical applications. Our interest is in the design of algorithms for both of these problems that are efficiently scalable, even as the number of objects involved grows very large. To aid in the design of scalable algorithms, we observe that some inputs have good separators, meaning that by removing some subset S of objects, one can divide the remaining objects into two sets V and V', where all pairs of objects between V and V' are incompatible. We design several new algorithms that exploit good separators, and prove that these algorithms scale better than previously existing approaches.
|
273 |
Meta-raps : an effective approach for combinatorial problemsMoraga, Reinaldo J. 01 April 2002 (has links)
No description available.
|
274 |
Scalable Combinatorial Algorithms for Optimal Transport Based Similarity MetricsZhang, Kaiyi 22 November 2024 (has links)
Optimal Transport (OT), also known as Wasserstein distance, is a valuable metric for comparing probability distributions. Owing to its appealing statistical properties, researchers in various fields, such as machine learning, use OT within applications. However, computing both exact and approximate OT is computationally expensive and impractical for large datasets. Furthermore, OT is sensitive to small noise in the input distributions. In this document, we propose to use combinatorial methods to design scalable and noise-resistant solutions for OT.
We present four key contributions in this work. First, we introduce a novel combinatorial parallel algorithm for approximating OT, which achieves a parallel time complexity of $O(log n/varepsilon^2)$, where $n$ is the input size and $varepsilon$ is the addtitive error, Our algorithm outperforms the state-of-the-art in experiments. Second, we propose a new concept, OT-profile, representing the function of minimum partial optimal transport cost $sigma_alpha$ versus the transported mass $alpha$. This can be used to identify outliers in real-world data. The utility of OT-profile is demonstrated in outlier detection and PU-learning jobs and outperforms the state-of-the-art. Third, building upon the OT-profile, we propose a new OT-based metric for comparing distributions that is more robust to noise. This metric preserves desirable properties while reducing its sensitivity to noise for high $p$ values, providing a robust solution for real-world datasets. Lastly, we have developed a Python library that integrates our algorithms and methods into a user-friendly framework, making it easier for practitioners to adopt our methods. Our work enhances the computational efficiency and robustness of OT, making it practical for machine learning applicaitons. / Doctor of Philosophy / Imagine a task in which you are required to fill up a set of holes with piles of sand using the least amount of effort. The idea behind this is Optimal Transport (OT), also known as the Wasserstein distance. This distance is a popular math tool that measures the similarity (or distance) between two probability distributions. This tool has many applications in different areas, including machine learning and data analysis. For example, it is employed in generative models to measure the difference between generated and real images, or in analyzing real-word datasets containing noise.
However, the practice of OT faces two major problems. First, the computation becomes expensive with larger datasets, making it infeasible for large-scale applications. Therefore, it is important to make the computation of OT more efficient in processing a large volume of data. Second, OT is sensitive to outliers or noise in the input distributions, which can significantly influence the results.
In this thesis, we try to address these challenges through four main contributions: First, we present novel parallel combinatorial algorithms that reduce the OT computational burden. Our approach achieves a state-of-the-art time complexity and, at the same time, is easy to implement. Second, we introduce an innovative method for computing an 'OT-profile', which enables the to identify the outliers from input distributions. This approach improves the robustness of OT in real-world scenarios where data always contains noise and perturbation.
Third, building on the OT-profile concept, we propose a new robust OT-based metric for comparing distributions in the presence of noise. To facilitate the impact of these advancements, we have open-sourced a Python library that implements our methods in a user-friendly interface, making it easier for researchers and practitioners to apply our solutions to their data analysis tasks. This work not only advances the theoretical analysis of OT but also contributes to future practical applications. By addressing the two challenges in scalability and robustness, we made the application of OT more robust and easier with large-scale data.
|
275 |
A novel optimization algorithm and other techniques in medicinal chemistryUnknown Date (has links)
In this dissertation we will present a stochastic optimization algorithm and use it and other mathematical techniques to tackle problems arising in medicinal chemistry. In Chapter 1, we present some background about stochastic optimization and the Accelerated Random Search (ARS) algorithm. We then present a novel improvement of the ARS algorithm, DIrected Accelerated Random Search (DARS), motivated by some theoretical results, and demonstrate through numerical results that it improves upon ARS. In Chapter 2, we use DARS and other methods to address issues arising from the use of mixture-based combinatorial libraries in drug discovery. In particular, we look at models associated with the biological activity of these mixtures and use them to answer questions about sensitivity and robustness, and also present a novel method for determining the integrity of the synthesis. Finally, in Chapter 3 we present an in-depth analysis of some statistical and mathematical techniques in combinatorial chemistry, including a novel probabilistic approach to using structural similarity to predict the activity landscape. / by Radleigh G. Santos. / Thesis (Ph.D.)--Florida Atlantic University, 2012. / Includes bibliography. / Electronic reproduction. Boca Raton, Fla., 2012. Mode of access: World Wide Web.
|
276 |
Circuitos hamiltonianos em hipergrafos e densidades de subpermutações / Hamiltonian cycles in hypergraphs and subpermutation densitiesBastos, Antonio Josefran de Oliveira 26 August 2016 (has links)
O estudo do comportamento assintótico de densidades de algumas subestruturas é uma das principais áreas de estudos em combinatória. Na Teoria das Permutações, fixadas permutações ?1 e ?2 e um inteiro n > 0, estamos interessados em estudar o comportamento das densidades de ?1 e ?2 na família de permutações de tamanho n. Assim, existem duas direções naturais que podemos seguir. Na primeira direção, estamos interessados em achar a permutação de tamanho n que maximiza a densidade das permutações ?1 e ?2 simultaneamente. Para n suficientemente grande, explicitamos a densidade máxima que uma família de permutações podem assumir dentre todas as permutações de tamanho n. Na segunda direção, estamos interessados em achar a permutação de tamanho n que minimiza a densidade de ?1 e ?2 simultaneamente. Quando ?1 é a permutação identidade com k elementos e ?2 é a permutação reversa com l elementos, Myers conjecturou que o mínimo é atingido quando tomamos o mínimo dentre as permutações que não possuem a ocorrência de ?1 ou ?2. Mostramos que se restringirmos o espaço de busca somente ao conjunto de permutações em camadas, então a Conjectura de Myers é verdadeira. Por outro lado, na Teoria dos Grafos, o problema de encontrar um circuito Hamiltoniano é um problema NP-completo clássico e está entre os 21 problemas Karp. Dessa forma, uma abordagem comum na literatura para atacar esse problema é encontrar condições que um grafo deve satisfazer e que garantem a existência de um circuito Hamiltoniano em tal grafo. O célebre resultado de Dirac afirma que se um grafo G de ordem n possui grau mínimo pelo menos n/2, então G possui um circuito Hamiltoniano. Seguindo a linha de Dirac, mostramos que, dados inteiros 1 6 l 6 k/2 e ? > 0 existe um inteiro n0 > 0 tal que, se um hipergrafo k-uniforme H de ordem n satisfaz ?k-2(H) > ((4(k - l) - 1)/(4(k - l)2) + ?) (n 2), então H possui um l-circuito Hamiltoniano. / The study of asymptotic behavior of densities of some substructures is one of the main areas in combinatorics. In Permutation Theory, fixed permutations ?1 and ?2 and an integer n > 0, we are interested in the behavior of densities of ?1 and ?2 among the permutations of size n. Thus, there are two natural directions we can follow. In the first direction, we are interested in finding the permutation of size n that maximizes the density of the permutations ?1 and ?2 simultaneously. We explicit the maximum density of a family of permutations between all the permutations of size n. In the second direction, we are interested in finding the permutation of size n that minimizes the density of ?1 and ?2 simultaneously. When ?1 is the identity permutation with l elements and ?2 is the reverse permutation with k elements, Myers conjectured that the minimum is achieved when we take the minimum among the permutations which do not have the occurrence of ?1 or ?2. We show that if we restrict the search space only to set of layered permutations and k > l, then the Myers\' Conjecture is true. On the other hand, in Graph Theory, the problem of finding a Hamiltonian cycle is a NP-complete problem and it is among the 21 Karp problems. Thus, one approach to attack this problem is to find conditions that a graph must meet to ensure the existence of a Hamiltonian cycle on it. The celebrated result of Dirac shows that a graph G of order n that has minimum degree at least n/2 has a Hamiltonian cycle. Following the line of Dirac, we show that give integers 1 6 l 6 k/2 and gamma > 0 there is an integer n0 > 0 such that if a hypergraph k-Uniform H of order n satisfies ?k-2(H) > ((4(k-l)-1)/(4(k-l)2)+?) (n 2), then H has a Hamiltonian l-cycle.
|
277 |
Minimização de funções submodulares / Submodular Function MinimizationSimão, Juliana Barby 09 June 2009 (has links)
Funções submodulares aparecem naturalmente em diversas áreas, tais como probabilidade, geometria e otimização combinatória. Pode-se dizer que o papel desempenhado por essas funções em otimização discreta é similar ao desempenhado por convexidade em otimização contínua. Com efeito, muitos problemas em otimização combinatória podem ser formulados como um problema de minimizar uma função submodular sobre um conjunto apropriado. Além disso, submodularidade está presente em vários teoremas ou problemas combinatórios e freqüentemente desempenha um papel essencial em uma demonstração ou na eficiência de um algoritmo. Nesta dissertação, estudamos aspectos estruturais e algorítmicos de funções submodulares, com ênfase nos recentes avanços em algoritmos combinatórios para minimização dessas funções. Descrevemos com detalhes os primeiros algoritmos combinatórios e fortemente polinomiais para esse propósito, devidos a Schrijver e Iwata, Fleischer e Fujishige, além de algumas outras extensões. Aplicações de submodularidade em otimização combinatória também estão presentes neste trabalho. / Submodular functions arise naturally in various fields, including probability, geometry and combinatorial optimization. The role assumed by these functions in discrete optimization is similar to that played by convexity in continuous optimization. Indeed, we can state many problems in combinatorial optimization as a problem of minimizing a submodular function over an appropriate set. Moreover, submodularity appears in many combinatorial theorems or problems and frequently plays an essencial role in a proof or an algorithm. In this dissertation, we study structural and algorithmic aspects of submodular functions. In particular, we focus on the recent advances in combinatorial algorithms for submodular function minimization. We describe in detail the first combinatorial strongly polynomial-time algorithms for this purpose, due to Schrijver and Iwata, Fleischer, and Fujishige, as well as some extensions. Some applications of submodularity in combinatorial optimization are also included in this work.
|
278 |
Investigation of 4-cutwidth critical graphsChavez, Dolores 01 January 2006 (has links)
A 2004 article written by Yixun Lin and Aifeng Yang published in the journal Discrete Math characterized the set of a 3-cutwidth critical graphs by five specified elements. This project extends the idea to 4-cutwidth critical graphs.
|
279 |
Combinatorial Synthesis and High-Throughput Characterization of Polyurethaneureas and Their Nanocomposites with LaponiteJoe-Lahai, Sormana 26 July 2005 (has links)
Segmented polyurethaneureas (SPUU) are thermoplastic elastomers with excellent elastic properties, high abrasion resistance and tear strength, making them very useful in numerous industrial applications ranging from microelectronics (slurry pad) to biomedical (artificial heart vessels) applications. The elastic and mechanical properties of these materials are strongly influenced by their two phase morphology. The factors that influence phase separation include difference in polarity between the hard and soft phases, composition and temperature. In general good phase separation results in materials with superior mechanical and elastic properties. Due to the immense potential applications of SPUU elastomers, there is a need for materials with higher strength. However, higher strength is not desired at the detriment of elasticity. If fact, stronger materials with enhanced elasticity are desired. In this thesis, high-strength SPUU elastomers were synthesized by incorporating reactive Laponite particles with surface-active free amine. The synthesis of pure SPUU is very complex, and addition of a reactive silicate further increases the complexity. To remedy this challenge, combinatorial methods and high-throughput screening techniques were used to optimize the diamine concentration and cure temperature. It was determined that pure SPUU elastomers prepared at a diamine stoichiometry of 85 100 mole %, and cured at 90 95 oC produced materials with higher strength and elongation at break. SPUU nanocomposites were prepared by maintaining the overall diamine stoichiometry at 95 mole %, and cured at 90 oC. Uniaxial tensile strength was optimized at a particle weight fraction of 1 wt. %, with a nearly 200 % increase in tensile strength and a 40 % increase in elongation at break, compared to pristine SPUU.
|
280 |
Design space pruning heuristics and global optimization method for conceptual design of low-thrust asteroid tour missionsAlemany, Kristina 13 November 2009 (has links)
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times.
Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables - launch date, time of flight, and asteroid stay times (when applicable) - as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition.
This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times.
The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
|
Page generated in 0.0316 seconds