• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 62
  • 24
  • 11
  • 5
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 129
  • 129
  • 37
  • 28
  • 26
  • 19
  • 17
  • 17
  • 17
  • 17
  • 16
  • 16
  • 15
  • 15
  • 14
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Modelling and Exploiting Structures in Solving Propositional Satisfiability Problems

Pham, Duc Nghia, n/a January 2006 (has links)
Recent research has shown that it is often preferable to encode real-world problems as propositional satisfiability (SAT) problems and then solve using a general purpose SAT solver. However, much of the valuable information and structure of these realistic problems is flattened out and hidden inside the corresponding Conjunctive Normal Form (CNF) encodings of the SAT domain. Recently, systematic SAT solvers have been progressively improved and are now able to solve many highly structured practical problems containing millions of clauses. In contrast, state-of-the-art Stochastic Local Search (SLS) solvers still have difficulty in solving structured problems, apparently because they are unable to exploit hidden structure as well as the systematic solvers. In this thesis, we study and evaluate different ways to effectively recognise, model and efficiently exploit useful structures hidden in realistic problems. A summary of the main contributions is as follows: 1. We first investigate an off-line processing phase that applies resolution-based pre-processors to input formulas before running SLS solvers on these problems. We report an extensive empirical examination of the impact of SAT pre-processing on the performance of contemporary SLS techniques. It emerges that while all the solvers examined do indeed benefit from pre-processing, the effects of different pre-processors are far from uniform across solvers and across problems. Our results suggest that SLS solvers need to be equipped with multiple pre-processors if they are ever to match the performance of systematic solvers on highly structured problems. [Part of this study was published at the AAAI-05 conference]. 2. We then look at potential approaches to bridging the gap between SAT and constraint satisfaction problem (CSP) formalisms. One approach has been to develop a many-valued SAT formalism (MV-SAT) as an intermediate paradigm between SAT and CSP, and then to translate existing highly efficient SAT solvers to the MV-SAT domain. In this study, we follow a different route, developing SAT solvers that can automatically recognise CSP structure hidden in SAT encodings. This allows us to look more closely at how constraint weighting can be implemented in the SAT and CSP domains. Our experimental results show that a SAT-based mechanism to handle weights, together with a CSP-based method to instantiate variables, is superior to other combinations of SAT and CSP-based approaches. In addition, SLS solvers based on this many-valued weighting approach outperform other existing approaches to handle many-valued CSP structures. [Part of this study was published at the AAAI-05 conference]. 3. Finally, we propose and evaluate six different schemes to encode temporal reasoning problems, in particular the Interval Algebra (IA) networks, into SAT CNF formulas. We then empirically examine the performance of local search as well as systematic solvers on the new temporal SAT representations, in comparison with solvers that operate on native IA representations. Our empirical results show that zChaff (a state-of-the-art complete SAT solver) together with the best IA-to-SAT encoding scheme, can solve temporal problems significantly faster than existing IA solvers working on the equivalent native IA networks. [Part of this study was published at the CP-05 workshop].
22

Constraint Weighting Local Search for Constraint Satisfaction

Thornton, John Richard, n/a January 2000 (has links)
One of the challenges for the constraint satisfaction community has been to develop an automated approach to solving Constraint Satisfaction Problems (CSPs) rather than creating specific algorithms for specific problems. Much of this work has concentrated on the development and improvement of general purpose backtracking techniques. However, the success of relatively simple local search techniques on larger satisfiability problems [Selman et a!. 1992] and CSPs such as the n-queens [Minton et al. 1992] has caused interest in applying local search to constraint satisfaction. In this thesis we look at the usefulness of constraint weighting as a local search technique for constraint satisfaction. The work is based on the clause weighting ideas of Selman and Kautz [1993] and Moths [1993] and applies, evaluates and extends these ideas from the satisfiability domain to the more general domain of CSPs. Specifically, the contributions of the thesis are: 1. The introduction of a local search taxonomy. We examine the various better known local search techniques and recognise four basic strategies: restart, randomness, memory and weighting. 2. The extension of the CSP modelling framework. In order to represent and efficiently solve more realistic problems we extend the C SP modelling framework to include array-based domains and array-based domain use constraints. 3. The empirical evaluation of constraint weighting. We compare the performance of three constraint weighting strategies on a range of CSP and satisflability problems and with several other local search techniques. We find that no one technique dominates in all problem domains. 4. The characterisation of constraint weighting performance. Based on our empirical study we identiIS' the weighting behaviours and problem features that favour constrtt weighting. We conclude weighting does better on structured problems where the algorithm can recognise a harder sub-group of constraints. 5. The extension of constraint weighting. We introduce an efficient arc weighting algorithm that additionally weights connections between constraints that are simultaneously violated at a local minimum. This algorithm is empirically shown to outperform standard constraint weighting on a range of CSPs and within a general constraint solving system. Also we look at combining constraint weighting with other local search heuristics and find that these hybrid techniques can do well on problems where the parent algorithms are evenly matched. 6. The application of constraint weighting to over constrained domains. Our empirical work suggests constraint weighting does well for problems with distinctions between constraint groups. This led us to investigate solving real-world over constrained problems with hard and soft constraint groups and to introduce two dynamic constraint weighting heuristics that maintain a distinction between hard and soft constraint groups while still adding weights to violated constraints in a local minimum. In an empirical study, the dynamic schemes are shown to outperform other fixed weighting and non-weighting systems on a range of real world problems. In addition, the performance of weighting is shown to degrade less severely when soft constraints are added to the system, suggesting constraint weighting is especially applicable to realistic, hard and soft constraint problems
23

Improved approximation guarantees for lower-bounded facility location problem

Ahmadian, Sara January 2010 (has links)
We consider the lower-bounded facility location (LBFL) problem (, also known as load-balanced facility location), which is a generalization of uncapacitated facility location (UFL) problem where each open facility is required to serve a minimum number of clients. More formally, in the LBFL problem, we are given a set of clients Ɗ , a set of facilities Ƒ, a non-negative facility-opening cost f_i for each i ∈ Ƒ, a lower bound M, and a distance metric c(i,j) on the set Ɗ ∪ Ƒ, where c(i,j) denotes the cost of assigning client j to facility i. A feasible solution S specifies the set of open facilities F_S ⊆ Ƒ and the assignment of each client j to an open facility i(j) such that each open facility serves at least M clients. Our goal is to find feasible solution S that minimizes ∑_{i ∈ F_S} f_i + ∑_j c(i,j). The current best approximation ratio for LBFL is 550. We substantially advance the state-of-the-art for LBFL by devising an approximation algorithm for LBFL that achieves a significantly-improved approximation guarantee of 83.
24

Augmenting Local Search for Satisfiability

Southey, Finnegan January 2004 (has links)
This dissertation explores approaches to the satisfiability problem, focusing on local search methods. The research endeavours to better understand how and why some local search methods are effective. At the root of this understanding are a set of metrics that characterize the behaviour of local search methods. Based on this understanding, two new local search methods are proposed and tested, the first, SDF, demonstrating the value of the insights drawn from the metrics, and the second, ESG, achieving state-of-the-art performance and generalizing the approach to arbitrary 0-1 integer linear programming problems. This generality is demonstrated by applying ESG to combinatorial auction winner determination. Further augmentations to local search are proposed and examined, exploring hybrids that incorporate aspects of backtrack search methods.
25

Improved approximation guarantees for lower-bounded facility location problem

Ahmadian, Sara January 2010 (has links)
We consider the lower-bounded facility location (LBFL) problem (, also known as load-balanced facility location), which is a generalization of uncapacitated facility location (UFL) problem where each open facility is required to serve a minimum number of clients. More formally, in the LBFL problem, we are given a set of clients Ɗ , a set of facilities Ƒ, a non-negative facility-opening cost f_i for each i ∈ Ƒ, a lower bound M, and a distance metric c(i,j) on the set Ɗ ∪ Ƒ, where c(i,j) denotes the cost of assigning client j to facility i. A feasible solution S specifies the set of open facilities F_S ⊆ Ƒ and the assignment of each client j to an open facility i(j) such that each open facility serves at least M clients. Our goal is to find feasible solution S that minimizes ∑_{i ∈ F_S} f_i + ∑_j c(i,j). The current best approximation ratio for LBFL is 550. We substantially advance the state-of-the-art for LBFL by devising an approximation algorithm for LBFL that achieves a significantly-improved approximation guarantee of 83.
26

Morfologinio daugiareikšminimo statistiniais metodais parametrų tyrimas / Research of morphological ambiguity parameters by statistical methods

Žiemelis, Audrius 15 June 2010 (has links)
Šiame darbe kuriamas įrankis, kuris padėtų nustatyti, kurios morfologinės žymos savybės yra svarbios sprendžiant lietuvių kalbos morfologinio daugiareikšmiškumo problemą. Morfologinio daugiareikšmiškumo problema išsprendžiama tuomet, kai pagal kontekstą vienam žodžiui priskiriama viena morfologinė žyma. Darbe naudojamas tekstynas, kurį sudaro daugiau nei 1.200.000 žodžių. Tekstyne žodžiams morfologines žymas nustatė ekspertas, o visos galimos žodžio morfologinės žymos buvo sugeneruotos su pagalbiniu įrankiu. Morfologinio daugiareikšmiškumo problemoms spręsti suprogramuotas ir taikytas Viterbi algoritmas, randantis tikėtiniausią sakinį atitinkančią kalbos dalių seką pagal sukurtus bigramų ar trigramų kalbos modelius. Atlikus testavimą naudojant dešimt kartų kryžminį patikrinimą, pasiekti toki rezultatai: • 90,10% – tikslumas, kuris parodo ar teisingai priskirta morfologinė žyma daugiareikšmiams žodžiams; • 96,39% – bendras tikslumas, kuris skaičiuojamas įtraukiant ir tuos žodžius, kurie turėjo tik vieną morfologinę žymą. / In this research was developed tool, which helps to determine, which morphological mark attributes are important when solving problem of morphological ambiguity in Lithuanian language. Morphological ambiguity problem is solved, when one word is matched with one morphological mark. Research uses corpus, which contains over than 1.200.000 words. Morphological marks in the corpus were assigned by expert and list of all possible morphological marks was generated with other utility. There was developed and applied Viterbi algorithm to solve morphological ambiguity problem, which finds the most expected path of part of speeches by created bigram or trigram speech models. Testing was implemented using cross validation with 10 folds. There was achieved these results: • 90,10% – accuracy, which shows if morphological mark was correctly match with ambiguous word; • 96,39% – total accuracy, which calculated when included non-ambiguous words.
27

An Algorithm For The Capacitated Vehicle Routing Problem With Time Windows

Pehlivanoglu, Osman 01 October 2005 (has links) (PDF)
In this thesis the capacitated vehicle routing problem with time windows (VRPTW) is studied, where the objective is to serve a set of geographically dispersed customers with known demands and predefined time windows at the minimum cost. It is hard to find an optimal solution for the VRPTW even if the problem size is small. Therefore, many heuristic methods are developed to obtain near optimal solutions. In this study a local search algorithm is proposed for solving the VRPTW, which consist of route construction and route improvement phases. Computational experiments are conducted with Solomon (1987)&rsquo / s and Homberger and Gehring (1999)&rsquo / s problem sets in order to test the performance of the proposed algorithm. From the computational results encouraging results are obtained in terms of solution quality.
28

Local search methods for constraint problems

Muhammad, Muhammad Rafiq Bin Unknown Date (has links) (PDF)
This thesis investigates the use of local search methods in solving constraint problems. Such problems are very hard in general and local search offers a useful and successful alternative to existing techniques. The focus of the thesis is to analyze the techniques of invariants used in local search. The use of invariants have recently become the cornerstone of local search technology as they provide a declarative way to specify incremental algorithms. We have produced a series of program libraries in C++ known as the One-Way-Solver. The One-Way-Solver includes the implementation of incremental data structures and is a useful tool for the implementation of local search. The One-Way-Solver is applied to two challenging constraint problems, the Boolean Satisfiability Testing (SAT) and university course timetabling problems.
29

Combining search strategies for distributed constraint satisfaction

Magaji, Amina Sambo-Muhammad January 2015 (has links)
Many real-life problems such as distributed meeting scheduling, mobile frequency allocation and resource allocation can be solved using multi-agent paradigms. Distributed constraint satisfaction problems (DisCSPs) is a framework for describing such problems in terms of related subproblems, called a complex local problem (CLP), which are dispersed over a number of locations, each with its own constraints on the values their variables can take. An agent knows the variables in its CLP plus the variables (and their current value) which are directly related to one of its own variables and the constraints relating them. It knows little about the rest of the problem. Thus, each CLP is solved by an agent which cooperates with other agents to solve the overall problem. Algorithms for solving DisCSPs can be classified as either systematic or local search with the former being complete and the latter incomplete. The algorithms generally assume that each agent has only one variable as they can solve DisCSP with CLPs using “virtual” agents. However, in large DisCSPs where it is appropriate to trade completeness off against timeliness, systematic search algorithms can be expensive when compared to local search algorithms which generally converge quicker to a solution (if a solution is found) when compared to systematic algorithms. A major drawback of local search algorithms is getting stuck at local optima. Significant researches have focused on heuristics which can be used in an attempt to either escape or avoid local optima. This thesis makes significant contributions to local search algorithms for DisCSPs. Firstly, we present a novel combination of heuristics in DynAPP (Dynamic Agent Prioritisation with Penalties), which is a distributed synchronous local search algorithm for solving DisCSPs having one variable per agent. DynAPP combines penalties on values and dynamic agent prioritisation heuristics to escape local optima. Secondly, we develop a divide and conquer approach that handles DisCSP with CLPs by exploiting the structure of the problem. The divide and conquer approach prioritises the finding of variable instantiations which satisfy the constraints between agents which are often more expensive to satisfy when compared to constraints within an agent. The approach also exploits concurrency and combines the following search strategies: (i) both systematic and local searches; (ii) both centralised and distributed searches; and (iii) a modified compilation strategy. We also present an algorithm that implements the divide and conquer approach in Multi-DCA (Divide and Conquer Algorithm for Agents with CLPs). DynAPP and Multi-DCA were evaluated on several benchmark problems and compared to the leading algorithms for DisCSPs and DisCSPs with CLPs respectively. The results show that at the region of difficult problems, combining search heuristics and exploiting problem structure in distributed constraint satisfaction achieve significant benefits (i.e. generally used less computational time and communication costs) over existing competing methods.
30

Heurística com busca local para solução do problema de cobertura de rotas com cardinalidade restrita. / Heuristic with local search to solve the cardinality constraint lane covering problem.

Rafael Alzuguir Rosin 19 December 2011 (has links)
A crescente necessidade de buscar operações mais eficientes, com menor custo e mais sustentáveis tem feito com que empresas passassem a procurar oportunidades pelas quais estes objetivos pudessem ser atingidos. Na área de transportes encontrou-se na colaboração uma oportunidade para tal. Este trabalho trata o problema de cobertura rotas com cardinalidade restrita (PCRCR), onde empresas que realizam viagens de carga cheia se unem com o objetivo de reduzir o deslocamento vazio de veículos através da formação de ciclos. É chamado de problema de cardinalidade restrita uma vez que limitamos o número de máximo de viagens no ciclo, o que torna este problema NP-Hard. Existem na literatura duas heurísticas (construtivas) e um modelo por programação linear inteira para a solução deste problema. Este trabalho apresenta uma heurística baseada em um método de busca local que reduziu em média 3,19% os melhores resultados apresentados na literatura. Também são apresentados os tempos de execução de cada um dos algoritmos e a importância de escolher de uma boa solução inicial quando se deseja implantar uma Heurística com Busca Local. / The growing need to seek more efficient, lower cost and more sustainable operations has caused industries to seek opportunities in which these objectives could be achieved. In the area of transportation, collaboration is an opportunity for that. This work deals with the cardinality constrained lane covering problem (CCLCP), where companies who uses full truck loads join efforts in order to reduce empty vehicle travel through closed cycle formation. It is known as cardinality constraint problem as the maximum number of trips in the cycle is limited to an integer number, which makes this problem NP-Hard. There are two heuristics in the literature (constructive) and an integer linear programming model for solving this problem. This work presents a heuristic based on a local search method that reduced an average of 3.19% the better results in the literature. It also presents the execution times of each algorithm and the importance of choosing a good initial solution when you want to create a Local Search Heuristic.

Page generated in 0.0337 seconds