• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 6
  • 1
  • Tagged with
  • 30
  • 30
  • 8
  • 8
  • 7
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Solution techniques for the Quadratic Assignment Problem and the development of a general purpose simulated annealing algorithm

Connolly, David T. January 1989 (has links)
No description available.
2

Tabu search-revisited

Whittley, Ian Murray January 2002 (has links)
No description available.
3

2-period travelling salesman problem

Butler, Martin January 1997 (has links)
No description available.
4

Evolutionary optimisation of network flow plans for emergency movement in the built environment

French, Thomas Reginald January 2012 (has links)
Planning for emergency evacuation, and, more generally, for emergency movement involving both evacuation (egress) of occupants and ingress of first responders, presents important and challenging problems. A number of the current issues that arise during emergency incidents are due to the uncertainty and transiency of environmental conditions. In general, movement plans are formulated at building design-time, and those involved, such as building occupants and emergency responders, are left to adapt routing plans to actual events as they unfold. In the context of next-generation emergency response systems, it has been proposed to dynamically plan and route individuals during an emergency event, replanning to take account of changes in the environment. In this work, an emergency movement problem, the Maximal Safest Escape (MSE) problem, is formulated in terms that model the uncertain and transient environmental conditions as a flow problem in time-dependent networks with time-varying and stochastic edge travel-times and capacities (STV Networks). The objective of the MSE problem is to find flow patterns with the a priori maximal probability of successfully conveying all supply from the source to the sink in some given STV Network. The MSE and its deterministic counterpart are proved to be NP-hard. Furthermore, due to inherent complexity in evaluating the exact quality of candidate solutions, a simulation approximation method is presented based on well-known Monte-Carlo sampling methods. Given the complexity of the problem, and using the approximation method for evaluating solutions, it is proposed to tackle the MSE problem using a metaheuristic approach based on an existing framework that integrates Evolutionary Algorithms (EA) with a state-of-the-art statistical ranking and selection method, the Optimal Computing Budget Allocation (OCBA). Several improvements are proposed for the framework to reduce the computational demand of the ranking method. Empirically, the approach is compared with a simple fitness averaging approach and conditions under which the integrated framework is more efficient are investigated. The performance of the EA is compared against upper and lower bounds on optimal solutions. An upper bound is established through the “wait-and-see” bound, and a lower bound by a naıve random search algorithm (RSA). An experimental design is presented that allows for a fair comparison between the EA and the RSA. While there is no guarantee that the EA will find optimal solutions, this work demonstrates that the EA can still find useful solutions; useful solutions are those that are at least better than some baseline, here the lower bound, in terms of solution quality and computational effort. Experimentally, it is demonstrated that the EA performs significantly better than the baseline. Also, the EA finds solutions relatively close to the upper bound; however, it is difficult to establish how optimistic the upper bounds. The main approach is also compared against an existing approach developed for solving a related problem wrapped in a heuristic procedure in order to apply the approach to the MSE. Empirical results show that the heuristic approach requires significantly less computation time, but finds solutions of significantly lower quality. Overall, this work introduces and empirically verifies the efficacy of a metaheuristic based on a framework integrating EAs with a state-of-the-art statistical ranking and selection technique, the OCBA, for a novel flow problem in STV Networks. It is suggested that the lessons learned during the course of this work, along with the specific techniques developed, may be relevant for addressing other flow problems of similar complexity.
5

New local search in the space of infeasible solutions framework for the routing of vehicles

Hamid, Mona January 2018 (has links)
Combinatorial optimisation problems (COPs) have been at the origin of the design of many optimal and heuristic solution frameworks such as branch-and-bound algorithms, branch-and-cut algorithms, classical local search methods, metaheuristics, and hyperheuristics. This thesis proposes a refined generic and parametrised infeasible local search (GPILS) algorithm for solving COPs and customises it to solve the traveling salesman problem (TSP), for illustration purposes. In addition, a rule-based heuristic is proposed to initialise infeasible local search, referred to as the parameterised infeasible heuristic (PIH), which allows the analyst to have some control over the features of the infeasible solution he/she might want to start the infeasible search with. A recursive infeasible neighbourhood search (RINS) as well as a generic patching procedure to search the infeasible space are also proposed. These procedures are designed in a generic manner, so they can be adapted to any choice of parameters of the GPILS, where the set of parameters, in fact for simplicity, refers to set of parameters, components, criteria and rules. Furthermore, a hyperheuristic framework is proposed for optimizing the parameters of GPILS referred to as HH-GPILS. Experiments have been run for both sequential (i.e. simulated annealing, variable neighbourhood search, and tabu search) and parallel hyperheuristics (i.e., genetic algorithms / GAs) to empirically assess the performance of the proposed HH-GPILS in solving TSP using instances from the TSPLIB. Empirical results suggest that HH-GPILS delivers an outstanding performance. Finally, an offline learning mechanism is proposed as a seeding technique to improve the performance and speed of the proposed parallel HH-GPILS. The proposed offline learning mechanism makes use of a knowledge-base to keep track of the best performing chromosomes and their scores. Empirical results suggest that this learning mechanism is a promising technique to initialise the GA's population.
6

Scheduling and Resource Efficiency Balancing: Discrete Species Conserving Cuckoo Search for Scheduling in an Uncertain Execution Environment

Bibiks, Kirils January 2017 (has links)
The main goal of a scheduling process is to decide when and how to execute each of the project’s activities. Despite large variety of researched scheduling problems, the majority of them can be described as generalisations of the resource-constrained project scheduling problem (RCPSP). Because of wide applicability and challenging difficulty, RCPSP has attracted vast amount of attention in the research community and great variety of heuristics have been adapted for solving it. Even though these heuristics are structurally different and operate according to diverse principles, they are designed to obtain only one solution at a time. In the recent researches on RCPSPs, it was proven that these kind of problems have complex multimodal fitness landscapes, which are characterised by a wide solution search spaces and presence of multiple local and global optima. The main goal of this thesis is twofold. Firstly, it presents a variation of the RCPSP that considers optimisation of projects in an uncertain environment where resources are modelled to adapt to their environment and, as the result of this, improve their efficiency. Secondly, modification of a novel evolutionary computation method Cuckoo Search (CS) is proposed, which has been adapted for solving combinatorial optimisation problems and modified to obtain multiple solutions. To test the proposed methodology, two sets of experiments are carried out. First, the developed algorithm is applied to a real-life software development project. Second, performance of the algorithm is tested on universal benchmark instances for scheduling problems which were modified to take into account specifics of the proposed optimisation model. The results of both experiments demonstrate that the proposed methodology achieves competitive level of performance and is capable of finding multiple global solutions, as well as prove its applicability in real-life projects.
7

Improving manufacturing systems using integrated discrete event simulation and evolutionary algorithms

Kang, Parminder January 2012 (has links)
High variety and low volume manufacturing environment always been a challenge for organisations to maintain their overall performance especially because of the high level of variability induced by ever changing customer demand, high product variety, cycle times, routings and machine failures. All these factors consequences poor flow and degrade the overall organisational performance. For most of the organisations, therefore, process improvement has evidently become the core component for long term survival. The aim of this research here is to develop a methodology for automating operations in process improvement as a part of lean creative problem solving process. To achieve the stated aim, research here has investigated the job sequence and buffer management problem in high variety/low volume manufacturing environment, where lead time and total inventory holding cost are used as operational performance measures. The research here has introduced a novel approach through integration of genetic algorithms based multi-objective combinatorial optimisation and discrete event simulation modelling tool to investigate the effect of variability in high variety/low volume manufacturing by considering the effect of improvement of selected performance measures on each other. Also, proposed methodology works in an iterative manner and allows incorporating changes in different levels of variability. The proposed framework improves over exiting buffer management methodologies, for instance, overcoming the failure modes of drum-buffer-rope system and bringing in the aspect of automation. Also, integration of multi-objective combinatorial optimisation with discrete event simulation allows problem solvers and decision makers to select the solution according to the trade-off between selected performance measures.
8

On combinatorial approximation algorithms in geometry / Sur les algorithmes d'approximation combinatoires en géométrie

Jartoux, Bruno 12 September 2018 (has links)
L'analyse des techniques d'approximation est centrale en géométrie algorithmique, pour des raisons pratiques comme théoriques. Dans cette thèse nous traitons de l'échantillonnage des structures géométriques et des algorithmes d'approximation géométriques en optimisation combinatoire. La première partie est consacrée à la combinatoire des hypergraphes. Nous débutons par les problèmes de packing, dont des extensions d'un lemme de Haussler, particulièrement le lemme dit de Shallow packing, pour lequel nous donnons aussi un minorant optimal, conjecturé mais pas établi dans les travaux antérieurs. Puis nous appliquons ledit lemme, avec la méthode de partition polynomiale récemment introduite, à l'étude d'un analogue combinatoire des régions de Macbeath de la géométrie convexe : les M-réseaux, pour lesquels nous unifions les résultats d'existence et majorations existants, et donnons aussi quelques minorants. Nous illustrons leur relation aux epsilon-réseaux, structures incontournables en géométrie combinatoire et algorithmique, notamment en observant que les majorants de Chan et al. (SODA 2012) ou Varadarajan (STOC 2010) pour les epsilon-réseaux (uniformes) découlent directement de nos résultats sur les M-réseaux. La deuxième partie traite des techniques de recherche locale appliquées aux restrictions géométriques de problèmes classiques d'optimisation combinatoire. En dix ans, ces techniques ont produit les premiers schémas d'approximation en temps polynomial pour divers problèmes tels que celui de calculer un plus petit ensemble intersectant pour un ensemble de disques donnés en entrée parmi un ensemble de points donnés en entrée. En fait, il a été montré que pour de nombreux tels problèmes, la recherche locale de rayon Θ (1/epsilon²) donne une (1 + epsilon)-approximation en temps n^{O(1/epsilon²)}. Savoir si l'exposant de n pouvait être ramené à o (1/epsilon²) demeurait une question ouverte. Nous répondons par la négative : la garantie d'approximation de la recherche locale n'est améliorable pour aucun desdits problèmes / The analysis of approximation techniques is a key topic in computational geometry, both for practical and theoretical reasons. In this thesis we discuss sampling tools for geometric structures and geometric approximation algorithms in combinatorial optimization. Part I focuses on the combinatorics of geometric set systems. We start by discussing packing problems in set systems, including extensions of a lemma of Haussler, mainly the so-called shallow packing lemma. For said lemma we also give an optimal lower bound that had been conjectured but not established in previous work on the topic. Then we use this lemma, together with the recently introduced polynomial partitioning technique, to study a combinatorial analogue of the Macbeath regions from convex geometry: Mnets, for which we unify previous existence results and upper bounds, and also give some lower bounds. We highlight their connection with epsilon-nets, staples of computational and combinatorial geometry, for example by observing that the unweighted epsilon-net bound of Chan et al. (SODA 2012) or Varadarajan (STOC 2010) follows directly from our results on Mnets. Part II deals with local-search techniques applied to geometric restrictions of classical combinatorial optimization problems. Over the last ten years such techniques have produced the first polynomial-time approximation schemes for various problems, such as that of computing a minimum-sized hitting set for a collection of input disks from a set of input points. In fact, it was shown that for many of these problems, local search with radius Θ(1/epsilon²) gives a (1 + epsilon)-approximation with running time n^{O(1/epsilon²)}. However the question of whether the exponent of n could be decreased to o(1/epsilon²) was left open. We answer it in the negative: the approximation guarantee of local search cannot be improved for any of these problems. The key ingredient is a new lower bound on locally expanding planar graphs, which is then used to show the impossibility results
9

Scheduling and resource efficiency balancing : discrete species conserving cuckoo search for scheduling in an uncertain execution environment

Bibiks, Kirils January 2017 (has links)
The main goal of a scheduling process is to decide when and how to execute each of the project's activities. Despite large variety of researched scheduling problems, the majority of them can be described as generalisations of the resource-constrained project scheduling problem (RCPSP). Because of wide applicability and challenging difficulty, RCPSP has attracted vast amount of attention in the research community and great variety of heuristics have been adapted for solving it. Even though these heuristics are structurally different and operate according to diverse principles, they are designed to obtain only one solution at a time. In the recent researches on RCPSPs, it was proven that these kind of problems have complex multimodal fitness landscapes, which are characterised by a wide solution search spaces and presence of multiple local and global optima. The main goal of this thesis is twofold. Firstly, it presents a variation of the RCPSP that considers optimisation of projects in an uncertain environment where resources are modelled to adapt to their environment and, as the result of this, improve their efficiency. Secondly, modification of a novel evolutionary computation method Cuckoo Search (CS) is proposed, which has been adapted for solving combinatorial optimisation problems and modified to obtain multiple solutions. To test the proposed methodology, two sets of experiments are carried out. Firstly, the developed algorithm is applied to a real-life software development project. Secondly, the performance of the algorithm is tested on universal benchmark instances for scheduling problems which were modified to take into account specifics of the proposed optimisation model. The results of both experiments demonstrate that the proposed methodology achieves competitive level of performance and is capable of finding multiple global solutions, as well as prove its applicability in real-life projects.
10

The use of geometric structures in graphics and optimization / L'utilisation des structures géométriques pour synthèse d'image et optimisation

Bus, Norbert 07 October 2015 (has links)
Les données du monde réel ont manifestement une composante géométrique importante et suggère les patterns géométriques signifiants. Les méthodes qui utilisent la nature géométrique des données sont activement développés dans plusieurs domaines scientifiques, comme, par exemple, la géométrie algorithmique, la géométrie discrète, la synthèse d'images, la vision par ordinateur. Dans le travail présent, nous utilisons les structures géométriques afin de modéliser des algorithmes efficaces pour deux domaines, celui de synthèse d'images et de l'optimisation combinatoire. Dans la première partie il s'agit de la structure de données géométriques, appelé une décomposition bien-séparée, et son application pour un des problèmes les plus difficiles dans la synthèse d'images, un efficace rendu photo-réalistique. Une solution consiste à appliquer toute une famille de méthodes de many-lights qui fait une approximation d'illumination globale par calcule individuelle d'illumination avec un grand nombre de VPLs (virtual point light) répartis sur les surfaces. L'application individuelle de chacun VPL résulte dans un grand nombre des calculs. Une des stratégies de la réussite pour réduire les computations est de faire les clusteurs considérés qui sont consideré comme une seul émetteur. Nous utilisons la décomposition bien-séparée de points comme le fondement de la structure des données susceptible de procéder à un calcul préliminaire et de conserver d'une façon compacte un grand nombre des clusterisations individuels potentiels ce qui montre que la clusterisation des VPL plus correspondante peut être extraite de cette structure de données d'une manière efficace. Nous montrons qu'au lieu de regroupper les points et/ou VPL indépendemment il vaut mieux produire les clusteurs sur l'espace de produit du nombre des points à nuancer et un groupe de VPL à la base de l'illumination des paires induite. En plus, nous proposons une technique adaptive afin d'échantillonner pour réduire le nombre des demandes de vérifications de visibilité pour chaque clusteur de l'espace de produit. Notre méthode consiste à détenir chaque émetteur qui peut être rapproché par VPL, matériaux spéculaire et à performer les méthodes précédents réconnus les meilleurs jusqu'au présent. La deuxième partie est consacrée au développement de nouveaux algorithmes d'approximation pour un problème fondamental de NP complet dans la géométrie algorithmique, précisément le problème du hitting set, avec une précision pour le cas d'un groupe de points et d'un groupe de disques, nous souhaiterons calculer les plus petits nombre du points qui touche tous les disques. Il arrive que les algorithmes efficaces à détecter le hitting set repose sur une structure géométrique clée, appelée epsilon-net. Nous donnons un algorithme utilisant uniquement les triangulisations de Delaunay pour construire les epsilon-nets de taille 13.4/epsilon. Nous donnons une implémentation pratique de la technique à calculer les hitting sets dans le temps quasi-linéaire en utilisant des epsilon-nets de petites tailles. Nos résultats aboutissent à une approximation de 13.4 pour le problème de hitting set par un algorithme qui fonctionne même pour les grands ensembles de données. Pour les ensembles de taille plus petite, nous proposons une implémentation de la technique de recherche locale avec une approximation bornes supérieures, avec le résultat obtenu d'approximation de (8 + epsilon) dans le temps O(n^{2.34}) / Real-world data has a large geometric component, showing significant geometric patterns. How to use the geometric nature of data to design efficient methods has became a very important topic in several scientific fields, e.g., computational geometry, discrete geometry, computer graphics, computer vision. In this thesis we use geometric structures to design efficient algorithms for problems in two domains, computer graphics and combinatorial optimization. Part I focuses on a geometric data structure called well-separated pair decomposition and its usage for one of the most challenging problems in computer graphics, namely efficient photo-realistic rendering. One solution is the family of many-lights methods that approximate global illumination by individually computing illumination from a large number of virtual point lights (VPLs) placed on surfaces. Considering each VPL individually results in a vast number of calculations. One successful strategy the reduce computations is to group the VPLs into a small number of clusters that are treated as individual lights with respect to each point to be shaded. We use the well-separated pair decomposition of points as a basis for a data structure for pre-computing and compactly storing a set of view independent candidate VPL clusterings showing that a suitable clustering of the VPLs can be efficiently extracted from this data structure. We show that instead of clustering points and/or VPLs independently what is required is to cluster the product-space of the set of points to be shaded and the set of VPLs based on the induced pairwise illumination. Additionally we propose an adaptive sampling technique to reduce the number of visibility queries for each product-space cluster. Our method handles any light source that can be approximated with virtual point lights (VPLs), highly glossy materials and outperforms previous state-of-the-art methods. Part II focuses on developing new approximation algorithms for a fundamental NP-complete problem in computational geometry, namely the minimum hitting set problem with particular focus on the case where given a set of points and a set of disks, we wish to compute the minimum-sized subset of the points that hits all disks. It turns out that efficient algorithms for geometric hitting set rely on a key geometric structure, called epsilon-net. We give an algorithm that uses only Delaunay triangulations to construct epsilon-nets of size 13.4/epsilon and we provide a practical implementation of a technique to calculate hitting sets in near-linear time using small sized epsilon-nets. Our results yield a 13.4 approximation for the hitting set problem with an algorithm that runs efficiently even on large data sets. For smaller datasets, we present an implementation of the local search technique along with tight approximation bounds for its approximation factor, yielding an (8 + epsilon)-approximation algorithm with running time O(n^{2.34})

Page generated in 0.1042 seconds