• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 74
  • 40
  • 23
  • 5
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 166
  • 78
  • 33
  • 32
  • 30
  • 25
  • 25
  • 25
  • 24
  • 24
  • 24
  • 23
  • 23
  • 21
  • 19
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
101

Architecting the deployment of cloud-hosted services for guaranteeing multitenancy isolation

Ochei, Laud Charles January 2017 (has links)
In recent years, software tools used for Global Software Development (GSD) processes (e.g., continuous integration, version control and bug tracking) are increasingly being deployed in the cloud to serve multiple users. Multitenancy is an important architectural property in cloud computing in which a single instance of an application is used to serve multiple users. There are two key challenges of implementing multitenancy: (i) ensuring isolation either between multiple tenants accessing the service or components designed (or integrated) with the service; and (ii) resolving trade-offs between varying degrees of isolation between tenants or components. The aim of this thesis is to investigate how to architect the deployment of cloud-hosted service while guaranteeing the required degree of multitenancy isolation. Existing approaches for architecting the deployment of cloud-hosted services to serve multiple users have paid little attention to evaluating the effect of the varying degrees of multitenancy isolation on the required performance, resource consumption and access privilege of tenants (or components). Approaches for isolating tenants (or components) are usually implemented at lower layers of the cloud stack and often apply to the entire system and not to individual tenants (or components). This thesis adopts a multimethod research strategy to providing a set of novel approaches for addressing these problems. Firstly, a taxonomy of deployment patterns and a general process, CLIP (CLoud-based Identification process for deployment Patterns) was developed for guiding architects in selecting applicable cloud deployment patterns (together with the supporting technologies) using the taxonomy for deploying services to the cloud. Secondly, an approach named COMITRE (COmponent-based approach to Multitenancy Isolation Through request RE-routing) was developed together with supporting algorithms and then applied to three case studies to empirically evaluate the varying degrees of isolation between tenants enabled by multitenancy patterns for three different cloud-hosted GSD processes, namely-continuous integration, version control, and bug tracking. After that, a synthesis of findings from the three case studies was carried out to provide an explanatory framework and new insights about varying degrees of multitenancy isolation. Thirdly, a model-based decision support system together with four variants of a metaheuristic solution was developed for solving the model to provide an optimal solution for deploying components of a cloud-hosted application with guarantees for multitenancy isolation. By creating and applying the taxonomy, it was learnt that most deployment patterns are related and can be implemented by combining with others, for example, in hybrid deployment scenarios to integrate data residing in multiple clouds. It has been argued that the shared component is better for reducing resource consumption while the dedicated component is better in avoiding performance interference. However, as the experimental results show, there are certain GSD processes where that might not necessarily be so, for example, in version control, where additional copies of the files are created in the repository, thus consuming more disk space. Over time, performance begins to degrade as more time is spent searching across many files on the disk. Extensive performance evaluation of the model-based decision support system showed that the optimal solutions obtained had low variability and percent deviation, and were produced with low computational effort when compared to a given target solution.
102

Análise do comportamento dos tempos de produção em um sistema de manufatura flexível em um problema de escalonamento em um job shop: abordagem utilizando conceito de caminho crítico

Rodrigues, Antonio Gabriel 01 March 2007 (has links)
Made available in DSpace on 2015-03-05T13:58:26Z (GMT). No. of bitstreams: 0 Previous issue date: 1 / Universidade do Vale do Rio dos Sinos / Neste trabalho é abordado o Problema de Escalonamento em um job shop, considerando restrições de datas de entrega, turnos de produção e tempo de setup entre operações. Considera-se um ambiente de Sistema de Manufatura flexível, que dado ao alto nível de automação, permite a previsibilidade dos processos de carregamento dos recursos à área de processamento. O problema foi modelado através de uma Função Objetivo fn composta de três variáveis de decisão. A importância da contribuição de cada variável para o valor de fn é gerida pela atribuição de valores aos pesos associados às variáveis. Na abordagem proposta, são utilizadas técnicas de Tecnologia de Grupo e Busca Tabu. O modelo implementado é uma modificação da técnica i TSAB, proposta por Nowicki e Smutnicki, a qual apresenta bons resultados no tratamento do Problema de Escalonamento em um job shop PEJS clássico. A consideração das restrições adicionais ao PEJS aumenta a complexidade do modelo implementado, porém, deixa o problema mais próximo da realidade. / In this work the Job Shop Scheduling Problem is studied, considering due dates, production turns and tooling constraints. This problem is applied in a Flexible Manufacturing System, which possesses high degree of automation, allowing previsibility in the processes of loading and unloading jobs on the machines. The problem is modeled through a objective function fn composed by three weighted decision variables. The importance of each variable in the fn final value is managed through assignment of values to the weights of these variables. In the proposed approach, it was used Group Technology and Tabu Search techniques. The implemented model is a modification of the i TSAB technique, proposed by Nowicki and Smutniki. The consideration of adicional constraints in the Job Shop Scheduling Problem increases the complexity of the implementation, otherwise, makes the problem closer to the industrial reality. The model was validated using benchmark instances, in which the data from the addional constraints were added.
103

Modelo hipermídia para geração de layouts de interfaces de aplicações

Nesi, Luan Carlos 27 March 2014 (has links)
Submitted by Maicon Juliano Schmidt (maicons) on 2015-03-23T14:28:22Z No. of bitstreams: 1 Luan Carlos Nesi.pdf: 100100607 bytes, checksum: 6012e0f177d7b8f3807de72ff7d98315 (MD5) / Made available in DSpace on 2015-03-23T14:28:22Z (GMT). No. of bitstreams: 1 Luan Carlos Nesi.pdf: 100100607 bytes, checksum: 6012e0f177d7b8f3807de72ff7d98315 (MD5) Previous issue date: 2014-03-27 / Milton Valente / Nesse trabalho foi desenvolvido um modelo computacional de Hipermídia Adaptativa para geração de layouts de interface de aplicações. A pesquisa partiu de uma revisão sobre Hipermídia Adaptativa, com um apanhado sobre os conceitos e características dos métodos e técnicas de adaptação a fim de embasar seu desenvolvimento. Após, avaliou-se o uso das metaheurísticas Algoritmo Genético, Busca Tabu e Algoritmo Memético como as ferramentas de apoio no desenvolvimento do modelo. Na sequência, as Redes de Autômatos Estocásticos nortearam a modelagem do formalismo utilizado para a retenção de conhecimento. Dessas bases, foi desenvolvida a prova de conceito. Conseguinte, apresentam-se os experimentos realizados para validação. Os resultados obtidos pelo modelo foram de boa qualidade, indo ao encontro dos objetivos da pesquisa. Como decorrência deste trabalho, obteve-se um sistema capaz de gerar layouts, contemplando as características dos usuários e seus dispositivos, sendo capaz de acompanhar uma tendência de consumo de conteúdos não só mercadológica, mas também, social. / In this paper was developed a computational model of Adaptive Hypermedia for generation of interface layouts of applications. The research began with a review of Adaptive Hypermedia, with an overview of the concepts and characteristics of the methods and adaptation techniques in order to base its development. After, we evaluated the use of metaheuristic Genetic Algorithm, Tabu Search, and Memetic Algorithm as support tools in the development of the model. Following, the Stochastic Automata Networks guided the modeling of the formalism used for knowledge retention. These bases, the proof of concept were developed. Therefore, we present the experiments to validate. The obtained results by the model were of good quality, meeting the research objectives. As results of this work, we obtained a system capable to generate layouts, considering the characteristics of the users and their devices, being able to follow a trend of content consumption not only marketing, but also social.
104

Tomada de decisão Fuzzy e busca Tabu aplicadas ao planejamento da expansão de sistemas de transmissão / Fuzzy decision making and Tabu search applied to planning the expansion of transmission systems

Aldir Silva Sousa 27 February 2009 (has links)
Neste trabalho é proposta uma nova técnica de solução para resolver o problema de planejamento da expansão de sistemas de transmissão estático através da introdução da tomada de decisão fuzzy. Na técnica apresentada neste trabalho, a tomada de decisão fuzzy é aplicada para o desenvolvimento de um algoritmo heurístico construtivo. O sistema fuzzy é utilizado para contornar alguns problemas críticos das heurísticas que utilizam o índice de sensibilidade como guia para inserção de novas linhas. A heurística apresentada nesse trabalho é baseada na técnica dividir para conquistar. Verificou-se que a deficiência das heurísticas construtivas é decorrente da decisão de inserir novas linhas baseada em valores não seguros encontrados através da solução do modelo utilizado. Para contornar tal deficiência, sempre que surgirem valores não seguros divide-se o problema original em dois subproblemas, um que analisa a qualidade da resposta para o caso em que a linha é inserida e outro para verificar a qualidade da resposta para o caso em que a linha não é inserida. A tomada de decisão fuzzy é utilizada para decidir sobre quando dividir o problema em dois novos subproblemas. Utilizou-se o modelo cc com a estratégia de Villasana-Garver-Salon para realizar a modelagem da rede elétrica para os problemas da expansão de sistemas de transmissão aqui propostos. Ao serem realizados testes em sistemas de pequeno, médio e grande portes certificou-se que o método pode encontrar a solução ótima de sistemas de pequeno e médio portes. Porém, a solução ótima dos sistemas de grande porte testados não foi encontrada. Para melhorar a qualidade da solução encontrada utilizou, em uma segunda fase, a metaheurística busca tabu. A busca tabu utiliza o modelo cc. Os resultados se mostraram bastante promissores. Os testes foram realizados em alguns sistemas reais brasileiros e com o sistema real colombiano. / A new solution technique to solve the long-term static transmission expansion planning (TEP) problem based on fuzzy decision making is proposed. The technique applies the concepts of fuzzy decision making in a constructive heuristic algorithm. The fuzzy system is used to circumvent some critical problems of heuristics that use sentivity indices as a guide for insertion and construction of new lines. The heuristic algorithm proposed in this work is based on the divide and conquer technique. It has been verified that the deficiency of the constructive heuristics is due to the decision of inserting new lines based only on information given by the index, which usually is calculated from a relaxed mathematical representation of the problem and can become less accurate during the solution process. In order to be able to deal with such problem, whenever the quality of the index decreases, the original problem is divided into two sub-problems: one examines the quality of the solution when the transmission line indicated by the sensitivity index is inserted and the other subproblem checks the opposite. Fuzzy decision-making is used to decide the moment to divide the problem into two subproblems based on other information. The hybrid linear model is used to model the long-term transmission expansion planning problem and is used in the proposed algorithm. Tests was done with systems of small-term, medium-term and long-term. The optimal solution of small-term and medium-term was foundo using just the construtive heuristic algorithm with fuzzy decision-making. To deal with long-term systems was used the solutions of the construtive heuristic algorithm with fuzzy decision-making to init a tabu search. The tabu search uses the dc model. The results are very promising. The test was done with some real brazilian systems and with the real colombian system.
105

Reconfiguração ótima de sistemas de distribuição de energia elétrica baseado no comportamento de colônias de formigas / Optimal reconfiguration of the electric power distribution systems using a modified ant colony system algorithm

Fernando Silva Pereira 26 February 2010 (has links)
O objetivo deste trabalho é apresentar uma nova abordagem para obtenção de configurações para sistemas de distribuição de energia elétrica com o intuito de minimizar o valor de perdas ativas sem violar as restrições operacionais. Para isso, considera-se que os sistemas de distribuição estão operando em regime permanente e que suas fases estão equilibradas e simétricas, podendo o sistema ser representado por um diagrama unifilar. A reconfiguração é feita de forma a redistribuir os fluxos de corrente nas linhas, transferindo cargas entre os alimentadores e melhorando o perfil de tensão ao longo do sistema. O problema de reconfiguração do sistema pode ser formulado como um problema de programação não-linear inteiro misto. Devido à explosão combinatorial inerente a este tipo de problema, a resolução do mesmo por técnicas de otimização clássicas torna-se pouco atraente, dando espaço para técnicas heurísticas e metaheurísticas. Essas outras, mesmo não garantindo o ótimo global, são capazes de encontrar boas soluções em um espaço de tempo relativamente curto. Para a resolução do problema de reconfiguração, utilizou-se uma nova metodologia baseada no comportamento de colônias de formigas em busca de alimento na natureza. Nesta, formigas artificiais (agentes) exploram o meio ambiente (sistema de distribuição) e trocam informações para tentar encontrar a topologia que apresente os menores valores de perdas ativas. Para o cálculo das perdas, este trabalho também apresenta uma nova abordagem para resolução do problema de fluxo de potência (FP) em sistemas de distribuição radial. O fluxo de potência é uma ferramenta básica utilizada pelos centros de controle para determinar os estados e condições operacionais desses sistemas de potência. Basicamente, as metodologias empregadas para o cálculo do fluxo de potência são baseadas nos métodos clássicos de Newton ou Gauss. Mas em sistemas de distribuição de energia, devido a particularidades inerentes a estes, como a alta relação entre resistência e reatância das linhas (r/x) e a operação radial, estes métodos apresentam problemas de convergência e se tornam ineficientes na maioria das vezes. A abordagem consiste na associação dos métodos da função penalidade e de Newton. O mal-condicionamento da matriz Jacobiana de Newton é resolvido pela associação com o método da função penalidade. São apresentados testes realizados em sistemas de 5 barras, 16 barras, 33 barras, 69 barras e 136 barras para avaliar a potencialidade das técnicas propostas. Os resultados são considerados bons ou muito bons quando comparado com as técnicas existentes atualmente. / The objective of this work is to present a novel methodology for obtaining new configurations of the distribution system in order to minimize the active power losses without violating operational constraints. For this, it is considered that any distribution system is operating in a steady state and that it is balanced, therefore it can be represented by a one-line diagram. The reconfiguration is done in order to redistribute de current flows on the distribution power lines, transferring loads among the feeders and improving the voltage profile along the system. Such problem can be formulated as a mixed integer nonlinear programming problem. Due to its inherent combinatorial characteristic and since its solution by classic optimization techniques is not appealing, heuristic and metaheuristic techniques are thus better suited for its solution. Although these latter do not guarantee a global optimum, they are able to find good solutions in a relatively short time. The solution of the reconfiguration problem in this approach makes use of a novel methodology based on ant colony behavior, when these search for victuals in nature. In this technique, the artificial ants (agents) explore the environment (distribution system) and exchange information among them in order to find the topology that provides the smallest active losses. For the active losses calculation, this work also presents a novel approach for the solution of the power flow problem for radial distribution systems. The solution of the power flow problem is used by system operators in order to determine the state and operational conditions of power systems. Basically, the most common techniques used in the power flow solution are based on either Newton\'s or Gauss\' approaches. However, due to particular characteristics of distribution systems such as the high ratio of r/x and the radial topology, these methods present convergence problems and are not efficient in most of the cases. Thus, this novel technique consists in associating Newton\'s and the penalty function approaches. The matter of the ill-conditioned Jacobian matrix in Newton\'s method is overcome with the penalty function method. Some tests performed in different systems are then presented in order to assess the effectiveness of both proposed techniques.
106

Métodos quantitativos para o problema de dimensionamento e sequenciamento de lotes na indústria de embalagens de vidro / Quantitative methods for lot sizing and scheduling in glass containers industry

Ramon Faganello Fachini 16 January 2015 (has links)
O problema de dimensionamento e sequenciamento de lotes vem sendo extensivamente estudado por pesquisadores da área de Pesquisa Operacional e há uma tendência de que tais trabalhos passem a cada vez mais integrar aspectos reais dos processos produtivos. Entretanto, percebe-se que os estudos conduzidos em alguns setores industriais negligenciam importantes restrições tecnológicos dos processos de produção e isso afasta esses trabalhos de Pesquisa Operacional de uma aplicação efetiva, como é o caso da indústria de embalagens de vidro. Neste contexto, propõe-se um modelo de programação inteira mista e um método de solução para o problema de dimensionamento e sequenciamentos de lotes na indústria de embalagens de vidro, sendo que este trabalho diferencia-se dos demais existentes na literatura por agregar restrições tecnológicas específicas desse processo produtivo. O modelo proposto, denominado CLSD-GCST, foi amplamente validado com base em um conjunto de testes com 40 instâncias de um problema real de uma grande empresa do setor no pacote comercial IBM ILOG CPLEX Optimization Studio Versão 12.5. A validação do modelo incluiu ainda uma análise de ganhos potenciais para o negócio de baseada no modelo SCOR. Já o método de solução proposto consiste em uma metaheurística de Busca em Vizinhança Variável (VNS) e se mostrou promissor para a solução do problema estudado, proporcionando resultados de qualidade em um baixo tempo computacional. Além disso, o VNS superou o Branch-and-Cut do CPLEX para grandes instâncias, nas quais o pacote comercial encontrou dificuldades. Por fim, o VNS proposto também foi validado por meio da análise de testes computacionais e suas principais características foram avaliadas sistematicamente, gerando um conjunto de informações que pode direcionar a utilização e, até mesmo, a evolução desse método em pesquisas futuras. / Lot sizing and scheduling problem has been extensively studied by Operations Research scientists and there is a tendency of incorporating more production processes real aspects in these researches. However, it can be noticed that studies conducted in some industrial sectors neglect important production process technological constraints and it keeps the Operations Research works away from an effective application, as happens with the glass containers industry. In this context, a mixed integer programming model and a solution method were proposed for glass containers industry lot sizing and scheduling problem, the main difference between this work and the others in literature is the inclusion of process specific technological constraints. The proposed model, named CLSD-GCST, was widely validated by a set of tests performed with 40 instances from a large company real problem using the commercial package IBM ILOG CPLEX Optimization Studio Version 12.5. The model validation also included a potential business earnings analysis based on SCOR framework. About the proposed solution method, it consists of a Variable Neighborhood Search (VNS) metaheuristic and it proved to be promising for the studied problem solution, providing good quality results in low computational time. Moreover, VNS overcame the CPLEX Branch-and-Cut for large instances, in which the commercial package found difficulties. Lastly, the proposed VNS was validated by means of computational tests analysis and its main characteristics were systematically evaluated, generating an information set that may direct this method application and even its evolution in future researches.
107

Implementação de algoritmo metaheurístico simulated annealing para problema de seleção de contingência em análise de segurança de redes elétricas

Tomazi, Fausto Stefanello 23 September 2016 (has links)
Submitted by Silvana Teresinha Dornelles Studzinski (sstudzinski) on 2016-12-21T11:49:01Z No. of bitstreams: 1 Fausto Stefanello Tomazi_.pdf: 1429293 bytes, checksum: 4e85a45b348c5d3cbf6a7e9e13e1be3b (MD5) / Made available in DSpace on 2016-12-21T11:49:02Z (GMT). No. of bitstreams: 1 Fausto Stefanello Tomazi_.pdf: 1429293 bytes, checksum: 4e85a45b348c5d3cbf6a7e9e13e1be3b (MD5) Previous issue date: 2016-09-23 / Nenhuma / Os sistemas de potência desempenham um papel fundamental na economia de uma nação, fornecendo energia elétrica com qualidade e sem interrupções a população. Para que isto seja possível grandes investimentos no setor são aplicados para garantir o fornecimento. No entanto, qualquer equipamento está sujeito a falhas, e analisar o impacto que falhas em equipamento afetam o fornecimento é uma das tarefas executadas pelos centros de controle, chamada de Análise de Segurança. Desta forma, os centros de controle são responsáveis por realizar planos de contingência para que em caso de algum equipamento saia de operação o impacto sofrido pela rede seja o menor possível. Uma importante tarefa da Análise de Segurança é a Seleção de Contingências. Esta tarefa sendo encarregada de selecionar os equipamentos mais importantes do sistema para que a tarefa de Análise de Segurança possa criar planos de prevenção caso os respectivos equipamentos saiam de operação. Os grandes sistemas elétricos existentes hoje são compostos de milhares de equipamentos, e uma análise mais detalhada para cada equipamento é algo de difícil resolução, sendo neste cenário que a seleção de contingência ganha importância. A Seleção de Contingência é encarregada de buscar e classificar as restrições mais importantes da rede, porem para redes de grande porte com milhares de itens, analisar o impacto de cada item é uma tarefa que pode levar muito tempo, não permitindo que o cálculo seja efetuado durante a operação do sistema. Desta forma faz-se necessário executar a Seleção de Contingências de forma eficiente e eficaz. Este estudo propõe o desenvolvimento do algoritmo metaheurístico de Simulated Annealing a fim de que a seleção de contingência seja executada de forma que atenda todas as restrições de tempo impostas pelos centros de controle. Nos experimentos é possível verificar que após uma sintonia de parâmetros para a instancia do problema abordado, os resultados encontrados atende as restrições dos centros de controle e também é possível visualizar que os resultados são ligeiramente melhores que resultados de trabalhos encontrados na literatura, onde o mesmo problema é abordado pela metaheurística do Algoritmo Genético. / Power systems play a key role in a nation's economy by providing quality, uninterrupted power to the population. For this to be possible large investments in the sector are applied to guarantee the supply. However, any equipment is subject to failures, and analyzing the impact that equipment failures affect supply is one of the tasks performed by control centers, called Safety Analysis. In this way, the control centers are responsible for carrying out contingency plans so that in the event of any equipment leaving the operation the impact suffered by the network is as small as possible. An important task of Security Analysis is the Selection of Contingencies. This task is in charge of selecting the most important equipment in the system so that the Security Analysis task can create prevention plans if the respective equipment goes out of operation. The large electrical systems that exist today are made up of thousands of equipment, and a more detailed analysis for each equipment is difficult to solve, and in this scenario contingency selection is important. The Contingency Selection is responsible for searching and classifying the most important restrictions of the network, but for large networks with thousands of items, analyzing the impact of each item is a task that can take a long time, not allowing the calculation to be performed During system operation. In this way it is necessary to perform the Contingency Selection efficiently and effectively. This study proposes the development of the metaheuristic algorithm of Simulated Annealing in order that the contingency selection is performed in a way that meets all the time constraints imposed by the control centers. In the experiments it is possible to verify that after a tuning of parameters for the instance of the problem approached, the results found meets the control center constraints and it is also possible to visualize that the results are slightly better than results of works found in the literature, where the same Problem is addressed by the metaheuristic of the Genetic Algorithm.
108

[en] INTEGRATING METAHEURISTICS WITH MIP SOLVERS TO THE CAPACITATED VEHICLE ROUTING PROBLEM / [pt] INTEGRANDO METAEURÍSTICAS COM RESOLVEDORES MIP PARA O CAPACITATED VEHICLE ROUTING PROBLEM

PEDRO NUNO DE SOUZA MOURA 02 March 2012 (has links)
[pt] Desde a sua origem, as abordagens a problemas de Otimização Combinatória polarizam-se entre métodos exatos e heurísticos. Recentemente, porém, estratégias que combinam ambos os métodos têm sido propostas para os mais variados problemas, apresentando resultados promissores. Nesse contexto, destacam-se os conceitos de vizinhaças de bola e elipsoidal, que realizam buscas em relação a uma ou mais soluções de referência. Este trabalho estuda a aplicação de tais vizinhanças para o Problema de Roteamento de Veículos com Restrição de Capacidade (CVRP), sobre o algoritmo de Branch-and-Cut-and-Price Robusto. Experimentos foram realizados e seus resultados analisados. / [en] Since its inception, approaches to Combinatorial Optimization were polarized between exact and heuristic methods. Recently, however, strategies that combine both methods have been proposed for various problems, showing promising results. In this context, the concepts of ball and ellipsoidal neighborhood appear, which perform a search regarding one or more reference solutions. This work studies the application of such neighborhoods for the Capacitated Vehicle Routing Problem (CVRP), using the Robust Branchand- Cut-and-Price algorithm. Experiments were made and its results were analyzed.
109

Models and algorithms for the capacitated location-routing problem

Contardo, Claudio 07 1900 (has links)
Le problème de localisation-routage avec capacités (PLRC) apparaît comme un problème clé dans la conception de réseaux de distribution de marchandises. Il généralisele problème de localisation avec capacités (PLC) ainsi que le problème de tournées de véhicules à multiples dépôts (PTVMD), le premier en ajoutant des décisions liées au routage et le deuxième en ajoutant des décisions liées à la localisation des dépôts. Dans cette thèse on dévelope des outils pour résoudre le PLRC à l’aide de la programmation mathématique. Dans le chapitre 3, on introduit trois nouveaux modèles pour le PLRC basés sur des flots de véhicules et des flots de commodités, et on montre comment ceux-ci dominent, en termes de la qualité de la borne inférieure, la formulation originale à deux indices [19]. Des nouvelles inégalités valides ont été dévelopées et ajoutées aux modèles, de même que des inégalités connues. De nouveaux algorithmes de séparation ont aussi été dévelopés qui dans la plupart de cas généralisent ceux trouvés dans la litterature. Les résultats numériques montrent que ces modèles de flot sont en fait utiles pour résoudre des instances de petite à moyenne taille. Dans le chapitre 4, on présente une nouvelle méthode de génération de colonnes basée sur une formulation de partition d’ensemble. Le sous-problème consiste en un problème de plus court chemin avec capacités (PCCC). En particulier, on utilise une relaxation de ce problème dans laquelle il est possible de produire des routes avec des cycles de longueur trois ou plus. Ceci est complété par des nouvelles coupes qui permettent de réduire encore davantage le saut d’intégralité en même temps que de défavoriser l’apparition de cycles dans les routes. Ces résultats suggèrent que cette méthode fournit la meilleure méthode exacte pour le PLRC. Dans le chapitre 5, on introduit une nouvelle méthode heuristique pour le PLRC. Premièrement, on démarre une méthode randomisée de type GRASP pour trouver un premier ensemble de solutions de bonne qualité. Les solutions de cet ensemble sont alors combinées de façon à les améliorer. Finalement, on démarre une méthode de type détruir et réparer basée sur la résolution d’un nouveau modèle de localisation et réaffectation qui généralise le problème de réaffectaction [48]. / The capacitated location-routing problem (CLRP) arises as a key problem in the design of distribution networks. It generalizes both the capacitated facility location problem (CFLP) and the multiple depot vehicle routing problem (MDVRP), the first by considering additional routing decisions and the second by adding the location decision variables. In this thesis we use different mathematical programming tools to develop and specialize new models and algorithms for solving the CLRP. In Chapter 3, three new models are presented for the CLRP based on vehicle-flow and commodity-flow formulations, all of which are shown to dominate, in terms of the linear relaxation lower bound, the original two-index vehicle-flow formulation [19]. Known valid inequalities are complemented with some new ones and included using separation algorithms that in many cases generalize extisting ones found in the literature. Computational experiments suggest that flow models can be efficient for dealing with small or medium size instances of the CLRP (50 customers or less). In Chapter 4, a new branch-and-cut-and-price exact algorithm is introduced for the CLRP based on a set-partitioning formulation. The pricing problem is a shortest path problem with resource constraints (SPPRC). In particular, we consider a relaxation of such problem in which routes are allowed to contain cycles of length three or more. This is complemented with the development of new valid inequalities that are shown to be effective for closing the optimality gap as well as to restrict the appearance of cycles. Computational experience supports the fact that this method is now the best exact method for the CLRP. In Chapter 5, we introduce a new metaheuristic with the aim of finding good quality solutions in short or moderate computing times. First, a bundle of good solutions is generated with the help of a greedy randomized adaptive search procedure (GRASP). Following this, a blending procedure is applied with the aim of producing a better upper bound as a combination of all the others in the bundle. An iterative destroy-and-repair method is then applied using a location-reallocation model that generalizes the reallocation model due to de Franceschi et al. [48].
110

以區域最佳解為基礎求解流程式排程問題的新啟發式方法 / A new heuristic based on local best solution for Permutation Flow Shop Scheduling

曾宇瑞, Tzeng, Yeu Ruey Unknown Date (has links)
本研究開發一個以區域最佳解為基礎的群體式 (population-based) 啟發式演算法(簡稱HLBS),來求解流程式排程(flow shop)之最大流程時間的最小化問題。其中,HLBS會先建置一個跟隨模型來導引搜尋機制,然後,運用過濾策略來預防重複搜尋相同解空間而陷入區域最佳解的困境;但搜尋仍有可能會陷入區域最佳解,這時,HLBS則會啟動跳脫策略來協助跳出區域最佳解,以進入新的區域之搜尋;為驗證HLBS演算法的績效,本研究利用著名的Taillard 測試題庫來進行評估,除證明跟隨模型、過濾策略和跳脫策略的效用外,也提出實驗結果證明HLBS較其他知名群體式啟發式演算法(如基因演算法、蟻群演算法以及粒子群最佳化演算法)之效能為優。 / This research proposes population-based metaheuristic based on the local best solution (HLBS) for the permutation flow shop scheduling problem (PFSP-makespan). The proposed metaheuristic operates through three mechanisms: (i) it introduces a new method to produce a trace-model for guiding the search, (ii) it applies a new filter strategy to filter the solution regions that have been reviewed and guides the search to new solution regions in order to keep the search from trapping into local optima, and (iii) it initiates a new jump strategy to help the search escape if the search does become trapped at a local optimum. Computational experiments on the well-known Taillard's benchmark data sets will be performed to evaluate the effects of the trace-model generating rule, the filter strategy, and the jump strategy on the performance of HLBS, and to compare the performance of HLBS with all the promising population-based metaheuristics related to Genetic Algorithms (GA), Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO).

Page generated in 0.103 seconds