• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • 1
  • Tagged with
  • 3
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A Hyper-Heuristic Clustering Algorithm

Song, Huei-jyun 07 September 2012 (has links)
The so-called heuristics have been widely used in solving combinatorial optimization problems because they provide a simple but effective way to find an approximate solution. These technologies are very useful for users who do not need the exact solution but who care very much about the response time. For every existing heuristic algorithm has its pros and cons, a hyper-heuristic clustering algorithm based on the diversity detection and improvement detection operators to determine when to switch from one heuristic algorithm to another is presented to improve the clustering result in this paper. Several well-known datasets are employed to evaluate the performance of the proposed algorithm. Simulation results show that the proposed algorithm can provide a better clustering result than the state-of-the-art heuristic algorithms compared in this paper, namely, k-means, simulated annealing, tabu search, and genetic k-means algorithm.
2

Combining mathematical programming and enhanced GRASP metaheuristics : an application to semiconductor manufacturing

Deng, Yumin 07 August 2012 (has links)
Planning and scheduling in semiconductor manufacturing is a difficult problem due to long cycle times, a large number of operational steps, diversified product types, and low-volume high-mix customer demand. This research addresses several problems that arise in the semiconductor industry related to front-end wafer fabrication operations and back-end assembly and test operations. The mathematical models built for these problems turn out to be large-scale mixed integer programs and hard to solve with exact methods. The major contribution of this research is to combine mathematical programming with metaheuristics to find high quality solutions within the time limits imposed by the industrial engineers who oversee the fabrication and test facilities. In order to reduce the size of problems that arise in practice, it is common to cluster similar product types into groups that reflect their underlying technology. The first part of the research is aimed at developing a greedy randomized adaptive search procedure (GRASP) coupled with path relinking (PR) to solve the capacitated clustering problem. The model is generic and can be applied in many different situations. The objective is to maximize a similarity measure within each cluster such that the sum of the weights associated with the product types does not exceed the cluster capacity in each case. In phase I, both a heaviest weight edge (HWE) algorithm and a constrained minimum cut (CMC) algorithm are used to select seeds for initializing the clusters. Feasible solutions are obtained with the help of a self-adjusting restricted candidate list. In phase II, three neighborhoods are defined and explored using the following strategies: cyclic neighborhood search, variable neighborhood descent, and randomized variable neighborhood descent (RVND). The best solutions found are stored in an elite pool. In a post-processing step, PR coupled with local search is applied to the pool members to cyclically generate paths between each pair. The elite pool is updated after each iteration and the procedure ends when no further improvement is possible. After grouping the product types into technologies, a new model is presented for production planning in a high volume fab that uses quarterly commitments to define daily target outputs. Rather than relying on due dates and priority rules to schedule lot starts and move work in process through the shop, the objective is to minimize the sum of the deviations between the target outputs and finished goods inventory. The model takes the form of a large-scale linear program that is intractable for planning horizons beyond a few days. Both Lagrangian relaxation and Benders decomposition were investigated but each proved ineffective. As a consequence, a methodology was developed which was more tailored to the problem’s structure. This involved creating weekly subproblems that were myopic but could be solved to optimality within a few minutes, and then postprocessing the results with a decomposition algorithm to fully utilize the excessive machine time. The heart of the post-processor consists of a rescheduling algorithm and a dispatching heuristic. The third part of the research focuses on the combinatorial problem of machinetooling setup and lot assignments for performing back-end operations. A new model and solution methodology are presented aimed at maximizing the weighted throughput of lots undergoing assembly and test, while ensuring that critical lots are given priority. The problem is formulated as a mixed-integer program and solved again with a GRASP that makes use of linear programming. In phase I of the GRASP, machine-tooling combinations are tentatively fixed and lot assignments are made iteratively to arrive at a feasible solution. This process is repeated many times. In phase II, a novel neighborhood search is performed on a subset of good solutions found in phase I. Using a linear programming-Monte Carlo simulation-based algorithm, new machine-tooling combinations are identified within the neighborhood of the solutions carried over, and improvements are sought by optimizing the corresponding lot assignments. / text
3

Heurísticas para o problema de cobertura em redes de sensores sem fio hierárquicas com sorvedouro móvel

Araújo, André Ricardo Melo 01 March 2013 (has links)
Made available in DSpace on 2015-04-11T14:02:47Z (GMT). No. of bitstreams: 1 Andre Ricardo Melo Araujo.pdf: 3722790 bytes, checksum: 1876d821e1e927795304f1c1ee7bbb67 (MD5) Previous issue date: 2013-03-01 / FAPEAM - Fundação de Amparo à Pesquisa do Estado do Amazonas / Wireless Sensor Network (WSN) is a special kind of ad hoc networks composed of devices capable of processing, storing, sensing the environment, and transmitting data via wireless communication interface. The sensor nodes have several limitations, among them the capacity of energy because to the reduced size. For this reason, many searches have been done with a view to improving the energy consumption of sensor nodes. This work aims to address the Problem of Coverage, Clustering and Routing with Mobile Sink (PCAR-SM, in portuguese Problema de Cobertura, Agrupamento e Roteamento com Sorvedouro Móvel) in WSN with mobile sink consisting of: given a set of sensor nodes and a monitoring area, develop algorithms to find the best subset of sensor nodes to cover the monitoring area, group them in a smaller number of clusters and find the shortest route to mobile sink navigate. The PCAR-SM is a strategy used to reduce the energy consumption of sensor nodes, data collisions, interference and redundant data in networks with high concentration of sensor nodes per area. The purpose of this paper is to solve each problem separately and together, in order to evaluate the impact of each problem on the other. The Coverage Problem has been solved with two metaheuristics: an Genetic Algorithm (GA) and a Greedy Randomized Adaptive Search Procedure (GRASP) algorithm. In the latter we used two representations of solution: (a) representation by sensor, where each element of the solution vector represents a sensor node that must be switched on or off; (b) representation by demand, where each element of the solution vector represents a demand point will indicate which sensor node cover it. The AG uses only the representation by demand. The computational results for Coverage Problem used the benchmark of Beasley s OR Library and it was possible seen that the GRASP with representation by demand achieved better results than the GA and the GRASP with representation by sensor when the optimization criterion is to minimize the total cost of each sensor node used in the solution. For Clustering Problem was created approach of virtual grids. In this approach, we divide the area into grids and clusters are formed by a set of adjacent grids (maximum 5 grids in group) forming a cross schematic. The aim of the problem is to minimize the number of clusters in the area. With this approach, we can model the Clustering Problem as a Set Cover Problem (SCP) without overlapping (an element does not belong to more than one set), which was treated by a greedy heuristic called Greedy Clustering Algorithm (GCA). The virtual grids proved to be a good solution because it is simple to identify a node which grid it belongs. Its simplicity also makes it a appropriate method for a distributed version. The Routing Problem of sink was modeled as the Travelling Salesman Problem (TSP), where the mobile sink part of a corner of the monitoring area, runs through the area visiting all clusters and returns to the starting point. For this, we propose two greedy approaches based on nearest neighbor, the Routing Greedy Algorithm - Center (RGA-C) and Routing Greedy Algorithm - Border (RGA-B). The route of the sink was also solved by a heuristic based on algorithm Centralized Spatial Partitioning (CSP). In CSP approach, the route is fixed and reminds the movement of a snake. The results show that fixed route produces a path with smaller size compared to the greedy heuristic for TSP. We analyze also the PCAR-SM, creating heuristic strategies. The union of the Clustering Problem and Routing Problem proved more beneficial in relation to the size of the sink s route. The union of Coverage Problem and Clustering Problem only proved beneficial when the communication radius was about 3,9 times greater than the sensing radius. Our results show that solve problems together allows some changes in the algorithms will lead to better results. / As Redes de Sensores Sem Fio (RSSFs) são um tipo especial de redes ad hoc constituídas por dispositivos capazes de processar, armazenar, sensoriar o ambiente e transmitir dados via interface de comunicação sem fio, denominados nós sensores. Os nós sensores possuem várias limitações, dentre elas, a capacidade de energia devido ao tamanho reduzido. Por isto, muitas pesquisas foram feitas tendo em vista a melhoria no consumo de energia dos nós sensores. Este trabalho tem como objetivo tratar o Problema de Cobertura, Agrupamento e Roteamento com Sorvedouro Móvel (PCAR-SM) em RSSF com nó sorvedouro móvel, que consiste em: dado um conjunto de nós sensores e uma área de monitoramento, desenvolver algoritmos para encontrar o melhor subconjunto de nós sensores que cubra a área de monitoramento, juntá-los no menor número de grupos possíveis e encontrar a menor rota para um nó sorvedouro móvel percorrer. O PCAR-SM é uma estratégia utilizada para diminuir o consumo de energia dos nós sensores, a colisão de dados, as interferências e os dados redundantes em redes com alta concentração de nós sensores por área. A proposta deste trabalho é resolver cada problema separadamente e em conjunto, de modo a avaliar o impacto de cada problema na solução do outro. O Problema de Cobertura foi resolvido com duas metaheurísticas: um Algoritmo Genético (AG) e um algoritmo Greedy Randomized Adaptive Search Procedure (GRASP). Neste último foram utilizadas duas representações de solução: (a) representação por sensor, onde cada elemento do vetor de solução representa um nó sensor que estará ligado ou desligado; (b) representação por demanda, onde cada elemento do vetor de solução representa um ponto de demanda no qual indicará qual o nó sensor o cobre. O AG utiliza apenas a representação por demanda. Os resultados computacionais para o Problema de Cobertura utilizaram o benchmark da Beasley s OR Library e foi possível constatar que o GRASP com representação por demanda obteve melhores resultados que o AG e o GRASP com representação por sensor quando o critério de otimização é minimizar a soma total dos custos de cada nó sensor utilizado na solução. Para o Problema de Agrupamento foi criada uma abordagem de grades virtuais. Nesta abordagem dividimos a área em grades e os grupos são formados por um conjunto de grades adjacentes (no máximo 5 grades) formando um esquema de cruz. O objetivo do problema é minimizar o número de grupos na área. A partir desta abordagem, pode-se modelar o Problema de Agrupamento como um Problema de Cobertura de Conjuntos (PCC) sem sobreposição (um elemento não pertence a mais de um conjunto), que foi tratada por uma heurística gulosa denominada Greedy Clustering Algorithm (GCA). Os grades virtuais provou ser uma boa solução por ser simples para um nó identificar a qual grade ele pertence. Sua simplicidade ainda o torna uma método adequado para uma versão distribuída. O Problema de Roteamento do nó sorvedouro foi modelado como o Problema do Caixeiro Viajante (PCV), onde o nó sorvedouro móvel parte de um canto da área de monitoramento, percorre a área visitando todos os grupos e retorna ao ponto inicial. Para isto, propomos duas abordagens gulosas baseadas no vizinho mais próximo, o Routing Greedy Algorithm - Center (RGA-C) e o Routing Greedy Algorithm - Border (RGA-B). A rota do nó sorvedouro também foi resolvida por uma heurística baseada no algoritmo Centralized Spatial Partitioning (CSP). Na abordagem CSP, a rota é fixa e lembra o movimento de uma cobra. Os resultados mostram que a rota fixa gera um percurso com tamanho menor em comparação com as heurísticas gulosas para o PCV. Analisamos, ainda, o PCAR-SM, criando estratégias heurísticas. Aunião dos Problema de Agrupamento e Roteamento, provou ser mais benéfica em relação ao tamanho da rota do nó sorvedouro, já a união do Problema de Cobertura com o Problema de Agrupamento só mostrou ser benéfica quando o raio de comunicação era aproximadamente 3, 9 vezes maior que o raio de sensoriamento. Nossos resultados, mostram que resolver os problemas em conjunto permite que algumas mudanças nos algoritmos levem a melhores resultados.

Page generated in 0.117 seconds