Spelling suggestions: "subject:"eshaped c.method"" "subject:"eshaped 20method""
1 |
Modified Network Simplex Method to Solve a Sheltering Network Planning and Management ProblemLi, Lingfeng 09 December 2011 (has links)
This dissertation considers sheltering network planning and operations for natural disaster preparedness and responses with a two-stage stochastic program. The first phase of the network design decides the locations, capacities and held resources of new permanent shelters. Both fixed costs for building a new permanent shelter and variable costs based on capacity are considered. Under each disaster scenario featured by the evacuee demand and transportation network condition, the flows of evacuees and resources to shelters, including permanent and temporary ones, are determined in the second stage to minimize the transportation and shortage/surplus costs. Typically, a large number of scenarios are involved in the problem and cause a huge computational burden. The L-shaped algorithm is applied to decompose the problem into the scenario level with each sub-problem as a linear program. The Sheltering Network Planning and Operation Problem considered in this dissertation also has a special structure in the second-stage sub-problem that is a minimum cost network flow problem with equal flow side constraints. Therefore, the dissertation also takes advantages of the network simplex method to solve the response part of the problem in order to solve the problem more efficiently. This dissertation investigates the extending application of special minimum cost equal flow problem. A case study for preparedness and response to hurricanes in the Gulf Coast region of the United States is conducted to demonstrate the usage of the model including how to define scenarios and cost structures. The numerical experiment results also verify the fast convergence of the L-shaped algorithm for the model.
|
2 |
Stochastic Scheduling for a Network of MEMS Job ShopsVaradarajan, Amrusha 31 January 2007 (has links)
This work is motivated by the pressing need for operational control in the fabrication of Microelectromechanical systems or MEMS. MEMS are miniature three-dimensional integrated electromechanical systems with the ability to absorb information from the environment, process this information and suitably react to it. These devices offer tremendous advantages owing to their small size, low power consumption, low mass and high functionality, which makes them very attractive in applications with stringent demands on weight, functionality and cost. While the system''s "brain" (device electronics) is fabricated using traditional IC technology, the micromechanical components necessitate very intricate and sophisticated processing of silicon or other suitable substrates. A dearth of fabrication facilities with micromachining capabilities and a lengthy gestation period from design to mass fabrication and commercial acceptance of the product in the market are factors most often implicated in hampering the growth of MEMS. These devices are highly application specific with low production volumes and the few fabs that do possess micromachining capabilities are unable to offer a complete array of fabrication processes in order to be able to cater to the needs of the MEMS R&D community. A distributed fabrication network has, therefore, emerged to serve the evolving needs of this high investment, low volume MEMS industry. Under this environment, a central facility coordinates between a network of fabrication centers (Network of MEMS job shops -- NMJS) containing micromachining capabilities. These fabrication centers include commercial, academic and government fabs, which make their services available to the ordinary customer. Wafers are shipped from one facility to another until all processing requirements are met. The lengthy and intricate process sequences that need to be performed over a network of capital intensive facilities are complicated by dynamic job arrivals, stochastic processing times, sequence-dependent set ups and travel between fabs. Unless the production of these novel devices is carefully optimized, the benefits of distributed fabrication could be completely overshadowed by lengthy lead times, chaotic routings and costly processing. Our goal, therefore, is to develop and validate an approach for optimal routing (assignment) and sequencing of MEMS devices in a network of stochastic job shops with the objective of minimizing the sum of completion times and the cost incurred, given a set of fabs, machines and an expected product mix.
In view of our goal, we begin by modeling the stochastic NMJS problem as a two-stage stochastic program with recourse where the first-stage variables are binary and the second-stage variables are continuous. The key decision variables are binary and pertain to the assignment of jobs to machines and their sequencing for processing on the machines. The assignment variables essentially fix the route of a job as it travels through the network because these variables specify the machine on which each job-operation must be performed out of several candidate machines. Once the assignment is decided upon, sequencing of job-operations on each machine follows. The assignment and sequencing must be such that they offer the best solution (in terms of the objective) possible in light of all the processing time scenarios that can be realized. We present two approaches for solving the stochastic NMJS problem. The first approach is based on the L-shaped method (credited to van Slyke and Wets, 1969). Since the NMJS problem lacks relatively complete recourse, the first-stage solution can be infeasible to the second-stage problem in that the first stage solution may either violate the reentrant flow conditions or it may create a deadlock. In order to alleviate these infeasibilities, we develop feasibility cuts which when appended to the master problem eliminate the infeasible solution. Alternatively, we also develop constraints to explicitly address these infeasibilities directly within the master problem. We show how a deadlock involving 2 or 3 machines arises if and only if a certain relationship between operations and a certain sequence amongst them exists. We generalize this argument to the case of m machines, which forms the basis for our deadlock prevention constraints. Computational results at the end of Chapter 3 compare the relative merits of a model which relies solely on feasibility cuts with models that incorporate reentrant flow and deadlock prevention constraints within the master problem. Experimental evidence reveals that the latter offers appreciable time savings over the former. Moreover, in a majority of instances we see that models that carry deadlock prevention constraints in addition to the reentrant flow constraints provide at par or better performance than those that solely carry reentrant flow constraints.
We, next, develop an optimality cut which when appended to the master problem helps in eliminating the suboptimal master solution. We also present alternative optimality and feasibility cuts obtained by modifying the disjunctive constraints in the subproblem so as to eliminate the big H terms in it. Although any large positive number can be used as the value of H, a conservative estimate may improve computational performance. In light of this, we develop a conservative upper bound for operation completion times and use it as the value of H. Test instances have been generated using a problem generator written in JAVA. We present computational results to evaluate the impact of a conservative estimate for big H on run time, analyze the effect of the different optimality cuts and demonstrate the performance of the multicut method (Wets, 1981) which differs from the L-shaped method in that the number of optimality cuts it appends is equal to the number of scenarios in each iteration. Experimentation indicates that Model 2, which uses the standard optimality cut in conjunction with the conservative estimate for big H, almost always outperforms Model 1, which also uses the standard optimality cut but uses a fixed value of 1000 for big H. Model 3, which employs the alternative optimality cut with the conservative estimate for big H, requires the fewest number of iterations to converge to the optimum but it also incurs the maximum premium in terms of computational time. This is because the alternative optimality cut adds to the complexity of the problem in that it appends additional variables and constraints to the master as well as the subproblems. In the case of Model 4 (multicut method), the segregated optimality cuts accurately reflect the shape of the recourse function resulting in fewer overall iterations but the large number of these cuts accumulate over the iterations making the master problem sluggish and so this model exhibits a variable performance for the various datasets. These experiments reveal that a compact master problem and a conservative estimate for big H positively impact the run time performance of a model. Finally, we develop a framework for a branch-and-bound scheme within which the L-shaped method, as applied to the NMJS problem, can be incorporated so as to further enhance its performance.
Our second approach for solving the stochastic NMJS problem relies on the tight LP relaxation observed for the deterministic equivalent of the model. We, first, solve the LP relaxation of the deterministic equivalent problem, and then, fix certain binary assignment variables that take on a value of either a 0 or a 1 in the relaxation. Based on this fixing of certain assignment variables, additional logical constraints have been developed that lead to the fixing of some of the sequencing variables too. Experimental results, comparing the performance of the above LP heuristic procedure with CPLEX over the generated test instances, illustrate the effectiveness of the heuristic procedure. For the largest problems (5 jobs, 10 operations/job, 12 machines, 7 workcenters, 7 scenarios) solved in this experiment, an average savings of as much as 4154 seconds and 1188 seconds was recorded in a comparison with Models 1 and 2, respectively. Both of these models solve the deterministic equivalent of the stochastic NMJS problem but differ in that Model 1 uses a big H value of 1000 whereas Model 2 uses the conservative upper bound for big H developed in this work. The maximum optimality gap observed for the LP heuristic over all the data instances solved was 1.35%. The LP heuristic, therefore, offers a powerful alternative to solving these problems to near-optimality with a very low computational burden. We also present results pertaining to the value of the stochastic solution for various data instances. The observed savings of up to 8.8% over the mean value approach underscores the importance of using a solution that is robust over all scenarios versus a solution that approximates the randomness through expected values.
We, next, present a dynamic stochastic scheduling approach (DSSP) for the NMJS problem. The premise behind this undertaking is that in a real-life implementation that is faithful to the two-stage procedure, assignment (routing) and sequencing decisions will be made for all the operations of all the jobs at the outset and these will be followed through regardless of the actual processing times realized for individual operations. However, it may be possible to refine this procedure if information on actual processing time realizations for completed operations could be utilized so that assignment and sequencing decisions for impending operations are adjusted based on the evolving scenario (which may be very different from the scenarios modeled) while still hedging against future uncertainty. In the DSSP approach, the stochastic programming model for the NMJS problem is solved at each decision point using the LP heuristic in a rolling horizon fashion while incorporating constraints that model existing conditions in the shop floor and the actual processing times realized for the operations that have been completed.
The implementation of the DSSP algorithm is illustrated through an example problem. The results of the DSSP approach as applied to two large problem instances are presented. The performance of the DSSP approach is evaluated on three fronts; first, by using the LP heuristic at each decision point, second, by using an optimal algorithm at each decision point, and third, against the two-stage stochastic programming approach. Results from the experimentation indicate that the DSSP approach using the LP heuristic at each decision point generates superior assignment and sequencing decisions than the two-stage stochastic programming approach and provides solutions that are near-optimal with a very low computational burden. For the first instance involving 40 operations, 12 machines and 3 processing time scenarios, the DSSP approach using the LP heuristic yields the same solution as the optimal algorithm with a total time savings of 71.4% and also improves upon the two-stage stochastic programming solution by 1.7%. In the second instance, the DSSP approach using the LP heuristic yields a solution with an optimality gap of 1.77% and a total time savings of 98% over the optimal algorithm. In this case, the DSSP approach with the LP heuristic improves upon the two-stage stochastic programming solution by 6.38%. We conclude by presenting a framework for the DSSP approach that extends the basic DSSP algorithm to accommodate jobs whose arrival times may not be known in advance. / Ph. D.
|
3 |
Stochastická optimalizace v programu AIMMS / Stochastic optimization in AIMMSKůdela, Jakub January 2014 (has links)
Tato diplomová práce uvádí základní poznatky matematického a především stochastického programování. Navíc se zabývá použitím softwaru AIMMS při vytváření a řešení optimalizačních problémů. Naším hlavním cílem je naprogramovat v softwaru AIMMS několik metod řešení problémů stochastického programování a ukázat jejich použití a užitečnost na vybraných problémech. Jedním z problémů, který jsme si zvolili, je model spalovny. Všechny AIMMS programy, které v našem textu použijeme a popíšeme, a jejich zdrojové kódy budou přiloženy v dodatcích.
|
4 |
Algoritmy pro řešení stochastických dvoustupňových úloh / Algorithms for solving two-stage stochastic programsVlčková, Ivona January 2017 (has links)
The thesis deals with the algorithms for two-stage stochastic programs. The first chapter considers the basic properties and theory. Specifically, we introduce the properites of the feasibility region and the objective function. Further, optimality conditions are discussed. In the second chapter we present algoritms which can be used to solve two-stage linear programs with fixed recourse. In the first section the basic L-shaped method is described in detail. The second section provides an explanation of the Stochastic Decomposition algorithm with the inclusion of a regularization term. The last chapter presents computational results. Three practical examples are provided both with a brief description of the problem and solutions by the studied algorithms.
|
5 |
Best Longitudinal Adjustment of Satellite Trajectories for the Observation of Forest Fires (Blastoff): A Stochastic Programming Approach to Satellite System DesignHoskins, Aaron Bradley 06 May 2017 (has links)
Forest fires cause a significant amount of damage and destruction each year. Optimally dispatching resources reduces the amount of damage a forest fire can cause. Models predict the fire spread to provide the data required to optimally dispatch resources. However, the models are only as accurate as the data used to build them. Satellites are one valuable tool in the collection of data for the forest fire models. Satellites provide data on the types of vegetation, the wind speed and direction, the soil moisture content, etc. The current operating paradigm is to passively collect data when possible. However, images from directly overhead provide better resolution and are easier to process. Maneuvering a constellation of satellites to fly directly over the forest fire provides higher quality data than is achieved with the current operating paradigm. Before launch, the location of the forest fire is unknown. Therefore, it is impossible to optimize the initial orbits for the satellites. Instead, the expected cost of maneuvering to observe the forest fire determines the optimal initial orbits. A two-stage stochastic programming approach is well suited for this class of problem where initial decisions are made with an uncertain future and then subsequent decisions are made once a scenario is realized. A repeat ground track orbit provides a non-maneuvering, natural solution providing a daily flyover of the forest fire. However, additional maneuvers provide a second daily flyover of the forest fire. The additional maneuvering comes at a significant cost in terms of additional fuel, but provides more data collection opportunities. After data are collected, ground stations receive the data for processing. Optimally selecting the ground station locations reduce the number of built ground stations and reduces the data fusion issues. However, the location of the forest fire alters the optimal ground station sites. A two-stage stochastic programming approach optimizes the selection of ground stations to maximize the expected amount of data downloaded from a satellite. The approaches of selecting initial orbits and ground station locations including uncertainty will provide a robust system to reduce the amount of damage caused by forest fires.
|
6 |
Problèmes de tournées de véhicules avec contraintes de chargementCôté, Jean-François 02 1900 (has links)
Cette thèse s’intéresse aux problèmes de tournées de véhicules où l’on retrouve des contraintes de chargement ayant un impact sur les séquences de livraisons permises. Plus particulièrement, les items placés dans l’espace de chargement d’un véhicule doivent être directement accessibles lors de leur livraison sans qu’il soit nécessaire de déplacer d’autres items. Ces problèmes sont rencontrés dans plusieurs entreprises de transport qui livrent de gros objets (meubles, électroménagers).
Le premier article de cette thèse porte sur une méthode exacte pour un problème de confection d’une seule tournée où un véhicule, dont l’aire de chargement est divisée en un certain nombre de piles, doit effectuer des cueillettes et des livraisons respectant une contrainte de type dernier entré, premier sorti. Lors d’une collecte, les items recueillis doivent nécessairement être déposés sur le dessus de l’une des piles. Par ailleurs, lors d’une livraison, les items doivent nécessairement se trouver sur le dessus de l’une des piles. Une méthode de séparation et évaluation avec plans sécants est proposée pour résoudre ce problème.
Le second article présente une méthode de résolution exacte, également de type séparation et évaluation avec plans sécants, pour un problème de tournées de véhicules avec chargement d’items rectangulaires en deux dimensions. L’aire de chargement des véhicules correspond aussi à un espace rectangulaire avec une orientation, puisque les items doivent être chargés et déchargés par l’un des côtés. Une contrainte impose que les items d’un client soient directement accessibles au moment de leur livraison.
Le dernier article aborde une problème de tournées de véhicules avec chargement d’items rectangulaires, mais où les dimensions de certains items ne sont pas connus avec certitude lors de la planification des tournées. Il est toutefois possible d’associer une distribution de probabilités discrète sur les dimensions possibles de ces items. Le problème est résolu de manière exacte avec la méthode L-Shape en nombres entiers. / In this thesis, we study mixed vehicle routing and loading problems where a constraint is imposed on delivery sequences. More precisely, the items in the loading area of a vehicle must be directly accessible, without moving any other item, at delivery time. These problems are often found in the transportation of large objects (furniture, appliances).
The first paper proposes a branch-and-cut algorithm for a variant of the single vehicle pickup and delivery problem, where the loading area of the vehicle is divided into several stacks. When an item is picked up, it must be placed on the top of one of these stacks. Conversely, an item must be on the top of one of these stacks to be delivered. This requirement is called “Last In First Out” or LIFO constraint.
The second paper presents another branch-and-cut algorithm for a vehicle routing and loading problem with two-dimensional rectangular items. The loading area of the vehicles is also a rectangular area where the items are taken out from one side. A constraint states that the items of a given customer must be directly accessible at delivery time.
The last paper considers a stochastic vehicle routing and loading problem with two- dimensional rectangular items where the dimensions of some items are unknown when the routes are planned. However, it is possible to associate a discrete probability distribution on the dimensions of these items. The problem is solved with the Integer L-Shaped method.
|
7 |
Problèmes de tournées de véhicules avec contraintes de chargementCôté, Jean-François 02 1900 (has links)
Cette thèse s’intéresse aux problèmes de tournées de véhicules où l’on retrouve des contraintes de chargement ayant un impact sur les séquences de livraisons permises. Plus particulièrement, les items placés dans l’espace de chargement d’un véhicule doivent être directement accessibles lors de leur livraison sans qu’il soit nécessaire de déplacer d’autres items. Ces problèmes sont rencontrés dans plusieurs entreprises de transport qui livrent de gros objets (meubles, électroménagers).
Le premier article de cette thèse porte sur une méthode exacte pour un problème de confection d’une seule tournée où un véhicule, dont l’aire de chargement est divisée en un certain nombre de piles, doit effectuer des cueillettes et des livraisons respectant une contrainte de type dernier entré, premier sorti. Lors d’une collecte, les items recueillis doivent nécessairement être déposés sur le dessus de l’une des piles. Par ailleurs, lors d’une livraison, les items doivent nécessairement se trouver sur le dessus de l’une des piles. Une méthode de séparation et évaluation avec plans sécants est proposée pour résoudre ce problème.
Le second article présente une méthode de résolution exacte, également de type séparation et évaluation avec plans sécants, pour un problème de tournées de véhicules avec chargement d’items rectangulaires en deux dimensions. L’aire de chargement des véhicules correspond aussi à un espace rectangulaire avec une orientation, puisque les items doivent être chargés et déchargés par l’un des côtés. Une contrainte impose que les items d’un client soient directement accessibles au moment de leur livraison.
Le dernier article aborde une problème de tournées de véhicules avec chargement d’items rectangulaires, mais où les dimensions de certains items ne sont pas connus avec certitude lors de la planification des tournées. Il est toutefois possible d’associer une distribution de probabilités discrète sur les dimensions possibles de ces items. Le problème est résolu de manière exacte avec la méthode L-Shape en nombres entiers. / In this thesis, we study mixed vehicle routing and loading problems where a constraint is imposed on delivery sequences. More precisely, the items in the loading area of a vehicle must be directly accessible, without moving any other item, at delivery time. These problems are often found in the transportation of large objects (furniture, appliances).
The first paper proposes a branch-and-cut algorithm for a variant of the single vehicle pickup and delivery problem, where the loading area of the vehicle is divided into several stacks. When an item is picked up, it must be placed on the top of one of these stacks. Conversely, an item must be on the top of one of these stacks to be delivered. This requirement is called “Last In First Out” or LIFO constraint.
The second paper presents another branch-and-cut algorithm for a vehicle routing and loading problem with two-dimensional rectangular items. The loading area of the vehicles is also a rectangular area where the items are taken out from one side. A constraint states that the items of a given customer must be directly accessible at delivery time.
The last paper considers a stochastic vehicle routing and loading problem with two- dimensional rectangular items where the dimensions of some items are unknown when the routes are planned. However, it is possible to associate a discrete probability distribution on the dimensions of these items. The problem is solved with the Integer L-Shaped method.
|
8 |
[en] ACCELERATING BENDERS STOCHASTIC DECOMPOSITION FOR THE OPTIMIZATION OF PARTIAL BACKORDER CONTROL FOR PERIODIC REVIEW (R, S) INVENTORY SYSTEM WITH UNCERTAIN DEMAND / [pt] ACELERANDO A DECOMPOSIÇÃO DE BENDERS ESTOCÁSTICA PARA OTIMIZAÇÃO DE UM MODELO DE GESTÃO DE ESTOQUE DE REVISÃO PERIÓDICA (R, S) COM BACKORDER PARCIAL E DEMANDA INCERTAFELIPE SILVA PLACIDO DOS SANTOS 05 September 2017 (has links)
[pt] Este trabalho apresenta uma proposta de aceleração da decomposição de Benders aplicada a uma versão mais geral e compacta (menos restrições e variáveis) do modelo de gestão de estoques, otimizado via programação estocástica de dois estágios que considera uma camada, um item, demanda incerta e política de controle (R, S). De maneira a ser possível considerar problemas de grande porte, foram aplicados os métodos L-Shaped tradicional com corte único e a sua forma estendida com múltiplos cortes. Resultados computacionais preliminares mostraram um substancial melhor desempenho computacional do método L-Shaped tradicional em relação à sua forma multi-cut L-Shaped, mesmo o primeiro necessitando de mais iterações para convergir na solução ótima. Tal observação motivou o desenvolvimento de uma nova técnica de aceleração da decomposição de Benders e de um conjunto de desigualdades válidas. Experimentos numéricos mostram que a abordagem proposta de combinar a técnica de aceleração elaborada com as desigualdades válidas desenvolvidas provê significativa redução do tempo computacional necessário para a solução de instâncias de grande porte. / [en] This dissertation presents a speed up proposal for the Benders decomposition applied to a more general and compact version (less constraints and variables) of inventory management model, optimized via two-stage stochastic programming, which considers one layer, one item, uncertain demand and control policy (R, S). In order to be possible to consider large scale problems, the L-Shaped traditional method with single cuts and its extended form with multiple cuts were applied. Preliminary computational results showed a substantially better computational performance of the traditional L-Shaped method in comparison to the multi-cut L-Shaped method, even with the first requiring more iterations to converge on optimum solutions. This observation led to the development of a new technique to accelerate the decomposition of Benders and a set of valid inequalities. Numerical experiments show that the proposed approach of combining the elaborate acceleration technique with the developed valid inequalities, provide significant reduction in the computational time required to solve large scale instances.
|
Page generated in 0.0629 seconds