• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 3
  • 1
  • 1
  • 1
  • Tagged with
  • 24
  • 24
  • 12
  • 9
  • 6
  • 6
  • 6
  • 5
  • 5
  • 4
  • 4
  • 4
  • 4
  • 4
  • 4
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Improving Post-Disaster Recovery: Decision Support for Debris Disposal Operations

Fetter, Gary 07 May 2010 (has links)
Disaster debris cleanup operations are commonly organized into two phases. During the first phase, the objective is to clear debris from evacuation and other important pathways to ensure access to the disaster-affected area. Practically, Phase 1 activities largely consist of pushing fallen trees, vehicles, and other debris blocking streets and highways to the curb. These activities begin immediately once the disaster has passed, with the goal of completion usually within 24 to 72 hours. In Phase 2 of debris removal, which is the focus of this study, completion can take months or years. Activities in this phase include organizing and managing curbside debris collection, reduction, recycling, and disposal operations (FEMA 2007). This dissertation research investigates methods for improving post-disaster debris cleanup operations—one of the most important and costly aspects of the least researched area of disaster operations management (Altay and Green 2006). The first objective is to identify the unique nature of the disaster debris cleanup problem and the important decisions faced by disaster debris coordinators. The second goal is to present three research projects that develop methods for assisting disaster management coordinators with debris cleanup operations. In the first project, which is the topic of Chapter 3, a facility location model is developed for addressing the problem of opening temporary disposal and storage reduction facilities, which are needed to ensure efficient and effective cleanup operations. In the second project, which is the topic of Chapter 4, a multiple objective mixed-integer linear programming model is developed to address the problem of assigning debris cleanup resources across the disaster-affected area at the onset of debris cleanup operations. The third project and the focus of Chapter 5 addresses the problem of equitably controlling ongoing cleanup operations in real-time. A self-balancing CUSUM statistical process control chart is developed to assist disaster management coordinators with equitably allocating cleanup resources as information becomes available in real-time. All of the models in this dissertation are evaluated using data from debris cleanup operations in Chesapeake, Virginia, completed after Hurricane Isabel in 2003. / Ph. D.
12

Multi-objective optimization of a two-echelon vehicle routing problem with vehicle synchronization and "grey Zone" customers arising in urban logistics

Anderluh, Alexandra, Nolz, Pamela, Hemmelmayr, Vera, Crainic, Teodor Gabriel January 2019 (has links) (PDF)
We present a multi-objective two-echelon vehicle routing problem with vehicle synchronization and "grey zone" customers arising in the context of urban freight deliveries. Inner-city center deliveries are performed by small vehicles due to access restrictions, while deliveries outside this area are carried out by conventional vehicles for economic reasons. Goods are transferred from the first to the second echelon by synchronized meetings between vehicles of the respective echelons. We investigate the assignment of customers to vehicles, i.e., to the first or second echelon, within a so-called "grey Zone" on the border of the inner city and the area around it. While doing this, the economic objective as well as negative external effects of transport, such as emissions and disturbance (negative impact on citizens due to noise and congestion), are taken into account to include objectives of companies as well as of citizens and municipal authorities. Our metaheuristic - a large neighborhood search embedded in a heuristic rectangle/cuboid splitting - addresses this problem efficiently. We investigate the impact of the free assignment of part of the customers ("grey Zone") to echelons and of three different city layouts on the solution. Computational results show that the impact of a "grey Zone" and thus the assignment of these customers to echelons depend significantly on the layout of a city. Potentially pareto-optimal solutions for two and three objectives are illustrated to efficiently support decision makers in sustainable city logistics planning processes.
13

Scheduling coal handling processes using metaheuristics

Conradie, David Gideon 21 April 2008 (has links)
The operational scheduling at coal handling facilities is of the utmost importance to ensure that the coal consuming processes are supplied with a constant feed of good quality coal. Although the Sasol Coal Handling Facility (CHF) were not designed to perform coal blending during the coal handling process, CHF has to blend the different sources to ensure that the quality of the feed supplied is of a stable nature. As a result, the operation of the plant has become an extremely complex process. Consequently, human intelligence is no longer sufficient to perform coal handling scheduling and therefore a scheduling model is required to ensure optimal plant operation and optimal downstream process performance. After various attempts to solve the scheduling model optimally, i.e. with exact solution methods, it was found that it is not possible to accurately model the complexities of CHF in such a way that the currently available exact solvers can solve it in an acceptable operational time. Various alternative solution approaches are compared, in terms of solution quality and execution speed, using a simplified version of the CHF scheduling problem. This investigation indicates that the Simulated Annealing (SA) metaheuristic is the most efficient solution method to provide approximate solutions. The metaheuristic solution approach allows one to model the typical sequential thoughts of a control room operator and sequential operating procedures. Thus far, these sequential rules could not be modelled in the simultaneous equation environment required for exact solution methods. An SA metaheuristic is developed to solve the practical scheduling model. A novel SA approach is applied where, instead of the actual solution being used for neighbourhood solution representation, the neighbours are indirectly represented by the rules used to generate neighbourhood solutions. It is also found that the initial temperature should not be a fixed value, but should be a multiple of the objective function value of the initial solution. An inverse arctan-based cooling schedule function outperforms traditional cooling schedules as it provides the required diversification and intensification behaviour of the SA. The scheduling model solves within 45 seconds and provides good, practically executable results. The metaheuristic approach to scheduling is therefore successful as the plant complexities and intricate operational philosophies can be accurately modelled using the sequential nature of programming languages and provides good approximate optimal solutions in a short solution time. Tests done with live CHF data indicate that the metaheuristic solution outperforms the current scheduling methodologies applied in the business. The implementation of the scheduler will lead to a more stable factory feed, which will increase production yields and therefore increase company profits. By reducing the amount of coal re-handling (in terms of throw-outs and load-backs at mine bunkers), the scheduler will reduce the coal handling facility’s annual operating cost by approximately R4.6 million (ZAR). Furthermore, the approaches discussed in this document can be applied to any continuous product scheduling environment. Additional information available on a CD stored at Level 3 of the Merensky Library. / Dissertation (MEng (Industrial Engineering))--University of Pretoria, 2011. / Industrial and Systems Engineering / unrestricted
14

Cooperative Water Resources Allocation among Competing Users

Wang, Lizhong January 2005 (has links)
A comprehensive model named the Cooperative Water Allocation Model (CWAM) is developed for modeling equitable and efficient water allocation among competing users at the basin scale, based on a multiperiod node-link river basin network. The model integrates water rights allocation, efficient water allocation and equitable income distribution subject to hydrologic constraints comprising both water quantity and quality considerations. CWAM allocates water resources in two steps: initial water rights are firstly allocated to water uses based on legal rights systems or agreements, and then water is reallocated to achieve efficient use of water through water transfers. The associated net benefits of stakeholders participating in a coalition are allocated by using cooperative game theoretical approaches. <br /><br /> The first phase of the CWAM methodology includes three methods for deriving initial water rights allocation among competing water uses, namely the priority-based multiperiod maximal network flow (PMMNF) programming, modified riparian water rights allocation (MRWRA) and lexicographic minimax water shortage ratios (LMWSR) methods. PMMNF is a very flexible approach and is applicable under prior, riparian and public water rights systems with priorities determined by different criteria. MRWRA is essentially a special form of PMMNF adapted for allocation under the riparian regime. LMWSR is designed for application under a public water rights system, which adopts the lexicographic minimax fairness concept. The second step comprises three sub-models: the irrigation water planning model (IWPM) is a model for deriving benefit functions of irrigation water; the hydrologic-economic river basin model (HERBM) is the core component of the coalition analysis, which searches for the values of various coalitions of stakeholders and corresponding optimal water allocation schemes, based on initial water rights, monthly net benefit functions of demand sites and the ownership of water uses; the sub-model cooperative reallocation game (CRG) of the net benefit of the grand coalition adopts cooperative game solution concepts, including the nucleolus, weak nucleolus, proportional nucleolus, normalized nucleolus and Shapley value, to perform equitable reallocation of the net benefits of stakeholders participating in the grand coalition. The economically efficient use of water under the grand coalition is achieved through water transfers based on initial water rights. <br /><br /> Sequential and iterative solution algorithms utilizing the primal simplex method are developed to solve the linear PMMNF and LMWSR problems, respectively, which only include linear water quantity constraints. Algorithms for nonlinear PMMNF and LMWSR problems adopt a two-stage approach, which allow nonlinear reservoir area- and elevation-storage relations, and may include nonlinear water quality constraints. In the first stage, the corresponding linear problems, excluding nonlinear constraints, are solved by a sequential or iterative algorithm. The global optimal solution obtained by the linear programming is then combined together with estimated initial values of pollutant concentrations to be used as the starting point for the sequential or iterative nonlinear programs of the nonlinear PMMNF or LMWSR problem. As HERBM adopts constant price-elasticity water demand functions to derive the net benefit functions of municipal and industrial demand sites and hydropower stations, and quadratic gross benefit functions to find the net benefit functions of agriculture water uses, stream flow demands and reservoir storages, it is a large scale nonlinear optimization problem even when the water quality constraints are not included. An efficient algorithm is built for coalition analysis, utilizing a combination of the multistart global optimization technique and gradient-based nonlinear programming method to solve a HERBM for each possible coalition. <br /><br /> Throughout the study, both the feasibility and the effectiveness of incorporating equity concepts into conventional economic optimal water resources management modeling are addressed. The applications of CWAM to the Amu Darya River Basin in Central Asia and the South Saskatchewan River Basin in western Canada demonstrate the applicability of the model. It is argued that CWAM can be utilized as a tool for promoting the understanding and cooperation of water users to achieve maximum welfare in a river basin and minimize the damage caused by water shortages, through water rights allocation, and water and net benefit transfers among water users under the regulated water market or administrative allocation mechanism.
15

Cooperative Water Resources Allocation among Competing Users

Wang, Lizhong January 2005 (has links)
A comprehensive model named the Cooperative Water Allocation Model (CWAM) is developed for modeling equitable and efficient water allocation among competing users at the basin scale, based on a multiperiod node-link river basin network. The model integrates water rights allocation, efficient water allocation and equitable income distribution subject to hydrologic constraints comprising both water quantity and quality considerations. CWAM allocates water resources in two steps: initial water rights are firstly allocated to water uses based on legal rights systems or agreements, and then water is reallocated to achieve efficient use of water through water transfers. The associated net benefits of stakeholders participating in a coalition are allocated by using cooperative game theoretical approaches. <br /><br /> The first phase of the CWAM methodology includes three methods for deriving initial water rights allocation among competing water uses, namely the priority-based multiperiod maximal network flow (PMMNF) programming, modified riparian water rights allocation (MRWRA) and lexicographic minimax water shortage ratios (LMWSR) methods. PMMNF is a very flexible approach and is applicable under prior, riparian and public water rights systems with priorities determined by different criteria. MRWRA is essentially a special form of PMMNF adapted for allocation under the riparian regime. LMWSR is designed for application under a public water rights system, which adopts the lexicographic minimax fairness concept. The second step comprises three sub-models: the irrigation water planning model (IWPM) is a model for deriving benefit functions of irrigation water; the hydrologic-economic river basin model (HERBM) is the core component of the coalition analysis, which searches for the values of various coalitions of stakeholders and corresponding optimal water allocation schemes, based on initial water rights, monthly net benefit functions of demand sites and the ownership of water uses; the sub-model cooperative reallocation game (CRG) of the net benefit of the grand coalition adopts cooperative game solution concepts, including the nucleolus, weak nucleolus, proportional nucleolus, normalized nucleolus and Shapley value, to perform equitable reallocation of the net benefits of stakeholders participating in the grand coalition. The economically efficient use of water under the grand coalition is achieved through water transfers based on initial water rights. <br /><br /> Sequential and iterative solution algorithms utilizing the primal simplex method are developed to solve the linear PMMNF and LMWSR problems, respectively, which only include linear water quantity constraints. Algorithms for nonlinear PMMNF and LMWSR problems adopt a two-stage approach, which allow nonlinear reservoir area- and elevation-storage relations, and may include nonlinear water quality constraints. In the first stage, the corresponding linear problems, excluding nonlinear constraints, are solved by a sequential or iterative algorithm. The global optimal solution obtained by the linear programming is then combined together with estimated initial values of pollutant concentrations to be used as the starting point for the sequential or iterative nonlinear programs of the nonlinear PMMNF or LMWSR problem. As HERBM adopts constant price-elasticity water demand functions to derive the net benefit functions of municipal and industrial demand sites and hydropower stations, and quadratic gross benefit functions to find the net benefit functions of agriculture water uses, stream flow demands and reservoir storages, it is a large scale nonlinear optimization problem even when the water quality constraints are not included. An efficient algorithm is built for coalition analysis, utilizing a combination of the multistart global optimization technique and gradient-based nonlinear programming method to solve a HERBM for each possible coalition. <br /><br /> Throughout the study, both the feasibility and the effectiveness of incorporating equity concepts into conventional economic optimal water resources management modeling are addressed. The applications of CWAM to the Amu Darya River Basin in Central Asia and the South Saskatchewan River Basin in western Canada demonstrate the applicability of the model. It is argued that CWAM can be utilized as a tool for promoting the understanding and cooperation of water users to achieve maximum welfare in a river basin and minimize the damage caused by water shortages, through water rights allocation, and water and net benefit transfers among water users under the regulated water market or administrative allocation mechanism.
16

Design of secondary voltage and stability controls with multiple control objectives

Song, Yang 01 June 2009 (has links)
The purpose of the proposed research is to design a Decentralized Voltage/Stability Monitoring and Control System to counteract voltage violations and the impact of disturbances/contingencies on power system voltage stability. A decentralized voltage and stability control system is designed to coordinate the controls of the local secondary voltage control devices and necessary load shedding without requiring information about the rest of the system. The voltage/stability control can be formulated as a multi-objective optimization problem. The control objectives include, but are not limited to: minimization of system active/reactive losses; maximization of the system stability margin; and minimization of the control actions. The constraints of the optimization problem depend on the specifications of the actual system components. For the first time, margin sensitivities of the control actions are included in the control formulation. The concept of using margin sensitivity to evaluate the post-control load margin is presented as a fast and accurate way to assess potential voltage and stability control options. A system decomposition procedure is designed to define the disturbance-affected zone as an independent control subsystem. A normal constraint algorithm is adopted to identify the most suitable control solution in a shorter timeline than the typical utility voltage-control practice. Both steady-state and dynamic simulations are performed to compare the proposed system with typical utility control practices.
17

Hybrid Gates approach for R and D product portfolio management

Koh, Alex January 2012 (has links)
Companies today are aggressively finding ways to improve top-line growth by introducing innovative products faster to the market. To achieve both innovation and accelerated rollout, many are turning to techniques such as Stage Gate approaches to improve engineering and marketing collaborations to clarify short term resource allocations (day to day plan with employee assignment). While Stage Gate approaches have been shown to result in better project coordination and faster time to market by doing projects right, research also indicates the need to ensure alignment to company strategy by doing the right projects within the allocated annual budget through medium term (rough cut capacity plan with employee requirements) and long term resource allocations (business / strategic plan with funding requirements). Today, such medium to long term resource allocation methodologies tend to be broadly consolidated under Research and Development (R&amp;D) product portfolio management. We argue that there is value in a philosophical change in viewing R&amp;D product portfolio management from the context of (1.) long and medium term resource allocation phases separately, (2.) focusing on the overlapping regions between long and medium term and between medium and short term resource allocation phases and (3.) the evolving resource allocation perspective (monetary to headcount to skillset) through these phases. Cooper et al note that for R&amp;D product portfolio management and the Stage Gate process to work together, one can expect one of two scenarios - a gates dominated approach (where the prioritization and resource decisions are made at short term focused Stage Gates) or a portfolio reviews dominated approach (where the prioritization and resource decisions are made at the long term focused portfolio reviews). We propose that with appropriate focus given to the medium term phase, a third approach that we call a Hybrid Gates approach can exist in a "gates dominated" environment. A case study on Freescale Semiconductor was used as an empirical inquiry to gain deeper understanding on the perceived value of this approach within a real-life context. Triangulating between structured surveys, unstructured surveys, and focused interviews; we were able to show perceived value to the organization in the following areas: (1.) Enhancing the understanding of decision maker's decision and solution spaces, (2.) Clarifying strategic expressions and "stress testing" new strategies, (3.) Improving horizontal and vertical communication within the organization and (4.) Aiding in objectivity in R&amp;D investment allocation. Furthermore, we were able to conceptually show how this approach retains the advantages of the gates dominated and portfolio dominated approaches while minimizing their respective weaknesses. This research is novel and unique as we have not found any research literature that focuses on a Hybrid Gates approach perspective or studies where the implementation of MO-ZOLP is: (1.) this large in scale and (2.) designed specifically to support a Stage Gate dominated environment. We believe that this research contributes to the practising educator and researcher by providing them with an alternative approach on R&amp;D project portfolio management in complex organizations that are using a Stage Gate process. We also believe that this research is valuable to the practitioner by providing them with a practical process and methodology in which change management for such activities can be achieved. In addition, we assessed the on-going value added to the organization, thus linking theory to practice and finally, to outcome.
18

Solving Multiple Objective Optimization Problem using Multi-Agent Systems: A case in Logistics Management

Pennada, Venkata Sai Teja January 2020 (has links)
Background: Multiple Objective Optimization problems(MOOPs) are common and evident in every field. Container port terminals are one of the fields in which MOOP occurs. In this research, we have taken a case in logistics management and modelled Multi-agent systems to solve the MOOP using Non-dominated Sorting Genetic Algorithm-II (NSGA-II). Objectives: The purpose of this study is to build AI-based models for solving a Multiple Objective Optimization Problem occurred in port terminals. At first, we develop a port agent with an objective function of maximizing throughput and a customer agent with an objective function of maximizing business profit. Then, we solve the problem using the single-objective optimization model and multi-objective optimization model. We then compare the results of both models to assess their performance. Methods: A literature review is conducted to choose the best algorithm among the existing algorithms, which were used previously in solving other Multiple Objective Optimization problems. An experiment is conducted to know how well the models performed to solve the problem so that all the participants are benefited simultaneously. Results: The results show that all three participants that are port, customer one and customer two have gained profits by solving the problem in multi-objective optimization model. Whereas in a single-objective optimization model, a single participant has achieved earnings at a time, leaving the rest of the participants either in loss or with minimal profits. Conclusion: We can conclude that multi-objective optimization model has performed better than the single-objective optimization model because of the impartial results among the participants.
19

Towards using microscopic traffic simulations for safety evaluation

Tamayo Cascan, Edgar January 2018 (has links)
Microscopic traffic simulation has become an important tool to investigate traffic efficiency and road safety. In order to produce meaningful results, incorporated driver behaviour models need to be carefully calibrated to represent real world conditions. In addition to macroscopic relationships such as the speed-density diagram, they should also adequately represent the average risk of accidents occurring on the road. In this thesis, I present a two stage computationally feasible multi-objective calibration process. The first stage performs a parameter sensitivity analysis to select only parameters with considerable effect on the respective objective functions to keep the computational complexity of the calibration at a manageable level. The second stage employs a multi-objective genetic algorithm that produces a front of Pareto optimal solutions with respect to the objective functions. Compared to traditional methods which focus on only one objective while sacrificing accuracy of the other, my method achieves a high degree of realism for both traffic flow and average risk. / Mikroskopisk trafiksimulering har blivit ett viktigt verktyg för att undersöka trafik effektivitet och trafiksäkerhet. För att producera meningsfulla resultat måste inbyggda drivrutinsbeteendemodeller noggrant kalibreras för att representera verkliga förhållanden i världen. Förutom makroskopiska relationer, såsom hastighetsdensitetsdiagrammet, bör de också på ett adekvat sätt representera den genomsnittliga risken för olyckor som uppträder på vägen. I denna avhandling presenterar jag en tvåstegs beräkningsberättigbar mångsidig kalibreringsprocess. Det första steget utför en parameterkänslighetsanalysför att bara välja parametrar med stor effekt på respektive objektiv funktioner för att hålla kalibrerings komplexiteten på en hanterbar nivå. Det andra steget använder en mångriktig genetisk algoritm som ger framsidan av Pareto optimala lösningar med hänsyn till objektivfunktionerna. Jämfört med traditionella metoder som fokuserar på endast ett mål, samtidigt som man offrar den andra, ger min metod en hög grad av realism för både trafikflöde och genomsnittlig risk.
20

Automatic Design Space Exploration of Fault-tolerant Embedded Systems Architectures

Tierno, Antonio 26 January 2023 (has links)
Embedded Systems may have competing design objectives, such as to maximize the reliability, increase the functional safety, minimize the product cost, and minimize the energy consumption. The architectures must be therefore configured to meet varied requirements and multiple design objectives. In particular, reliability and safety are receiving increasing attention. Consequently, the configuration of fault-tolerant mechanisms is a critical design decision. This work proposes a method for automatic selection of appropriate fault-tolerant design patterns, optimizing simultaneously multiple objective functions. Firstly, we present an exact method that leverages the power of Satisfiability Modulo Theory to encode the problem with a symbolic technique. It is based on a novel assessment of reliability which is part of the evaluation of alternative designs. Afterwards, we empirically evaluate the performance of a near-optimal approximation variation that allows us to solve the problem even when the instance size makes it intractable in terms of computing resources. The efficiency and scalability of this method is validated with a series of experiments of different sizes and characteristics, and by comparing it with existing methods on a test problem that is widely used in the reliability optimization literature.

Page generated in 0.0798 seconds