• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 32
  • 5
  • 4
  • 4
  • 3
  • 3
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 69
  • 69
  • 13
  • 11
  • 9
  • 9
  • 8
  • 8
  • 7
  • 6
  • 6
  • 6
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Improved accuracy of surrogate models using output postprocessing

Andersson, Daniel January 2007 (has links)
<p>Using surrogate approximations (e.g. Kriging interpolation or artifical neural networks) is an established technique for decreasing the execution time of simulation optimization problems. However, constructing surrogate approximations can be impossible when facing complex simulation inputs, and instead one is forced to use a surrogate model, which explicitly attempts to simulate the inner workings of the underlying simulation model. This dissertation has investigated if postprocessing the output of a surrogate model with an artificial neural network can increase its accuracy and value in simulation optimization problems. Results indicate that the technique has potential in that when output post-processing was enabled the accuracy of the surrogate model increased, i.e. its output more losely matched the output of the real simulation model. No apparent improvement in optimization performance could be observed however. It was speculated that this was due to either the optimization algorithm used not taking advantage of the improved accuracy of the surrogate model, or the fact the the improved accuracy of the surrogate model was to small to make any measurable impact. Further investigation of these issues must be conducted in order to get a better understanding of the pros and cons of the technique.</p>
12

Improved accuracy of surrogate models using output postprocessing

Andersson, Daniel January 2007 (has links)
Using surrogate approximations (e.g. Kriging interpolation or artifical neural networks) is an established technique for decreasing the execution time of simulation optimization problems. However, constructing surrogate approximations can be impossible when facing complex simulation inputs, and instead one is forced to use a surrogate model, which explicitly attempts to simulate the inner workings of the underlying simulation model. This dissertation has investigated if postprocessing the output of a surrogate model with an artificial neural network can increase its accuracy and value in simulation optimization problems. Results indicate that the technique has potential in that when output post-processing was enabled the accuracy of the surrogate model increased, i.e. its output more losely matched the output of the real simulation model. No apparent improvement in optimization performance could be observed however. It was speculated that this was due to either the optimization algorithm used not taking advantage of the improved accuracy of the surrogate model, or the fact the the improved accuracy of the surrogate model was to small to make any measurable impact. Further investigation of these issues must be conducted in order to get a better understanding of the pros and cons of the technique.
13

Convergent algorithms in simulation optimization

Hu, Liujia 27 May 2016 (has links)
It is frequently the case that deterministic optimization models could be made more practical by explicitly incorporating uncertainty. The resulting stochastic optimization problems are in general more difficult to solve than their deterministic counterparts, because the objective function cannot be evaluated exactly and/or because there is no explicit relation between the objective function and the corresponding decision variables. This thesis develops random search algorithms for solving optimization problems with continuous decision variables when the objective function values can be estimated with some noise via simulation. Our algorithms will maintain a set of sampled solutions, and use simulation results at these solutions to guide the search for better solutions. In the first part of the thesis, we propose an Adaptive Search with Resampling and Discarding (ASRD) approach for solving continuous stochastic optimization problems. Our ASRD approach is a framework for designing provably convergent algorithms that are adaptive both in seeking new solutions and in keeping or discarding already sampled solutions. The framework is an improvement over the Adaptive Search with Resampling (ASR) method of Andradottir and Prudius in that it spends less effort on inferior solutions (the ASR method does not discard already sampled solutions). We present conditions under which the ASRD method is convergent almost surely and carry out numerical studies aimed at comparing the algorithms. Moreover, we show that whether it is beneficial to resample or not depends on the problem, and analyze when resampling is desirable. Our numerical results show that the ASRD approach makes substantial improvements on ASR, especially for difficult problems with large numbers of local optima. In traditional simulation optimization problems, noise is only involved in the objective functions. However, many real world problems involve stochastic constraints. Such problems are more difficult to solve because of the added uncertainty about feasibility. The second part of the thesis presents an Adaptive Search with Discarding and Penalization (ASDP) method for solving continuous simulation optimization problems involving stochastic constraints. Rather than addressing feasibility separately, ASDP utilizes the penalty function method from deterministic optimization to convert the original problem into a series of simulation optimization problems without stochastic constraints. We present conditions under which the ASDP algorithm converges almost surely from inside the feasible region, and under which it converges to the optimal solution but without feasibility guarantee. We also conduct numerical studies aimed at assessing the efficiency and tradeoff under the two different convergence modes. Finally, in the third part of the thesis, we propose a random search method named Gaussian Search with Resampling and Discarding (GSRD) for solving simulation optimization problems with continuous decision spaces. The method combines the ASRD framework with a sampling distribution based on a Gaussian process that not only utilizes the current best estimate of the optimal solution but also learns from past sampled solutions and their objective function observations. We prove that our GSRD algorithm converges almost surely, and carry out numerical studies aimed at studying the effects of utilizing the Gaussian sampling strategy. Our numerical results show that the GSRD framework performs well when the underlying objective function is multi-modal. However, it takes much longer to sample solutions, especially in higher dimensions.
14

Using Simulation Optimization to Construct Efficient Screening Strategies for Cervical Cancer

Foufoulides, Christodoulos 21 August 2008 (has links)
Cervical cancer is the second most common type of cancer in women worldwide. Because cervical cancer is usually asymptomatic until the disease is in its advanced stages, cervical screening is of central importance towards combating cervical cancer. Alternative screening strategies are evaluated from an economic point of view through cost-effectiveness analysis. In the literature however, studies perform cost-effectiveness analysis on a limited number of de facto or predetermined screening policies. At present, no attempt has been made to construct efficient screening strategies through optimization, before cost-effectiveness analysis is applied. In this study simulation optimization is used to construct efficient screening strategies for cervical cancer by properly timing the screenings. The constructed strategies are highly cost-effective when a small number of lifetime screenings is available, and are more cost-effective than screening strategies used in practice or considered in the literature so far, indicating the value of optimal timing for other screened diseases as well.
15

Hybrid simulation and optimization approach for green intermodal transportation problem with travel time uncertainty

Hrusovsky, Martin, Demir, Emrah, Jammernegg, Werner, van Woensel, Tom 09 1900 (has links) (PDF)
The increasing volumes of road transportation contribute to congestion on road, which leads to delays and other negative impacts on the reliability of transportation. Moreover, transportation is one of the main contributors to the growth of carbon dioxide equivalent emissions, where the impact of road transportation is significant. Therefore, governmental organizations and private commercial companies are looking for greener transportation solutions to eliminate the negative externalities of road transportation. In this paper, we present a novel solution framework to support the operational-level decisions for intermodal transportation networks using a combination of an optimization model and simulation. The simulation model includes stochastic elements in form of uncertain travel times, whereas the optimization model represents a deterministic and linear multi-commodity service network design formulation. The intermodal transportation plan can be optimized according to different objectives, including costs, time and CO2e emissions. The proposed approach is successfully implemented to real-life scenarios where differences in transportation plans for alternative objectives are presented. The solutions for transportation networks with up to 250 services and 20 orders show that the approach is capable of delivering reliable solutions and identifying possible disruptions and alternatives for adapting the unreliable transportation plans.
16

A Metamodel based Multiple Criteria Optimization via Simulation Method for Polymer Processing

Villarreal-Marroquin, Maria G. January 2012 (has links)
No description available.
17

Genetic Algorithm based Simulation-Optimization for Fighting Wildfires

HomChaudhuri, Baisravan 03 August 2010 (has links)
No description available.
18

Impact of travel time uncertainties on the solution cost of a two-echelon vehicle routing problem with synchronization

Anderluh, Alexandra, Larsen, Rune, Hemmelmayr, Vera, Nolz, Pamela January 2019 (has links) (PDF)
Two-echelon vehicle routing problems which contain synchronization between vehicles can be deeply impacted by time uncertainty, because one vehicle's delay can propagate to other vehicles. In this paper, we evaluate the deterministic solution of such a problem based on simulated travel time scenarios. The information obtained by simulation is incorporated in the optimization procedure iteratively. Computational results show that the degree of synchronization in an instance is directly correlated with the potential improvements by reoptimization. We present findings on the number of travel time scenarios required to obtain a representative picture of the stochastic solutions. In addition, we demonstrate that time dependent travel times can be aggregated on a city-wide level and linearized as a function of free flow times without major loss of reliability.
19

Production Control MechanismsComparison using Multi-ObjectiveSimulation Optimization

Zia, Muhammad Irfan January 2009 (has links)
The choice of an efficient and effective production control mechanism (PCM)along with the appropriate buffer allocation pattern is very important for anyproduction engineer/decision maker when designing a production line in order toattain the required system performance. This project work aims to give an insightwith different PCMs, different buffer allocation patterns and arrangement ofworkers of different capability to help the production engineers/decision makersto select the right mechanism and pattern. This study has been performed withmulti-objective simulation optimisation (MOSO) tool. The result from manyexperiments have shown that the ascending buffer allocation pattern stands outas the prominent choice when the goal was to attain maximum throughput (TP)and simultaneously keeping minimum cycle time (CT) and work in process (WIP).The PCMs and workers imbalance patterns performance is different in differentregions of the Pareto-optimal CT-TP data plots obtained from MOSO so theirselection is depending on the interest of the desired level of throughput togetherwith the limit of cycle time.
20

Analysis Of The Heterogeneity Scale Effects On Pump And Treat Aquifer Remediation Design

Gungor Demirci, Gamze 01 May 2009 (has links) (PDF)
The effect of heterogeneity correlation scale (&amp / #61548 / ) of hydraulic conductivity (K), equilibrium distribution coefficient (Kd) and mass transfer rate (&amp / #61537 / ) on the design and cost of the P&amp / T remediation system for different heterogeneity levels (defined by the variance (&amp / #963 / 2lnK)) and parameter distributions under the rate-limited sorption conditions was evaluated in this study. In addition, the impacts of initial amount of contaminant mass and plume configuration on the remediation design and cost were explored. The effects of different K heterogeneity and remediation design conditions on the length of remediation period, the influence of &amp / #61548 / anisotropy of K, correlation between K and Kd, and Kd and &amp / #61537 / , and the fraction of equilibrium sorption sites (f) on the pump-and-treat (P&amp / T) design and cost were the other studied subjects. In this study, simulation-optimization approach, in which a groundwater flow and contaminant transport simulation model was linked with a genetic algorithm (GA) library, was used. Results showed that not only the amount of PCE mass initially present in the aquifer was important in terms of P&amp / T design, cost and remediation time, but also the location and size of the high and low K regions defined by &amp / #955 / lnK as well as the magnitudes of K represented by geometric mean and &amp / #963 / 2lnK were influential. It was also found that P&amp / T designs utilizing higher numbers of wells with lower pumping rates may be more robust predicting the time-to-compliance compared to a single well with higher pumping rate for aquifers heterogeneous in K. Homogenous Kd assumption might cause serious error in both the design and the cost of remediation. The magnitude of this error may change depending on the spatial distribution of K and Kd, &amp / #955 / lnKd, &amp / #963 / 2lnKd and &amp / #963 / 2lnK. The effect of heterogeneity in &amp / #61537 / on the design and cost of remediation may or may not be significant depending on K, Kd and &amp / #61537 / distributions, &amp / #61548 / ln&amp / #61537 / and &amp / #963 / 2ln&amp / #61537 / . Increased amount of kinetically sorbed mass defined by decreased f value resulted in more costly remediation.

Page generated in 0.1725 seconds