• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Sequential design strategies for mean response surface metamodeling via stochastic kriging with adaptive exploration and exploitation

Chen, Xi, Zhou, Qiang 10 1900 (has links)
Stochastic kriging (SK) methodology has been known as an effective metamodeling tool for approximating a mean response surface implied by a stochastic simulation. In this paper we provide some theoretical results on the predictive performance of SK, in light of which novel integrated mean squared error-based sequential design strategies are proposed to apply SIC for mean response surface metamodeling with a fixed simulation budget. Through numerical examples of different features, we show that SIC with the proposed strategies applied holds great promise for achieving high predictive accuracy by striking a good balance between exploration and exploitation. Published by Elsevier B.V.
2

A Dual Metamodeling Perspective for Design and Analysis of Stochastic Simulation Experiments

Wang, Wenjing 17 July 2019 (has links)
Fueled by a growing number of applications in science and engineering, the development of stochastic simulation metamodeling methodologies has gained momentum in recent years. A majority of the existing methods, such as stochastic kriging (SK), only focus on efficiently metamodeling the mean response surface implied by a stochastic simulation experiment. As the simulation outputs are stochastic with the simulation variance varying significantly across the design space, suitable methods for variance modeling are required. This thesis takes a dual metamodeling perspective and aims at exploiting the benefits of fitting the mean and variance functions simultaneously for achieving an improved predictive performance. We first explore the effects of replacing the sample variances with various smoothed variance estimates on the performance of SK and propose a dual metamodeling approach to obtain an efficient simulation budget allocation rule. Second, we articulate the links between SK and least-square support vector regression and propose to use a ``dense and shallow'' initial design to facilitate selection of important design points and efficient allocation of the computational budget. Third, we propose a variational Bayesian inference-based Gaussian process (VBGP) metamodeling approach to accommodate the situation where either one or multiple simulation replications are available at every design point. VBGP can fit the mean and variance response surfaces simultaneously, while taking into full account the uncertainty in the heteroscedastic variance. Lastly, we generalize VBGP for handling large-scale heteroscedastic datasets based on the idea of ``transductive combination of GP experts.'' / Doctor of Philosophy / In solving real-world complex engineering problems, it is often helpful to learn the relationship between the decision variables and the response variables to better understand the real system of interest. Directly conducting experiments on the real system can be impossible or impractical, due to the high cost or time involved. Instead, simulation models are often used as a surrogate to model the complex stochastic systems for conducting simulation-based design and analysis. However, even simulation models can be very expensive to run. To alleviate the computational burden, a metamodel is often built based on the outputs of the simulation runs at some selected design points to map the performance response surface as a function of the controllable decision variables, or uncontrollable environmental variables, to approximate the behavior of the original simulation model. There has been a plethora of work in the simulation research community dedicated to studying stochastic simulation metamodeling methodologies suitable for analyzing stochastic simulation experiments in science and engineering. A majority of the existing methods, such as stochastic kriging (SK), have been known as effective metamodeling tool for approximating a mean response surface implied by a stochastic simulation. Despite that SK has been extensively used as an effective metamodeling methodology for stochastic simulations, SK and metamodeling techniques alike still face four methodological barriers: 1) Lack of the study in variance estimates methods; 2) Absence of an efficient experimental design for simultaneous mean and variance metamodeling; 3) Lack of flexibility to accommodate situations where simulation replications are not available; and 4) Lack of scalability. To overcome the aforementioned barriers, this thesis takes a dual metamodeling perspective and aims at exploiting the benefits of fitting the mean and variance functions simultaneously for achieving an improved predictive performance. We first explore the effects of replacing the sample variances with various smoothed variance estimates on the performance of SK and propose a dual metamodeling approach to obtain an efficient simulation budget allocation rule. Second, we articulate the links between SK and least-square support vector regression and propose to use a “dense and shallow” initial design to facilitate selection of important design points and efficient allocation of the computational budget. Third, we propose a variational Bayesian inference-based Gaussian process (VBGP) metamodeling approach to accommodate the situation where either one or multiple simulation replications are available at every design point. VBGP can fit the mean and variance response surfaces simultaneously, while taking into full account the uncertainty in the heteroscedastic variance. Lastly, we generalize VBGP for handling large-scale heteroscedastic datasets based on the idea of “transductive combination of GP experts.”
3

Simulation-Based Robust Revenue Maximization Of Coal Mines Using Response Surface Methodology

Nageshwaraniyergopalakrishnan, Saisrinivas January 2014 (has links)
A robust simulation-based optimization approach is proposed for truck-shovel systems in surface coal mines to maximize the expected value of revenue obtained from loading customer trains. To this end, a large surface coal mine in North America is considered as case study. A data-driven modeling framework is developed and then applied to automatically generate a highly detailed simulation model of the mine in Arena. The framework comprises a formal information model based on Unified Modeling Language (UML), which is used to input mine structural as well as production information. Petri net-based model generation procedures are applied to automatically generate the simulation model based on the whole set of simulation inputs. Then, factors encountered in material handling operations that may affect the robustness of revenue are then classified into 1) controllable; and 2) uncontrollable categories. While controllable factors are trucks locked to routes, uncontrollable factors are inverses of summation over truck haul, and shovel loading and truck-dumping times for each route. Historical production data of the mine contained in a data warehouse is used to derive probability distributions for the uncontrollable factors. The data warehouse is implemented in Microsoft SQL, and contains snapshots of historical equipment statuses and production outputs taken at regular intervals in each shift of the mine. Response Surface Methodology is applied to derive an expression for the variance of revenue as a function of controllable and uncontrollable factors. More specifically, 1) first order and second order effects for controllable factors, 2) first order effects for uncontrollable factors, and 3) two factor interactions for controllable and uncontrollable factors are considered. Latin Hypercube Sampling method is applied for setting controllable factors and the means of uncontrollable factors. Also, Common Random Numbers method is applied to generate the sequence of pseudo-random numbers for uncontrollable factors in simulation experiments for variance reduction between different design points of the metamodel. The variance of the metamodel is validated using leave-one-out cross validation. It is later applied as an additional constraint to the mathematical formulation to maximize revenue in the simulation model using OptQuest. The decision variables in this formulation are truck locks only. Revenue is a function of the actual quality of coal delivered to each customer and their corresponding quality specifications for premiums and penalties. OptQuest is an optimization add-on for Arena that uses Tabu search and Scatter search algorithms to arrive at the optimal solution. The upper bound on the variance as a constraint is varied to obtain different sets of expected value as well as variance of optimal revenue. After comparison with results using OptQuest with random sampling and without variance expression of metamodel, it has been shown that the proposed approach can be applied to obtain the decision variable set that not only results in a higher expected value but also a narrower confidence interval for optimum revenue. According to the best of our knowledge, there are two major contributions from this research: 1) It is theoretically demonstrated using 2-point and orthonormal k-point response surfaces that Common Random Numbers reduces the error in estimation of variance of metamodel of simulation model. 2) A data-driven modeling and simulation framework has been proposed for automatically generating discrete-event simulation model of large surface coal mines to reduce modeling time, expenditure, as well as human errors associated with manual development.

Page generated in 1.1933 seconds