• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 120
  • 33
  • 14
  • 13
  • 9
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 256
  • 256
  • 256
  • 91
  • 46
  • 42
  • 40
  • 36
  • 31
  • 28
  • 26
  • 26
  • 25
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Evaluating Lean Manufacturing Proposals through Discrete Event Simulation – A Case Study at Alfa Laval

Detjens, Sönke, Flores, Erik January 2013 (has links)
In their strive for success in competitive markets companies often turn to Lean philosophy. However, for many companies Lean benefits are hard to substantialize especially when their ventures have met success through traditional manufacturing approaches. Traditional Lean tools analyze current situations or help Lean implementation. Therefore productions facilities require tools that enhance the evaluation of Lean proposals in such a way that decisions are supported by quantitative data and not only on a gut feeling. This thesis proposes how Discrete Event Simulation may be used as an evaluation tool in production process improvement to decide which proposal best suits Lean requirements. Theoretical and empirical studies were carried out. Literature review helped define the problem. A case study was performed at Alfa Laval to investigate through a holistic approach how and why did this tool provide a solution to the research questions. Case study analysis was substantiated with Discrete Event Simulation models for the evaluation of current and future state Lean proposals. Results of this study show that Discrete Event Simulation was not designed and does not function as a Lean specific tool. The use of Discrete Event Simulation in Lean assessment applications requires the organization to understand the principles of Lean and its desired effects. However, the use of traditional static Lean tools such as Value Stream Mapping and dynamic Discrete Event Simulation complement each other in a variety of ways. Discrete Event Simulation provides a unique condition to account for process variability and randomness. Both measurement of and reduction in variability through simulation provide insight to Lean implementation strategies.
12

PRODUCTION AND DISTRIBUTION PLANNING FOR DYNAMIC SUPPLY CHAINS USING MULTI-RESOLUTION HYBRID MODELS

Venkateswaran, Jayendran January 2005 (has links)
Today, there is little understanding of how local decisions and disturbances impact the global performance of the supply chain. In this research, we attempt to gain insight about such relationship using multi-resolution hybrid models. To this end, a novel hybrid architecture and methodology consisting of simulation (system dynamic and discrete-event) and optimization modules is proposed. The proposed methodology, applicable to general supply chains, is divided into fours stages: plan stability analysis (Stage I), plan optimization (Stages II), schedule optimization (Stage III) and concurrent decision evaluation (Stage IV). Functional and process models of the proposed architecture are specified using formal IDEF tools. A realistic three-echelon conjoined supply chain system characterized by communicative and collaborative (VMI) configurations is analyzed in this research. Comprehensive SD models of each player of the supply chain have been developed. General conditions of the stability (settings of control parameters that produce stable response) are derived using z-transformation techniques (Stage I), and insights into the behavior of the supply chain are gained. Next, a novel method for the integration of the stability analysis with performance analysis (optimization) is presented (Stage II) by employing the derived stability conditions derived as additional constraints within the optimization models. Next, in Stage III, the scheduling at each chain partner using discrete-event simulation (DES) modeling techniques is addressed. In Stage IV, the optimality of the SD control parameters (from Stage II) and DES operational policies (from Stage III) for each member are concurrently evaluated by integrating the SD and DES models. Evaluation in Stage IV is performed to better understand the global consequence of the locally optimal decisions determined at each supply chain member. A generic infrastructure has been developed using High Level Architecture (HLA) to integrate the distributed decision and simulation models. Experiments are conducted to demonstrate the proposed architecture for the analysis of distributed supply chains. The progressions of cost based objective function from Stages I-III are compared with that from the concurrent evaluation in Stage IV. Also the ability of the proposed methodology to capture the effect of dynamic perturbations within the supply chain system is illustrated.
13

MDRIP: A Hybrid Approach to Parallelisation of Discrete Event Simulation

Chao, Daphne (Yu Fen) January 2006 (has links)
The research project reported in this thesis considers Multiple Distributed Replications in Parallel (MDRIP), a hybrid approach to parallelisation of quantitative stochastic discrete-event simulation. Parallel Discrete-Event Simulation (PDES) generally covers distributed simulation or simulation with replicated trials. Distributed simulation requires model partitioning and synchronisation among submodels. Simulation with replicated trials can be executed on-line by applying Multiple Replications in Parallel (MRIP). MDRIP has been proposed for overcoming problems related to the large size of simulated models and their complexity, as well as with the problem of controlling the accuracy of the final simulation results. A survey of PDES investigates several primary issues which are directly related to the parallelisation of DES. A secondary issue related to implementation efficiency is also covered. Statistical analysis as a supporting issue is described. The AKAROA2 package is an implementation of making such supporting issue effortless. Existing solutions proposed for PDES have exclusively focused on collecting of output data during simulation and conducting analysis of these data when simulation is finished. Such off-line statistical analysis of output data offers no control of statistical errors of the final estimates. On-line control of statistical errors during simulation has been successfully implemented in AKAROA2, an automated controller of output data analysis during simulation executed in MRIP. However, AKAROA2 cannot be applied directly to distributed simulation. This thesis reports results of a research project aimed at employing AKAROA2 for launching multiple replications of distributed simulation models and for on-line sequential control of statistical errors associated with a distributed performance measure; i.e. with a performance measure which depends on output data being generated by a number of submodels of distributed simulation. We report changes required in the architecture of AKAROA2 to make MDRIP possible. A new MDRIP-related component of AKAROA2, a distributed simulation engine mdrip engine, is introduced. Stochastic simulation in its MDRIP version, as implemented in AKAROA2, has been tested in a number of simulation scenarios. We discuss two specific simulation models employed in our tests: (i) a model consisting of independent queues, and (ii) a queueing network consisting of tandem connection of queueing systems. In the first case, we look at the correctness of message orderings from the distributed messages. In the second case, we look at the correctness of output data analysis when the analysed performance measures require data from all submodels of a given (distributed) simulation model. Our tests confirm correctness of our mdrip engine design in the cases considered; i.e. in models in which causality errors do not occur. However, we argue that the same design principles should be applicable in the case of distributed simulation models with (potential) causality errors.
14

Insight generation in simulation studies : an empirical exploration

Gogi, Anastasia January 2016 (has links)
This thesis presents an empirical research that aims to explore insight generation in discrete-event simulation (DES) studies. It is often claimed that simulation is useful for generating insights. There is, however, almost no empirical evidence to support this claim. The factors of a simulation intervention that affect the occurrence of insight are not clear. A specific claim is that watching the animated display of a simulation model is more helpful in making better decisions than relying on the statistical outcomes generated from simulation runs; but again, there is very limited evidence to support this. To address this dearth of evidence, two studies are implemented: a quantitative and a qualitative study. In the former, a laboratory-based experimental study is used, where undergraduate students were placed in three separate groups and given a task to solve using a model with only animation, a model with only statistical results, or using no model at all. In the qualitative study, semi-structured interviews with simulation consultants were carried out, where participants were requested to account examples of projects in which clients change their problem understanding and generate more effective ideas. The two separated parts of the study found different types of evidence to support that simulation generates insight. The experimental study suggests that insights are generated more rapidly from statistical results than the use of animation. Research outcomes from the interviews include descriptions of: the phase of a simulation study where insight emerges; the role of different methods applied and means used in discovering and overcoming discontinuity in thinking (for instance, the role of consultant s influence in problem understanding); how some factors of a simulation intervention are associated with the processes of uncovering and overcoming discontinuity in thinking (for example, the role of clients team in the selection of methods used to communicate results); and the role of the model and consultant in generating new ideas. This thesis contributes to the limited existing literature by providing a more in depth understanding of insight in the context of simulation and empirical evidence on the insight-enabling benefits of simulation based on an operational definition. The findings of the study provide new insights into the factors of simulation that support fast and creative problem solving.
15

Examining the Impact of Experimental Design Strategies on the Predictive Accuracy of Quantile Regression Metamodels for Computer Simulations of Manufacturing Systems

January 2016 (has links)
abstract: This thesis explores the impact of different experimental design strategies for the development of quantile regression based metamodels of computer simulations. In this research, the objective is to compare the resulting predictive accuracy of five experimental design strategies, each of which is used to develop metamodels of a computer simulation of a semiconductor manufacturing facility. The five examined experimental design strategies include two traditional experimental design strategies, sphere packing and I-optimal, along with three hybrid design strategies, which were developed for this research and combine desirable properties from each of the more traditional approaches. The three hybrid design strategies are: arbitrary, centroid clustering, and clustering hybrid. Each of these strategies is analyzed and compared based on common experimental design space, which includes the investigation of four densities of design point placements three different experimental regions to predict four different percentiles from the cycle time distribution of a semiconductor manufacturing facility. Results confirm that the predictive accuracy of quantile regression metamodels depends on both the location and density of the design points placed in the experimental region. They also show that the sphere packing design strategy has the best overall performance in terms of predictive accuracy. However, the centroid clustering hybrid design strategy, developed for this research, has the best predictive accuracy for cases in which only a small number of simulation resources are available from which to develop a quantile regression metamodel. / Dissertation/Thesis / Masters Thesis Engineering 2016
16

The Effect of Heterogeneous Servers on the Service Level Predicted by Erlang-A

Griffith, Edward Shane 19 May 2011 (has links)
Thousands of call centers operate in the United States employing millions of people. Since personnel costs represent as much as 80% of the total operating expense of these centers, it is important for call center managers to determine an appropriate staffing level required to maintain the desired operating performance. Historically, queueing models serve an important role in this regard. The one most commonly used is the Erlang-C model. The Erlang-C model has several assumptions, however, which are required for the predicted performance measures to be valid. One assumption that has received significant attention from researchers is that callers have infinite patience and will not terminate a call until the service is complete regardless of the wait time. Since this assumption is not likely to occur in reality, researchers have suggested using Erlang-A instead. Erlang-A does consider caller patience and allows for calls to be abandoned prior to receiving service. However, the use of Erlang-A still requires an assumption that is very unlikely to occur in practice - the assumption that all agents provide service at the same rate. Discrete event simulation is used to examine the effects of agent heterogeneity on the operating performance of a call center compared to the theoretical performance measures obtained from Erlang-A. Based on the simulation results, it is concluded that variability in agent service rate does not materially affect call center performance except to the extent that the variability changes the average handle time of the call center weighted by the number of calls handled and not weighted by agent. This is true regardless of call center size, the degree of agent heterogeneity, and the distribution shape of agent variability. The implication for researchers is that it is unnecessary to search for an analytic solution to relax the Erlang-A assumption that agents provide service at the same rate. Several implications for managers are discussed including the reliability of using Erlang-A to determine staffing levels, the importance of considering the service rates of the agents rather than the average handle time, and the unintended consequence of call routing schemes which route calls to faster rather than slower agents.
17

Impact evaluation of an automatic identificationtechnology on inventory management : A simulation approach with the focus on RFID

Petersson, Martin January 2020 (has links)
Automatic identification system is a prominent technology used in warehouses to give managers real time information of their products and assistance to warehouse employees in keeping an accurate inventory record. This kind of assistance is needed as an inaccurate inventory leads to profit loss due to misplacement or other mistakes. This project cooperated with an organization called Stora Enso specifically one of their forest nursery to find a solution to improve their inventory management system. Their current inventory system is a manual process which leads to mistakes occurring that affects the inventory accuracy. This thesis project evaluates automatic identification systems to observe if the technology is a possible solution and aims to answer the research question ”What are the significant impacts an automatic identification system has on an inventory management system?”. From the automatic identification evaluation one system is picked for further evaluation and due to its advantages radio frequency identification (RFID) is picked. To evaluate RFID in a warehouse setting a discrete-event simulation was created that simulates the forest nursery’s warehouse. The simulation is then used to evaluate the impact of different RFID implementations and their respective cost. The simulation results show that just a simple RFID implementation can improve inventory accuracy and remove some of the mistakes a manual system has with a relatively low direct cost. It also shows that a full RFID implementation that gives full visibility of a warehouse can almost remove inventory mistakes however the cost analysis shows that it requires a large investment.
18

Current State Simulation Scope of Improvement and Forecast Demand Analysis at AstraZeneca using Discrete Event Simulation.

Kasula, Siva Sai Krishna January 2020 (has links)
In this rapidly changing product demand market, the pharmaceutical companies have adapted their production system to be more flexible and agile. In order to meet the demand, production lines need to be more efficient and effective. Even a small improvement is a great achievement as these production lines are designed to produce large volumes of medicines. To test the efficiency and effectiveness of the lines by analyzing production data would be time taking and needs the involvement of experts from different departments. When production lines are subjected to change, previous analysis done will no longer be valid and needs to be repeated again. Instead, this can be replaced with discrete even simulation analysis (DES).     DES is one of the key technology in developing a production system in this industry 4.0 era. As the production systems become more and more complicated it becomes difficult to understand and analyze the behavior of the system if there are any changes brought up in the system. Simulation is the right technology to analyze and understand the behavior of the real system when undergone small or big changes.  The purpose of this case study is to make use of DES using ExtenSim as a simulation tool at the case company in order to develop a virtual model of a production system containing five production lines to understand the behavior and analyze the production lines to identify possible improvement and evaluate the feasibility of production system to achieve the forecasted demand. Possible improvements are identified from the simulation results of the current state model and a future state simulation model is developed with the improvements. Furthermore, this future state simulation model is used to analyze the feasibility of production lines for forecasted demand. By developing the simulation model was identified that the production lines were not efficient and are underutilized as that the company assumed.
19

Improving Patients Experience in an Emergency Department using Systems Engineering Approach

Khazaei, Hosein 08 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Healthcare industry in United States of America is facing a big paradox. Although US is a leader in the industry of medical devices, medical practices and medical researches, however there isnt enough satisfaction and quality in performance of US healthcare operations. Despite the big investments and budgets associated with US healthcare, there are big threats to US healthcare operational side, that reduces the quality of care. In this research study, a step by step Systems Engineering approach is applied to improve healthcare delivery process in an Emergency Department of a hospital located in Indianapolis, Indiana. In this study, different type of systems engineering tools and techniques are used to improve the quality of care and patients satisfaction in ED of Eskenazi hospital. Having a simulation model will help to have a better understanding of the ED process and learn more about the bottlenecks of the process. Simulation model is verified and validated using different techniques like applying extreme and moderate conditions and comparing model results with historical data. 4 different what if scenarios are proposed and tested to find out about possible LOS improvements. Additionally, those scenarios are tested in both regular and an increased patient arrival rate. The optimal selected what-if scenario can reduce the LOS by 37 minutes compared to current ED setting. Additionally, by increasing the patient arrival rate patients may stay in the ED up to 6 hours. However, with the proposed ED setting, patients will only spend an additional 106 minutes compared to the regular patient arrival rate.
20

Developing An Object-oriented Approach For Operations Simulation In Speedes

Wasadikar, Amit 01 January 2005 (has links)
Using simulation techniques, performance of any proposed system can be tested for different scenarios with a generated model. However, it is difficult to rapidly create simulation models that will accurately represent the complexity of the system. In recent years, Object-Oriented Discrete-Event Simulation has emerged as the potential technology to implement rapid simulation schemes. A number of software based on programming languages like C++ and Java are available for carrying out Object Oriented Discrete-Event Simulation. These software packages establish a general framework for simulation in computer programs, but need to be further customized for desired end-use applications. In this thesis, a generic simulation library is created for the distributed Synchronous Parallel Environment for Emulation and Discrete-Event Simulation (SPEEDES). This library offers classes to model the functionality of servers, processes, resources, transporters, and decisions. The library is expected to produce efficient simulation models in less time and with a lesser amount of coding. The class hierarchy is modeled using the Unified Modeling Language (UML). To test the library, the existing SPEEDES Space Shuttle Model is enhanced and recreated. This enhanced model is successfully validated against the original Arena model.

Page generated in 0.0511 seconds