• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 151
  • 34
  • 15
  • 13
  • 10
  • 6
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 307
  • 307
  • 307
  • 108
  • 52
  • 51
  • 46
  • 45
  • 43
  • 33
  • 32
  • 32
  • 28
  • 28
  • 27
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
111

Optimalizace materiálového toku v hromadné výrobě simulačními metodami / Optimization of Material Flow in Mass Production by Means of Simulation Methods

Hloska, Jiří January 2015 (has links)
The aim of the PhD thesis is to design a methodology for generating a material flow using a simulation meta-model of a mass production process. This methodology is in principle based on the relationship between selected material flow characteristics. Simulation of production and logistics processes has been increasingly used in planning, commissioning and subsequent operational management and optimization of the respective technological operations, in particular in mass production. The first part of the PhD thesis summarizes up-to-date findings in the field of discrete event simulation of material flow, related statistical and mathematical disciplines, but also information technology which enables effective realization of simulation studies. Attention is also paid to significant domestic and international conferences, symposia and interest associations related to simulation of manufacturing processes. The next part of the PhD thesis presents the methodology of reconstruction and generation of material flow using simulation meta-models developed for this purpose. Principles of algorithms used by these meta-models and their possible range of use are demonstrated by simulation experiments carried out. Their description and results are also commented. Special focus is put on the selection of significant material flow characteristics and their mutual relationship. For its evaluation a series of simulation experiments was conducted using a simulation model of a closed queuing system with variable parameters. Revealed interdependence between the selected material flow characteristics is experimentally verified using a detailed simulation model of particular selected mass production system. The conclusion of the PhD thesis summarizes provided findings and, with regard to the designed methodology of reconstruction and generating of material flow, it outlines possible further steps both in research and their practical application.
112

Redesigning an automated production cell for increased efficiency : Discrete-Event Simulation as a tool to evaluate system designs

Paillon, Noah, Reimer, William January 2023 (has links)
Date:    4th June 2023 Level:    Master Thesis in Product- and Process Development, 30 ECTS Authors:   William Reimer   Noah Paillon Title:    Redesigning automated production cell for increased efficiency - Discrete-Event Simulation as a tool to evaluate system designs Keywords:   Discrete Event Simulation (DES), Redesign, Production process, Automated Manufacturing System (AMS), Automated production cell, Robotics in production, Simulation, Key Performance indicators (KPIs), Decision-making, Redesigning production systems, Bottlenecks, Deadlocks, Production system development (PSD). Aim:    The aim of this study is to evaluate and redesign an automated production cell at a Swedish manufacturing plant using DES with KPIs, with the goal of increasing the efficiency. Research question:  “When redesigning, what factors influence production efficiency within an automated production cell?”, “What are the challenges when redesigning and increasing efficiency within an automated production cell?” and “How can such challenges be surpassed through the usage of DES with KPIs?” Methodology:   This study is a quantitative research with an inductive approach, including simulation as an experimental tool. In order to identify reliable knowledge, the literature review compared books and scientific publications. The empirical results were obtained by observation and data collection at the production site of the case company, which also involved frequent meetings with company supervisors and production engineers. In order to assess the situation and make conclusions, the empirical results were evaluated and experimented using DES.  Conclusion:   The need of identifying a problem's underlying causes in order to improve efficiency was one of the conclusions made. This was additionally mentioned as one of PSD's primary challenges in redesigning an automated production cell. This implies that in order to increase production efficiency, bottlenecks and deadlocks must be appropriately addressed and adjusted to meet the demands of a production. In order to improve the production efficiency of an automated production cell, DES with KPIs is considered to be a useful approach.
113

Analyzing and Evaluating the Resilience of Scheduling Scientific Applications on High Performance Computing Systems using a Simulation-based Methodology

Sukhija, Nitin 09 May 2015 (has links)
Large scale systems provide a powerful computing platform for solving large and complex scientific applications. However, the inherent complexity, heterogeneity, wide distribution, and dynamism of the computing environments can lead to performance degradation of the scientific applications executing on these computing systems. Load imbalance arising from a variety of sources such as application, algorithmic, and systemic variations is one of the major contributors to their performance degradation. In general, load balancing is achieved via scheduling. Moreover, frequently occurring resource failures drastically affect the execution of applications running on high performance computing systems. Therefore, the study of deploying support for integrated scheduling and fault-tolerance mechanisms for guaranteeing that applications deployed on computing systems are resilient to failures becomes of paramount importance. Recently, several research initiatives have started to address the issue of resilience. However, the major focus of these efforts was geared more toward achieving system level resilience with less emphasis on achieving resilience at the application level. Therefore, it is increasingly important to extend the concept of resilience to the scheduling techniques at the application level for establishing a holistic approach that addresses the performability of these applications on high performance computing systems. This can be achieved by developing a comprehensive modeling framework that can be used to evaluate the resiliency of such techniques on heterogeneous computing systems for assessing the impact of failures as well as workloads in an integrated way. This dissertation presents an experimental methodology based on discrete event simulation for the analysis and the evaluation of the resilience of scheduling scientific applications on high performance computing systems. With the aid of the methodology a wide class of dependencies existing between application and computing system are captured within a deterministic model for quantifying the performance impact expected from changes in application and system characteristics. Ideally, the results obtained by employing the proposed simulation-based performance prediction framework enabled an introspective design and investigation of scheduling heuristics to reason about how to best fully optimize various often antagonistic objectives, such as minimizing application makespan and maximizing reliability.
114

A framework for enabling operators to use simulation for continuous improvement

Alcayaga, Christian Junior, Hoffsten, Jakob January 2022 (has links)
In a time where digitalization is becoming more and more necessary to work with, tools such as simulation is becoming more of a standard to be able to make decisions made on facts. Within previous research there is a gap in the research, as the factors enabling operators to use simulation is an unexplored topic. Therefore, to fill this gap and provide the industry with help, this study fulfills the purpose:  Create a framework for enabling operators to use simulation for continuous improvements.  By looking at the different areas in the production and identifying challenges within the production, information handling between the departments and the structure of the company, the purpose of this study is fulfilled. The purpose was achieved through the usage of two research questions.  What are the challenges in practice for enabling operators to use simulation for continuous improvements?  How can these challenges be overcome, to enable operators to use simulation for continuous improvements?  To answer the research questions, a case study was performed alongside a literature review. The case study consisted of interviews performed at a case company within the automotive industry. The gathered empirical and theoretical data was then analyzed trough a thematic analysis.  The conclusion of this study was that, for enabling operators to use simulation for continuous improvements, companies must first have the right organizational structure, were a foundation for continuous improvement is present. The reason for this being that if a company desires to work with simulation tools, they must first work in an environment where suggestions and improvements are supported.
115

Delivery Performance Prediction Tool for Complex Assembly Systems

Beladi, Faried D 01 January 2014 (has links) (PDF)
Complex assembly systems are made up of hundreds, and in some cases, thousands of parts, that all need to be managed in a proper manner so part arrivals will coincide to meet a build plan, and ensure production requirements are satisfied. A major challenge faced by manufacturers for these complex systems is that many parts have long and complex supply chains, which result in long and highly variable supply lead times. The high cost and low volume makes holding large stocks of these components unviable. Thus, the need arises for the development of a simulation tool that can predict the time all of the required parts are ready for assembly, and allow for comparison of various ordering and inventory strategies. Two strategies were tested, the current practice of ordering to an agreed upon quoted lead time, and a strategy which accounts for lead time variability through advanced ordering. The results of these two strategies displayed the benefits of synchronizing the system through advance ordering, as a potential 60% reduction in inventory was observed. Future development in the tool would incorporate more granular steps of the build sequence, as well as the inclusion of quality non-conformance (QN) issues.
116

Using Computer Simulation to Study Hospital Admission and Discharge Processes

Kim, Edwin S. 01 January 2013 (has links) (PDF)
Hospitals around the country are struggling to provide timely access to inpatient beds. We use discrete event simulation to study the inpatient admission and discharge processes in US hospitals. Demand for inpatient beds comes from two sources: the Emergency Department (ED) and elective surgeries (NonED). Bed request and discharge rates vary from hour to hour; furthermore, weekday demand is different from weekend demand. We use empirically collected data from national and local (Massachusetts) sources on different-sized community and referral hospitals, demand rates for ED and NonED patients, patient length of stay (LOS), and bed turnover times to calibrate our discrete event simulation model. In our computational experiments, we find that expanding hours of discharge, increasing the number of days elective patients are admitted in a week, and decreasing length of stay all showed statistically significant results in decreasing the average waiting time for patients. We discuss the implications of these results in practice, and list the key limitations of the model.
117

Surviving the Surge: Real-time Analytics in the Emergency Department

Rea, David J. 05 October 2021 (has links)
No description available.
118

Threaded WARPED : An Optimistic Parallel Discrete Event Simulator for Cluster of Multi-Core Machines

Muthalagu, Karthikeyan January 2012 (has links)
No description available.
119

Aiding Strategic and Operational Decision Making in Hospital Centralized Scheduling Through Discrete-Event Simulation

Natale, James 07 June 2013 (has links)
No description available.
120

Integrating Process Mining with Discrete-Event Simulation Modeling

Liu, Siyao 01 November 2015 (has links) (PDF)
Discrete-event simulation (DES) is an invaluable tool which organizations can use to help better understand, diagnose, and optimize their operational processes. Studies have shown that for the typical DES exercise, the greatest amount of time is spent on developing an accurate model of the process that is to be studied. Process mining, a similar field of study, focuses on using historical data stored in software databases to accurate recreate and analyze business processes. Utilizing process mining techniques to help rapidly develop DES models can drastically reduce the amount of time spent building simulation models, which ultimately will enable organizations to more quickly identify and correct shortcomings in their operations. Although there have been significant advances in process mining research, there are still several issues with current process mining methods which prevent them from seeing widespread industry adoption. One such issue, which this study examines, is the lack of cross-compatibility between process mining tools and other process analysis tools. Specifically, this study develops and characterizes a method through which mined process models can be converted into discrete-event simulation models. The developed method utilizes a plugin written for the ProM Framework, an existing collection of process mining tools, which takes a mined process model as its input and outputs an Excel workbook which provides the process data in a format more easily read by DES packages. Two event logs which mimic real-world processes were used in the development and validation of the plugin. The developed plugin successfully extracted the critical process data from the mined process model and converted it into a format more easily utilized by DES packages. There are several limitations which will limit model accuracy, but the plugin developed by this study shows that the conversion of process models to basic simulation models is possible. Future research can focus on addressing the limitations to improve model accuracy.

Page generated in 0.1291 seconds