• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 147
  • 34
  • 13
  • 10
  • 10
  • 6
  • 4
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 298
  • 298
  • 298
  • 102
  • 50
  • 50
  • 46
  • 42
  • 42
  • 32
  • 31
  • 30
  • 28
  • 27
  • 27
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Profile Driven Partitioning Of Parallel Simulation Models

Alt, Aaron J. 10 October 2014 (has links)
No description available.
12

Autonomous finite capacity scheduling using biological control principles

Manyonge, Lawrence January 2012 (has links)
The vast majority of the research efforts in finite capacity scheduling over the past several years has focused on the generation of precise and almost exact measures for the working schedule presupposing complete information and a deterministic environment. During execution, however, production may be the subject of considerable variability, which may lead to frequent schedule interruptions. Production scheduling mechanisms are developed based on centralised control architecture in which all of the knowledge base and databases are modelled at the same location. This control architecture has difficulty in handling complex manufacturing systems that require knowledge and data at different locations. Adopting biological control principles refers to the process where a schedule is developed prior to the start of the processing after considering all the parameters involved at a resource involved and updated accordingly as the process executes. This research reviews the best practices in gene transcription and translation control methods and adopts these principles in the development of an autonomous finite capacity scheduling control logic aimed at reducing excessive use of manual input in planning tasks. With autonomous decision-making functionality, finite capacity scheduling will as much as practicably possible be able to respond autonomously to schedule disruptions by deployment of proactive scheduling procedures that may be used to revise or re-optimize the schedule when unexpected events occur. The novelty of this work is the ability of production resources to autonomously take decisions and the same way decisions are taken by autonomous entities in the process of gene transcription and translation. The idea has been implemented by the integration of simulation and modelling techniques with Taguchi analysis to investigate the contributions of finite capacity scheduling factors, and determination of the ‘what if’ scenarios encountered due to the existence of variability in production processes. The control logic adopts the induction rules as used in gene expression control mechanisms, studied in biological systems. Scheduling factors are identified to that effect and are investigated to find their effects on selected performance measurements for each resource in used. How they are used to deal with variability in the process is one major objective for this research as it is because of the variability that autonomous decision making becomes of interest. Although different scheduling techniques have been applied and are successful in production planning and control, the results obtained from the inclusion of the autonomous finite capacity scheduling control logic has proved that significant improvement can still be achieved.
13

Where is my inhaler? : A simulation and optimization study of the Quality Control on Symbicort Turbuhaler at AstraZeneca / Var är min inhalator? : En simulerings- och optimeringsstudie på kvalitetskontrollen av Symbicort Turbuhaler vid AstraZeneca

Haddad, Shirin, Nilsson, Marie January 2019 (has links)
Symbicort Turbuhaler is a medical device produced by the pharmaceutical company AstraZeneca for the treatment of asthma and symptoms of chronic obstructive pulmonary disease. The delivery reliability of the product is dependent on the performance of the whole supply chain and as part of the chain the results from the department, Quality Control (QC), are mandatory to release the produced batches to the market. The performance of QC is thus an important part of the supply chain. In order to reduce the risk of supply problems and market shortage, it is very important to investigate whether the performance of QC can be improved. The purpose of the thesis is to provide AstraZeneca with scientifically based data to identify sensitive parameters and readjust work procedures in order to improve the performance of QC. The goal of this thesis is to map out the flow of the QC Symbicort Turbuhaler operation and construct a model of it. The model is intended to be used to simulate and optimize different parameters, such as the inflow of batch samples, the utilization of the instrumentation and staff workload. QC is modelled in a simulation software. The model is used to simulate and optimize different scenarios following a discrete event simulation and an optimization technique based on evolution strategies. By reducing the number of analytical robots from 14 to 10, it is possible to maintain existing average lead time. Through a reduction, the utilization of the robots increases, simultaneously the workload decreases for some of the staff. However, it is not possible to extend the durability of the system suitability test (SST), and still achieve existing average lead time. From the investigation of different parameters, it is found that, an added laboratory engineer at the high-performance liquid chromatography (HPLC) station has the best outcome on lead time and overall equipment effectiveness. However, a reduced laboratory engineer at the Minispice robots has the worst outcome. With the resources available today the lead times cannot be maintained in the long run, if the inflow is of 35 batch samples a week or more. By adding a laboratory engineer at the HPLC station and by using a SST with durability of 48 hours, the best outcome in terms of average lead time and number of batch samples with a lead time less than 10 days is received. / Symbicort Turbuhaler är en medicinsk enhet som tillverkas av läkemedelsföretaget AstraZeneca för behandling av astma och symptomen för kronisk obstruktiv lungsjukdom. Leveranssäkerheten för produkten är beroende av hela försörjningskedjans prestanda och som en del utav kedjan är resultaten från kvalitetskontrollen (QC) obligatoriska för att släppa en batch av produkten till marknaden. QCs prestanda är därför en viktig del av försörjningskedjan. För att minska risken för leveransproblem och produktbrist på marknaden är det viktigt att undersöka huruvida prestandan hos QC kan förbättras. Syftet med arbetet är att ge AstraZeneca vetenskapligt baserat data för att identifiera känsliga parametrar och justera arbetssätt för att förbättra prestandan hos QC. Målet med detta arbete är att kartlägga flödet av QC Symbicort Turbuhaler och konstruera en modell utifrån det flödet. Modellen är avsedd för att simulera och optimera olika parametrar, såsom inflödet av batchprover, utnyttjande av instrumentering och arbetsbelastning av personal. Genom att minska antalet analytiska robotar från 14 till 10, är det möjligt att bibehålla befintlig genomsnittlig ledtid. Genom denna minskning ökar utnyttjandet av robotarna, samtidigt som arbetsbelastningen minskar för en del av bemanningen. Det är inte möjligt att förlänga hållbarheterna på robotarnas systemtest (SST) och fortfarande uppnå befintlig genomsnittlig ledtid. Vid undersökning av olika parametrar indikerar resultatet att en ytterligare laboratorieingenjör vid högpresterande vätskekromatografi-stationen (HPLC) har den bästa effekten på ledtid och produktionseffektivitet. En laboratorieingenjör som reduceras från Minispice-robotarna har däremot den värsta effekten. Med de resurser som finns tillgängliga idag kan ledtiderna inte bibehållas långsiktigt om inflödet är 35 batchprover per vecka eller mer. Genom att addera en laboratorieingenjör vid HPLC-stationen och användaen SST med en hållbarhet på 48 timmar, erhålls det bästa resultatet i termer av genomsnittlig ledtid och antal batchprover som har en individuell ledtid på mindre än 10 dagar.
14

Discrete-Event Simulation for Hospital Resource Planning : Possibilities and Requirements

Steins, Krisjanis January 2010 (has links)
<p>The delivery of health care services has been under pressure due to limited funding and increasing demand. This has highlighted the need to increase not only the effectiveness but also the efficiency of health care delivery. Discrete-event simulation has been suggested as an analysis tool in health care management to support the planning of health care resources.</p><p>The overall purpose of this thesis is to investigate the possibilities and requirements for using discrete-event simulation in analyzing and planning the use of hospital resources. This is achieved by three case studies that focus on improvements in patient flow of emergency patients that require a radiology examination, intensive care unit capacity planning and operating room allocation strategies, respectively.</p><p>The first case investigates the current stage of digitization and process orientation in hospital care as a prerequisite for efficient process simulation and analysis. The study reveals an emergency-radiology patient flow process that is not very well measured and uncovers disparate information systems storing incompatible and fragmented data. These results indicate that the current degree of process orientation and the current IT infrastructure does not enable efficient use of quantitative process analysis and management tools like simulation.</p><p>In the second case the possibilities to develop generic hospital unit simulation models by building and validating a generic intensive care unit (ICU) model are explored. The results show that some of the modeling approaches described in literature cannot replicate the actual behavior observed in all studied ICUs. It is important to identify patient groups for different admission priorities, to account for over-utilizations in the model logic, and to discover and properly model dependencies in the input data. The research shows that it is possible to develop a generic ICU simulation model that could realistically describe the performance of different real ICUs in terms of occupancy, coverage and transfers.</p><p>The value of simulation modeling in health care management is examined in the third case through the development and use of a simulation model for optimal resource allocation and patient flow in a hospital operating department. The goal of the simulation modeling in this case was to identify bottlenecks in the patient flow and to try different alternatives for allocation of operating room capacity in order to increase the utilization of operating room resources. The final model was used to evaluate four different proposed changes to operating room time allocation.</p>
15

Evaluating Lean Manufacturing Proposals through Discrete Event Simulation – A Case Study at Alfa Laval

Detjens, Sönke, Flores, Erik January 2013 (has links)
In their strive for success in competitive markets companies often turn to Lean philosophy. However, for many companies Lean benefits are hard to substantialize especially when their ventures have met success through traditional manufacturing approaches. Traditional Lean tools analyze current situations or help Lean implementation. Therefore productions facilities require tools that enhance the evaluation of Lean proposals in such a way that decisions are supported by quantitative data and not only on a gut feeling. This thesis proposes how Discrete Event Simulation may be used as an evaluation tool in production process improvement to decide which proposal best suits Lean requirements. Theoretical and empirical studies were carried out. Literature review helped define the problem. A case study was performed at Alfa Laval to investigate through a holistic approach how and why did this tool provide a solution to the research questions. Case study analysis was substantiated with Discrete Event Simulation models for the evaluation of current and future state Lean proposals. Results of this study show that Discrete Event Simulation was not designed and does not function as a Lean specific tool. The use of Discrete Event Simulation in Lean assessment applications requires the organization to understand the principles of Lean and its desired effects. However, the use of traditional static Lean tools such as Value Stream Mapping and dynamic Discrete Event Simulation complement each other in a variety of ways. Discrete Event Simulation provides a unique condition to account for process variability and randomness. Both measurement of and reduction in variability through simulation provide insight to Lean implementation strategies.
16

Discrete event modelling and Simulation of an Assembly Line at GKN Driveline Köping AB

Yesilgul, Mustafa, Nasser, Firas January 2013 (has links)
Today’s economic conditions force companies and organizations to work more effectively in their processes due to different reasons.  Especially; after the Second World War, owing to the changing business perception and strong competition between companies, new terms such as productivity, flexible systems, efficiency, and lean came into industrial engineering discipline. However, these kinds of terms also brought a new question. How are they reached?  At that point, discrete event simulation has been used as an effective method to give an answer to this question. From this perspective; this project focuses on discrete event simulation and its role in real industrial processes. The main interest of this paper is discrete event simulation, but in this study we also tried to give some detailed information about other types of simulations such as continuous and discrete rate. Basically, we can say that this paper consists of several parts. In the beginning of this paper, the reader can find some theoretical information about simulation itself and the requirements for implementing it on real processes. Secondly, we tried to explain different types of simulations and the reason why we used discrete event simulation instead of continuous or discrete rate in our case study. Furthermore, one of the main areas of this research is to inform the reader about how computer support is used as a simulation tool by today’s companies. To do this, a powerful software, Extendsim8, is described in detail.  The reader is able to find all the information about how to create discrete event models in this software. In case study part, we are able to find the results of the five months work that we did between February and June at GKNDriveline Köping AB in Sweden. In these five months, we had been busy with analyzing an assembly line, collecting data, creating a simulation model, discussion with workers and engineers and doing some tests such as validation &amp; verification. In this part, the reader can find all the information about the production line and the simulation model. In conclusion, reader can find the results of the project at the end with the visualization of future state. As it will be discussed repeatedly in the paper, validation is one of the important steps in a simulation project. Therefore, in order to see the reliability of our simulation model, different calculations and tests were made. Last of all, some of results will be shown by graphs and tables in order to give better insight to reader.
17

Modelling a Manufacturing line : Analysis and Decision support based on Discrete Event Simulation

Ibrahim, Fady January 2011 (has links)
The increasing competition between the companies forces them to develop the production in a continuous manner in order to maintain the competitiveness in the global market, and became as efficient and effective as possible. This master thesis is conducted at Getrag All Wheel Drive Company which is one of the largest suppliers for transmissions and powertrain systems .This Company has worked actively for long time to improve the production flow at the manufacturing departments by using usual methods. Because of the high complex and intersected flow that the company has, the management intended to adopt another approach that takes dynamic information into consideration, therefore, building a simulation model is the solution, where according to Banks et al (2001) the simulation is a duplication of a real-world process or system and its behaviour as it progress during the time, which is a very useful method to evaluate complex systems, instead of using usual mathematical means used tools. The simulation model created by using Banks et al (2001) simulation methodology, and ExtendSim software help. The resulted model used as a tool that provides great assistance to the decision makers, in order to develop the Conwip system which applied in the manufacturing line under study, and to investigate “What if” scenarios. The result of this study obtained after performing two experiments, where the first experiment gives recommendation regarding the optimal upper bound of the total amount of work that can be used in Conwip system, with the use of sensitivity analysis, and the second experiment analyse the overall all effect on the system after separating the paths of high and low volume products. This project proves the powerful side of using the simulation in situations where it’s too hard or even impossible to improve the performance of a manufacturing line i.e. when large number of variables involved and affecting the system.
18

Parallel Discrete Event Simulation on Many Core Platforms Using Parallel Heap Event Queues

Tanniru, Govardhan 10 May 2014 (has links)
Discrete Event Simulation on GPUs employing parallel heap data structure is the focus of this thesis. Two traditional algorithms, one being conservative and other being optimistic, for parallel discrete event simulation have been implemented on GPUs using CUDA. The first algorithm is the safe-window algorithm (conservative). It has produced expected performance when compared to sequential simulation. The second algorithm, known as SyncSim, is an optimistic simulation algorithm previously designed to be space efficient and reduce rollbacks. This algorithm is re-implemented on GPU platform with necessary changes on the logic simulator and the parallel heap implementation. The performance of the parallel heap when working with a logic simulator has also been validated against the results indicated in previous research paper on parallel heap without the logic simulator.
19

PRODUCTION AND DISTRIBUTION PLANNING FOR DYNAMIC SUPPLY CHAINS USING MULTI-RESOLUTION HYBRID MODELS

Venkateswaran, Jayendran January 2005 (has links)
Today, there is little understanding of how local decisions and disturbances impact the global performance of the supply chain. In this research, we attempt to gain insight about such relationship using multi-resolution hybrid models. To this end, a novel hybrid architecture and methodology consisting of simulation (system dynamic and discrete-event) and optimization modules is proposed. The proposed methodology, applicable to general supply chains, is divided into fours stages: plan stability analysis (Stage I), plan optimization (Stages II), schedule optimization (Stage III) and concurrent decision evaluation (Stage IV). Functional and process models of the proposed architecture are specified using formal IDEF tools. A realistic three-echelon conjoined supply chain system characterized by communicative and collaborative (VMI) configurations is analyzed in this research. Comprehensive SD models of each player of the supply chain have been developed. General conditions of the stability (settings of control parameters that produce stable response) are derived using z-transformation techniques (Stage I), and insights into the behavior of the supply chain are gained. Next, a novel method for the integration of the stability analysis with performance analysis (optimization) is presented (Stage II) by employing the derived stability conditions derived as additional constraints within the optimization models. Next, in Stage III, the scheduling at each chain partner using discrete-event simulation (DES) modeling techniques is addressed. In Stage IV, the optimality of the SD control parameters (from Stage II) and DES operational policies (from Stage III) for each member are concurrently evaluated by integrating the SD and DES models. Evaluation in Stage IV is performed to better understand the global consequence of the locally optimal decisions determined at each supply chain member. A generic infrastructure has been developed using High Level Architecture (HLA) to integrate the distributed decision and simulation models. Experiments are conducted to demonstrate the proposed architecture for the analysis of distributed supply chains. The progressions of cost based objective function from Stages I-III are compared with that from the concurrent evaluation in Stage IV. Also the ability of the proposed methodology to capture the effect of dynamic perturbations within the supply chain system is illustrated.
20

On the performance of a manufacturing process with employee learning and turnover

Starchuk, Nathan Unknown Date
No description available.

Page generated in 0.1171 seconds