Spelling suggestions: "subject:"discreteevent simulationlation"" "subject:"discreteevent motionsimulation""
211 |
Optimizing Data Accesses for Scaling Data-intensive Scientific ApplicationsYeom, Jae-seung 30 May 2014 (has links)
Data-intensive scientific applications often process an enormous amount of data. The scalability of such applications depends critically on how to manage the locality of data. Our study explores two common types of applications that are vastly different in terms of memory access pattern and workload variation. One includes those with multi-stride accesses in regular nested parallel loops. The other is for processing large-scale irregular social network graphs. In the former case, the memory location or the data item accessed in a loop is predictable and the load on processing a unit work (an array element) is relatively uniform with no significant variation. On the other hand, in the latter case, the data access per unit work (a vertex) is highly irregular in terms of the number of accesses and the locations being accessed. This property is further tied to the load and presents significant challenges in the scalability of the application performance.
Designing platforms to support extreme performance scaling requires understanding of how application specific information can be used to control the locality and improve the performance. Such insights are necessary to determine which control and which abstraction to provide for interfacing an underlying system and an application as well as for designing a new system. Our goal is to expose common requirements of data-intensive scientific applications for scalability.
For the former type of applications, those with regular accesses and uniform workload, we contribute new methods to improve the temporal locality of software-managed local memories, and optimize the critical path of scheduling data transfers for multi-dimensional arrays in nested loops. In particular, we provide a runtime framework allowing transparent optimization by source-to-source compilers or automatic fine tuning by programmers. Finally, we demonstrate the effectiveness of the approach by comparing against a state-of-the-art language-based framework. For the latter type, those with irregular accesses and non-uniform workload, we analyze how the heavy-tailed property of input graphs limits the scalability of the application. Then, we introduce an application-specific workload model as well as a decomposition method that allows us to optimize locality with the custom load balancing constraints of the application. Finally, we demonstrate unprecedented strong scaling of a contagion simulation on two state-of-the-art high performance computing platforms. / Ph. D.
|
212 |
Designing Simulation Model for Open-Pit Mining Charging InfrastructureFredin, Alex, Pogén, Mattias January 2024 (has links)
The emission-heavy open-pit mining industry is exploring a new emission-free era with electrified autonomous haulers. The new technology has challenges, such as designing charging infrastructure for complex environments. This thesis addresses these challenges with a simulation model. This thesis aims to create a foundation and enhance decision-making for designing and evaluating charging infrastructure for Volvo Autonomous Solutions. The paper investigates various charging infrastructure designs using a simulation model verified by Volvo Autonomous Solutions employees, prioritizing minimizing the total cost of ownership and simultaneously meeting the production goal. The project consists of phases such as Needfinding and Research, Modeling, Simulation, and Validation, each aligned with specific activities inspired by Design Thinking principles. The simulation model is developed iterative with a specific site as the case. Several meetings were conducted with the co-advisor and supervisor to validate the simulation model. The thesis has developed a simulation model for charging infrastructures specifically designed for open-pit quarries. The simulation model was developed to make changes to the system easily. Adaptiveness is needed for the constantly changing environment or testing a new open-pit quarry. How to design charging infrastructure for open-pit quarries was found to be a research gap, and there is also an industrial need for guidance on how it should be designed. The scientific contribution concludes that important factors are important in designing charging infrastructure. Whereas the industrial contribution concludes, the simulation model is configured. / Den utsläppstunga dagbrottsindustrin utforskar en ny utsläppsfri era med elektrifierade autonoma dumprar. Den nya tekniken har utmaningar, som att designa laddinfrastruktur för komplexa miljöer. Denna avhandling tar uppdessa utmaningar med en simuleringsmodell. Examensarbetet syftar till att skapa en grund och förbättra beslutsfattandet för design och utvärdering av laddinfrastruktur för Volvo Autonomous Solutions. Uppsatsen undersöker olika designer av laddinfrastruktur med hjälp av en simuleringsmodell som verifierats av anställda på Volvo Autonomous Solutions, med prioritet att minimera den totala ägandekostnaden och samtidigt uppfylla produktionsmålet. Projektet bestod av faser som Needfinding och Research, Modellering, Simulering och Validering, var och en anpassad till specifika aktiviteter inspirerade av Design Thinking principer. Simuleringsmodellen utvecklades iterativt med ett specifik dagbrott som fall. Flera möten genomfördes med företagshandledare och handledare för att validera simuleringsmodellen. Avhandlingen har utvecklat en simuleringsmodell för laddningsinfrastruktur speciellt utformad för dagbrott. Simuleringsmodellen utvecklades för att enkelt kunna göra ändringar i systemet. Anpassningsförmåga behövs för den ständigt föränderliga miljön eller för att testa ett nytt dagbrott. Hur man utformar laddinfrastruktur för dagbrott visade sig vara ett forskningsgap och det finns också ett industriellt behov av vägledning om hur den ska utformas. Det vetenskapliga bidraget drar slutsatser om viktiga faktorer vid utformningen av laddinfrastruktur. Medan det industriella bidraget drar slutsatser om konfigurationen av simuleringsmodellen.
|
213 |
Information-theoretic and stochastic methods for managing the quality of service and satisfaction in healthcare systemsKomashie, Alexander January 2010 (has links)
This research investigates and develops a new approach to the management of service quality with the emphasis on patient and staff satisfaction in the healthcare sector. The challenge of measuring the quality of service in healthcare requires us to view the problem from multiple perspectives. At the philosophical level, the true nature of quality is still debated; at the psychological level, an accurate conceptual representation is problematic; whilst at the physical level, an accurate measurement of the concept still remains elusive to practitioners and academics. This research focuses on the problem of quality measurement in the healthcare sector. The contributions of this research are fourfold: Firstly, it argues that from the technological point of view the research to date into quality of service in healthcare has not considered methods of real-time measurement and monitoring. This research identifies the key elements that are necessary for developing a real-time quality monitoring system for the healthcare environment.Secondly, a unique index is proposed for the monitoring and improvement of healthcare performance using information-theoretic entropy formalism. The index is formulated based on five key performance indicators and was tested as a Healthcare Quality Index (HQI) based on three key quality indicators of dignity, confidence and communication in an Accident and Emergency department. Thirdly, using an M/G/1 queuing model and its underlying Little’s Law, the concept of Effective Satisfaction in healthcare has been proposed. The concept is based on a Staff-Patient Satisfaction Relation Model (S-PSRM) developed using a patient satisfaction model and an empirically tested model developed for measuring staff satisfaction with workload (service time). The argument is presented that a synergy between patient satisfaction and staff satisfaction is the key to sustainable improvement in healthcare quality. The final contribution is the proposal of a Discrete Event Simulation (DES) modelling platform as a descriptive model that captures the random and stochastic nature of healthcare service provision process to prove the applicability of the proposed quality measurement models.
|
214 |
An analysis of the integrated mechanical diagnostics health and usage management system on rotor track and balanceRevor, Mark S. 06 1900 (has links)
Approved for public release, distribution is unlimited / This thesis is concerned with the operational benefit of the Integrated Mechanical Diagnostics Health and Usage Management Systems (IMD HUMS) rotor track and balance (RTB) functionality. The questions addressed are whether there is a savings in flight hours expended on functional check flights (FCF's) when compared to present practices, if there will there be a reduction in directed maintenance man-hours (DMMH) spent on maintenance related to the rotor system, and the impact on Operational Availability. Experiments were conducted using a discrete event simulation model of squadron flight operations and organizational level maintenance. The simulation is generic and can be used in the analysis of other helicopters. Input parameters governing the distributions of maintenance action inter-arrival times were estimated from Naval Aviation Logistics Data Analysis (NALDA) databases and squadron experiences on such systems. The analysis suggests that flight hours spent in FCF are dependent upon vibration growth rate, an unknown quantity, and the maintenance policy for rotor smoothing. Directed maintenance man-hours decrease with increasing numbers of IMD HUMS configured aircraft and further gains are achieved with a maintenance policy suited to a continuous monitoring system. / Captain, United States Marine Corps
|
215 |
Optimisation du développement de nouveaux produits dans l'industrie pharmaceutique par algorithme génétique multicritère / Multiobjective optimization of New Product Development in the pharmaceutical industryPerez Escobedo, José Luis 03 June 2010 (has links)
Le développement de nouveaux produits constitue une priorité stratégique de l'industrie pharmaceutique, en raison de la présence d'incertitudes, de la lourdeur des investissements mis en jeu, de l'interdépendance entre projets, de la disponibilité limitée des ressources, du nombre très élevé de décisions impliquées dû à la longueur des processus (de l'ordre d'une dizaine d'années) et de la nature combinatoire du problème. Formellement, le problème se pose ainsi : sélectionner des projets de Ret D parmi des projets candidats pour satisfaire plusieurs critères (rentabilité économique, temps de mise sur le marché) tout en considérant leur nature incertaine. Plus précisément, les points clés récurrents sont relatifs à la détermination des projets à développer une fois que les molécules cibles sont identifiées, leur ordre de traitement et le niveau de ressources à affecter. Dans ce contexte, une approche basée sur le couplage entre un simulateur à événements discrets stochastique (approche Monte Carlo) pour représenter la dynamique du système et un algorithme d'optimisation multicritère (de type NSGA II) pour choisir les produits est proposée. Un modèle par objets développé précédemment pour la conception et l'ordonnancement d'ateliers discontinus, de réutilisation aisée tant par les aspects de structure que de logique de fonctionnement, a été étendu pour intégrer le cas de la gestion de nouveaux produits. Deux cas d'étude illustrent et valident l'approche. Les résultats de simulation ont mis en évidence l'intérêt de trois critères d'évaluation de performance pour l'aide à la décision : le bénéfice actualisé d'une séquence, le risque associé et le temps de mise sur le marché. Ils ont été utilisés dans la formulation multiobjectif du problème d'optimisation. Dans ce contexte, des algorithmes génétiques sont particulièrement intéressants en raison de leur capacité à conduire directement au front de Pareto et à traiter l'aspect combinatoire. La variante NSGA II a été adaptée au problème pour prendre en compte à la fois le nombre et l'ordre de lancement des produits dans une séquence. A partir d'une analyse bicritère réalisée pour un cas d'étude représentatif sur différentes paires de critères pour l'optimisation bi- et tri-critère, la stratégie d'optimisation s'avère efficace et particulièrement élitiste pour détecter les séquences à considérer par le décideur. Seules quelques séquences sont détectées. Parmi elles, les portefeuilles à nombre élevé de produits provoquent des attentes et des retards au lancement ; ils sont éliminés par la stratégie d'optimistaion bicritère. Les petits portefeuilles qui réduisent les files d'attente et le temps de lancement sont ainsi préférés. Le temps se révèle un critère important à optimiser simultanément, mettant en évidence tout l'intérêt d'une optimisation tricritère. Enfin, l'ordre de lancement des produits est une variable majeure comme pour les problèmes d'ordonnancement d'atelier. / New Product Development (NPD) constitutes a challenging problem in the pharmaceutical industry, due to the characteristics of the development pipeline, namely, the presence of uncertainty, the high level of the involved capital costs, the interdependency between projects, the limited availability of resources, the overwhelming number of decisions due to the length of the time horizon (about 10 years) and the combinatorial nature of a portfolio. Formally, the NPD problem can be stated as follows: select a set of R and D projects from a pool of candidate projects in order to satisfy several criteria (economic profitability, time to market) while copying with the uncertain nature of the projects. More precisely, the recurrent key issues are to determine the projects to develop once target molecules have been identified, their order and the level of resources to assign. In this context, the proposed approach combines discrete event stochastic simulation (Monte Carlo approach) with multiobjective genetic algorithms (NSGA II type, Non-Sorted Genetic Algorithm II) to optimize the highly combinatorial portfolio management problem. An object-oriented model previously developed for batch plant scheduling and design is then extended to embed the case of new product management, which is particularly adequate for reuse of both structure and logic. Two case studies illustrate and validate the approach. From this simulation study, three performance evaluation criteria must be considered for decision making: the Net Present Value (NPV) of a sequence, its associated risk defined as the number of positive occurrences of NPV among the samples and the time to market. Theyv have been used in the multiobjective optimization formulation of the problem. In that context, Genetic Algorithms (GAs) are particularly attractive for treating this kind of problem, due to their ability to directly lead to the so-called Pareto front and to account for the combinatorial aspect. NSGA II has been adapted to the treated case for taking into account both the number of products in a sequence and the drug release order. From an analysis performed for a representative case study on the different pairs of criteria both for the bi- and tricriteria optimization, the optimization strategy turns out to be efficient and particularly elitist to detect the sequences which can be considered by the decision makers. Only a few sequences are detected. Among theses sequences, large portfolios cause resource queues and delays time to launch and are eliminated by the bicriteria optimization strategy. Small portfolio reduces queuing and time to launch appear as good candidates. The optimization strategy is interesting to detect the sequence candidates. Time is an important criterion to consider simultaneously with NPV and risk criteria. The order in which drugs are released in the pipeline is of great importance as with scheduling problems.
|
216 |
Simulation et optimisation du transport automatise dans la fabrication de semi-conducteurs. / Simulating and Optimizing Automated Transportation in Semiconductor ManufacturingKiba, Téwendé Jean-Etienne Arthur 10 November 2010 (has links)
Composants essentiels de tout ordinateur, les semi-conducteurs sont utilisés dans de nombreux secteurs. Les percées technologiques dans ce domaine imposent un rythme vertigineux aux industriels. Tous les deux ans environ, la capacité des puces est doublée et leur prix est divisé par deux. Le diamètre des plaquettes de silicium augmente et, regroupées en lots, les plaquettes sont plus lourdes à transporter. Les systèmes automatiques de transport (AMHS) se présentent comme une excellente alternative. Le prix très élevé des équipements de production fait que l’objectif est de saturer les capacités de production. Pour y parvenir, il est important que le système de transport et de stockage garantisse que les machines n’attendent pas des lots disponibles, et aussi que les lots n’attendent pas une machine disponible.Dans la littérature, la complexité du problème (jusqu’à 700 étapes de fabrication, flux réentrants, etc.) fait que les études de modélisation du transport se font avec de fortes hypothèses sur la production ou inversement. Pourtant, le transport est un service pour la production. Cette thèse propose une approche de modélisation permettant d'intégrer le plus fidèlement possible les contraintes de transport, production et stockage, afin d'améliorer les indicateurs clés de la production. Une analyse détaillée du système a permis de construire un modèle de simulation à événements discrets. Enfin, après une validation industrielle, l'étude complète du modèle a permis d'analyser les paramètres critiques de gestion du transport. Les résultats permettent une meilleure compréhension du système et mettent en exergue d'intéressantes perspectives de recherche. / Essential components of all computers, semiconductors are widely used in many sectors. Quick advances in these technologies force a challenging rhythm to manufacturers. Following the Moore’s Law, chip capacity doubles approximately every two years and prices are divided by two. Thus, the increase of the diameter of wafers to 300 mm makes them heavier to transport in lots. Automated Material Handling Systems (AMHS) are an excellent alternative to tackle this problem. Because of the high price of production equipment, the goal is to use production capacity as much as possible while avoiding to keep too much inventory. To reach this goal, it is important that the transportation and storage system ensures that machines do not wait for available lots, and also that lots do not wait for available machines. In the literature, due to the problem complexity (up to 700 steps, re-entrant flows, etc.), the modeling of transportation is made with strong assumptions on the production or vice-versa. However, transport is a service for production. This thesis aims at providing a modeling approach that allows the integration in details transport, production and storage constraints, in order to improve production key indicators. A detailed understanding of the system allows us to build of a discrete event simulation model which is, in our point of view, the best compromise between the necessary level of details and computational times. Finally, through industrial validations, the complete study of the model allows critical parameters of transport management to be analyzed. The results help to get a better understanding of the system and open interesting research perspectives.
|
217 |
Using discrete event simulation : Improving efficiency and eliminating nonvalue added workJanius, Camilla, Mir, Sahel January 2016 (has links)
Process improvement is one of the challenging tasks within manufacturing companies. This study has been focused on analysing a packaging station by using a discrete event simulation tool. Packaging is an important part of the production and logistics process, but it is seldom considered when analysing non-value added activities. Discrete event simulation has been used in the analysis of non-value added activities in production systems, but noted by the low number of articles related to the usage of discrete event simulation within packaging, there exists a limited understanding of discrete event simulation use in this area. The authors divided the scope of the research into the following research questions, which are presented below: RQ1: How can discrete event simulation be used as a tool to identify time wastes and create efficiency in a packaging station? RQ2: What method is suitable when creating a simulation project? These questions were to be answered by performing a literature review and a case study in ABB AB Control Products Vasteras, mentioned as ABB in later in the thesis, where the packaging station were in need of improvements. The results from theoretical and empirical finding were analysed, they highlight the importance of packaging and its impact on logistics and supply chain management performance. By creating discrete event simulation models for both current and future stage, the authors were able to provide analysed improvements of the packaging station. The result of the models illustrated by implementing the improvements it could generate in less pressure on the operators as well as an approximated improvement of 125% more packed product. The improvements of the model involve a better material handling and a more optimized packaging station in order to create a more efficient workstation. The conclusion of the study is that the company should develop the product simultaneously as the production, were every activity and process should be included. They should also consider what impacts the development has on the entire supply chain. This could be a way to eliminate non-value activities from the start. Discrete event simulation is a tool that could be of help when visualizing the process and it allows the developers to see the impact of a change or improvement on the other processes.
|
218 |
Modeling the effect of resident learning curve in the emergency departmentRichards, Robert Michael January 1900 (has links)
Master of Science / Department of Industrial and Manufacturing Systems Engineering / Chih-Hang John Wu / The University of Kansas Medical Center’s Emergency Department is adopting a new residency program. In the past, generalized Residents have supported attending physicians during a required three month rotation in the Emergency Department. As of July 2010, the University of Kansas Medical Center’s Emergency Department has switched to a dedicated Emergency Medicine Residency program that allows recently graduated physicians the opportunity enter the field of Emergency Medicine. This thesis shows that although not initially a dedicated residency program provides an advantage to the Emergency Department. Discrete Event Simulations have been used to predict changes in processes, policies, and practices in many different fields. The models run quickly, and can provide a basis for future actions without the cost of actually implementing changes in policies or procedures. This thesis applies a learning curve in a Simulation Model in order to provide data that the University of Kansas Medical Center’s Emergency Department can utilize to make decisions about their new Residency Program. A generalized learning curve was used for the base model and compared to all alternatives. When it was compared with an alternative curve following a Sigmoid Function (Logistic Function), there were no significant differences. Ultimately, a Gompertz Curve is suggested for hospitals attempting to develop or improve their residency programs using learning curves because it is easily fitted to their desired shape. This thesis shows the effect that Residents have on the performance of the Emergency Department as a whole. The two major components examined for the generalized learning curve were the initial position for first year residents determined by the variable [alpha], and the shape of the curve determined by the variable [beta]. Individual changes the value of [alpha] had little effect. Varying values of [beta] have shown that smaller values elongate the shape of the curve, prolonging the amount of time it takes for a resident to perform at the level of the attending physician. Each resident’s personal value of [beta] can be used to evaluate the performance in the emergency department. Resident’s who’s [beta] value are smaller the emergency department’s expected value might have trouble performing.
|
219 |
Processutveckling : En fallstudie om VSM kombinerat med diskret händelsestyrd simulering kan effektivisera lastningsprocessen hos en logistikverksamhets hamnterminalBertilsson, Jonathan, Bodin, Kristoffer January 2017 (has links)
För att upprätthålla sin konkurrenskraft krävs det idag att ett fokus läggs på att ständigt förbättra sina processer inom organisationen. Detta faller sig logiskt, men att ta fram underlag för vilka förbättringsåtgärder som bör implementeras till processen kan vara svårt. Syftet med denna studie är att undersöka hur value stream mapping (VSM) tillsammans med diskret händelsestyrd simulering (DES) kan användas inom lastningsprocesser hos hamnliggande logistikverksamheter för att kartlägga processen, identifiera flaskhalsar samt att undersöka det teoretiska utfallet vid implementation av potentiella förbättringsåtgärder. Svaret på detta genererades via en fallstudie på SCA Logistics där en lastningsprocess analyserades från ett av deras lager som start till slutet av processen när produkten är lastad på ett fartyg. Först skapades en VSM samt en DES-modell av nuläget. Nulägets VSM bidrog till en översikt över den aktuella processen och till att validera DES-modellen. Därefter med Lean-principer som grundpelare och med hjälp av intervjuer kunde tre förbättringsåtgärder identifieras. Dessa förbättringsåtgärder applicerades till DES-modellen en i taget för att undersöka hur de påverkar systemet. De två framstående förbättringsåtgärderna testades även ihop. Resultatet visade på att processen blev mer effektiv, i form av kortare ledtid, och två nya VSM togs fram för att bidra med ytterligare visualiserande information. Åtgärderna har andra spärrar i praktiken som inte visas via varken DES eller VSM såsom höga kostnader eller hinder i lagerförhållanden. Undersökningen ledde till den slutsatsen att VSM kan användas som komplement till DES för att bidra med ett teoretiskt underlag för huruvida en förbättringsåtgärd bör implementeras. Som rekommendation bör även en kostnadsnyttoanalys genomföras för att undersöka huruvida nyttan uppväger för kostnaden.
|
220 |
Une méthodologie pour modéliser et optimiser la mutualisation du transport ferroviaire urbain de marchandises et de passagers / A modeling methodology to introduce freight into urban passenger rail networkBehiri, Walid 13 December 2017 (has links)
Malgré la prédominance actuelle du mode routier, pour le transport de marchandises en milieu urbain, une alternative durable est nécessaire, au vu des enjeux environnementaux et sociétaux. Dans cette thèse, nous proposons l’étude d’une des perspectives possibles, pour absorber une partie de ce flux de marchandises toujours plus dense, en utilisant le réseau ferroviaire urbain, initialement dédié aux voyageurs. Une méthodologie intégrant le fret dans ce dernier est proposée, avec comme première étape, l'identification et la classification de tous les niveaux de mixité fret / voyageurs possibles. Le niveau le plus contraint est retenu, car sa faisabilité induira celle des autres. Notre seconde contribution est relative à une approche par décomposition du problème d’insertion du flux de fret en plusieurs sous-problèmes interdépendants, selon les trois horizons temporels (long, moyen et court). Dans le but d’évaluer la capacité du système global, à absorber un flux supplémentaire de nature différente, le problème de détermination du meilleur plan de transport des marchandises est identifié comme central et critique. La troisième contribution concerne la simulation du système de transport, puis sa formalisation par un PL en variables mixtes, pour affecter chaque commande à un train, en déterminant le moment auquel elle sera chargée et en minimisant les temps d’attente cumulés des commandes. Plusieurs variantes de colonies de fourmis sont développées, pour la résolution d’instances de grande taille. La quatrième contribution concerne le couplage du modèle de simulation, qui permet l’évaluation des performances de cette nouvelle solution de transport, avec les différents algorithmes optimisant le plan de transport. Enfin, nous proposons une approche de replanification par horizon glissant, pour absorber les perturbations de la demande, en minimisant les changements du plan de transport / Urban freight transport is almost exclusively carried out by truck. Beyond the drawbacks caused in the city, this transport mode is nearly saturated. This study discusses an alternative way of transporting freight by using urban rail infrastructure. The first contribution deals with the identification and classification of all different sharing possibilities of mixing freight with passenger’s traffic using rail network. The second contribution is the definition of global freight/passenger transport problem, which is decomposed into several optimization interdependent sub-problems with different temporal decision horizon. In order to show the capacity of the global system to absorb an additional flow with different nature, the Freight Rail Transport Schedule Problem “FRTSP” is identified as the bottleneck of transportation system and is formalized with MIP model. As third contribution, this problem determines train and loading time for each demand to be assigned respecting several constraints while minimizing total waiting time. The fourth contribution deals with a discrete event simulation approach, which studies this alternative and validates several proposed decision algorithms. Finally, the fifth contribution consists in a dynamic approach based on a rolling horizon, which is proposed in order to update the initial plan. The updated plan allows to determine a new assignment regarding new demand such as the modifications from the previous plan are minimized
|
Page generated in 0.1252 seconds