Spelling suggestions: "subject:"discreteevent simulationlation"" "subject:"discreteevent motionsimulation""
151 |
New Framework for Real-time Measurement, Monitoring, and Benchmarking of Construction Equipment EmissionsHeidari Haratmeh, Bardia 29 June 2014 (has links)
The construction industry is one of the largest emitters of greenhouse gases and health-related pollutants. Monitoring and benchmarking emissions will provide practitioners with information to assess environmental impacts and improve the sustainability of construction. This research focuses on real-time measurement of emissions from non-road construction equipment and development of a monitoring-benchmarking tool for comparison of expected vs. actual emissions. First, exhaust emissions were measured using a Portable Emission Measurement System (PEMS) during the operation of 18 pieces of construction equipment at actual job sites. Second-by-second emission rates and emission factors for carbon dioxide, carbon monoxide, nitrogen oxides, and hydrocarbons were calculated for all equipment. Results were compared to those of other commonly used emission estimation models. Significant differences in emission factors associated with different activities were not observed, except for idling and hauling. Moreover, emission rates were up to 200 times lower than the values estimated using EPA and California Air Resources Board (CARB) guidelines. Second, the resulting database of emissions was used in an automated, real-time environmental assessment system. Based on videos of actual construction activities, this system enabled real-time action recognition of construction operations. From the resulting time-series of activities, emissions were estimated for each piece of equipment and differed by only 2% from those estimated by manual action recognition. Third, the actual emissions were compared to estimated ones using discrete event simulation, a computational model of construction activities. Actual emissions were 28% to 144% of those estimated by manual action recognition. Results of this research will aid practitioners in implementing strategies to measure, monitor, benchmark, and possibly reduce air pollutant emissions stemming from construction. / Master of Science
|
152 |
HEMLOCK: HEterogeneous ModeL Of Computation Kernel for SystemCPatel, Hiren Dhanji 15 December 2003 (has links)
As SystemC gains popularity as a System Level Design Language (SLDL) for System-On-Chip (SOC) designs, heterogeneous modelling and efficient simulation become increasingly important. The key in making an SLDL heterogeneous is the facility to express different Models Of Computation (MOC). Currently, all SystemC models employ a Discrete-Event simulation kernel making it difficult to express most MOCs without specific designer guidelines. This often makes it unnatural to express different MOCs in SystemC. For the simulation framework, this sometimes results in unnecessary delta cycles for models away from the Discrete-Event MOC, hindering the simulation performance of the model. Our goal is to extend SystemC's simulation framework to allow for better modelling expressiveness and efficiency for the Synchronous Data Flow (SDF) MOC. The SDF MOC follows a paradigm where the production and consumption rates of data by a function block are known a priori. These systems are common in Digital Signal Processing applications where relative sample rates are specified for every component. Knowledge of these rates enables the use of static scheduling. When compared to dynamic scheduling of SDF models, we experience a noticeable improvement in simulation efficiency. We implement an extension to the SystemC kernel that exploits such static scheduling for SDF models and propose designer style guidelines for modelers to use this extension. The modelling paradigm becomes more natural to SDF which results to better simulation efficiency. We will distribute our implementation to the SystemC community to demonstrate that SystemC can be a heterogeneous SLDL. / Master of Science
|
153 |
Accelerating Hardware Simulation on Multi-coresNanjundappa, Mahesh 04 June 2010 (has links)
Electronic design automation (EDA) tools play a central role in bridging the productivity gap for designing complex hardware systems. However, with an increase in the size and complexity of today's design requirements, current methodologies and EDA tools are unable to effectively mitigate the further widening of productivity gap. It is estimated that testing and verification takes 2/3rd of the total development time of complex hardware systems. Functional simulation forms the main stay of testing and verification process and is the most widely used technique for testing and verification. Most of the simulation algorithms and their implementations are designed for uniprocessor systems that cannot easily leverage the parallelism in multi-core and GPU platforms. For example, logic simulation often uses levelized sequential algorithms, whereas the discrete-event simulation frameworks for Verilog, VHDL and SystemC employ concurrency in the form of multi-threading to given an illusion of the inherent parallelism present in circuits. However, the discrete-event model of computation requires a global notion of an event-queue, which makes improving its simulation performance via parallelization even more challenging. This work investigates automatic parallelization of simulation algorithms used to simulate hardware models. In particular, we focus on parallelizing the simulation of hardware designs described at the RTL using SystemC/HDL with examples to clearly describe the parallelization. Even though multi-cores and GPUs other parallelism, efficiently exploiting this parallelism with their programming models is not straightforward. To overcome this, we also focus our research on building intelligent translators to map simulation applications onto multi-cores and GPUs such that the complexity of the low-level programming models is hidden from the designers. / Master of Science
|
154 |
PTP Simulator for evaluating Best TimeTransmitter Clock AlgorithmsWiggman, Hugo, Kjellén, Jonathan January 2024 (has links)
In a distributed real-time system there is often a need for time synchronization of the network’s nodes. In the telecommunications industry, this is exemplified by the implementation of 5G New Radio (NR) that uses Time Devision Duplexing (TDD) communication between the user equipment and base station in a Radio Access Network (RAN). To enable communication there exists a strict requirement of no larger error than 3 μs set by the 3rd Generation PartnershipProject (3GPP). Originally made to synchronize lab equipment, the Precision Time Protocol (PTP) has been adapted for the telecom industry’s needs. The Institute of Electrical and Electronics Engineers (IEEE) in the 1588a-2023 amendment demonstrates how PTP will not create the ideal timeTransmitter-timeReceiver hierarchy in two case studies. To address these problems a modification of the Best TimeTransmitter Clock Algorithm (BTCA) is necessary. The amendment introduces the Enhanced Accuracy Metrics TLV as an optional feature. This featureenables each clock to gain knowledge about the inaccuracy of time from its neighboring clocks.The standard BTCA does not utilize the information from this new feature in its decision mechanism. This thesis presents four new alternate BTCA and to evaluate the algorithms a network simulator was built focusing on the timeTransmitter-timeReceiver hierarchy. The simulator is proven to be useful and in a case study, the four new BTCAs are evaluated where two of them solve the issues mentioned in the 1588a-2023 amendment.
|
155 |
Simulering och modellering av produktionsflöden i tung fordonsinsdustrin : Buffertnivåer efter 50/50-scenario av elektriska - och konventionella drivenheterPetersson, Märta, Kraft, Adelia January 2024 (has links)
In the coming decades, many automotive industries will undergo changes where the electrification of vehicles will play a significant role. In order for a manufacturing company to be able to compete in the market, constant adaptation is needed to maintain competitiveness. To make companies more efficient, tools as Lean production is used together with several digital technologies to maintain a competitiveness and facilitate possible changes. Simulation can be used to control and develop future production and answer important questions about about future challenges.The purpose of this study is to investigate, through discrete event simulation, the needs and conditions that a scenario with 50% conventional and 50% electric driveunits will introduce to Scania’s transmission production. The purpose was partially fulfilled through a preliminary study to provide a basis for the upcoming simulation models. This preliminary study identified three critical buffers, essential for future production planning. Therefore, three simulation models were established, focusing on the total number of buffer slots. The result of this study presents three proposals for the number of slots for each buffer, as well as the number of stations for the testing and repair of the electric units. Given that a high volume of products alters the need for space requirements, it was crucial to describe the capacity of these buffers and the potential of their stations. To fully leverage this study and the three simulation models, a more further and detailed data collection on several production processes is recommended.
|
156 |
Simulation-based Optimisation of a Finished Goods Inventory : A qualitative and quantitative capacity analysis using discrete event simulation / Simuleringsbaserad optimering av ett färdigvarulager : En kvalitativ och kvantitativ kapacitetsanalys med hjälp av diskret händelsessimuleringBarnes, Patrik, Malmbom, Kevin January 2024 (has links)
In this thesis, we have conducted a holistic capacity analysis of Scania's Gearbox and Axle finished goods inventory, as they transition toward electrified mobility and have expectations of higher production volumes. Due to an increase in the volumes, the current capacity of the inventory is expected to not be enough. The purpose of this thesis is to analyse Scania's inventory, from a current and future perspective using a simulation approach and identify ways of managing the increased new production demands. Two objectives are answered in order to get a holistic capacity analysis. The first deals with an organisational perspective and how the organisation and way of working impact the capacity of the process. The goal is to identify barriers in the way of working in order to achieve more efficient production. By conducting semi-structured interviews with co-workers and managers, we gather insight into current problems and improvement areas in the production and IT systems in the inventory. We can also understand how communication flows between customers and suppliers in the internal supply chain and what effect communication has on the FVL process. The second objective deals with how storage time policies affect the production and inventory levels. With the help of discrete event simulation and both current and future production data, we find scenarios where inventory levels are within the limits of the inventory capacity by adjusting the time policies. Our results indicate that by adjusting storage time policies and having efficient production, the capacity of the finished goods inventory is sufficient for future demand scenarios. However, there is a strong connection between organisational aspects and capacity, and in order to handle higher volumes, factors such as communication, IT systems and ways of working need to be improved. Lastly, we provide recommendations for dealing with these issues, such as reducing lay time in inventory, changing the shift schedule for operators 2 and 3, including late order handling in the warehouse management systems, and consolidating the different IT systems into one. / I denna avhandling har vi genomfört en holistisk kapacitetsanalys av Scanias färdigvarulager för växellådor. Eftersom Scania aktivt arbetar med att introducera nya eldrivna produkter samtidigt som de fortsätter produktionen av fossildrivna produkter, förväntas produktionsvolymerna öka. Vid ökade volymer uppstår förväntningar om att kapaciteten i lagret inte kommer att räcka till. Syftet med denna avhandling är att analysera Scanias lager ur både ett nuvarande och ett framtida perspektiv. För att få en holistisk analys besvarar vi två frågeställningar. Det första handlar om ett organisationsperspektiv och hur organisationen och arbetssättet påverkar kapaciteten i processen. Målet är att identifiera hinder i arbetssättet för att uppnå en effektivare produktion. Genom att genomföra semistrukturerade intervjuer med medarbetare och chefer kan vi få insikt i nuvarande problem och förbättringsområden i produktionen, samt IT-system som används i lagret. Vi kan också förstå hur kommunikationen flyter mellan kunder och leverantörer i den interna leveranskedjan och vilken effekt kommunikationen har på FVL-processen. Det andra fokuset ligger på hur liggtiden av produkter påverkar lagernivån. Genom att använda diskret händelsesimulering så kan vi med hjälp av nuvarande och framtida produktionsdata, identifiera scenarier där lagernivå ligger inom gränserna för lagerkapaciteten genom att justera liggtiden. Från våra resultat finner vi att genom att minska liggtiden är kapaciteten av lagret tillräcklig för framtida volymökningar. Det finns dock en stark koppling mellan organisatoriska aspekter och kapacitet, och för att hantera högre volymer behöver faktorer som kommunikation, IT-system och arbetssätt förbättras. Slutligen ger vi rekommendationer för att hantera dessa frågor, såsom att minska liggande tid i lagret, ändra skiftschema för operatör 2 och 3, inkludera hantering av sena order i lagerhanteringssystemen, samt konsolidera de olika IT-systemen till ett enda.
|
157 |
Developing a Course Enrollment Simulation Model to Improve College Graduation OutcomesStraney, Rachel 01 January 2024 (has links) (PDF)
The process of enrolling and completing the courses needed to earn an undergraduate degree involves complex interactions between individual students and institutional policies and procedures, especially because student and institutional priorities do not always align. Traditional social and behavioral statistical methods are ineffective for modeling these interactions. Simulation and algorithm-based modeling approaches have been underutilized in higher education, but their adaptability can accommodate the complexity of the degree attainment process. The purpose of this research was to design, develop, validate, and apply a multi-method Course Enrollment Simulation Model (CESM), which mirrored the process of college students enrolling in courses required for a specific undergraduate degree program. Simulated output from the model included graduation outcomes, like six-year graduation rates and average terms for students to obtain the degree, which are metrics commonly tracked by institutions of higher education. As proof of concept, data from a Fall 2015 student cohort and graduation requirements for an undergraduate computer science program at a large public university were used to create and test the CESM. The model integrated elements of discrete event simulation, agent-based modeling, and microsimulation methods into one architecture. Monte Carlo experiments were used to assess the validity of the model, which was more accurate than comparable inferential statistics. Finally, the CESM was used to evaluate summer enrollment policy options intended to improve graduation outcomes in a computer science program, finding that the effectiveness of the proposed policies depended on student factors as well as course requirements of the degree program. This dissertation is formatted as a collection of three studies, each organized into a publishable manuscript.
|
158 |
Resource Allocation and Process Improvement of Genetic Manufacturing SystemsPurdy, Gregory T. 21 November 2016 (has links)
Breakthroughs in molecular and synthetic biology through de novo gene synthesis are stimulating new vaccines, pharmaceutical applications, and functionalized biomaterials, and advancing the knowledge of the function of cells. This evolution in biological processing motivates the study of a class of manufacturing systems, defined here as genetic manufacturing systems, which produce a final product with a genetic construct. Genetic manufacturing systems rely on rare molecular events for success, resulting in waste and repeated work during the deoxyribonucleic acid (DNA) fabrication process. Inspection and real time monitoring strategies are possible as mitigation tools, but it is unclear if these techniques are cost efficient and value added for the successful creation of custom genetic constructs.
This work investigates resource allocation strategies for DNA fabrication environments, with an emphasis on inspection allocation. The primary similarities and differences between traditional manufacturing systems and genetic manufacturing systems are described. A serial, multi-stage inspection allocation mathematical model is formulated for a genetic manufacturing system utilizing gene synthesis. Additionally, discrete event simulation is used to evaluate inspection strategies for a fragment synthesis process and multiple fragment assembly operation. Results from the mathematical model and discrete event simulation provide two approaches to determine the appropriate inspection strategies with respect to total cost or total flow time of the genetic manufacturing system. / Ph. D. / Breakthroughs in molecular and synthetic biology through <i>de novo</i> gene synthesis are stimulating new vaccines, pharmaceutical applications, and functionalized biomaterials, and advancing the knowledge of the function of cells. This evolution in biological processing motivates the study of a class of manufacturing systems, defined here as genetic manufacturing systems, which produce a final product with a genetic construct. Genetic manufacturing systems rely on rare molecular events for success, resulting in waste and repeated work during the deoxyribonucleic acid (DNA) fabrication process. Inspection and real time monitoring strategies are possible as mitigation tools, but it is unclear if these techniques are cost efficient and value added for the successful creation of custom genetic constructs.
This work investigates resource allocation strategies for DNA fabrication environments, with an emphasis on inspection allocation. The primary similarities and differences between traditional manufacturing systems and genetic manufacturing systems are described. A serial, multi-stage inspection allocation mathematical model is formulated for a genetic manufacturing system utilizing gene synthesis. Additionally, discrete event simulation is used to evaluate inspection strategies for a fragment synthesis process and multiple fragment assembly operation. Results from the mathematical model and discrete event simulation provide two approaches to determine the appropriate inspection strategies with respect to total cost or total flow time of the genetic manufacturing system.
|
159 |
Discrete Event Simulation for Aftermarket Supply ChainAlbors Marques, Laura, Jayakumar, Jagathishvar January 2020 (has links)
The planning of an Aftermarket Supply Chain is a very complex task. This is due to an unpredictable demand which is driven by the need for maintenance and repair. This drive translates to a high variety of lead times, a large number of stock-keeping units (SKUs) and the capacity to deliver spare parts during its full lifecycle. With all these complexities in place, optimizing and parametrizing the planning process is a difficult and time-consuming task. Moreover, the current optimization tool focuses only on one node (each warehouse individually) of the whole Supply Chain, without considering the information such as inventory levels of the other nodes. Hence, the Supply Chain is not completely connected, making it difficult to get a better understanding of the system performance to identify cost draining areas. This leads to capital being tied up in the upper stream of the Supply Chain and later adding unnecessary costs like high inventory costs, rush freight costs, return or scrapping cost. In this study, Discrete Event Simulation (DES) is explored as an additional optimization tool that could analyse and improve the performance of the whole Supply Chain. To do that, the functioning of a node is modelled by replicating the logics behind the flow of material, which includes analysing some manual workflows which are currently present. In Addition, all the information needed from the orders, order lines and parts are mapped. The later part of the study aims to connect all the nodes to form a whole overview of the Supply Chain and further perform optimizations globally. As an outcome, Multi-Echelon Inventory Optimization has been performed on the whole Supply Chain after connecting all the nodes and thus getting an overview. Furthermore, the impact of different parameters has been studied on the whole model to understand the sensitivity of parameters such as variations in lead time and demand. Finally, different what-if scenarios such as COVID and problems with delay in suppliers were studied, which could help understand the impact of unforeseen situations. / Planeringen av en eftermarknadskedja är en mycket komplex uppgift. Detta beror på en oförutsägbar efterfrågan som drivs av behovet av underhåll och reparation. Enheten översätter till många olika ledtider, ett stort antal lagerhållningsenheter (SKU) och kapacitet att leverera reservdelar under hela dess livscykel. Med alla dessa komplexiteter på plats är optimering och parametrering av planeringsprocessen en svår och tidskrävande uppgift. Dessutom fokuserar det nuvarande optimeringsverktyget bara på en nod (varje lager separat) i hela leveranskedjan utan att beakta informationen som lagernivåerna för de andra noderna. Därför är försörjningskedjan inte helt ansluten, vilket gör det svårt att få en bättre förståelse för systemets prestanda för att identifiera kostnadsavtappningsområden. Detta leder till att kapital binds i den övre strömmen i försörjningskedjan och senare lägger till onödiga kostnader som höga lagerkostnader, snabba fraktkostnader, retur- eller skrotningskostnader. I denna studie undersöks Discrete Event Simulation (DES) som ett ytterligare optimeringsverktyg som kan analysera och förbättra prestanda för hela försörjningskedjan. För att göra det modelleras en nods funktion genom att replikera logiken bakom materialflödet, vilket inkluderar analys av några manuella arbetsflöden som för närvarande finns. Dessutom kartläggs all information som behövs från beställningar, orderrader och delar. Den senare delen av studien syftar till att ansluta alla noder för att bilda en hel översikt över försörjningskedjan och ytterligare utföra optimeringar globalt. Som ett resultat har Multi-Echelon Lageroptimering utförts i hela försörjningskedjan efter att alla noder har anslutits och därmed fått en översikt. Dessutom har effekterna av olika parametrar studerats på hela modellen för att förstå känsligheten hos parametrar som variationer i ledtid och efterfrågan. Slutligen studerades olika tänkbara scenarier som COVID och problem med förseningar hos leverantörer, vilket kan hjälpa till att förstå effekterna av oförutsedda situationer.
|
160 |
Discrete Event Simulation for Optimization of Codependent Assembly Lines / Discrete event simulation för optimering av komplexa produktionsflödenAcar, Delal, Tran, Hanna January 2022 (has links)
This study is conducted at Scania, a globally leading company that manufactures trucks and delivers transport solutions to customers around the globe. In this study, the end flow of the chassi manufacturing was mapped in order to identify and eliminate wastes and bottlenecks, with the help of lean principles and discrete event simulation (DES) software. The assembly line consists of a make-to-order production head line that sends out semi-finished trucks to the end flow. The end flow is thereby codependent on the output from the head line. A synchronized tacted flow would minimize the wait in the stations, which is considered a waste in production, and enhance quality of work as deviations become more visible. The complexity of the production system has however made the department disregard the usage and implementation of simulations tools for improvement and decision making. The aim of this study was to evaluate the applicability and feasibility of DES as a tool for waste elimination, to increase throughput in a complex system and for investigating a better ratio between the takt times of the codependent assembly lines. This was done by developing future scenarios with different configurations of the model, studying how the parameters' behavior change in DES. Recommendations for reaching more effective codependent assembly lines with the help of DES were developed, analyzing the benefits and challenges of using simulation solutions for future decision making. The study offered useful insights and practical guidelines for companies with complex codependent assembly lines, wanting to use DES to implement lean principles. Five different improvement scenarios were tested and it was concluded that an increased number of buffer spots would have the largest impact on the throughput. An optimized takt time for a maximized throughput was also possible to identify with an enumeration method with scenarios. The drawback of DES for complex codependent lines was found to be the inability to consider all surrounding factors such as safety, ergonomics, quality and communication, which makes it a less applicable tool for waste elimination. The benefit of the DES simulation was the ability to manage complexities such as; different takt times between lines; gaps that occur in the production because of trucks removed from the head line; the interplay of the various inflows merging; the additional time to receive trucks from the reparation department; and stop times that cause waiting time at the production line. It was therefore considered an applicable tool for deciding the takt time for codependent lines. / Denna studie är genomförd på Scania, ett globalt ledande företag som tillverkar lastbilar och levererar transportlösningar till kunder runt om i världen. I denna studie kartlades slutflödet av chassitillverkningen för att identifiera och eliminera slöserier och flaskhalsar, med hjälp av leanprinciper och Discrete Event Simulation (DES). Produktionslinan består av en make-to-order huvudlina som skickar ut halvfärdiga lastbilar till slutflödet. Slutflödet är därmed beroende av utflödet från huvudlinan. Ett mer synkroniserat taktat flöde skulle minimera väntetider på stationerna, vilket anses vara slöseri i produktionen, och förbättra kvaliteten på produkterna. Produktionssystemets komplexitet har dock fått avdelningen att bortse från användningen och implementeringen av simuleringsverktyg för beslutsfattande. Målet med denna studie var att utvärdera användbarheten och genomförbarheten av DES som ett verktyg för att eliminera slöserier för att öka antal producerade produkter i ett komplext system, för att undersöka en bättre relation mellan takttider för olika linor. Detta gjordes genom att utveckla framtidsscenarier med olika konfigurationer av modellen där olika parametrarnas beteende och förändring studerades. Rekommendationer för att nå mer effektiva samberoende monteringslinjer med hjälp av DES utvecklades, där fördelarna och utmaningarna med att använda simuleringslösningar för framtida beslutsfattande analyserades. Studien gav insikter och praktiska förslag för företag med komplexa samberoende monteringslinjer, som vill använda DES för att implementera lean-principer. Fem olika förbättringsscenarier testades och resultatet visade att ett ökat antal buffertplatser skulle ha störst påverkan på antal producerade produkter. En optimerad takttid för att maximera antal producerade bilar var också möjlig att identifiera med en numerisk analys där olika intervall av takttider testats i olika scenarier. Nackdelen med DES för komplexa produktionsflöden visade sig vara oförmågan att ta hänsyn till olika omgivande faktorer som säkerhet, ergonomi, kvalitet och kommunikation. Detta gör det till ett mindre användbart verktyg för att mäta effekten av att eliminera slöserier. Fördelen med DES-simuleringen var förmågan att hantera komplexitet såsom; olika takttider mellan olika flöden; luckor som uppstår i produktionen på grund av borttagna lastbilar från huvudlinjen; samspelet mellan de olika inflödena som slås samman; den extra tiden för att ta emot lastbilar från andra avdelningar; och stopptider som orsakar väntetid vid produktionslinjen. Därmed ansågs DES vara ett användbart verktyg för att bestämma takttiden för komplexa flöden.
|
Page generated in 0.0957 seconds