• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 121
  • 33
  • 14
  • 13
  • 9
  • 4
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 257
  • 257
  • 257
  • 91
  • 46
  • 42
  • 40
  • 37
  • 31
  • 28
  • 26
  • 26
  • 26
  • 25
  • 25
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
41

Um método para quantificar o estoque em processo à luz da simulação computacional e da análise multicritério

Pergher, Isaac 16 March 2011 (has links)
Submitted by Mariana Dornelles Vargas (marianadv) on 2015-03-17T19:40:51Z No. of bitstreams: 1 metodo_quantificar.pdf: 1038017 bytes, checksum: 54f78fa2753a1dfe9d797dc88d1ed31e (MD5) / Made available in DSpace on 2015-03-17T19:40:51Z (GMT). No. of bitstreams: 1 metodo_quantificar.pdf: 1038017 bytes, checksum: 54f78fa2753a1dfe9d797dc88d1ed31e (MD5) Previous issue date: 2011-03-16 / CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior / PROSUP - Programa de Suporte à Pós-Gradução de Instituições de Ensino Particulares / Em ambientes produtivos intermitentes que operam na condição ?produzir para estoque?, com fluxo convergente, a possibilidade de constituir estoques em processo (WIP), de produtos prontos, ou matérias-primas pode aumentar o grau de complexidade da gestão das operações e a necessidade de utilizar procedimentos de controle distintos para cada tipo de estoque. Ao focar no alinhamento da gestão dos estoques aos planos de demanda e capacidade, considerando um ambiente produtivo que emprega a abordagem Conwip, a presente pesquisa tem por finalidade propor um método estruturado que possibilite quantificar o nível de WIP do sistema produtivo, a partir da Simulação por Eventos Discretos e da técnica de Apoio Multicritério ELECTRE TRI. Esta pesquisa tem o intuito de contribuir com a geração de informações que subsidiem a tomada de decisão concernente à escolha de uma configuração de cenário que aluda a um nível de estoque em processo e de produtos prontos considerando o mix de produção sob estudo. Fundamentado na proposta desenvolvida nesta dissertação, o Método de Pesquisa pode ser caracterizado quanto aos procedimentos técnicos, pelo uso da Simulação Computacional e relativo à natureza dos dados, destaca-se o da abordagem essencialmente quantitativa, ou Pesquisa Quantitativa. O método proposto foi avaliado, quanto a sua estrutura e proposta, por especialistas das disciplinas de Simulação, Gestão de Sistemas Produtivos e Métodos Multicritério à Decisão. Uma aplicação do método em um sistema produtivo real também é apresentada / In intermittent productive systems that operate in the condition 'make to stock' with convergent flow, the possibility of generate work in process (WIP), finish good products or raw materials inventories can increase the degree of complexity of the management in the operations and the need of using procedures of different control for each stock type. Focusing in the alignment of the stocks to the demand and capacity plans, considering a productive system which uses the Conwip approach, this research describes a structured method that aims to quantify the level of WIP in the productive system, applying the Events Discrete Simulation and the technique nominated ELECTRE TRI. This research intends to contribute with the generation of information for decision support regarding the choice of work in process and finished goods stock levels, considering the production mix studied. Based in the proposal developed in this work, the Method of Research can be characterized, regarding the technical procedures, as Computational Simulation, and regarding the nature of the data, as Quantitative Research. The method proposed in this research was evaluated for specialists in the disciplines of Simulation, Administration of Production Systems and Multicriteria Decision Analysis. An application of the method in a real productive system is also presented.
42

A New Insight into Data Requirements Between Discrete Event Simulation and Industry 4.0 : A simulation-based case study in the automotive industry supporting operational decisions

Mirzaie Shra, Afroz January 2019 (has links)
Current industrial companies are highly pressured by growing competitiveness and globalization, while striving for increased production effectiveness. Meanwhile, flustered markets and amplified customer demands are causing manufacturers to shift strategy. Hence, international companies are challenged to pursue changes, in order to continue being competitive on global markets. Consequently, a new industrial revolution has taken place, introduced as Industry 4.0. This new concept incorporates organizational improvement and digitalization of current information and data flows. Accomplished by data from embedded systems through connected machines, devices and humans into a combined interface. Thus, companies are given possibilities to improve current production systems, simultaneously saving operational costs and minimizing insufficient production development. Smart Factories, being the foundation of Industry 4.0 results in making more accurate and precise operational decisions from abilities to test industrial changes in a virtual world before real-life implementation. However, in order to assure these functions as intended, enormous amount of data must be collected, analysed and evaluated. The indicated data will aid companies to make more self-aware and automated decisions, resulting in increased effectiveness in production. Thus, the concept will clearly change how operational decisions are made today. Nowadays, Discrete Event Simulation is a commonly applied tool founded on specific data requirements as operational changes can be tested in virtual settings. Accordingly, it is believed that simulation can aid companies that are striving for implementing Industry 4.0. As a result, data requirements between Discrete Event Simulation and Industry 4.0 needs to be established, while detecting the current data gap in operational context. Hence, the purpose of this thesis is to analyse the data requirements of Discrete Event Simulation and Industry 4.0 for improving operational decisions of production systems. In order to justify the purpose, the following research questions has been stated:   RQ1: What are the data challenges in existing production systems? RQ2: What data is required for implementing Industry 4.0 in production systems? RQ3: How can data requirements from Discrete Event Simulation benefit operational decisions when implementing Industry 4.0?   The research questions were answered by conducting a case study, in collaboration with Scania CV AB. The case study performed observations, interviews and other relevant data collection to accomplish the purpose. In parallel, a literature review focusing on data requirements for operational decisions was compared to the empirical findings. The analysis identified the current data gap in existing production systems, in correlation to Industry 4.0, affecting the accuracy of operational decisions. In addition, it was shown that simulation can undoubtedly give positive outcome for adaptation of Industry 4.0, and a clear insight on data requirements.
43

Modelling framework for assessing nuclear regulatory effectiveness

Lavarenne, Jean January 2018 (has links)
This thesis participates to the effort launched after the Fukushima-Daiichi disaster to improve the robustness of national institutions involved in nuclear safety because of the role that the failing nuclear regulator had in the accident. The driving idea is to investigate how engineering techniques used in high-risk industries can be applied to institutions involved in nuclear safety to improve their robustness. The thesis focuses specifically on the Office for Nuclear Regulation (ONR), the British nuclear regulator, and its process for structured inspections. The first part of the thesis demonstrates that the hazard and operability (HAZOP) technique, used in the nuclear industry to identify hazards associated with an activity, can be adapted to qualitatively assess the robustness of organisational processes. The HAZOP method was applied to the ONR inspection process and led to the identification of five significant failures or errors. These are: failure to focus on an area/topic deserving regulatory attention; failure to evaluate an area/topic of interest; failure to identify a non-compliance; failure to identify the underlying issue, its full extent and/or safety significance and failure to adequately share inspection findings. In addition, the study identified the main causal chains leading to each failure. The safeguards of the process, i.e. the mechanisms in place to prevent, detect, resolve and mitigate possible failures, were then analysed to assess the robustness of the inspection process. The principal safeguard found is the superintending inspector who performs reviews of inspection reports and debriefs inspectors after inspections. It was concluded that the inspection process is robust provided recruitment and training excellence. However, given the predominant role of the superintending inspector, the robustness of the process could be improved by increasing the diversity of safeguards. Finally, suggestions for improvement were made such as establishing a formal handover procedure between former and new site inspectors, formalising and generalising the shadowing scheme between inspectors and setting minimum standards for inspection debriefs. These results were shared with ONR, which had reached the same conclusions independently, thus validating the new application for the HAZOP method. The second part of the thesis demonstrates that computational modelling techniques can be used to build digital twins of institutions involved in safety which can then be used to assess their effectiveness. The knowledge learned thanks to the HAZOP study was used in association with computational modelling techniques to build a digital twin of the ONR and its structural inspection process along with a simple model of a nuclear plant. The model was validated using the face-validity and predictive validation processes. They respectively involved an experienced ONR inspector checking the validity of the model’s procedures and decision-making processes and comparing the model’s output for oversight work done to data provided by the ONR. The effectiveness of the ONR was then evaluated using a scenario where a hypothetical, newly discovered phenomenon threatens the integrity of the plant, with ONR inspectors gradually learning and sharing new information about it. Monte-Carlo simulation was used to estimate the cost of regulatory oversight and the probability that the ONR model detects and resolves the issue introduced before it causes an accident. Different arrangements were tested and in particular with a superintending inspector reviewing inspection reports and a formal information sharing process. For this scenario, these two improvements were found to have a similar impact on the success probability. However, the former achieves it for only half the cost.
44

Aplicação da simulação a eventos discretos no balanceamento de linha de montagem / Application of simulation to discrete events in mounting line balancing

Maciel, Lucio Flore [UNESP] 01 April 2016 (has links)
Submitted by Lucio Flore Maciel (floremaciel@gmail.com) on 2017-02-15T08:26:10Z No. of bitstreams: 1 Dissertação Lucio Flore Maciel APLICAÇÃO DA SIMULAÇÃO A EVENTOS DISCRETOS NO BALANCEAMENTO DE LINHA DE MONTAGEM.pdf: 1899747 bytes, checksum: ce39bd6323b1cd070d845c8ce1d9a680 (MD5) / Approved for entry into archive by Juliano Benedito Ferreira (julianoferreira@reitoria.unesp.br) on 2017-02-21T17:37:52Z (GMT) No. of bitstreams: 1 maciel_lf_me_guara.pdf: 1899747 bytes, checksum: ce39bd6323b1cd070d845c8ce1d9a680 (MD5) / Made available in DSpace on 2017-02-21T17:37:52Z (GMT). No. of bitstreams: 1 maciel_lf_me_guara.pdf: 1899747 bytes, checksum: ce39bd6323b1cd070d845c8ce1d9a680 (MD5) Previous issue date: 2016-04-01 / O objetivo geral da pesquisa foi desenvolver um modelo de simulação a eventos discretos para ser aplicado ao balanceamento de uma linha de montagem de componentes numa empresa do setor automotivo, buscando aumentar a produtividade da linha e utilizando menores quantidades de insumos. Na situação estudada há muitos elementos com comportamento estocástico, que podem influenciar na produtividade da linha, além da diversidade de insumos que abastecem a montagem, o que torna o balanceamento da linha uma tarefa complexa e favorece o uso da Simulação como procedimento de solução. Na modelagem conceitual do problema estudado adotou-se o método IDEF-SIM e para a simulação foi utilizado o software ProModel®. Como resultados tem-se um modelo conceitual e implementado para uma linha de montagem que possibilitou informações interessantes sobre o problema, tais como: juntar funções de dois postos de trabalhos, eliminar excesso de movimentação e eliminar gargalos. / The overall objective of the research was to develop a simulation model discrete event to be applied to balancing an assembly line components in automotive company, seeking to increase line productivity and using lower amounts of inputs. In the situation studied for many elements with stochastic behavior, which may influence the productivity of the line beyond the range of products that supply assembly, which makes the balancing line a complex task and promotes the use of the simulation procedure as a solution. In conceptual modeling of the studied problem adopted the IDEF-SIM method and for the simulation we used the ProModel® software. As results has a conceptual model and implemented for an assembly line that has enabled interesting information about the problem, such as joining functions of two posts jobs, eliminate excess movement and eliminate bottlenecks.
45

Workflow analysis, modelling and simulation for improving conventional and MRI-guided vascular interventions

Fernandez-Gutierrez, Fabiola January 2014 (has links)
This thesis proposes a multidirectional methodological framework for a comprehensive ergonomic analysis and modelling of workflow for multi-modal vascular image-guided procedures (IGPs). Two approaches are employed to analyse the workflow: Discrete Event Simulation (DES) and purpose-oriented physical models. In contrast to previous studies, the proposed methodology looks in detail the actions carried out within the intervention rooms and the clinical experience during the procedures with three main objectives: to provide a deeper understanding of vascular procedures, to predict the impact of protocol modifications and to offer a framework to develop new image-guided protocols for the alternative use of Magnetic Resonance (MR) imaging in comparison with X-Ray Digital Subtraction Angiography (DSA). The methodological framework includes an assessment of commercial simulation software packages to evaluate their fitness to the specific requirements of this research. The novel methodology is applied to several cases studies of common vascular IGPs. In addition, a case of MR – guided focused ultrasound intervention demonstrates how it is possible to extend the framework to study non-vascular IGPs. The multi-disciplinary methodological framework described opens a new way to understand IGPs that could be used in prospective applications such as medical education and medical devices regulations.
46

DEVELOPMENT OF A DECISION SUPPORT SYSTEM FOR CAPACITY PLANNING FROM GRAIN HARVEST TO STORAGE

Turner, Aaron P. 01 January 2018 (has links)
This dissertation investigated issues surrounding grain harvest and transportation logistics. A discrete event simulation model of grain transportation from the field to an on-farm storage facility was developed to evaluate how truck and driver resource constraints impact material flow efficiency, resource utilization, and system throughput. Harvest rate and in-field transportation were represented as a stochastic entity generation process, and service times associated with various material handling steps were represented by a combination of deterministic times and statistical distributions. The model was applied to data collected for three distinct harvest scenarios (18 total days). The observed number of deliveries was within ± 2 standard deviations of the simulation mean for 15 of the 18 input conditions examined, and on a daily basis, the median error between the simulated and observed deliveries was -4.1%. The model was expanded to simulate the whole harvest season and include temporary wet storage capacity and grain drying. Moisture content changes due to field dry down was modeled using weather data and grain equilibrium moisture content relationships and resulted in an RMSE of 0.73 pts. Dryer capacity and performance were accounted for by adjusting the specified dryer performance to the observed level of moisture removal and drying temperature. Dryer capacity was generally underpredicted, and large variations were found in the observed data. The expanded model matched the observed cumulative mass of grain delivered well and estimated the harvest would take one partial day longer than was observed. Usefulness of the model to evaluate both costs and system performance was demonstrated by conducting a sensitivity analysis and examining system changes for a hypothetical operation. A dry year and a slow drying crop had the largest impact on the system’s operating and drying costs (12.7% decrease and 10.8% increase, respectively). The impact of reducing the drying temperature to maintain quality in drying white corn had no impact on the combined drying and operating cost, but harvest took six days longer. The reduced drying capacity at lower temperatures resulted in more field drying which counteracted the reduced drying efficiency and increased field time. The sensitivity analysis demonstrated varied benefits of increased drying and transportation capacity based on how often these systems created a bottleneck in the operation. For some combinations of longer transportation times and higher harvest rates, increasing hauling and drying capacity could shorten the harvest window by a week or more at an increase in costs of less than $12 ha-1. An additional field study was conducted to examine corn harvest losses in Kentucky. Total losses for cooperator combines were found to be between 0.8%-2.4% of total yield (86 to 222 kg ha-1). On average, the combine head accounted for 66% of the measured losses, and the total losses were highly variable, with coefficients of variation ranging from 21.7% to 77.2%. Yield and harvest losses were monitored in a single field as the grain dried from 33.9% to 14.6%. There was no significant difference in the potential yield at any moisture level, and the observed yield and losses displayed little variation for moisture levels from 33.9% to 19.8%, with total losses less than 1% (82 to 130 kg dry matter ha-1). Large amounts of lodging occurred while the grain dried from 19.8% to 14.6%, which resulted in an 18.9% reduction in yield, and harvest losses in excess of 9%. Allowing the grain to field dry generally improved test weight and reduced mechanical damage, however, there was a trend of increased mold and other damage in prolonged field drying.
47

The adoption of discrete event simulation in manufacturing management

Jenkins, Roger J., University of Western Sydney, College of Law and Business, School of Management January 2002 (has links)
The research described in this thesis is aimed to improve our knowledge of barriers to the use of OR (Operations Research) techniques in the manufacturing sector. Numerous issues have been previously identified, but the work has generally been unsystematic, or focused on the OR profession. This research is empirical, systematic, and compares a group of OR professionals to a group of manufacturing managers. The technology selected for analysis was DES (Discrete event simulation, a computer based technique for modeling complex manufacturing processes), and the companies chosen, as the focus for data collection, were based in Australia. Two major implications for practice are identified in the thesis. These two perspectives should be seen to be complementary, rather than opposing. OR professionals should place their knowledge within the mindset used by manufacturing managers, rather than attempting to displace that mindset. / Doctor of Philosophy (PhD)
48

Evaluation of Ambulance Diversion

Huang, Chung-Yeh 10 February 2012 (has links)
The problem of overcrowding is one of the serious issues that almost every emergency department (ED) in Taiwan has to face on daily basis. ED crowding results in adverse medical outcomes, decline in quality of care, and lack of the ability to provide instant medical care. One of the viable (but not necessarily preferable) solutions to ED overcrowding is ambulance diversion (AD). That is, ambulances would bypass the ED¡¦s unable to provide emergency medical service, and send patients to another emergency department. In many medical systems, ambulance diversion is being seen as a standard operating procedure, whose effectiveness needs to be carefully studied before making a sound policy. In this research, an input-throughput-output simulation model is proposed for simulating ED operation. A computer simulation program is developed based on this model to evaluate various AD initiating criteria, patient-blocking rules, and AD intervals. The crowdedness index, the patient waiting time for service, and the percentage of adverse patients were assessed to determine the impact of various AD policies. By appropriate parameter settings, this simulation model can represent medical resource providers of different scales. The results we obtained may offer insights for making effective AD policies.
49

Simulation Analysis Of The Blood Supply Chain And A Case Study

Yegul, Mert 01 September 2007 (has links) (PDF)
Efficient management of blood throughout the countries is of great economic importance, in addition to its major impact on the success of medical operations. This study is concerned with the analysis of policies for managing a unique blood supply chain network, which is defined in the new Blood and Blood Products Law of the Republic of Turkey. The main objective of the study is to obtain a better understanding of the system, and to find improved policies to be able to manage it efficiently. A discrete event simulation model is developed to analyze the blood supply chain of a pilot region in Turkey. Effects of different management policies on the supply chain performance are analyzed. Important improvements are achieved in terms of the selected performance measures such as outdate, mismatch and shortage rates of the region. Our proposed model can be used by both national health authorities and the Turkish Red Crescent Society as a decision support tool to analyze other regions and to examine alternative policies.
50

Framework for robust design: a forecast environment using intelligent discrete event simulation

Beisecker, Elise K. 29 March 2012 (has links)
The US Navy is shifting to power projection from the sea which stresses the capabilities of its current fleet and exposes a need for a new surface connector. The design of complex systems in the presence of changing requirements, rapidly evolving technologies, and operational uncertainty continues to be a challenge. Furthermore, the design of future naval platforms must take into account the interoperability of a variety of heterogeneous systems and their role in a larger system-of-systems context. To date, methodologies to address these complex interactions and optimize the system at the macro-level have lacked a clear direction and structure and have largely been conducted in an ad-hoc fashion. Traditional optimization has centered around individual vehicles with little regard for the impact on the overall system. A key enabler in designing a future connector is the ability to rapidly analyze technologies and perform trade studies using a system-of-systems level approach. The objective of this work is a process that can quantitatively assess the impacts of new capabilities and vessels at the systems-of-systems level. This new methodology must be able to investigate diverse, disruptive technologies acting on multiple elements within the system-of-systems architecture. Illustrated through a test case for a Medium Exploratory Connector (MEC), the method must be capable of capturing the complex interactions between elements and the architecture and must be able to assess the impacts of new systems). Following a review of current methods, six gaps were identified, including the need to break the problem into subproblems in order to incorporate a heterogeneous, interacting fleet, dynamic loading, and dynamic routing. For the robust selection of design requirements, analysis must be performed across multiple scenarios, which requires the method to include parametric scenario definition. The identified gaps are investigated and methods recommended to address these gaps to enable overall operational analysis across scenarios. Scenarios are fully defined by a scheduled set of demands, distances between locations, and physical characteristics that can be treated as input variables. Introducing matrix manipulation into discrete event simulations enables the abstraction of sub-processes at an object level and reduces the effort required to integrate new assets. Incorporating these linear algebra principles enables resource management for individual elements and abstraction of decision processes. Although the run time is slightly greater than traditional if-then formulations, the gain in data handling abilities enables the abstraction of loading and routing algorithms. The loading and routing problems are abstracted and solution options are developed and compared. Realistic loading of vessels and other assets is needed to capture the cargo delivery capability of the modeled mission. The dynamic loading algorithm is based on the traditional knapsack formulation where a linear program is formulated using the lift and area of the connector as constraints. The schedule of demands from the scenarios represents additional constraints and the reward equation. Cargo available is distributed between cargo sources thus an assignment problem formulation is added to the linear program, requiring the cargo selected to load on a single connector to be available from a single load point. Dynamic routing allows a reconfigurable supply chain to maintain a robust and flexible operation in response to changing customer demands and operating environment. Algorithms based on vehicle routing and computer packet routing are compared across five operational scenarios, testing the algorithms ability to route connectors without introducing additional wait time. Predicting the wait times of interfaces based on connectors en route and incorporating reconsideration of interface to use upon arrival performed consistently, especially when stochastic load times are introduced, is expandable to a large scale application. This algorithm selects the quickest load-unload location pairing based on the connectors routed to those locations and the interfaces selected for those connectors. A future connector could have the ability to unload at multiple locations if a single load exceeds the demand at an unload location. The capability for multiple unload locations is considered a special case in the calculation of the unload location in the routing. To determine the unload location to visit, a traveling salesman formulation is added to the dynamic loading algorithm. Using the cost to travel and unload at locations balanced against the additional cargo that could be delivered, the order and locations to visit are selected. Predicting the workload at load and unload locations to route vessels with reconsideration to handle disturbances can include multiple unload locations and creates a robust and flexible routing algorithm. The incorporation of matrix manipulation, dynamic loading, and dynamic routing enables the robust investigation of the design requirements for a new connector. The robust process will use shortfall, capturing the delay and lack of cargo delivered, and fuel usage as measures of performance. The design parameters for the MEC, including the number available and vessel characteristics such as speed and size were analyzed across four ways of testing the noise space. The four testing methods are: a single scenario, a selected number of scenarios, full coverage of the noise space, and feasible noise space. The feasible noise space is defined using uncertainty around scenarios of interest. The number available, maximum lift, maximum area, and SES speed were consistently design drivers. There was a trade-off in the number available and size along with speed. When looking at the feasible space, the relationship between size and number available was strong enough to reverse the number available, to desiring fewer and larger ships. The secondary design impacts come from factors that directly impacted the time per trip, such as the time between repairs and time to repair. As the noise sampling moved from four scenario to full coverage to feasible space, the option to use interfaces were replaced with the time to load at these locations and the time to unload at the beach gained importance. The change in impact can be attributed to the reduction in the number of needed trips with the feasible space. The four scenarios had higher average demand than the feasible space sampling, leading to loading options being more important. The selection of the noise sampling had an impact of the design requirements selected for the MEC, indicating the importance of developing a method to investigate the future Naval assets across multiple scenarios at a system-of-systems level.

Page generated in 0.0377 seconds