• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 143
  • 142
  • 17
  • 15
  • 15
  • 10
  • 9
  • 7
  • 6
  • 6
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 408
  • 408
  • 141
  • 137
  • 113
  • 80
  • 67
  • 67
  • 65
  • 63
  • 61
  • 60
  • 56
  • 50
  • 49
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
211

The impact of inventory record inaccuracy on material requirements planning systems /

Bragg, Daniel Jay January 1984 (has links)
No description available.
212

GAME THEORETIC APPROACHES TO PETROLEUM REFINERY PRODUCTION PLANNING – A JUSTIFICATION FOR THE ENTERPRISE LEVEL OPTIMIZATION OF PRODUCTION PLANNING

Tominac, Philip A. 11 1900 (has links)
This thesis presents frameworks for the optimal strategic production planning of petroleum refineries operating in competition in multiple markets. The game theoretic concept of the Cournot oligopoly is used as the basic competitive model, and the Nash equilibrium as the solution concept for the formulated problems, which are reformulated into potential games. Nonlinear programming potential game frameworks are developed for static and dynamic production planning problems, as well for mixed integer nonlinear expansion planning problems in which refiners have access to potential upgrades increasing their competitiveness. This latter model represents a novel problem in game theory as it contains both integer and continuous variables and thus must satisfy both discrete and continuous mathematical definitions of the Nash equilibrium. The concept of the mixed-integer game is introduced to explore this problem and the theoretical properties of the new class of games, for which conditions are identified defining when a class of two-player games will possess Nash equilibria in pure strategies, and conjectures offered regarding the properties of larger problems and the class as a whole. In all examples, petroleum refinery problems are solved to optimality (equilibrium) to illustrate the competitive utility of the mathematical frameworks. The primary benefit of such frameworks is the incorporation of the influence of market supply and demand on refinery profits, resulting in rational driving forces in the underlying production planning problems. These results are used to justify the development of frameworks for enterprise optimization as a means of decision making in competitive industries. / Thesis / Doctor of Philosophy (PhD) / This thesis presents a mathematical framework in which refinery production planning problems are solved to optimal solutions in competing scenarios. Concepts from game theory are used to formulate these competitive problems into mathematical programs under single objective functions which coordinate the interests of the competing refiners. Several different cases are considered presenting refinery planning problems as static and dynamic programs in which decisions are time independent or dependent, respectively. A theoretical development is also presented in the concept of the mixed integer game, a game theoretic problem containing both continuous and discrete valued variables and which must satisfy both continuous and discrete definitions of Nash equilibrium. This latter development is used to examine refinery problems in which individual refiners have access to numerous unit upgrades which can potentially improve performance. The results are used to justify a game theoretic approach to enterprise optimization.
213

Lot Sizing at the Operational Planning and Shop Floor Scheduling Levels of the Decision Hierarchy of Various Production Systems

Chen, Ming 07 December 2007 (has links)
The research work presented in this dissertation relates to lot sizing and its applications in the areas of operational planning and shop floor scheduling and control. Lot sizing enables a proper loading of requisite number of jobs on the machines in order to optimize the performance of an underlying production system. We address lot sizing problems that are encountered at the order entry level as well as those that are faced at the time of distributing the jobs from one machine to another and those that arise before shipping the jobs (orders) to customers. There are different issues and performance measures involved during each of these scenarios, which make the lot sizing problems encountered in these scenarios different from one another. We present algorithms and relevant theoretical analyses for each of the lot sizing problems considered, and also, present results of numerical experimentation to depict their effectiveness We first study the lot sizing problem encountered while transferring jobs from one machine to another. A lot of the jobs is to be split into smaller lots (called sublots) such that the lot is processed on multiple machines in an overlapping manner, a process which is known in the literature as lot streaming. Two lot streaming problems, FL2/n/C and FLm/1/C, are investigated in Chapter 2. FL2/n/C involves a two-machine flow shop in which multiple lots are to be processed. The objective is to minimize the combined cost of makespan and material handling (the latter is proportional to the number of sublots). A dynamic programming-based methodology is developed to determine the optimal sublot sizes and the number of sublots for each lot while assuming a known sequence in which to process the lots. We designate this problem as LSP-DP. This methodology is, then, extended to determine an optimal sequence in which to process the lots in conjunction with the number of sublots and sublot sizes for each lot. We designate this problem as LSSP-DP. Three multidimensional heuristic search procedures (denoted as LSSP-Greedy, LSSP-Cyclic and LSSP-ZP) are proposed for this problem in order to obtain good-quality solutions in a reasonable amount of computational time. Our experimentation reveals that both lot streaming and lot sequencing generate significant benefits, if used alone. However, for the objective of minimizing total handling and makespan cost, lot streaming is more beneficial than lot sequencing. The combined use of lot streaming and sequencing, expectedly, results in the largest improvement over an initial random solution. LSP-DP is found to be very efficient, and so are the three LSSP heuristics, all of which are able to generate near-optimal solutions. On the average, LSSP-Greedy generates the best solutions among the three, and LSSP-Cyclic requires the least time. FLm/1/C deals with the streaming of a single lot over multiple machines in a flow shop. The objective is a unified cost function that comprises of contributions due to makespan, mean flow time, work-in-process, transfer time and setup time. The distinctive features of our problem pertain to the inclusion of sublot-attached setup time and the fact that idling among the sublots of a lot is permitted. A solution procedure that relies on an approximation equation to determine sublot size is developed for this problem for equal-size sublots. The approximation avoids the need for numerical computations, and enables the procedure to run in polynomial time. Our experimentation shows that this solution procedure performs quite well and frequently generates the optimal solution. Since the objective function involves multiple criteria, we further study the marginal cost ratios of various pairs of the criteria, and propose cost sensitivity indices to help in estimating the impact of marginal cost values on the number of sublots obtained. The lot sizing problem addressed in Chapter 3 is motivated by a real-life setting associated with semiconductor manufacturing. We first investigate the integration of lot sizing (at the operational planning level) and dispatching (at the scheduling and control level) in this environment. Such an integration is achieved by forming a closed-loop control system between lot sizing and dispatching. It works as follows: lot sizing module determines lot sizes (loading quota) for each processing buffer based on the current buffer status via a detailed linear programming model. The loading quotas are then used by the dispatching module as a general guideline for dispatching lots on the shop floor. A dispatching rule called "largest-remaining-quota-first" (LRQ) is designed to drive the buffer status to its desired level as prescribed by the lot sizing module. Once the buffer status is changed or a certain amount of time has passed, loading quotas are updated by the lot sizing module. Our experimentation, using the simulation of a real-life wafer fab, reveals that the proposed approach outperforms the existing practice (which is based on "first-in-first-out" (FIFO) model and an ad-hoc lot sizing method). Significant improvements are obtained in both mean values and standard deviations of the performance metrics, which include finished-goods inventory, backlog, throughput and work-in-process. The integration of lot sizing and dispatching focuses on the design of an overall production system architecture. Another lot sizing problem that we present in Chapter 3 deals with input control (or workload control) that complements this architecture. Input control policies are responsible for feeding the production system with the right amount of work and at the right time, and are usually divided into "push" or "pull" categories. We develop a two-phase input control methodology to improve system throughput and the average cycle time of the lots. In phase 1, appropriate operational lot sizes are determined with regard to weekly demand, so as to keep the lot start rate at the desired level. In phase 2, a "pull" policy, termed CONLOAD, is applied to keep the bottleneck's workload at a target level by releasing new lots into the system whenever the workload level is below the desired level. Since the operators are found to be the bottleneck of the system in our preliminary investigation, the "operator workload" is used as system workload in this study. Using throughput and cycle time as the performance metrics, it is shown that this two-phase CONLOAD methodology achieves significant improvement over the existing CONWIP-like policy. Furthermore, a reference table for the target operator workload is established with varying weekly demand and lot start rate. The last lot sizing problem that we address has to do with the integration of production and shipping operations of a make-to-order manufacturer. The objective is to minimize the total cost of shipping and inventory (from manufacturer's perspective) as well as the cost of earliness and tardiness of an order (from customer's perspective). An integer programming (IP) model is developed that captures the key features of this problem, including production and delivery lead times, multiple distinct capacitated machines and arbitrary processing route, among others. By utilizing the generalized upper bound (GUB) structure of this IP model, we are able to generate a simplified first-level RLT (Reformulation Linearization Technique) relaxation that guarantees the integrity of one set of GUB variables when it is solved as a linear programming (LP) problem. This allows us to obtain a tighter lower bound at a node of a branch-and-bound procedure. The GUB-based RLT relaxation is complemented by a GUB identification procedure to identify the set of GUB variables that, once restricted to integer values, would result in the largest increment in the objective value. The tightening procedure described above leads to the development of a RLT-based branch-and-bound algorithm. Our experimentation shows that this algorithm is able to search the branch-and-bound tree more efficiently, and hence, generates better solutions in a given amount of time. / Ph. D.
214

Leadtime estimation in a manufacturing shop environment

Srihari, Krishnaswami 14 November 2012 (has links)
This research examines the relationships between the independent variables; lot size, sequencing rule, and shop type/product mix and the â dependent variables; work-in-process, machine utilization, flowtime, and leadtime. These relationships have seldom been investigated by academicians. There is no satisfactory method available in the literature today that can be used to accurately estimate leadtime. This research developed methods for leadtime estimation. A leadtime estimation method would be invaluable to the production planner, for it would enable the person to decide when the job should be released to the shop. The experimental system considered in this research is an eight machine shop. Three different shop types, a jobshop, modified flowshop, and a flowshop, were modelled. Each shop type had eight different product types being manufactured. Two different sequencing rules, SPT and FIFO, were used. The entire system was analyzed via simulation on an IBM P.C. using SIMAN system simulation concepts. / Master of Science
215

Selection of an optimal set of assembly part delivery dates in a stochastic assembly system

Das, Sanchoy K. 14 November 2012 (has links)
The scheduling of material requirements at a factory to maximize profits.or productivity is a difficult mathematical problem. The stochastic nature of most production setups introduces additional complications as a result of the uncertainty involved in vendor reliability and processing times. But in developing the descriptive model for a system, a true representation can only be attained if the variability of these elements is considered. Here we present the development of a normative model based on a new type of descriptive model which considers the element of stochasticity. The arrival time of an assembly part from a vendor is considered to be a normally distributed random variable. We attempt to optimize the system with regard to work-in-process inventory using a dynamic programming algorithm in combination with a heuristic procedure. The decision variable is the prescribed assembly part delivery date. The model is particularly suitable for application in low volume assembly lines, where products are manufactured in discrete batches. / Master of Science
216

A Multi-Agent System and Auction Mechanism for Production Planning over Multiple Facilities in an Advanced Planning and Scheduling System

Goel, Amol 29 October 2004 (has links)
One of the major planning problems faced by medium and large manufacturing enterprises is the distribution of production over various (production) facilities. The need for cross-facility capacity management is most evident in the high-tech industries having capital-intensive equipment and short technology life cycle. There have been solutions proposed in the literature that are based on the lagragian decomposition method which separate the overall multiple product problem into a number of single product problems. We believe that multi-agent systems, given their distributed problem solving approach can be used to solve this problem, in its entirety, more effectively. According to other researchers who have worked in this field, auction theoretic mechanisms are a good way to solve complex production planning problems. This research study develops a multi-agent system and negotiation protocol based on combinatorial auction framework to solve the given multi-facility planning problem. The output of this research is a software library, which can be used as a multi-agent system model of the multi-product, multi-facility capacity allocation problem. The negotiation protocol for the agents is based on an iterative combinatorial auction framework which can be used for making allocation decisions in this environment in real-time. A simulator based on this library is created to validate the multi-agent model as well as the auction theoretic framework for different scenarios in the problem domain. The planning software library is created using open source standards so that it can be seamlessly integrated with scheduling library being developed as a part of the Advanced Planning and Scheduling (APS) system project or any other software suite which might require this functionality. The research contribution of this study is in terms of a new multi-agent architecture for an Advanced Planning and Control (APS) system as well as a novel iterative combinatorial auction mechanism which can be used as an agent negotiation protocol within this architecture. The theoretical concepts introduced by this research are implemented in the MultiPlanner production planning tool which can be used for generating master production plans for manufacturing enterprises. The validation process carried out on both the iterative combinatorial framework and the agent-based production planning methodology demonstrate that the proposed solution strategies can be used for integrated decision making in the multi-product, multi-facility production planning domain. Also, the software tool developed as part of this research is a robust, platform independent tool which can be used by manufacturing enterprises to make relevant production planning decisions. / Master of Science
217

Simulering och modellering av produktionsflöden i tung fordonsinsdustrin : Buffertnivåer efter 50/50-scenario av elektriska - och konventionella drivenheter

Petersson, Märta, Kraft, Adelia January 2024 (has links)
In the coming decades, many automotive industries will undergo changes where the electrification of vehicles will play a significant role. In order for a manufacturing company to be able to compete in the market, constant adaptation is needed to maintain competitiveness. To make companies more efficient, tools as Lean production is used together with several digital technologies to maintain a competitiveness and facilitate possible changes. Simulation can be used to control and develop future production and answer important questions about about future challenges.The purpose of this study is to investigate, through discrete event simulation, the needs and conditions that a scenario with 50% conventional and 50% electric driveunits will introduce to Scania’s transmission production. The purpose was partially fulfilled through a preliminary study to provide a basis for the upcoming simulation models. This preliminary study identified three critical buffers, essential for future production planning. Therefore, three simulation models were established, focusing on the total number of buffer slots. The result of this study presents three proposals for the number of slots for each buffer, as well as the number of stations for the testing and repair of the electric units. Given that a high volume of products alters the need for space requirements, it was crucial to describe the capacity of these buffers and the potential of their stations. To fully leverage this study and the three simulation models, a more further and detailed data collection on several production processes is recommended.
218

Dynamic resource allocation in manufacturing and service industries

Yilmaz, Tuba 11 January 2012 (has links)
In this thesis, we study three applications of dynamic resource allocation: the first two consider dynamic lead-time quotation in make-to-order (MTO) systems with substitutable products and order cancellations, respectively; and the third application is a manpower allocation problem with job-teaming constraints. Matching supply and demand for manufacturing and service industries has been a fundamental focus of operations management literature, which concentrated on optimizing or improving supply-side decisions since demand has generally been assumed to be exogenously determined. However, recent business trends and advances in consumer behavior modeling have shown that demand for goods and services can clearly be shaped by various decisions that a firm makes, such as price and lead-time. In fact, competition between companies is no longer mainly based on price or product features; lead-time is one of the strategic measures to evaluate suppliers. In MTO manufacturing or service environments that aim to satisfy the customers' unique needs, lead-time quotation impacts the actual demand of the products and the overall profitability of the firm. In the first two parts of the thesis, we study the dynamic lead-time quotation problem in pure MTO (or service) systems characterized by lead-time sensitive Poisson demand and exponentially distributed service times. We formulate the problem as an infinite horizon Markov decision process (MDP) with the objective of maximizing the long-run expected average profit per unit time, where profits are defined to specifically account for delays in delivery of the customer orders. We study dynamic lead-time quotation problem in two particular settings; one setting with the possibility of demand substitution and another setting with order cancellations. The fundamental trade-off in lead-time quotation is between quoting short lead-times and attaining them. In case of demand substitution, i.e., in presence of substitutable products and multiple customer classes with different requirements and margins, this trade-off also includes capacity allocation and order acceptance decisions. In particular, one needs to decide whether to allocate capacity to a low-margin order now, or whether to reserve capacity for potential future arrivals of high-margin orders by considering customer preferences, the current workload in the system, and the future arrivals. In the case of order cancellations, one needs to take into account the probability of cancellation of orders currently in the system and quote lead-times accordingly; otherwise quotation of a longer lead-time may result in the loss of customer order, lower utilization of resources, and, in turn, reduced in profits. In Chapter 2, we study a dynamic lead-time quotation problem in a MTO system with two (partially) substitutable products and two classes of customers. Customers decide to place an order on one of the products or not to place an order, based on the quoted lead-times. We analyze the optimal profit and the structure of the optimal lead-time policy. We also compare the lead-time quotes and profits for different quotation strategies (static vs. dynamic) with or without substitution. Numerical results show that substitution and dynamic quotation have synergetic effects, and higher benefits can be obtained by dynamic quotation and/or substitution when difference in product revenues or arrival rates, or total traffic intensity are higher. In Chapter 3, we study a dynamic lead-time quotation problem in a MTO system with single product considering the order cancellations. The order cancellations can take place during the period that the order is being processed (either waiting or undergoing processing), or after the processing is completed, at the delivery to the customer. We analyze the behavior of optimal profit in terms of cancellation parameters. We show that the optimal profit does not necessarily decrease as cancellation rate increases through a numerical study. When the profit from a cancelled order, arrival rate of customers, or lead-time sensitivity of customers are high, there is a higher probability that optimal profit increases as cancellation rate increases. We also compare the cancellation scenarios with the corresponding no-cancellation scenarios, and show that there exists a cancellation scenario that is at least as good in terms of profit than a no-cancellation scenario for most of the parameter settings. In Chapter 4, we study the Manpower Allocation Problem with Job-Teaming Constraints with the objective of minimizing the total completion time of all tasks. The problem arises in various contexts where tasks require cooperation between workers: a team of individuals with varied expertise required in different locations in a business environment, surgeries requiring different composition of doctors and nurses in a hospital, a combination of technicians with individual skills needed in a service company. A set of tasks at random locations require a set of capabilities to be accomplished, and workers have unique capabilities that are required by several tasks. Tasks require synchronization of workers to be accomplished, hence workers arriving early at a task have to wait for other required workers to arrive in order to start processing. We present a mixed integer programming formulation, strengthen it by adding cuts and propose heuristic approaches. Experimental results are reported for low and high coordination levels, i.e., number of workers that are required to work simultaneously on a given task.
219

Resource constrained step scheduling of project tasks

Eygelaar, Anton Burger 03 1900 (has links)
Thesis (MScEng (Civil Engineering))--University of Stellenbosch, 2008. / Thesis presented in partial fulfilment of the requirements for the degree of Master of Science in Civil Engineering at the University of Stellenbosch. / ENGLISH ABSTRACT: The logical scheduling of activities in an engineering project currently relies heavily on the experience and intuition of the persons responsible for the schedule. In large projects the complexity of the schedule far exceeds the capacity of human intuition, and systematic techniques are required to compute a consistent sequence of activities. In this study a simple model of the engineering process is described. Based on certain specified relationships between components of the model, a consistent sequence of activities is determined in the form of a logical step schedule. The problem of resource constraints receives special attention. Engineering projects are often executed with limited resources and determining the impact of such restrictions on the logical step schedule is important. This study investigates activityshifting strategies to find a near-optimal sequence of activities that guarantees consistent evolution of deliverables while resolving resource conflicts within the context of logical step schedules. / AFRIKAANSE OPSOMMING: Die logiese skedulering van aktiwiteite in ‘n ingenieursprojek steun swaar op die ondervinding en intuisie van die persone wat verantwoordelik is vir die skedule. In groot projekte is die kompleksiteit van die skedule veel hoër as die kapasiteit van die menslike intuisie, en sistematiese tegnieke word benodig om ‘n konsekwente volgorde van aktiwiteite te bereken. In hierdie studie word ‘n eenvoudige model van die ingenieursproses beskryf. Gebasseer op sommige relasies tussen komponente van die model, kan ‘n konsekwente volgorde van aktiwiteite bepaal word in die vorm van ‘n logiese stap-skedule. Die probleem van beperkte hulpbronne ontvang spesiale aandag. Ingenieursprojekte word dikwels uitgevoer met beperkte hulpbronne en dit is belangrik om die impak daarvan op die logiese stap-skedule te bepaal. Die studie ondersoek die gebruik van aktiwiteit-skuiwende strategieë om ‘n nabyoptimale volgorde van aktiwiteite te vind wat konsekwente ontwikkeling van die projekprodukte waarborg, terwyl hulpbron konflikte opgelos word binne die konteks van ‘n logiese stap-skedule.
220

Research and development of a linear programming function with specific reference to the generation expansion planning environment of Eskom

Botha, Lance Robert 17 August 2016 (has links)
A project submitted to the faculty of Engineering, University of the Witwatersrand, Johannesburg, in partial fulfilments of the requirements for the degree of Master of science in Engineering Johannesburg, 1994 / The purpose of this document is to report on the development of Linear Programming function for the Generation Expansion Planning environment of Eskom. This was achieved by researching the modeling methods employed in this and related fields of work. After establishing the scope of the work to be performed all the options were carefully assessed and it was decided to develop the Production Scheduling function first, as this would serve as the foundation for future work. The requirements were specified after extensive discussion with the customer. These requirements were utilize to establish the formulae, including their bounds and constraints. These were in turn converted into the Linear programming function. To faci1itate the data input process a simple input facility was developed. To maximize the value of the results the report writer was developed to enable sensitivity studies to be performed. This work was later used as the foundation of the NewGex programme.

Page generated in 0.0876 seconds