• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 30
  • 6
  • Tagged with
  • 36
  • 36
  • 36
  • 26
  • 19
  • 19
  • 11
  • 9
  • 9
  • 8
  • 8
  • 6
  • 6
  • 6
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Platform based approach for economic production of a product family

Choubey, Anand M January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / David H. Ben-Arieh / In present competitive market, there is growing concern for ascertaining and fulfilling the individual customer’s wants and needs. Therefore, the focus of manufacturing has been shifting from mass production to mass customization, which requires the manufacturers to introduce an increasing number of products with shorter life span and at a lower cost. Also, another challenge is to manage the variety of products in an environment where demands are stochastic and the lead times to fulfill those demands are short. The focus of this thesis is to develop and investigate platform based production strategies, as opposed to producing each product independently, which would ensure the economic production of the broader specialized products with small final assembly time and under demand uncertainty. The thesis proposes three different platform based production models. The first model considers the economic production of products based on a single platform and with forecasted demands of the products. The model is formulated as a general optimization problem that considers the minimization of total production costs. The second model is the extension of the first model and considers the production of products based on multiple platforms and considers the minimization of total production costs and the setup costs of having multiple platforms. The third model is also an extension of the first model and considers the demands of the products to be stochastic in nature. The model considers the minimization of total production costs and shortage costs of lost demands and holding cost of surplus platforms under demand uncertainties. The problem is modeled as a two stage stochastic programming with recourse. As only the small instances of the models could be solved exactly in a reasonable time, various heuristics are developed by combining the genetic evolutionary search approaches and some operations research techniques to solve the realistic size problems. The various production models are validated and the performances of the various heuristics tailored for the applications are investigated by applying these solution approaches on a case of cordless drills.
22

Simultaneously lifting sets of variables in binary Knapsack problems

Sharma, Kamana January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Integer programming (IP) has been and continues to be widely used by industries to minimize cost and effectively manage resources. Faster computers and innovative IP techniques have enabled the solution to many large-scale IPs. However, IPs are NP-hard and many IPs require exponential time to solve. Lifting is one of the most widely used techniques that helps to reduce computational time and is widely applied in today's commercial IP software. Lifting was first introduced by Gomory for bounded integer programs and a theoretical and computationally intractible technique to simultaneously lift sets of variables was introduced by Zemel in 1978. This thesis presents a new algorithm called the Maximal Simultaneous Lifting Algorithm (MSLA), to simultaneously uplift sets of binary integer variables into a cover inequality. These lifted inequalities result in strong inequalities that are facet defining under fairly moderate assumptions. A computational study shows that this algorithm can find numerous strong inequalities for random Knapsack (KP) instances. The pre-processing time observed for these instances is less than 1/50th of a second, which is negligible. These simultaneously lifted inequalities are easy to find and incorporating these cuts to KP instances reduced the solution time by an average of 41%. Therefore, implementing MSLA should be highly beneficial for large real-world problems.
23

Exact synchronized simultaneous uplifting over arbitrary initial inequalities for the knapsack polytope

Beyer, Carrie Austin January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Integer programs (IPs) are mathematical models that can provide an optimal solution to a variety of different problems. They have been used to reduce costs and optimize organizations. Additionally, IPs are NP-complete resulting in many IPs that cannot be solved. Cutting planes or valid inequalities have been used to decrease the time required to solve IPs. Lifting is a technique that strengthens existing valid inequalities. Lifting inequalities can result in facet defining inequalities, which are the theoretically strongest valid inequalities. Because of these properties, lifting procedures are used in software to reduce the time required to solve an IP. The thesis introduces a new algorithm for exact synchronized simultaneous uplifting over an arbitrary initial inequality for knapsack problems. Synchronized Simultaneous Lifting (SSL) is a pseudopolynomial time algorithm requiring O(nb+n[superscript]3) effort to solve. It exactly uplifts two sets simultaneously into an initial arbitrary valid inequality and creates multiple inequalities of a particular form. This previously undiscovered class of inequalities generated by SSL can be facet defining. A small computational study shows that SSL is quick to execute, requiring on average less than a quarter of a second. Additionally, applying SSL inequalities to a knapsack problem enabled commercial software to solve problems that it could not solve without them.
24

Optimization and resource management in wireless sensor networks

Roseveare, Nicholas January 1900 (has links)
Doctor of Philosophy / Department of Electrical and Computer Engineering / Balasubramaniam Natarajan / In recent years, there has been a rapid expansion in the development and use of low-power, low-cost wireless modules with sensing, computing, and communication functionality. A wireless sensor network (WSN) is a group of these devices networked together wirelessly. Wireless sensor networks have found widespread application in infrastructure, environmental, and human health monitoring, surveillance, and disaster management. While there are many interesting problems within the WSN framework, we address the challenge of energy availability in a WSN tasked with a cooperative objective. We develop approximation algorithms and execute an analysis of concave utility maximization in resource constrained systems. Our analysis motivates a unique algorithm which we apply to resource management in WSNs. We also investigate energy harvesting as a way of improving system lifetime. We then analyze the effect of using these limited and stochastically available communication resources on the convergence of decentralized optimization techniques. The main contributions of this research are: (1) new optimization formulations which explicitly consider the energy states of a WSN executing a cooperative task; (2) several analytical insights regarding the distributed optimization of resource constrained systems; (3) a varied set of algorithmic solutions, some novel to this work and others based on extensions of existing techniques; and (4) an analysis of the effect of using stochastic resources (e.g., energy harvesting) on the performance of decentralized optimization methods. Throughout this work, we apply our developments to distribution estimation and rate maximization. The simulation results obtained help to provide verification of algorithm performance. This research provides valuable intuition concerning the trade-offs between energy-conservation and system performance in WSNs.
25

An employee assignment optimization model exploring cross-training and specialization through multiple management strategies

Wipperfurth, Christy January 1900 (has links)
Master of Agribusiness / Department of Agricultural Economics / Jason Bergtold / Company managers continually face challenges in the market, such as increased demand for their services and variability in the types of service requested. In addition, managers may face internal challenges during periods adjustment such as moving the company forward through a hiring freeze. In these situations, a manager must be able to allocate their scarce resources in a way to continue to perform. For employees, this could mean specializing in tasks or increasing crosstraining to improve work schedule flexibility. The objective of this research is to determine the optimal allocation of employees to tasks, given resource constraints and the need for staff flexibility, to satisfy alternative management strategies. The setting is the service industry, in particular a laboratory setting providing testing and consulting services. An optimization model was developed to incorporate key aspects of a company’s operation, and determine labor allocation among tasks, and for how many hours, to satisfy the manager’s objective. The model estimates the optimal allocation of labor and how much production and net revenues would be generated, with more specialized employees. A sensitivity analysis was employed to determine the impact of cross-training current staff. Results indicate that cross-training affords flexibility; however, the impact on overall production varies depending on the employee trained. The highest benefit is derived from training a lower-producing employee into a high value task at a high productivity rate. Specialization can help to improve productivity in net returns for higher valued tasks, but may limit flexibility, as employees cannot switch between tasks as readily.
26

Optimal mobility patterns in epidemic networks

Nirkhiwale, Supriya January 1900 (has links)
Master of Science / Department of Electrical and Computer Engineering / Caterina M. Scoglio / Disruption Tolerant Networks or opportunistic networks represent a class of networks where there is no contemporaneous path from source to destination. In other words, these are networks with intermittent connections. These networks are generally sparse or highly mobile wireless networks. Each node has a limited radio range and the connections between nodes may be disrupted due to node movement, hostile environments or power sleep schedules, etc. A common example of such networks is a sensor network monitoring nature or military field or a herd of animals under study. Epidemic routing is a widely proposed routing mechanism for data propagation in these type of networks. According to this mechanism, the source copies its packets to all the nodes it meets in its radio range. These nodes in turn copy the received packets to the other nodes they meet and so on. The data to be transmitted travels in a way analogous to the spread of an infection in a biological network. The destination finally receives the packet and measures are taken to eradicate the packet from the network. The task of routing in epidemic networks faces certain difficulties involving minimizing the delivery delay with a reduced consumption of resources. Every node has severe power constraints and the network is also susceptible to temporary but random failure of nodes. In the previous work, the parameter of mobility has been considered a constant for a certain setting. In our setting, we consider a varying parameter of mobility. In this framework, we determine the optimal mobility pattern and a forwarding policy that a network should follow in order to meet the trade-off between delivery delay and power consumption. In addition, the mobility pattern should be such that it can be practically incorporated. In our work, we formulate an optimization problem which is solved by using the principles of dynamic programming. We have tested the optimal algorithm through extensive simulations and they show that this optimization problem has a global solution.
27

Pricing American options with jump-diffusion by Monte Carlo simulation

Fouse, Bradley Warren January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Chih-Hang Wu / In recent years the stock markets have shown tremendous volatility with significant spikes and drops in the stock prices. Within the past decade, there have been numerous jumps in the market; one key example was on September 17, 2001 when the Dow industrial average dropped 684 points following the 9-11 attacks on the United States. These evident jumps in the markets show the inaccuracy of the Black-Scholes model for pricing options. Merton provided the first research to appease this problem in 1976 when he extended the Black-Scholes model to include jumps in the market. In recent years, Kou has shown that the distribution of the jump sizes used in Merton’s model does not efficiently model the actual movements of the markets. Consequently, Kou modified Merton’s model changing the jump size distribution from a normal distribution to the double exponential distribution. Kou’s research utilizes mathematical equations to estimate the value of an American put option where the underlying stocks follow a jump-diffusion process. The research contained within this thesis extends on Kou’s research using Monte Carlo simulation (MCS) coupled with least-squares regression to price this type of American option. Utilizing MCS provides a continuous exercise and pricing region which is a distinct difference, and advantage, between MCS and other analytical techniques. The aim of this research is to investigate whether or not MCS is an efficient means to pricing American put options where the underlying stock undergoes a jump-diffusion process. This thesis also extends the simulation to utilize copulas in the pricing of baskets, which contains several of the aforementioned type of American options. The use of copulas creates a joint distribution from two independent distributions and provides an efficient means of modeling multiple options and the correlation between them. The research contained within this thesis shows that MCS provides a means of accurately pricing American put options where the underlying stock follows a jump-diffusion. It also shows that it can be extended to use copulas to price baskets of options with jump-diffusion. Numerical examples are presented for both portions to exemplify the excellent results obtained by using MCS for pricing options in both single dimension problems as well as multidimensional problems.
28

Cliqued holes and other graphic structures for the node packing polytope

Conley, Clark Logan January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Graph Theory is a widely studied topic. A graph is defined by two important features: nodes and edges. Nodes can represent people, cities, variables, resources, products, while the edges represent a relationship between two nodes. Using graphs to solve problems has played a major role in a diverse set of industries for many years. Integer Programs (IPs) are mathematical models used to optimize a problem. Often this involves maximizing the utilization of resources or minimizing waste. IPs are most notably used when resources must be of integer value, or cannot be split. IPs have been utilized by many companies for resource distribution, scheduling, and conflict management. The node packing or independent set problem is a common combinatorial optimization problem. The objective is to select the maximum nodes in a graph such that no two nodes are adjacent. Node packing has been used in a wide variety of problems, which include routing of vehicles and scheduling machines. This thesis introduces several new graph structures, cliqued hole, odd bipartite hole, and odd k-partite hole, and their corresponding valid inequalities for the node packing polyhedron. These valid inequalities are shown to be new valid inequalities and conditions are provided for when they are facet defining, which are known to be the strongest class of valid inequalities. These new valid inequalities can be used by practitioners to help solve node packing instances and integer programs.
29

Synchronized simultaneous lifting in binary knapsack polyhedra

Bolton, Jennifer Elaine January 1900 (has links)
Master of Science / Department of Industrial & Manufacturing Systems Engineering / Todd W. Easton / Integer programs (IP) are used in companies and organizations across the world to reach financial and time-related goals most often through optimal resource allocation and scheduling. Unfortunately, integer programs are computationally difficult to solve and in some cases the optimal solutions are unknown even with today’s advanced computing machines. Lifting is a technique that is often used to decrease the time required to solve an IP to optimality. Lifting begins with a valid inequality and strengthens it by changing the coefficients of variables in the inequality. Often times, this technique can result in facet defining inequalities, which are the theoretically strongest inequalities. This thesis introduces a new type of lifting called synchronized simultaneous lifting (SSL). SSL allows for multiple sets of simultaneously lifted variables to be simultaneously lifted which generates a new class of inequalities that previously would have required an oracle to be found. Additionally, this thesis describes an algorithm to perform synchronized simultaneous lifting for a binary knapsack inequality called the Synchronized Simultaneous Lifting Algorithm (SSLA). SSLA is a quadratic time algorithm that will exactly simultaneously lift two sets of simultaneously lifted variables. Short computational studies show SSLA can sometimes solve IPs to optimality that CPLEX, an advanced integer programming solver, alone cannot solve. Specifically, the SSL cuts allowed a 76 percent improvement over CPLEX alone.
30

The role of sourcing agents in global apparel supply chains: an exploratory study

Cook, Celeste Nicole January 1900 (has links)
Master of Science / Department of Apparel, Textiles, and Interior Design / Joy Kozar / Global apparel supply chains require collaboration from many sectors, including consumers, retail firms, manufacturing firms and factories. The role of sourcing agents as a facilitating sector of global apparel supply chains was explored as part of this study. A quantitative analysis of import data from the U.S. Department of Commerce for four apparel categories revealed changes in import volume over the last 10 years for all major trade regions of the world. Qualitative data was also collected to further analyze how industry professionals perceive the role of sourcing agents in facilitating apparel production movement. Participants revealed that sourcing agents breakdown cultural and geographic barriers between U.S. firms and off shore factories, providing a competitive advantage to U.S. apparel firms by reducing production cost and facilitating production of complex designs, but lack any form of formal regulation.

Page generated in 0.1089 seconds