• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2028
  • 247
  • 99
  • 74
  • 49
  • 17
  • 16
  • 10
  • 9
  • 7
  • 6
  • 3
  • 2
  • 2
  • 2
  • Tagged with
  • 2935
  • 2935
  • 506
  • 484
  • 483
  • 482
  • 450
  • 401
  • 343
  • 332
  • 218
  • 208
  • 183
  • 177
  • 176
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
91

LEAN MANUFACTURING TOOLS AND TECHNIQUES IN THE PROCESS INDUSTRY WITH A FOCUS ON STEEL

Abdullah, Fawaz Mohammed 03 September 2003 (has links)
This research addresses the application of lean manufacturing concepts to the continuous production/ process sector with a focus on the steel industry. The goal of this research is to investigate how lean manufacturing tools can be adapted from the discrete to the continuous manufacturing environment, and to evaluate their benefits on a specific application instance. Although the process and discrete industry share several common characteristics, there are areas where they are very different. Both manufacturing settings have overlap, but at the extreme, each has its unique characteristics. This research attempts to identify commonalities between discrete and continuous manufacturing where lean techniques from the discrete side are directly applicable. The ideas are tested on a large steel manufacturing company (referred to as ABS). Value stream mapping is used to first map the current state and then used to identify sources of waste and to identify lean tools to try to eliminate this waste. The future state map is then developed for a system with lean tools applied to it. To quantify the benefits gained from using lean tools and techniques in the value stream mapping, a detailed simulation model is developed for ABS and a designed experiment is used to analyze the outputs of the simulation model for different lean configurations. Generalizations of the results are also provided.
92

Assembly Operation Tools for e-Product Design and Realization

Kim, Kyoung-Yun 03 September 2003 (has links)
True competitive advantage can only result from the ability to bring highly customized quality products to the market at lower cost and in less time. Many customers are demanding customization and rapid delivery of innovative products. Industries now realize that the best way to reduce life cycle costs is to evolve a more effective product development paradigm using the Internet and web-based technologies. Yet there remains a gap between these market demands and current product development paradigms. Assembly plays a very important role in manufacturing industries, given that joints on a structure are inevitable because of the limitations on component geometric configurations and material properties along with various engineering requirements. Appropriate joints should be determined by considering mechanical and mathematical implications and assembly/joining knowledge. Currently, the effects of joining are analyzed upon completion of assembly modeling. This sequential process is arduous and time-consuming and is eliminated with the tools developed in this work. The existing CAD systems require that a product developer possess all the design and analysis tools in-house making it impractical to employ all the needed and newest tools. Existing assembly design methodologies have limitations on capturing the non-geometric aspects of a designers intent and the physical effects of joining in an Internet-based product development environment. In this work, new assembly design (AsD) frameworks and assembly operation tools (AOT) are developed to integrate AsD, virtual analysis, and decision making for e-product design and realization. The AOT include the assembly design (AsD), assembly implication (AsI), and assembly advisory (AsA) engines. The AsD formalism, which is the base of the AsD engine, represents the assembly/joining relations symbolically for computer interpretation, and the automatically generated AsD model is used for inferring mathematical/physical implications, as well as lean AsD information exchange. A new virtual assembly analysis concept is introduced to transparently predict the various effects of joining and is implemented in a service-oriented architecture. The AsA engine employs hierarchical semantic net to support an AsD decision by capturing AsD information and assembly/manufacturing knowledge. The concepts and AOT are validated using a case study of realistic mechanical assemblies.
93

Constraint-Enabled Design Information Representation for Mechanical Products Over the Internet

Wang, Yan 03 September 2003 (has links)
Global economy has made manufacturing industry become more distributed than ever before. Product design requires more involvement from various technical disciplines at different locations. In such a geographically and temporally distributed environment, efficient and effective collaboration on design is vital to maintain product quality and organizational competency. Interoperability of design information is one of major barriers for collaborative design. Current standard CAD data formats do not support design collaboration effectively in terms of design information and knowledge capturing, exchange, and integration within the design cycle. Multidisciplinary design constraints cannot be represented and transferred among different groups, and design information cannot be integrated efficiently within a distributed environment. Uncertainty of specification cannot be modeled at early design stages, while constraints for optimization are not embedded in design data. In this work, a design information model, Universal Linkage model, is developed to represent design related information for mechanical products in a distributed form. It incorporates geometric and non-geometric constraints with traditional geometry and topology elements, thus allows more design knowledge sharing in collaborative design. Segments of design data are linked and integrated into a complete product model, thus support lean design information capturing, storage, and query. The model is represented by Directed Hyper Graph and Product Markup Language to preserve extensibility and openness. Incorporating robustness consideration, an Interval Geometric Modeling scheme is presented, in which numerical parameters are represented by interval values. This scheme is able to capture uncertainty and inexactness of design and reduces the chances of conflict in constraint imposition. It provides a unified constraint representation for the process of conceptual design, detailed design, and design optimization. Corresponding interval constraint solving methods are studied.
94

THE STOCHASTIC UNIT COMMITMENT PROBLEM: A CHANCE CONSTRAINED PROGRAMMING APPROACH CONSIDERING EXTREME MULTIVARIATE TAIL PROBABILITIES

Ozturk, Ugur Aytun 03 September 2003 (has links)
Reliable power production is critical to the profitability of electricity utilities. Power generators (units) need to be scheduled efficiently to meet the electricity demand(load). This dissertation develops a solution method to schedule units for producing electricity while determining the estimated amount of surplus power each unit should produce taking into consideration the stochasticity of the load and its correlation structure. This scheduling problem is known as the unit commitment problem in the power industry. The solution method developed to solve this problem can handle the presence of wind power plants, which creates additional uncertainty. In this problem it is assumed that the system under consideration is an isolated one such that it does not have access to an electricity market. In such a system the utility needs to specify the probability level the system should operate under. This is taken into consideration by solving a chance constrained program. Instead of using a set level of energy reserve, the chance constrained model determines the level probabilistically which is superior to using an arbitrary approximation. In this dissertation, the Lagrangian relaxation technique is used to separate the master problem into its subproblems, where a subgradient method is employed in updating the Lagrange multipliers. To achieve this a computer program is developed that solves the optimization problem which includes a forward recursion dynamic program for the unit subproblems. A program developed externally is used to evaluate high dimensional multivariate normal probabilities. To solve the quadratic programs of period subproblems an optimization software is employed. The results obtained indicate that the load correlation is significant and cannot be ignored while determining a schedule for the pool of units a utility possesses. It is also concluded that it is very risky to choose an arbitrary level of energy reserve when solving the unit commitment problem. To verify the effectiveness of the optimum unit commitment schedules provided by the chance constrained optimization algorithm and to determine the expected operation costs, Monte Carlo simulations are used where the simulation generates the realized load according to the assumed multivariate normal distribution with a specific correlation structure.
95

An Examination of the Economic Benefits of ISO 9000 and the Baldrige Award to Manufacturing Firms

Wilson, James Patrick 09 June 2004 (has links)
This Thesis examines the financial data of manfacturing companies who are ISO 9000 certified and the winners in the manufacturing category of the Malcolm Baldrige National Quality Award to determine whether the benefit of receiving certification or winning the award is economically attractive. A literature review is completed to show the limited number of quantitative analyses that have been conducted on this subject and to provide the sources of raw data that were used in the thesis. An analysis of the costs and benefits associated with registration is performed for ISO 9000, while stock performance is examined for the Baldrige Award winners. Results show that the economic success reported by companies that received ISO 9000 certification or the Baldrige award may be exaggerated and certainly that this success cannot be gauranteed. Recommendations for further study and a simple program design for a summative evaluation of the Baldrige award winning companies is also suggested for future research.
96

A Neural Network Approach for Multi-Attribute Process Control with Comparison of Two Current Techniques and Guidelines for Practical Use

Larpkiattaworn, Siripen 02 February 2004 (has links)
Both manufacturing and service industries deal with quality characteristics, which include not only variables but attributes as well. In the area of Quality Control there has been substantial research in the area of correlated variables (i.e. multivariate control charts); however, little work has been done in the area of correlated attributes. To control product or service quality of a multi-attribute process, several issues arise. A high number of false alarms (Type I error) occur and the probability of not detecting defects increases when the process is monitored by a set of uni-attribute control charts. Furthermore, plotting and monitoring several uni-attribute control charts makes additional work for quality personnel. To date, a standard method for constructing a multi-attribute control chart has not been fully evaluated. In this research, three different techniques for simultaneously monitoring correlated process attributes have been compared: the normal approximation, the multivariate np-chart (MNP chart), and a new proposed Neural Network technique. The normal approximation is a technique of approximating multivariate binomial and Poisson distributions as normal distributions. The multivariate np chart (MNP chart) is base on traditional Shewhart control charts designed for multiple attribute processes. Finally, a Backpropagation Neural Network technique has been developed for this research. Each technique should be capable of identifying an out-of-control process while considering all correlated attributes simultaneously. To compare the three techniques an experiment was designed for two correlated attributes. The experiment consisted of three levels of proportion nonconforming p, three values of the correlation matrix, three sample sizes, and three magnitudes of shift of proportion nonconforming in either the positive or negative direction. Each technique was evaluated based on average run length and the number of replications of correctly identified given the direction of shifts (positive or negative). The resulting performances for all three techniques at their varied process conditions were presented and compared. From this study, it has observed that no one technique outperforms the other two techniques for all process conditions. In order to select a suitable technique, a user must be knowledgeable about the nature of their process and understand the risks associated with committing Type I and II errors. Guidelines for how to best select and use multi-attribute process control techniques are provided.
97

USING SIMULATION TO EXAMINE CUTTING POLICIES FOR A STEEL FIRM

Olsen, Susan Marie 02 February 2004 (has links)
Minimizing the cost of filling demand is a problem that reaches back to the foundation of operations research. Here we use simulation to investigate various heuristic policies for a one-dimensional, guillotine cutting stock problem with stochastic demand and multiple supply and demand locations. The policies investigated range from a random selection of feasible pieces, to a more strategic search of pieces of a specific type, to a new policy using dual values from a linear program that models a static, deterministic demand environment. We focus on an application in the steel industry and we use real data in our model. We show that simulation can effectively model such a system, and further we exhibit the relative performance of each policy. Our results demonstrate that this new policy provides statistically significant savings over the other policies investigated.
98

MULTIPHYSICS ANALYSIS AND OPTIMIZATION OF 3 DIMENSIONAL PRINTING TECHNOLOGY USING NANO FLUIDIC SUSPENSIONS

Desai, Salil S 13 September 2004 (has links)
Fabrication of micro and nano devices is of prime significance to the area of Micro-Electro-Mechanical Systems (MEMS). Attempts have been made to accommodate high performance devices in compact units, thus reducing their overall size. There exist a variety of microfabrication techniques including lithography, chemical vapor deposition, and LIGA that are used today. Manufacturing costs associated with these processes can be prohibitive due to cycle time and the precious material loss that occurs during etching operations. These drawbacks become more significant problem when building curved traces and support structures that most occur in 3D space. To address the problems associated with building 3-dimensional circuits and devices in free space, a unique manufacturing process has been developed. This process utilizes conductive Nano-Particulate Fluid Jets (NPFJ) that are deposited onto a substrate by a Continuous Inkjet (CIJ) printing methodology. In this method, a fluid jet consists of colloidal suspensions of conductors and carrier fluids that are deposited onto a substrate and later sintered at high temperatures to form a homogeneous material. The major contribution of the present research is the investigation, development and optimization of the NPFJ. In this work, a Computational Fluid Dynamics (CFD) model has been developed to simulate the fluid jet and CIJ process. The modified CIJ printing process involves interaction of three domains namely, electrostatics, structural and fluidics. A coupled field analysis of the piezoelectric membrane that exists in the CIJ print head is conducted to establish the perturbation characteristics applied to the fluid. Interaction of the above three domains is captured within a single model using a (FSI) fluid-structural algorithm which staggers between domains until convergence is attained. A Design of Experiments approach was used to determine trends for the drop formations based on various exciting parameters. Results from these simulations have been validated using an ultra-high-speed camera featuring exposure/delay times from 100 nanoseconds at full sensor resolution. The results of present research will give manufacturers the freedom to construct 3D devices and circuits that conform to the desired shapes and sizes of products, rather than being limited to present 2D components such as printed circuit boards.
99

Optimal Policies for the Acceptance of Living- and Cadaveric-Donor Livers

Alagoz, Oguzhan 13 September 2004 (has links)
Transplantation is the only viable therapy for end-stage liver diseases (ESLD) such as hepatitis B. In the United States, patients with ESLD are placed on a waiting list. When organs become available, they are offered to the patients on this waiting list. This dissertation focuses on the decision problem faced by these patients: which offer to accept and which to refuse? This decision depends on two major components: the patient's current and future health, as well as the current and future prospect for organ offers. A recent analysis of liver transplant data indicates that 60\% of all livers offered to patients for transplantation are refused. This problem is formulated as a discrete-time Markov decision process (MDP). This dissertation analyzes three MDP models, each representing a different situation. The Living-Donor-Only Model considers the problem of optimal timing of living-donor liver transplantation, which is accomplished by removing an entire lobe of a living donor's liver and implanting it into the recipient. The Cadaveric-Donor-Only Model considers the problem of accepting/refusing a cadaveric liver offer when the patient is on the waiting list but has no available living donor. In this model, the effect of the waiting list is incorporated into the decision model implicitly through the probability of being offered a liver. The Living-and-Cadaveric-Donor Model is the most general model. This model combines the first two models, in that the patient is both listed on the waiting list and also has an available living donor. The patient can accept the cadaveric liver offer, decline the cadaveric liver offer and use the living-donor liver, or decline both and continue to wait. This dissertation derives structural properties of all three models, including several sets of conditions that ensure the existence of intuitively structured policies such as control-limit policies. The computational experiments use clinical data, and show that the optimal policy is typically of control-limit type.
100

An Integrated, Evolutionary Approach to Facility LAyout and Detailed Design

Shebanie, Charles 13 September 2004 (has links)
The unequal-area, shape constrained facility layout problem is a NP-hard combinatorial optimization problem concerned with minimizing material handling costs. An integrated methodology that incorporates a genetic algorithm and a constructive heuristic is developed to simultaneously solve the traditional block layout problem of locating and shaping departments and the detailed design problem of locating the input/output stations of departments. These problems have received much attention over the past half-century with the majority of research focused on solving them individually or sequentially. This thesis aims to show that an integrated methodology which combines the problems and solves them in parallel is preferable to sequential approaches. The complexity of the integrated layout problem is reduced through a Flexbay formulation and through pre-assigned intra-departmental flow types. A genetic algorithm with a two-tiered solution structure generates and maintains a population of block layout solutions throughout an evolutionary process. Genetic operators reproduce and alter solutions in order to generate better solutions, find new search directions, and prevent premature convergence of the algorithm. An adaptive penalty mechanism guides the search process and reduces the computational overhead of the algorithm. Through the placement of input/output stations, the optimization of a block layouts material flow network is implemented as a subroutine to the genetic algorithm. A contour distance metric is used to evaluate the costs associated with material movement between the input/output stations of departments and aids in constructing practical aisle structures. A constructive placement heuristic places the input/output stations and perturbs them until no further improvement to a layout can be realized. The integrated approach is applied to several well known problems over a comprehensive test plan. The results from the integrated approach indicate moderate variability in the solutions and considerable computational expense. To compare the integrated methodology to prior methodologies, some of the best results from the unequal-area facility layout problem are selected from prior research and the I/O optimization heuristic is applied to them. The results of the integrated approach uniformly and significantly outperform the results obtained through sequential optimization. The integrated methodology demonstrates the value of a simultaneous approach to the unequal-area facility layout problem.

Page generated in 0.383 seconds