• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 476
  • 71
  • 37
  • 37
  • 37
  • 37
  • 37
  • 37
  • 2
  • Tagged with
  • 604
  • 604
  • 59
  • 52
  • 47
  • 46
  • 42
  • 42
  • 40
  • 33
  • 32
  • 30
  • 25
  • 25
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

A new framework for manufacturing planning and scheduling in engineered-to-order environments /

Jin, Guang, 1955- January 2000 (has links)
The dynamic characteristics of engineered-to-order (ETO) environments are in conflict with the basic assumptions of Manufacturing Resource Planning (MRP) systems, which makes MRP based manufacturing planning systems fail to meet the needs of ETO companies. ETO environments are complex and require sophisticated planning techniques. In fact, the shortcomings of planning and scheduling systems are exacerbating the ability of ETO companies to reduce manufacturing cycle times. This thesis presents an overview of ETO environments and related manufacturing planning and scheduling systems. The reasons for the failure of present MRP systems to address ETO environments are discussed along with an analysis of the planning systems of real companies. A new framework for manufacturing planning and scheduling in ETO environments is proposed. It addresses the importance of coordinating planning and scheduling among engineering, material procurement and the shop floor. It incorporates finite capacity scheduling into MRP systems to overcome load fluctuations. It also has scheduling mechanisms to address the dynamic nature of ETO environments.
22

Development of an optical technique for on-line measurement of the thickness distribution of blow moulding parisons

Swan, Philip January 1991 (has links)
In the extrusion blow moulding process, the strength and weight of a hollow article, such as a bottle, is controlled by an open loop control process called parison programming. The article thickness is increased and decreased by opening and closing the gap of the annular die from which the parison is extruded. The die gap is regulated according to a gap-time profile which an operator determines during start up by trial and error. / An optical sensor has been developed which can measure the thickness profile of the parison on-line just prior to its enclosure in the mould. The device will help operators to program the gap-time profile for optimum use of plastic by providing rapid feedback on the formation of the parison. It also represents an important step toward the development of closed loop control for container thickness distribution. / The device determines thickness by striking the parison at an angle with a laser beam and measuring the separation between the beams that reflect from the outer and inner surfaces of the parison wall. A prototype was built and tested. The prototype uses three lasers at different angles and can make up to 250 point measurements during a one second scan. A personal computer uses specially developed software to reconstruct the profile of the parison wall from the raw data with an accuracy of $ pm$5%.
23

Effect of pre-drawing on formability during cold heading

Ma, Lianzhong, 1968- January 2005 (has links)
One of the most common industrial cold forging processes is cold heading of steel wire or rod to produce screws, bolts, nuts and rivets. The process is limited by a complicated interplay of many factors. The cold work (pre-drawing) is one of them. Although several investigations into the effects of pre-drawing on the formability of metals during cold heading processes have been conducted, so far no attention has been given to the numerical simulations of this phenomenon. The current work aims at examining effects of pre-drawing on formability during cold heading through numerical simulations. / Physical tests in the literature investigating the effects of pre-drawing on the formability of three metals are simulated using ABAQUS 6.4, with three successive FE models: the drawing model, the cutting model and the upsetting model. A new combined linear kinematic/nonlinear isotropic hardening constitutive model is proposed and derived to account for the Bauschinger effect existing in reverse plastic deformation. The new model is implemented into ABAQUS/Explicit v6.4 by a user subroutine VUMAT, which is verified by one-element numerical tests under tension, compression and reverse loading conditions. In addition, for the purpose of comparison, the Johnson-Cook isotropic hardening model is also applied for the materials. The Cockroft and Latham criterion is employed to predict surface fracture. / Although considerable discrepancies between the experimental and simulation results are observed, the proposed combined hardening model is more accurate in predicting material behavior in the reverse loading than the Johnson-Cook isotropic hardening model. In addition, the simulation results show that the proposed combined hardening material model has the potential to correctly predict the material behavior in the reverse loading process.
24

Defining display complexity in electric utility system operator displays

McElhaney, Steven Hunt 15 January 2014 (has links)
<p> In the electric utility industry, displays provide power system operators with information on and the status of the system, who then make decisions on how to maintain the safety, the reliability and the efficient operation of the utility generation and transmission grid based on that information. Complexity of the data presented and the display itself can lead to errors or misjudgments that can cause power system operators to make unwise decisions. The primary goal of this research was to develop a method to quantify display complexity for select displays used by system operators when operating the electric generation and transmission grids. Three studies were performed: (1) complexity measure development, (2) validation of the measure using usability and situation awareness (SA) techniques, and (3) display revisions based on complexity measure findings. Fifteen 15 different complexity metrics were originally considered (additive models, multiplicative models, and combination models with five different weighting schemes). The additive model with equal weighting was found to be the most sensitive in differentiating displays and was used in the later studies. For the validation study, system operators were asked to complete a usability questionnaire and a paper-based SA test using the current displays. Correlation and scatter plot analyses was used to determine if the complexity metric and usability and SA scores were related. Results of the validation study indicated that usability and SA scores for the studied displays were not well correlated with the complexity metric. In study 3, the highest and lowest scoring displays were redesigned with an emphasis on maintaining functionality but reducing aspects of complexity that were driving the complexity score. Systems operators again completed the usability and SA testing using the redesigned displays and again correlation analysis was performed. As was the case with study 2, usability scores were not correlated with the complexity metric; however, SA scores were significantly correlated. The complexity metric developed here can be used to quantify the complexity in a display and identify redesign opportunities to reduce non-essential information, as displays that are less complex should result in improved operator performance and satisfaction with the display.</p>
25

A process comparison algorithm /

Nayestani, Naynaz January 2002 (has links)
The purpose of this research was to design a method for the detailed comparison of processes. People already do process measurement and compare these measurements to internal or regulatory standards. The innovation in this thesis is the development of an algorithm for process comparison that accommodates any process, responds to any kind of semantic junction in the process model, deals with uncertain process data, and compares all or part of a process. This is done by the activity-by-activity comparison of data and measurements from an actual process to a model of a process. / IDEF3 diagrams were used for process modeling, a modified PERT network technique was used for process comparison, and the technique incorporated fuzzy data and metric comparison. / Solutions of practical problems showed that the algorithm does not have any limitations for process comparison. While the test data were for time performance only, other metrics can easily be accommodated, such as cost, quality, human resource or energy.
26

Models for estimating design effort

Bashir, Hamdi A. January 2000 (has links)
In today's competitive environment, it is necessary to deliver products on time and within budget. Unfortunately, design projects have been plagued by severe cost and schedule overruns. This problem persists in spite of the significant advances that have been made in design technology over the last two decades. In most of the cases, the problem of overruns is due to poor estimation. The search for a solution has become even more pressing in the present era of shrinking product cycle times. / Driven primarily by this need, this thesis presents new effort estimation models. Unlike existing estimation techniques that are based on work breakdown structures with respect to process or product, the proposed models are based on a new metric for estimating product complexity, which is based on product functional decomposition. The validity of the metric as a good predictor of design effort was tested using data obtained from an experiment involving simple design tasks, and empirically using historical data collected for 32 projects from 3 companies. / The performance of the new effort estimation models was tested in terms of a number of objective criteria. The results indicated that the average estimation error of the models ranged from 12% to 15%. The improvement in estimation accuracy accomplished by the models ranged from 52% to 64% compared to estimates originally made by the companies which had errors from 27% to 41%. / Moreover, models for estimating cost and duration, as well as updating the estimates during project execution, were derived. The applications of the derived models are described through demonstrative examples. Thus, a complete methodology is given for the estimation of project effort and duration.
27

Sparse data estimation for knowledge processes

Lari, Kamran A. January 2004 (has links)
During recent years, industry has increasingly focused on knowledge processes. Similar to traditional or manufacturing processes, knowledge processes need to be managed and controlled in order to provide the expected results for which they were designed. During the last decade, the principals of process management have evolved, especially through work done in software engineering and workflow management. / Process monitoring is one of the major components for any process management system. There have been efforts to design process control and monitoring systems; however, no integrated system has yet been developed as a "generic intelligent system shell". In this dissertation, an architecture for an integrated process monitoring system (IPMS) is developed, whereby the end-to-end activities of a process can be automatically measured and evaluated. In order to achieve this goal, various components of the IPMS and the interrelationship among these components are designed. / Furthermore, a comprehensive study on the available methodologies and techniques revealed that sparse data estimation (SDE) is the key component of the IPMS which does not yet exist. Consequently, a series of algorithms and methodologies are developed as the basis for the sparse data estimation of knowledge based processes. Finally, a series of computer programs demonstrate the feasibility and functionality of the proposed approach when applied to a sample process. The sparse data estimation method is successful for not only knowledge based processes, but also for any process, and indeed for any set of activities that can be modeled as a network.
28

A study of key mechanisms for concurrent engineering processes /

Liu, Yun, 1969- January 2005 (has links)
Previous studies (Bhuiyan, 2001; Bhuiyan et al., 2003; Jaafar, 2001) used stochastic computer models to study concurrent engineering (CE) processes. This thesis used the same models, but modified the two main mechanisms used in CE, that is, functional interaction and overlap, in order to better understand how they contribute to process performance, i.e., how they affect product development effort and span time. The present study used more realistic development processes over previous research in addition to using the same uncertainty conditions, rework, learning, and communication techniques. Simulation results of the updated models were discussed in comparison to the baseline models in terms of effort versus span time and effort distribution during the process versus span time. / Research outcomes indicated that the use of CE was beneficial as long as the uncertainty of information during product development was moderate to low. When uncertainty was high, sequential engineering was best. Several cases were demonstrated. / The distribution of effort during product development was studied and it showed that processes should be designed to avert rework due to design versions (complete redesign of the product) and should emphasize churn (redesign in small steps during teamwork).
29

Supplier partnership establishment under uncertainties for agile organizations.

Baramichai, Manisra. January 2007 (has links)
Thesis (Ph.D.)--Lehigh University, 2007. / Adviser: Emory W. Zimmers, Jr.
30

Evaluating portfolios of multi-stage investment projects with approximate dynamic programming.

Keles, Pinar. January 2007 (has links)
Thesis (Ph.D.)--Lehigh University, 2007. / Adviser: Joseph C. Hartman.

Page generated in 0.0782 seconds