191 |
Constraint-Enabled Design Information Representation for Mechanical Products Over the InternetWang, Yan 03 September 2003 (has links)
Global economy has made manufacturing industry become more distributed than ever before. Product design requires more involvement from various technical disciplines at different locations. In such a geographically and temporally distributed environment, efficient and effective collaboration on design is vital to maintain product quality and organizational competency. Interoperability of design information is one of major barriers for collaborative design. Current standard CAD data formats do not support design collaboration effectively in terms of design information and knowledge capturing, exchange, and integration within the design cycle. Multidisciplinary design constraints cannot be represented and transferred among different groups, and design information cannot be integrated efficiently within a distributed environment. Uncertainty of specification cannot be modeled at early design stages, while constraints for optimization are not embedded in design data.
In this work, a design information model, Universal Linkage model, is developed to represent design related information for mechanical products in a distributed form. It incorporates geometric and non-geometric constraints with traditional geometry and topology elements, thus allows more design knowledge sharing in collaborative design. Segments of design data are linked and integrated into a complete product model, thus support lean design information capturing, storage, and query. The model is represented by Directed Hyper Graph and Product Markup Language to preserve extensibility and openness. Incorporating robustness consideration, an Interval Geometric Modeling scheme is presented, in which numerical parameters are represented by interval values. This scheme is able to capture uncertainty and inexactness of design and reduces the chances of conflict in constraint imposition. It provides a unified constraint representation for the process of conceptual design, detailed design, and design optimization. Corresponding interval constraint solving methods are studied.
|
192 |
THE STOCHASTIC UNIT COMMITMENT PROBLEM: A CHANCE CONSTRAINED PROGRAMMING APPROACH CONSIDERING EXTREME MULTIVARIATE TAIL PROBABILITIESOzturk, Ugur Aytun 03 September 2003 (has links)
Reliable power production is critical to the
profitability of electricity utilities. Power generators (units) need to be scheduled efficiently to meet the electricity demand(load). This dissertation develops a solution method to schedule units for producing electricity while determining the estimated amount of surplus power each unit should produce taking into consideration the stochasticity of the load and its correlation structure. This scheduling problem is known as the unit commitment problem in the power industry. The solution method developed to solve this problem can handle the presence of wind power plants, which creates additional uncertainty. In this problem it is
assumed that the system under consideration is an isolated one such that it does not have access to an electricity market. In such a system the utility needs to specify the probability level
the system should operate under. This is taken into consideration by solving a chance constrained program. Instead of using a set level of energy reserve, the chance constrained model determines the level probabilistically which is superior to using an arbitrary approximation. In this dissertation, the Lagrangian relaxation technique is used to separate the master problem into its subproblems, where a subgradient method is employed in updating the Lagrange multipliers. To achieve this a computer program is developed that solves the optimization problem which
includes a forward recursion dynamic program for the unit subproblems. A program developed externally is used to evaluate high dimensional multivariate normal probabilities. To solve the
quadratic programs of period subproblems an optimization software is employed. The results obtained indicate that the load correlation is significant and cannot be ignored while determining a schedule for the pool of units a utility possesses. It is also concluded that it is very risky to choose an arbitrary level of
energy reserve when solving the unit commitment problem. To verify the effectiveness of the optimum unit commitment schedules provided by the chance constrained optimization algorithm and to
determine the expected operation costs, Monte Carlo simulations are used where the simulation generates the realized load according to the assumed multivariate normal distribution with a
specific correlation structure.
|
193 |
An Examination of the Economic Benefits of ISO 9000 and the Baldrige Award to Manufacturing FirmsWilson, James Patrick 09 June 2004 (has links)
This Thesis examines the financial data of manfacturing companies who are ISO 9000 certified and the winners in the manufacturing category of the Malcolm Baldrige National Quality Award to determine whether the benefit of receiving certification or winning the award is economically attractive. A literature review is completed to show the limited number of quantitative analyses that have been conducted on this subject and to provide the sources of raw data that were used in the thesis.
An analysis of the costs and benefits associated with registration is performed for ISO 9000, while stock performance is examined for the Baldrige Award winners. Results show that the economic success reported by companies that received ISO 9000 certification or the Baldrige award may be exaggerated and certainly that this success cannot be gauranteed. Recommendations for further study and a simple program design for a summative evaluation of the Baldrige award winning companies is also suggested for future research.
|
194 |
A Neural Network Approach for Multi-Attribute Process Control with Comparison of Two Current Techniques and Guidelines for Practical UseLarpkiattaworn, Siripen 02 February 2004 (has links)
Both manufacturing and service industries deal with quality characteristics, which include not only variables but attributes as well. In the area of Quality Control there has been substantial research in the area of correlated variables (i.e. multivariate control charts); however, little work has been done in the area of correlated attributes. To control product or service quality of a multi-attribute process, several issues arise. A high number of false alarms (Type I error) occur and the probability of not detecting defects increases when the process is monitored by a set of uni-attribute control charts. Furthermore, plotting and monitoring several uni-attribute control charts makes additional work for quality personnel.
To date, a standard method for constructing a multi-attribute control chart has not been fully evaluated. In this research, three different techniques for simultaneously monitoring correlated process attributes have been compared: the normal approximation, the multivariate np-chart (MNP chart), and a new proposed Neural Network technique. The normal approximation is a technique of approximating multivariate binomial and Poisson distributions as normal distributions. The multivariate np chart (MNP chart) is base on traditional Shewhart control charts designed for multiple attribute processes. Finally, a Backpropagation Neural Network technique has been developed for this research. Each technique should be capable of identifying an out-of-control process while considering all correlated attributes simultaneously.
To compare the three techniques an experiment was designed for two correlated attributes. The experiment consisted of three levels of proportion nonconforming p, three values of the correlation matrix, three sample sizes, and three magnitudes of shift of proportion nonconforming in either the positive or negative direction. Each technique was evaluated based on average run length and the number of replications of correctly identified given the direction of shifts (positive or negative). The resulting performances for all three techniques at their varied process conditions were presented and compared.
From this study, it has observed that no one technique outperforms the other two techniques for all process conditions. In order to select a suitable technique, a user must be knowledgeable about the nature of their process and understand the risks associated with committing Type I and II errors. Guidelines for how to best select and use multi-attribute process control techniques are provided.
|
195 |
USING SIMULATION TO EXAMINE CUTTING POLICIES FOR A STEEL FIRMOlsen, Susan Marie 02 February 2004 (has links)
Minimizing the cost of filling demand is a problem that reaches back to the foundation of operations research. Here we use simulation to investigate various heuristic policies for a one-dimensional, guillotine cutting stock problem with stochastic demand and multiple supply and demand locations. The policies investigated range from a random selection of feasible pieces, to a more strategic search of pieces of a specific type, to a new policy using dual values from a linear program that models a static, deterministic demand environment. We focus on an application in the steel industry and we use real data in our model. We show that simulation can effectively model such a system, and further we exhibit the relative performance of each policy. Our results demonstrate that this new policy provides statistically significant savings over the other policies investigated.
|
196 |
MULTIPHYSICS ANALYSIS AND OPTIMIZATION OF 3 DIMENSIONAL PRINTING TECHNOLOGY USING NANO FLUIDIC SUSPENSIONSDesai, Salil S 13 September 2004 (has links)
Fabrication of micro and nano devices is of prime significance to the area of Micro-Electro-Mechanical Systems (MEMS). Attempts have been made to accommodate high performance devices in compact units, thus reducing their overall size. There exist a variety of microfabrication techniques including lithography, chemical vapor deposition, and LIGA that are used today. Manufacturing costs associated with these processes can be prohibitive due to cycle time and the precious material loss that occurs during etching operations. These drawbacks become more significant problem when building curved traces and support structures that most occur in 3D space.
To address the problems associated with building 3-dimensional circuits and devices in free space, a unique manufacturing process has been developed. This process utilizes conductive Nano-Particulate Fluid Jets (NPFJ) that are deposited onto a substrate by a Continuous Inkjet (CIJ) printing methodology. In this method, a fluid jet consists of colloidal suspensions of conductors and carrier fluids that are deposited onto a substrate and later sintered at high temperatures to form a homogeneous material. The major contribution of the present research is the investigation, development and optimization of the NPFJ. In this work, a Computational Fluid Dynamics (CFD) model has been developed to simulate the fluid jet and CIJ process. The modified CIJ printing process involves interaction of three domains namely, electrostatics, structural and fluidics. A coupled field analysis of the piezoelectric membrane that exists in the CIJ print head is conducted to establish the perturbation characteristics applied to the fluid. Interaction of the above three domains is captured within a single model using a (FSI) fluid-structural algorithm which staggers between domains until convergence is attained. A Design of Experiments approach was used to determine trends for the drop formations based on various exciting parameters. Results from these simulations have been validated using an ultra-high-speed camera featuring exposure/delay times from 100 nanoseconds at full sensor resolution.
The results of present research will give manufacturers the freedom to construct 3D devices and circuits that conform to the desired shapes and sizes of products, rather than being limited to present 2D components such as printed circuit boards.
|
197 |
Optimal Policies for the Acceptance of Living- and Cadaveric-Donor LiversAlagoz, Oguzhan 13 September 2004 (has links)
Transplantation is the only viable therapy for end-stage liver
diseases (ESLD) such as hepatitis B. In the United States,
patients with ESLD are placed on a waiting list. When organs
become available, they are offered to the patients on this waiting
list. This dissertation focuses on the decision problem faced by
these patients: which offer to accept and which to refuse? This
decision depends on two major components: the patient's current
and future health, as well as the current and future prospect for
organ offers. A recent analysis of liver transplant data indicates
that 60\% of all livers offered to patients for transplantation
are refused.
This problem is formulated as a discrete-time Markov decision
process (MDP). This dissertation analyzes three MDP models, each
representing a different situation. The Living-Donor-Only Model
considers the problem of optimal timing of living-donor liver
transplantation, which is accomplished by removing an entire lobe
of a living donor's liver and implanting it into the recipient.
The Cadaveric-Donor-Only Model considers the problem of
accepting/refusing a cadaveric liver offer when the patient is on
the waiting list but has no available living donor. In this model,
the effect of the waiting list is incorporated into the decision
model implicitly through the probability of being offered a liver.
The Living-and-Cadaveric-Donor Model is the most general model.
This model combines the first two models, in that the patient is
both listed on the waiting list and also has an available living
donor. The patient can accept the cadaveric liver offer, decline
the cadaveric liver offer and use the living-donor liver, or
decline both and continue to wait.
This dissertation derives structural properties of all three
models, including several sets of conditions that ensure the
existence of intuitively structured policies such as control-limit
policies. The computational experiments use clinical data, and
show that the optimal policy is typically of control-limit type.
|
198 |
An Integrated, Evolutionary Approach to Facility LAyout and Detailed DesignShebanie, Charles 13 September 2004 (has links)
The unequal-area, shape constrained facility layout problem is a NP-hard combinatorial optimization problem concerned with minimizing material handling costs. An integrated methodology that incorporates a genetic algorithm and a constructive heuristic is developed to simultaneously solve the traditional block layout problem of locating and shaping departments and the detailed design problem of locating the input/output stations of departments. These problems have received much attention over the past half-century with the majority of research focused on solving them individually or sequentially. This thesis aims to show that an integrated methodology which combines the problems and solves them in parallel is preferable to sequential approaches.
The complexity of the integrated layout problem is reduced through a Flexbay formulation and through pre-assigned intra-departmental flow types. A genetic algorithm with a two-tiered solution structure generates and maintains a population of block layout solutions throughout an evolutionary process. Genetic operators reproduce and alter solutions in order to generate better solutions, find new search directions, and prevent premature convergence of the algorithm. An adaptive penalty mechanism guides the search process and reduces the computational overhead of the algorithm. Through the placement of input/output stations, the optimization of a block layouts material flow network is implemented as a subroutine to the genetic algorithm. A contour distance metric is used to evaluate the costs associated with material movement between the input/output stations of departments and aids in constructing practical aisle structures. A constructive placement heuristic places the input/output stations and perturbs them until no further improvement to a layout can be realized.
The integrated approach is applied to several well known problems over a comprehensive test plan. The results from the integrated approach indicate moderate variability in the solutions and considerable computational expense. To compare the integrated methodology to prior methodologies, some of the best results from the unequal-area facility layout problem are selected from prior research and the I/O optimization heuristic is applied to them. The results of the integrated approach uniformly and significantly outperform the results obtained through sequential optimization. The integrated methodology demonstrates the value of a simultaneous approach to the unequal-area facility layout problem.
|
199 |
Exploring the industrial hygiene academic curriculum: Expectations and perceptions of the professionBreeding, David Clarence 15 May 2009 (has links)
Although the multi-disciplinary profession of industrial hygiene (IH) has been
established for many years and IH practitioners have been prolific in developing the technical
tools for recognition, evaluation and control of workplace hazards, few in the IH discipline have
turned the tools and methods of academic research toward the academic curriculum itself. A
review of the literature revealed that published research in IH curriculum has been minimal, and
that none has considered comparing faculty and employer expectations. Evaluating the nature of
the current IH curriculum, and the preferences and expectations of the IH profession for
graduates’ competencies, is true to the goal of IH practice, i.e., conducting research as a basis for
on-going evaluation and review of existing programs, and using research findings to plan
preventive interventions in order to ensure continued good health of both programs and impacted
individuals.
This research was an initial, exploratory study to identify and assess the expectations and
perceptions of the IH faculty and employers in the areas of IH curriculum content and structure.
The expectations and perceptions of IH academic program faculty were compared with those of
employers of graduates of IH programs. Characteristics of current IH academic programs were
identified, as a baseline for future evaluation of the IH curriculum. Actual and expected undergraduate majors of those entering IH masters programs were identified to aid in targeting
effective recruitment programs and efficient resource allocation. The study populations’ skill and
capacity with computers and the Internet were assessed as an indicator of readiness to
incorporate distance learning methodology and electronic media delivery into traditional
classroom delivery of industrial hygiene education. Recommendations were given for model IH
curricula derived from the survey participants’ responses, and for future work.
|
200 |
Effective Design and Operation of Supply Chains for Remnant Inventory SystemsWang, Zhouyan 02 June 2006 (has links)
This research considers a stochastic supply chain problem that (a) has applications in a
number of continuous production industries, and (b) integrates elements of several classical
operations research problems, including the cutting stock problem, inventory management,
facility location, and distribution. The research also uses techniques such as stochastic
programming and Benders' decomposition. We consider an environment in which a company
has geographically dispersed distribution points where it can stock standard sizes of a product
from its plants. In the most general problem, we are given a set of candidate distribution
centers with different fixed costs at the di®erent locations, and we may choose not to operate facilities at one or more of these locations. We assume that the customer demand for smaller sizes comes from other geographically distributed points on a continuing basis and this demand is stochastic in nature and is modeled by a Poisson process. Furthermore, we address a sustainable manufacturing environment where the trim is not considered waste, but rather, gets recycled and thus has an inherent value associated with it. Most importantly, the problem is not a static one where a one-time decision has to be made. Rather, decisions are made on a continuing basis, and decisions made at one point in time have a significant impact on those made at later points. An example of where this problem would arise is a steel or aluminum company that produces product in rolls of standard widths. The decision maker must decide which facilities to open, to find long-run replenishment rates for standard sizes, and to develop long-run policies for cutting these into smaller pieces so as to satisfy customer demand. The cutting stock, facility-location, and transportation problems reside at the heart of the research, and all these are integrated into the framework of a supply chain. We can see that, (1) a decision made at some point in time a®ects the ability to satisfy demand at a later point, and (2) that there might be multiple ways to satisfy demand. The situation is further complicated by the fact that customer demand is stochastic and that this demand could be potentially satisfied by more than one distribution center. Given this background, this research examines broad alternatives for how the company's supply chain should be designed and operated in order to remain competitive with smaller and more nimble companies. The research develops a LP formulation, a mixed-integer programming formulation, and a stochastic programming formulation to model di®erent aspects of the problem. We present new solution methodologies based on Benders' decomposition and the L-shaped method to solve the NP-hard mixed-integer problem and the stochastic problem respectively. Results from duality will be used to develop shadow prices for the units in stock, and these in turn will be used to develop a policy to help make decisions on an ongoing basis. We investigate the theoretical underpinnings of the models, develop new, sophisticated computational methods and interesting properties of its solution, build a simulation model to compare the policies developed with other ones commonly in use, and conduct computational studies to compare the performance of new methods with their corresponding existing methods.
|
Page generated in 0.0604 seconds