Spelling suggestions: "subject:"bindustrial"" "subject:"0industrial""
141 |
Linear Program Construction Using MetamodelingParks, Judith-Marie Tyler 02 January 2003 (has links)
One of the most significant trends in data warehousing today is the integration of Metadata into data warehousing tools. A data warehouse is an area which exists on computer systems that is used for holding all of the data that an organization might possess. Metadata is ²data about data,² a dictionary and summary of data, that is held in a system catalog that is contained in a data warehouse. The purpose of this dissertation is four-fold: to show that by examining a database?s system catalog, information can be extracted from it that can be used to develop a structure for building operations research applications. To show that a database?s system catalog can be modified to hold the structure and the definition of a linear programming model. To show that a data table containing the linear programming model constraints can be automatically constructed based on the contents of the modified system catalog. And finally, to show that the modified system catalog can be used to guide a user in developing objective functions based on a given set of model constraints. Thus, the main contribution of the work is that it furthers the hybrid area of information technology/mathematical programming by exploiting metadata, as opposed to raw data, that is held in a data warehouse.
|
142 |
A Lower Bound Calculationfor theN-Job, M-Machine Job Shop Scheduling Problemto Minimize LmaxStanislaw, Natalie A. 13 July 1998 (has links)
<p>An improved lower bound on Lmax is developed for the N-job, M-machine job shop scheduling problem. Improvements occur particularly on problems that are defined by a specific due date range. The procedure allows preemption on all but one machine and then identifies other delays in the processing on that machine. When a delay is found it affects the earliest starts of the remaining operations on that job and the latest finishes of the preceding operations. The delays are found by repeatedly lowering the potential value of Lmax, which, in response, may increase the lower bound. To accommodate the decreasing upper bound, (i.e. Lmax), changes in sequence are made which cause earliest starts and latest finishes to be updated. This then allows for recalculation of the lower bound. The lower bound is still determined using preemption on all but one machine, but now includes more accurate (i.e., tighter) start and finish times. <P>
|
143 |
AN AUTOMATED PROCEDURE FOR STOCHASTIC SIMULATION INPUT MODELING WITH BEZIER DISTRIBUTIONSDonovan, Marty Edwin 14 October 1998 (has links)
<p>As a means of handling the problem of input modeling forstochastic simulation experiments, we build upon previous workof Wagner and Wilson using Bézier distributions. Wagner andWilson proposed a likelihood ratio test to determine how manycontrol points (that is, parameters) a Bézier distributionshould have to adequately model sample data. In this thesis, weextend this input-modeling methodology in two directions. First,we establish the asymptotic properties of the Likelihood RatioTest (LRT) as the sample size tends to infinity. The asymptoticanalysis applies only to maximum likelihood estimation withknown endpoints and not to any other parameter estimationprocedure, nor to situations in which the endpoints of thetarget distribution are unknown. Second, we perform acomprehensive Monte Carlo evaluation of this procedure forfitting data together with other estimation procedures based onleast squares and minimum L norm estimation. In the MonteCarlo performance evaluation, several different goodness-of-fitmeasures are formulated and used to evaluate how well the fittedcumulative distribution function (CDF) compares to theempirical CDF and to the actual CDF from which the samplescame. The Monte Carlo experiments show that in addition toworking well with the method of maximum likelihood when theendpoints of the target distribution are known, the LRT alsoworks well with minimum L norm estimation and least squaresestimation; moreover, the LRT works well with suitablyconstrained versions of these three estimation methods when theendpoints are unknown and must also be estimated.<P>
|
144 |
IMPROVED BATCHING FORCONFIDENCE INTERVAL CONSTRUCTION IN STEADY-STATESIMULATIONSteiger, Natalie Miller 10 March 1999 (has links)
<p>The primary objectives of this research are formulation and evaluation of an improved batch-means procedure for steady-state simulation output analysis. The new procedure yields a confidence interval for a steady-state expected response that is centered on the sample mean of a portion of the series ofsimulation-generated responses and satisfies a user-specified absolute or relative precision requirement. We concentrate on the method of nonoverlapping batch means (NOBM), which requires the sample means computed from adjacent batches of observations to be independent and identically distributed normal random variables. For increasing batch sizes and a fixed number of batches computed from a weakly dependent (phi-mixing) output process, we establish key asymptotic distributional properties of the vector of batch means and of the numerator and squared denominator of the NOBM -ratio, where the terms of the expansion are estimated via an autoregressive--moving average time series model of the batch means. An extensive experimental performance evaluation demonstrates the advantages of ASAP versus other widely used batch-means procedures.<P>
|
145 |
Statistical Analysis of Long-Range Dependent Processes via a Stochastic Intensity Approach, with Applications in NetworkingTan, Phaik-Hoon 07 July 1999 (has links)
<p>The objective of this research is to develop a flexible stochastic intensity-function model for traffic arrivals arising in an Ethernet local area network. To test some well-known Bellcore datasets for long-rangedependence or nonstationarity, a battery of statistical tests was applied---including a new extensionof the classical Priestley-Rao test for nonstationarity; and the results of this analysis revealedpronounced nonstationarity in all of the Bellcore datasets. To model such teletraffic arrivalprocesses accurately, a stochastic intensity function was formulated as a nonlinear extension of theCox regression model that incorporates a general time trend together with cyclic effects andpacket-size effects. The proposed intensity-function model has anexponential-polynomial-trigonometric form that includes a covariate representing the latest packet size. Maximum likelihoodestimates of the unknown continuous parameters of the stochastic intensity function areobtained numerically, and the degrees of the polynomial time and packet-size components aredetermined by a likelihood ratio test. Although this approach yielded excellent fits to the Bellcoredatasets, it also yielded the surprising conclusion that packet size has a negligible effect on thepacket arrival rate. A follow-up analysis of the packet-size process confirmed this conclusion andshed additional light on the packet-generation mechanism in Ethernet local area networks. This research also includes the development of procedures for simulating traffic processes having a stochastic intensity function of the proposed form. An extensive Monte Carlo performanceevaluation demonstrates the effectiveness of the proposed procedure for modeling and simulation ofteletraffic arrival processes.<P>
|
146 |
A STRUCTURED APPROACH FOR CLASSIFYING AND PRIORITIZING PRODUCT REQUIREMENTSJackson, Harold Vaughn 04 August 1999 (has links)
<p>New product development involves making a series of decisions that transform vaguely defined customer needs and desires into a final product. Two important, but often overlooked, product development decisions are (1) the classification of requirements as mandatory or optional, and (2) the prioritization of requirements. This research effort addresses the current lack of theoretically sound and practical methods for classifying and prioritizing product requirements by focusing on three primary objectives. The first objective is to develop a structured approach for classifying and prioritizing product requirements. The second objective is to use the structured approach to gather, analyze, and aggregate stakeholder input. The third objective is to use the structured approach to support both group and individual learning. The first objective was accomplished through the development and demonstration of a structured requirement analysis model (SRAM). SRAM?s development involved integrating methods and concepts from the following knowledge domains: requirement analysis, Multi-Attribute Decision-Making (MADM), market orientation, organizational learning, and cognitive decision theory. The second and third objectives were accomplished through the implementation of SRAM to resolve two diverse case studies. The main case study involved classifying and prioritizing functional requirements for a proposed knowledge?based CAD engineering system. In contrast, the second case study was focused on evaluating alternative vision statements for a consulting group. After successful completion of both case studies, SRAM was formally evaluated by case study participants and a controlled group that did not participate in either case study. Thus, the three primary objectives of this thesis were verified and validated via case study implementation and impartial evaluation.SRAM uses MADM as a solution framework for classifying and prioritizing product requirements. Requirements are evaluated using market orientation based (market priority, risk, customer value, and performance) qualitative (fuzzy linguistic) and quantitative decision criteria. Within this MADM framework, the Analytical Hierarchy Process (AHP) and entropy weighting are used to derive attribute importance weights and define stakeholder preference structures. Where pairwise comparison inconsistencies are passively corrected using a geometric averaging procedure for constructing supertransitive approximation to binary matrices. Each stakeholder?s requirement classifications and priorities are derived via the hierarchical application of the Technique for Order Preference By Similarity to the Ideal Solution (TOPSIS). While the results for an aggregated group of stakeholders are determined using weighted Borda Scoring and heuristic decision rules.Through the first and second case studies, it was discovered that resolving real-world problems requires understanding both how decision-makers should ideally behave and how they actually behave. Accordingly, quantitative results generated using traditional decision analysis methods were qualitatively analyzed using the essential elements of good decision-making (framing, gathering intelligence, coming to conclusions, and learning from feedback) as a conceptual foundation. The systematic application of structured decision-making was utilized to resolve conflict, develop consensus, define preferences, correct inconsistencies, and highlight critical issues. Emphasis was placed on supporting individual and group learning through structured decision-making. Hence, regardless of the specific outcomes of classification and prioritization decisions, SRAM helps provide users with necessary knowledge and skill to address similar problems in the future. Results from the formal evaluation of SRAM indicate participants from both case studies and a controlled group that did not participate in either case study view SRAM as being effective, practical, valid, and supportive of group and individual learning. In addition, both the second case study and the evaluation process demonstrated SRAM?s ability to be utilized in a variety of applications. <P>
|
147 |
. Drilling Parameters and Their Effect on Chip Clogging and Surface RoughnessJoshi, Sandesh Surendra 03 January 2000 (has links)
<p>In the woodworking industry, drilling (boring) is one of the most extensively used processes. Due to the traditional nature of the woodworking industry, not much data on machining has been recorded, only a fraction of this on drilling. This lack of information hinders the understanding and thus improvement in the process of drilling. The objective of this research is to provide a pilot study on chip clogging and the surface finish generated while drilling wood and also to examine the surface breakout at the point of drill entry and exit while drilling. This will help the industry by giving an insight into the drilling of wood for furthering research in focused areas. Experiments on chip clogging were carried out with two sizes of standard twist drills and the effect of feed (in/rev), spindle speed (rpm), passage of air jet, pecking cycle and rotational tool were studied on chip clogging. For the study of surface finish, a full factorial experimental design was implemented to evaluate the effect of factor level combinations of four wood types, four drill types, grain directions (along and across the grain), spindle speed (rpm) and feed (in/rev) and their interactions with respect to the surface quality of the machined workpiece. These 128 factor level combinations were replicated three times for a total of 384 experiments. The data obtained was statistically tested by using the analysis of variance techniques to prove the level of significance for each factor and interaction with respect to the surface finish. Work on chip clogging shows promise and needs further investigation for the benefit of the industry. Results on the surface finish study show trends in the behavior of parameters and future work should include developing mathematical models for accurately predicting responses with respect to the input parameters.<P>
|
148 |
Modeling and Simulating Random Vectors withGiven Marginals and Covariance StructureLassiter, Mary Elizabeth 28 July 2000 (has links)
<p>This research concerns the NORTA (NORmal To Anything) methodfor constructing a multivariate random vector having prespecifiedmarginal distributions and prespecified correlations by applying anappropriate transformation of a standard normal random vector; and themain objective of this thesis is to develop an algorithm forestimating the normal correlations that are required to yield thedesired correlations between the coordinates of the transformed randomvector. A stochastic root finding method was implemented toapproximate the Normal correlations of interest with a prespecifiedlevel of accuracy. Examples are shown of the experimentationperformed with bivariate distributions having various marginals.Results are also shown for a bivariate distribution with two Uniformdistributions as marginals, in which the functional relationshipbetween the Normal correlation and the prespecified correlation isknown exactly. The experimentation shows that this stochastic rootfinding algorithm is an excellent method for rapidly approximating therequired Normal correlations to a reasonable degree of accuracy.<P>
|
149 |
Methodology for Furniture Finishing System Capacity PlanningMelton, Ryan Heath 13 November 2000 (has links)
<p>This research developed a methodology for the capacity planning of a furniture finishing system using both deterministic analysis and stochastic simulation. The thesis includes the development of an interface through which users can interactively build a simulation model of a finishing system. The Excel-based interface de-couples data input from the simulation model construction and execution. This provides a user-friendly tool for analyzing a finishing system. A manufacturing manager unfamiliar with simulation techniques can use the interface to conduct simulations and experiment with various input parameters such as line loading techniques and line speeds. Through the interface, results from the simulation can be used in an iterative process to analyze and refine design parameters of the finishing system. Adjustments to input parameters are made, and the model is re-simulated until the user discovers and eliminates any problem areas within an existing finishing system or accurately determines the required workstation capacities for a proposed system.<P>
|
150 |
Computer-Aided Engineering of Plywood Upholstered Furniture FramesOltikar, Akhil Manohar 12 March 2001 (has links)
<p>Until the early 1900s, furniture was built by hand, one piece at a time. The industrial revolution and modern manufacturing technology has changed all of that. Today, as the furniture industry moves firmly into the next century, computerized systems and automated manufacturing have become more common to the industry. This thesis represents an effort to analyze the current practices in computer-aided design of upholstered furniture, specifically plywood frame furniture, and to develop new procedures for reducing the lead-time in upholstery product development. Different 3-D modeling techniques for designing plywood furniture frames and their features have been developed and implemented. A plywood frame feature library has been created, and geometric relations needed to fully constrain each feature type have been developed. This reduces modeling time and also increases consistency in the solid models. A new reverse engineering procedure, using an articulating arm, has been proposed, implemented, and tested for 3-D digitization of plywood frames. The proposed methodology eliminates some of the traditional processes currently followed in the industry, thus making the product development faster and more streamlined. Further, an algorithm has been developed, implemented and tested for automatically mirroring plywood upholstery frame assemblies in a CAD system. The algorithm considerably reduces the modeling lead-time in the product development process. Finally, some future work that considers currently available 3-D CAD technologies has been recommended which would help close the gap between upholstery designers and manufacturers.<P>
|
Page generated in 0.0762 seconds