• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 81
  • 1
  • 1
  • Tagged with
  • 90
  • 90
  • 46
  • 41
  • 40
  • 40
  • 13
  • 12
  • 10
  • 9
  • 8
  • 7
  • 7
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
61

Systems Engineering Process Modeling And Simulation

Arikan, Merve 01 September 2012 (has links) (PDF)
In this study, an approach is proposed to model and simulate the systems engineering process of design projects. One of the main aims is to model the systems engineering process, treating the process itself as a complex system. A conceptual model is developed as a result of a two-phase survey conducted with systems engineers. The conceptual model includes two levels of activity networks. Each first level systems engineering activity has its own network of second level activities. The model is then implemented in object oriented modeling language, namely SysML, using block definition diagrams and activity diagrams. Another aim is to generate a discrete event simulation model of the process for performance evaluation. For this purpose the SysML model is transformed to an Arena model using an Excel interface and VBA codes. Three deterministic and three stochastic cases are created to represent systems engineering process alternatives, which originate from the same conceptual model but possess different activity durations, resource availabilities and resource requirements. The scale of the project and the effect of uncertainty in activity durations are also considered. The proposed approach is applied to each of these six cases, developing the SysML models, transforming them to Arena models, and running the simulations. Project duration and resource utilization results are reported for these cases.
62

Scheduling With Discounted Costs

Kiciroglu, Ahmet 01 September 2003 (has links) (PDF)
Majority of the studies in the scheduling literature is devoted to time based performance measures. In this thesis, we develop a model that considers monetary issues in single machine scheduling environments. We assume all the jobs should be completed by a common due date. An early revenue is earned if the completion time is before or on the due date, and a tardy revenue is gained if the job is completed after the due date. We consider restricted and unrestricted due date versions of the problem. Our objective is the maximization of the net present value of all revenues. We first investigate some special cases of the problem, and present polynomial time algorithms to solve them. Then, we develop branch and bound algorithms with lower and upper bounding mechanisms. Computational experiments have shown that the branch and bound algorithms can solve large-sized problems in reasonable times.
63

A Business Process Performance Measure Definition System Supported By Information Technologies

Alpay Koc, Nurcan 01 January 2013 (has links) (PDF)
There is a growing interest and research on improvement of business processes as an essential part of effective quality management. Process improvement is possible with measurement and analysis of the process performance. Process performance measurement has been studied to a certain extend in the literature, and many different approaches have been developed such as Sink-Tuttle Model, Performance Measurement Matrix, SMART Pyramid, Balanced Scorecard Approach, Critical Few Method, and Performance Prism Framework. These approaches require that process owners and analysts define appropriate measures based on general guidelines for each process separately. Recently, with the advancement of information technologies, modeling and simulation of processes on a computer aided platform has become possible / standards and software support regarding such applications have been developed. Even though increasingly many organizations have been building their process models on computers, only a few manages effective use of such models for process improvement. This is partly due to difficulties in defining appropriate performance measures for the processes. The purpose of this study is to propose a method that can be used for defining performance measures of business processes easily and effectively according to specific nature of these processes. The proposed performance measure definition system is based on the idea of using generic process performance measures published by trusted business process frameworks for high level processes and adapting them for lower level ones. The system, using a search mechanism available on a computer, allows users to easily find and define appropriate performance measures for their processes. The proposed system is used for a research project management process and a creating research opportunities process of a public university and the results are discussed.
64

A Decision Analytic Model For Early Stage Breast Cancer Patients: Lumpectomy Vs Mastectomy

Elele, Tugba 01 September 2006 (has links) (PDF)
The purpose of this study was to develop a decision model for early-stage breast cancer patients. This model provides an opportunity for comparing two main treatment options, mastectomy and lumpectomy, with respect to quality of life by making use of Decision Theoretic Techniques. A Markov chain was constructed to project the clinical history of breast carcinoma following surgery. Then, health states used in the model were characterized by transition probabilities and utilities for quality of life. A Multi Attribute Utility Model was developed for outcome evaluation. This study was performed on the sample population of female university students, and utilities were elicited from these healthy volunteers. The results yielded by Multi Attribute Utility Model were validated by using Von Neumann-Morgenstern Standard Gamble technique. Finally, Monte Carlo Simulation was utilized in Treeage-Pro 2006 Suit software program in order to solve model and calculate expected utility value generated by each treatment option. The results showed that lumpectomy is more favorable for people who participated in this study. Sensitivity analysis on transition probabilities to local recurrence and salvaged states was performed and two threshold values were observed. Additionally, sensitivity analysis on utilities showed that the model was more sensitive to no evidence of disease state / however, was not sensitive to utilities of local recurrence and salvaged states.
65

Designing An Information System For Material Management In Engineer-to-order Organizations

Dede, Erdogan 01 January 2007 (has links) (PDF)
In this thesis, an information system is designed and developed for engineer-to-order organizations to improve the traditional Bill-of-Material by handling variants of products and components efficiently. A database is developed to store the related information about inventories and configuration management in an effective way. The improved Bill-of-Material provides a common structure to access stored information for material management purposes. A model, based on network, is presented and included into the system for calculating time required to produce components and to make subassemblies or assemblies with the current inventory levels. The system is applied to T&Uuml / BiTAK-SAGE, which is an engineer-to-order organization carrying out Research and Development projects for Defense Industry.
66

A Lagrangean Heuristic For The Two-stage Modular Capacitated Facility Location Problem

Sevinc, Selim 01 May 2008 (has links) (PDF)
In this study, a Lagrangean heuristic based on Lagrangean relaxation and subgradient optimization is proposed for the two-stage modular capacitated facility location problem. The objective is to minimize the cost of locating and operating plants and warehouses, plus the cost of transporting goods at both echelons to satisfy the demand of customers. The difference of our study from the two-stage capacitated facility location problem is the existence of multiple capacity levels as a candidate for each plant in the problem. Each capacity level has a minimum production capacity which has to be satisfied to open the relevant capacity level. Obviously, a single capacity level can be selected for an opened facility location. In the second echelon, the warehouses are capacitated and have unique fixed and variable costs for opening and operating. Multiple sourcing is allowed in both transportation echelons. Firstly, we develop a mixed integer linear programming model for the two-stage modular capacitated facility location problem. Then we develop a Lagrangean heuristic to solve the problem efficiently. Our Lagrangean heuristic consists of three main components: Lagrangean relaxation, subgradient optimization and a primal heuristic. Lagrangean relaxation is employed for obtaining the lower bound, subgradient optimization is used for updating the Lagrange multipliers at each iteration, and finally a three-stage primal heuristic is created for generating the upper bound solutions. At the first stage of the upper bound heuristic, global feasibility of the plants and warehouses is inspected and a greedy heuristic is executed, if there is a global infeasibility. At the next stage, an allocation heuristic is used to assign customers to warehouses and warehouses to plants sequentially. At the final stage of the upper bound heuristic, local feasibilities of the plants are investigated and infeasible capacity levels are adjusted if necessary. In order to show the efficiency of the developed heuristic, we have tested our heuristic on 280 problem instances generated randomly but systematically. The results of the experiments show that the developed heuristic is efficient and effective in terms of solution quality and computational effort especially for large instances.
67

Material Flow Cost Versus Congestion In Dynamic Distributed Facility Layout Problem

Ozen, Aykut 01 July 2008 (has links) (PDF)
In this thesis, we study both dynamic and distributed facility layout problems, where the demand for product mix changes over time. We propose a new simulated annealing algorithm, SALAB, for the dynamic facility layout problem. Four variants of SALAB find the best known solution for 20 of the 48 benchmark problems from the literature, improving upon the best known solutions of 18 problems. We modify SALAB to obtain DSALAB, solving the dynamic distributed facility layout problem with the objective of minimizing relocation cost and total (full and empty) travel cost of the material handling system. We simulate DSALAB solutions of randomly generated problems to study the tradeoff between total cost and congestion in the system. Our experimental results indicate that distributing the department duplicates throughout the facility reduces the total cost with diminishing returns and causes increasing congestion. Therefore, distribution beyond a certain level is not justified.
68

A Location And Routing-with-profit Problem In Glass Recycling

Polat, Esra 01 December 2008 (has links) (PDF)
In this study, our aim is to determine the locations of bottle banks used in collecting recycled glass. The collection of recycled glass is done by a fleet of vehicles that visit some predetermined collection points, like restaurants and hospitals. The location of bottle banks depends on the closeness of the banks to the population zones where the recycled class is generated, and to the closeness of the banks to the predetermined collection points. A mathematical model, which combines the maximal covering problem in the presence of partial coverage and vehicle routing problem with profits, is presented. Heuristic procedures are proposed for the solution of the problem. Computational results based on generated test problems are provided. We also discuss a case study, where bottle banks are located in Yenimahalle, a district of Ankara
69

Value Of Quality Information Of Returns In Product Recovery Management

Atabarut, Altan 01 February 2009 (has links) (PDF)
Returned products of many industries are transported backwards through supply chains for recovery, thus forming &ldquo / closed-loop supply chains&rdquo / . Benefits, forthcoming with more effective management of recovery of returns are gaining importance. However, some issues, such as lack of information required to assess the quality of the returned products, may translate into critical uncertainties in the product recovery decisions and prevent closed-loop supply chains from operating efficiently. Hence, it is envisaged that significant economies may be attained by increasing the quantity of information fed into the planning decisions related to returned products. Thus, the objective of this study is to test the hypothesis that ready availability of perfect quality grade information associated with returned products by means of &ldquo / embedded systems&rdquo / , may lead to improved over all performance of recovery operations. To this end, in this thesis, linear programming models of generic multistage recovery processes are built. It is demonstrated by computational studies that significant gains may be obtained especially in environments where the prices of recovered products are decreasing in time.
70

Mining Association Rules For Quality Related Data In An Electronics Company

Kilinc, Yasemin 01 March 2009 (has links) (PDF)
Quality has become a central concern as it has been observed that reducing defects will lower the cost of production. Hence, companies generate and store vast amounts of quality related data. Analysis of this data is critical in order to understand the quality problems and their causes, and to take preventive actions. In this thesis, we propose a methodology for this analysis based on one of the data mining techniques, association rules. The methodology is applied for quality related data of an electronics company. Apriori algorithm used in this application generates an excessively large number of rules most of which are redundant. Therefore we implement a three phase elimination process on the generated rules to come up with a reasonably small set of interesting rules. The approach is applied for two different data sets of the company, one for production defects and one for raw material non-conformities. We then validate the resultant rules using a test data set for each problem type and analyze the final set of rules.

Page generated in 0.1079 seconds