• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 24
  • 8
  • 4
  • 1
  • Tagged with
  • 40
  • 40
  • 19
  • 18
  • 18
  • 18
  • 6
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

A Meshless Method Approach for Solving Coupled Thermoelasticity Problems

Gerace, Salvadore 01 January 2006 (has links)
Current methods for solving thennoelasticity problems involve using finite element analysis, boundary element analysis, or other meshed-type methods to determine the deflections under an imposed temperature/stress field. This thesis will detail a new approach using meshless methods to solve these types of thermoelasticity problems in which the solution is independent of boundary and internal meshing. With the rapidly increasing availability and performance of computer workstations and clusters, the major time requirement for solving a thermoelasticity model is no longer the computation time, but rather the problem setup. Defining the required mesh for a complex geometry can be extremely complicated and time consuming, and new methods are desired that can reduce this model setup time. The proposed meshless methods completely eliminate the need for a mesh, and thus, eliminate the need for complicated meshing procedures. Although the savings gain due to eliminating the meshing process would be more than sufficient to warrant further study, the localized meshless method can also be comparable in computational speed to more traditional finite element solvers when analyzing complex problems. The reduction of both setup and computational time makes the meshless approach an ideal method of solving coupled thermoelasticity problems. Through the development of these methods it can be determined whether they are feasible as potential replacements for more traditional solution methods. More specifically, two methods will be covered in depth from the development to the implementation. The first method covered will be the global meshless method and the second will be the improved localized method. Although they both produce similar results in terms of accuracy, the localized method greatly improves upon the stability and computation time of the global method.
22

A metamodel of operational control for discrete event logistics systems

Sprock, Timothy A. 27 May 2016 (has links)
Discrete Event Logistics Systems (DELS) are a class of dynamic systems that are defined by the transformation of discrete flows through a network of interconnected subsystems. The DELS domain includes systems such as supply chains, manufacturing systems, transportation networks, warehouses, and health care delivery systems. Advancements in computer integrated manufacturing and intelligent devices have spurred a revolution in manufacturing. These smart manufacturing systems utilize technical interoperability and plant-wide integration at the device-level to drive production agility and efficiency. Extending these successes to enterprise-wide integration and decision-making will require the definitions of control and device to be extended and supported at the operations management and the business planning levels as well. In the future, smart operational control mechanisms must not only integrate real-time data from system operations, but also formulate and solve a wide variety of optimization analyses quickly and efficiently and then translate the results into executable commands. However in contemporary DELS practice, these optimization analyses, and analyses in general, are often purpose-built to answer specific questions, with an implicit system model and many possible analysis implementations depending on the question, the instance data, and the solver. Also because of the semantic gap between operations research analysis models such as job-shop scheduling algorithms and IT-based models such as MES, there is little integration between control analysis methods and control execution tools. Automated and cost-effective access to multiple analyses from a single conceptual model of the target system would broaden the usage and implementation of analysis-based decision support and system optimization. The fundamental contribution of this dissertation is concerned with interoperability and bridging the gap between operations research analysis models and practical applications of the results. This dissertation closes this gap by constructing a standard domain-specific language, standard problem definitions, and a standard analysis methodology to answer the control questions and execute the prescribed control actions. The domain specific language meets a broader requirement for facilitating interoperability for DELS, including system integration, plug-and-play analysis methods and tools, and system design methodologies. The domain-specific language formalizes a recurring product, process, resource, and facility description of the DELS domain. It provides a common language to discuss our systems, including the questions that we want to ask about our systems, the problems that we need to solve in order to answer those questions, and the mechanisms to deploy the solution. A canonical set of control questions defines the comprehensive functional specification of all the decision-making mechanisms that a controller needs to be able to provide; i.e. a model of analysis models or a metamodel of operational control. These questions refine the interoperability mechanism between system and analysis models by mapping classes of control analysis models to implementation and execution mechanisms in the system model. A standard representation of each class of control problems is only a partial solution to fully addressing operational control. The final contribution of this dissertation constructs a round-trip analysis methodology that completes the bridge between operations research analysis models and deployable control mechanisms. This contribution formalizes an analysis pathway, from formulating an analysis model to executing a control action, that is grounded in a more fundamental insight into how analysis methods are executed to support operational control decision-making.
23

Um método baseado em inteligência computacional para a geração automática de casos de teste de caixa preta. / A method based on computational intelligence for automatic Black Box test cases generation.

Sá, Hindenburgo Elvas Gonçalves de 09 September 2010 (has links)
Este trabalho de dissertação apresenta um método baseado em técnicas de inteligência computacional, como aprendizado de conjunto de regras, redes neurais artificiais e lógica fuzzy, para propor o desenvolvimento de ferramentas capazes de gerar e classificar casos de testes de caixa preta com as finalidades de auxiliar na atividade de preparação de testes, na detecção de defeitos em características ou funcionalidades e na diminuição do tempo de detecção de correção do software visando, com isto, atingir uma cobertura de testes qualitativamente superior ao processo criação manual. A obtenção de novos casos de testes e a classificação dos casos de testes gerados utilizam técnicas de aprendizado de um conjunto de regras, utilizando algoritmos de cobertura seqüencial, e de uma máquina de inferência fuzzy. A definição dos métodos, tanto para gerar como para classificar os casos de testes, foram fundamentados em experimentos visando comparar as similaridades entre os métodos fuzzy, redes neurais artificiais e aprendizado de conjunto de regras. Por fim, procurou-se desenvolver uma ferramenta à titulo de prova de conceitos objetivando aplicar os métodos que obtiveram melhores resultados nas experimentações. Os critérios adotados para definir os métodos foram às métricas de complexidade ciclomática e total de linhas de código (LOC). / This dissertation work presents a method based on computational intelligence techniques, such as learning set of rules, artificial neural networks and fuzzy logic, proposed the development of tools that generate test cases and sort of black box with the purposes of assisting activity in the preparation of tests for detection of defects in features or functionality and decreasing the detection time correction software aimed, with this, reach a qualitatively higher test coverage to the manual creation process. The acquisition of new test cases and classification of test cases generated using techniques Learning learning a whole set of Regrasregras using sequential covering algorithms, and a fuzzy inference machine. The definition of methods, both to generate and to classify the test cases were substantiated in experiments aimed at comparing the similarities between the fuzzy methods, neural networks and learning of the rule set. Finally, we sought to develop a tool for evidence of concepts aiming to apply the methods which obtained better results in trials. The criteria adopted to define the methods were metrics cyclomatic complexity and total lines of code (LOC).
24

Um método baseado em inteligência computacional para a geração automática de casos de teste de caixa preta. / A method based on computational intelligence for automatic Black Box test cases generation.

Hindenburgo Elvas Gonçalves de Sá 09 September 2010 (has links)
Este trabalho de dissertação apresenta um método baseado em técnicas de inteligência computacional, como aprendizado de conjunto de regras, redes neurais artificiais e lógica fuzzy, para propor o desenvolvimento de ferramentas capazes de gerar e classificar casos de testes de caixa preta com as finalidades de auxiliar na atividade de preparação de testes, na detecção de defeitos em características ou funcionalidades e na diminuição do tempo de detecção de correção do software visando, com isto, atingir uma cobertura de testes qualitativamente superior ao processo criação manual. A obtenção de novos casos de testes e a classificação dos casos de testes gerados utilizam técnicas de aprendizado de um conjunto de regras, utilizando algoritmos de cobertura seqüencial, e de uma máquina de inferência fuzzy. A definição dos métodos, tanto para gerar como para classificar os casos de testes, foram fundamentados em experimentos visando comparar as similaridades entre os métodos fuzzy, redes neurais artificiais e aprendizado de conjunto de regras. Por fim, procurou-se desenvolver uma ferramenta à titulo de prova de conceitos objetivando aplicar os métodos que obtiveram melhores resultados nas experimentações. Os critérios adotados para definir os métodos foram às métricas de complexidade ciclomática e total de linhas de código (LOC). / This dissertation work presents a method based on computational intelligence techniques, such as learning set of rules, artificial neural networks and fuzzy logic, proposed the development of tools that generate test cases and sort of black box with the purposes of assisting activity in the preparation of tests for detection of defects in features or functionality and decreasing the detection time correction software aimed, with this, reach a qualitatively higher test coverage to the manual creation process. The acquisition of new test cases and classification of test cases generated using techniques Learning learning a whole set of Regrasregras using sequential covering algorithms, and a fuzzy inference machine. The definition of methods, both to generate and to classify the test cases were substantiated in experiments aimed at comparing the similarities between the fuzzy methods, neural networks and learning of the rule set. Finally, we sought to develop a tool for evidence of concepts aiming to apply the methods which obtained better results in trials. The criteria adopted to define the methods were metrics cyclomatic complexity and total lines of code (LOC).
25

Modelling The Evolution Of Demand Forecasts In A Production-distribution System

Yucer, Cem Tahsin 01 December 2006 (has links) (PDF)
In this thesis, we focus on a forecasting tool, Martingale Model of Forecast Evolution (MMFE), to model the evolution of forecasts in a production-distribution system. Additive form is performed to represent the evolution process. Variance-Covariance (VCV) matrix is defined to express the forecast updates. The selected demand pattern is stationary and it is normally distributed. It follows an Autoregressive Order-1 (AR(1)) model. Two forecasting procedures are selected to compare the MMFE with. These are MA (Moving average) and ES (Exponential smoothing) methods. A production-distribution model is constructed to represent a two-stage supply chain environment. The performance measures considered in the analyses are the total costs, fill rates and forecast accuracy observed in the operation of the production-distribution system. The goal is to demonstrate the importance of good forecasting in supply chain environments.
26

Comparison Of The Resource Allocation Capabilities Of Project Management Software Packages In Resource Constrained Project Scheduling Problems

Hekimoglu, Ozge 01 January 2007 (has links) (PDF)
In this study, results of a comparison on benchmark test problems are presented to investigate the performance of Primavera V.4.1 with its two resource allocation priority rules and MS Project 2003. Resource allocation capabilities of the packages are measured in terms of deviation from the upper bound of the minimum makespan. Resource constrained project scheduling problem instances are taken from PSPLIB which are generated under a factorial design from ProGen. Statistical tests are applied to the results for investigating the significance effectiveness of the parameters.
27

Mathematical Modeling Of Supercritical Fluid Extraction Of Biomaterials

Cetin, Halil Ibrahim 01 July 2003 (has links) (PDF)
Supercritical fluid extraction has been used to recover biomaterials from natural matrices. Mathematical modeling of the extraction is required for process design and scale up. Existing models in literature are correlative and dependent upon the experimental data. Construction of predictive models giving reliable results in the lack of experimental data is precious. The long term objective of this study was to construct a predictive mass transfer model, representing supercritical fluid extraction of biomaterials in packed beds by the method of volume averaging. In order to develop mass transfer equations in terms of volume averaged variables, velocity and velocity deviation fields, closure variables were solved for a specific case and the coefficients of volume averaged mass transfer equation for the specific case were computed using one and two-dimensional geometries via analytical and numerical solutions, respectively. Spectral Element method with Domain Decomposition technique, Preconditioned Conjugate Gradient algorithm and Uzawa method were used for the numerical solution. The coefficients of convective term with additional terms of volume averaged mass transfer equation were similar to superficial velocity. The coefficients of dispersion term were close to diffusivity of oil in supercritical carbon dioxide. The coefficients of interphase mass transfer term were overestimated in both geometries. Modifications in boundary conditions, change in geometry of particles and use of three-dimensional computations would improve the value of the coefficient of interphase mass transfer term.
28

Linear Static Analysis Of Large Structural Models On Pc Clusters

Ozmen, Semih 01 July 2009 (has links) (PDF)
This research focuses on implementing and improving a parallel solution framework for the linear static analysis of large structural models on PC clusters. The framework consists of two separate programs where the first one is responsible from preparing data for the parallel solution that involves partitioning, workload balancing, and equation numbering. The second program is a fully parallel nite element program that utilizes substructure based solution approach with direct solvers. The first step of data preparation is partitioning the structure into substructures. After creating the initial substructures, the estimated imbalance of the substructures is adjusted by iteratively transferring nodes from the slower substructures to the faster ones. Once the final substructures are created, the solution phase is initiated. Each processor assembles its substructure&#039 / s stiffness matrix and condenses it to the interfaces. The interface equations are then solved in parallel with a block-cyclic dense matrix solver. After computing the interface unknowns, each processor calculates the internal displacements and element stresses or forces. Comparative tests were done to demonstrate the performance of the solution framework.
29

Sensitivity Analysis Using Finite Difference And Analytical Jacobians

Ezertas, Ahmet Alper 01 September 2009 (has links) (PDF)
The Flux Jacobian matrices, the elements of which are the derivatives of the flux vectors with respect to the flow variables, are needed to be evaluated in implicit flow solutions and in analytical sensitivity analyzing methods. The main motivation behind this thesis study is to explore the accuracy of the numerically evaluated flux Jacobian matrices and the effects of the errors in those matrices on the convergence of the flow solver, on the accuracy of the sensitivities and on the performance of the design optimization cycle. To perform these objectives a flow solver, which uses exact Newton&rsquo / s method with direct sparse matrix solution technique, is developed for the Euler flow equations. Flux Jacobian is evaluated both numerically and analytically for different upwind flux discretization schemes with second order MUSCL face interpolation. Numerical flux Jacobian matrices that are derived with wide range of finite difference perturbation magnitudes were compared with analytically derived ones and the optimum perturbation magnitude, which minimizes the error in the numerical evaluation, is searched. The factors that impede the accuracy are analyzed and a simple formulation for optimum perturbation magnitude is derived. The sensitivity derivatives are evaluated by direct-differentiation method with discrete approach. The reuse of the LU factors of the flux Jacobian that are evaluated in the flow solution enabled efficient sensitivity analysis. The sensitivities calculated by the analytical Jacobian are compared with the ones that are calculated by numerically evaluated Jacobian matrices. Both internal and external flow problems with varying flow speeds, varying grid types and sizes are solved with different discretization schemes. In these problems, when the optimum perturbation magnitude is used for numerical Jacobian evaluation, the errors in Jacobian matrix and the sensitivities are minimized. Finally, the effect of the accuracy of the sensitivities on the design optimization cycle is analyzed for an inverse airfoil design performed with least squares minimization.
30

An Investigation Of The Leak-off Tests Conducted In Oil And Natural Gas Wells Drilled In Thrace Basin

Kayael, Burak 01 February 2012 (has links) (PDF)
This study aims to analyze the leak-off tests carried out in the Thrace Basin of Turkey by Turkish Petroleum Corporation and find any relationship that may exist between leak-off test results and drilled formations as well as drilling parameters, such as mud weight, depth. The analysis of 77 leak-off tests indicated that there is no close correlation between the mud weight of test fluid and equivalent mud weight (fracture gradient) if the test is carried out within impermeable sections. On the other hand, the correlation between mud weight and equivalent mud weight increase while running the test within permeable-productive zones. It is also found that the leak-off test results are not dependent on the depth but the formation to be tested. The analyzed leak-off test results from Thrace Basin showed that the fracture gradient is not the limiting factor to set the casing of any section unless a gas show is observed during drilling operation which occurred only in 5 wells out of 78 wells analyzed.

Page generated in 0.1423 seconds