• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6769
  • 2451
  • 1001
  • 805
  • 777
  • 234
  • 168
  • 119
  • 83
  • 79
  • 70
  • 63
  • 54
  • 52
  • 50
  • Tagged with
  • 15002
  • 2422
  • 1971
  • 1814
  • 1642
  • 1528
  • 1381
  • 1327
  • 1284
  • 1252
  • 1220
  • 1114
  • 972
  • 928
  • 926
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
761

Intelligent modeling for control, reconfiguration and optimization of discrete event systems

Mahmood, Waqar 08 1900 (has links)
No description available.
762

Auto-tuning control using optimization techniques

Knueven, Susan 08 1900 (has links)
No description available.
763

SIMULATION AND OPTIMIZATION OF A CROSSDOCKING OPERATION IN A JUST-IN-TIME ENVIRONMENT

Hauser, Karina 01 January 2002 (has links)
In an ideal Just-in-Time (JIT) production environment, parts should be delivered to the workstationsat the exact time they are needed and in the exact quantity required. In reality, formost components/subassemblies this is neither practical nor economical. In this study, thematerial flow of the crossdocking operation at the Toyota Motor Manufacturing plant inGeorgetown, KY (TMMK) is simulated and analyzed.At the Georgetown plant between 80 and 120 trucks are unloaded every day, with approximately1300 different parts being handled in the crossdocking area. The crossdocking areaconsists of 12 lanes, each lane corresponding to one section of the assembly line. Whereassome pallets contain parts designated for only one lane, other parts are delivered in such smallquantities that they arrive as mixed pallets. These pallets have to be sorted/crossdocked intothe proper lanes before they can be delivered to the workstations at the assembly line. Thisprocedure is both time consuming and costly.In this study, the present layout of the crossdocking area at Toyota and a layout proposed byToyota are compared via simulation with three newly designed layouts. The simulation modelswill test the influence of two different volumes of incoming quantities, the actual volumeas it is now and one of 50% reduced volume. The models will also examine the effects ofcrossdocking on the performance of the system, simulating three different percentage levelsof pallets that have to be crossdocked.The objectives of the initial study are twofold. First, simulations of the current system,based on data provided by Toyota, will give insight into the dynamic behavior and the materialflow of the existing arrangement. These simulations will simultaneously serve to validateour modeling techniques. The second objective is to reduce the travel distances in the crossdockingarea; this will reduce the workload of the team members and decrease the lead timefrom unloading of the truck to delivery to the assembly line. In the second phase of theproject, the design will be further optimized. Starting with the best layouts from the simulationresults, the lanes will be rearranged using a genetic algorithm to allow the lanes withthe most crossdocking traffic to be closest together.The different crossdocking quantities and percentages of crossdocking pallets in the simulationsallow a generalization of the study and the development of guidelines for layouts ofother types of crossdocking operations. The simulation and optimization can be used as abasis for further studies of material flow in JIT and/or crossdocking environments.
764

Multi-objective optimization of Industrial robots

Nezhadali, Vaheed January 2011 (has links)
Industrial robots are the most widely manufactured and utilized type of robots in industries. Improving the design process of industrial robots would lead to further developments in robotics industries. Consequently, other dependant industries would be benefited. Therefore, there is an effort to make the design process more and more efficient and reliable. The design of industrial robots requires studies in various fields. Engineering softwares are the tools which facilitate and accelerate the robot design processes such as dynamic simulation, structural analysis, optimization, control and so forth. Therefore, designing a framework to automate the robot design process such that different tools interact automatically would be beneficial. In this thesis, the goal is to investigate the feasibility of integrating tools from different domains such as geometry modeling, dynamic simulation, finite element analysis and optimization in order to obtain an industrial robot design and optimization framework. Meanwhile, Meta modeling is used to replace the time consuming design steps. In the optimization step, various optimization algorithms are compared based on their performance and the best suited algorithm is selected. As a result, it is shown that the objectives are achievable in a sense that finite element analysis can be efficiently integrated with the other tools and the results can be optimized during the design process. A holistic framework which can be used for design of robots with several degrees of freedom is introduced at the end.
765

A study and implementation of the network flow problem and edge integrity of networks

Haiba, Mohamed Salem January 1991 (has links)
Fundamental problems in graph theory are of four types existence, construction, enumeration and optimization problems. Optimization problems lie at the interface between computer science and the field of operations research and are of primary importance in decision-making. In this thesis, two optimization problems are studied: the edge-integrity of networks and the network flow problem. An implementation of the corresponding algorithms is also realized.The edge integrity of a communication network provides a way to assess the vulnerability of the network to disruption through the destruction or failure of some of its links. While the computation of the edge-integrity of graphs in general has been proven to be NPcomplete, a recently published paper was devoted to a good algorithm using a technique of edge separation sequence for computing the edge integrity of trees. The main results of this paper will be presented and an implementation of this algorithm is achieved.The network flow problem models a distribution system in which commodities are flowing through an interconnected network. The goal is to find a maximum feasible flow and its value, given the capacity constraints for each edge. The three majors algorithms for this problem (Ford -Fulkerso n, Edmonds-Karp method, MPKM algorithm) are discussed, their complexities compared and an implementation of the Ford-Fulkerson and the MPKM algorithms is presented. / Department of Computer Science
766

Resource optimization for fault-tolerant quantum computing

Paetznick, Adam 13 December 2013 (has links)
Quantum computing offers the potential for efficiently solving otherwise classically difficult problems, with applications in material and drug design, cryptography, theoretical physics, number theory and more. However, quantum systems are notoriously fragile; interaction with the surrounding environment and lack of precise control constitute noise, which makes construction of a reliable quantum computer extremely challenging. Threshold theorems show that by adding enough redundancy, reliable and arbitrarily long quantum computation is possible so long as the amount of noise is relatively low---below a ``threshold'' value. The amount of redundancy required is reasonable in the asymptotic sense, but in absolute terms the resource overhead of existing protocols is enormous when compared to current experimental capabilities. In this thesis we examine a variety of techniques for reducing the resources required for fault-tolerant quantum computation. First, we show how to simplify universal encoded computation by using only transversal gates and standard error correction procedures, circumventing existing no-go theorems. The cost of certain error correction procedures is dominated by preparation of special ancillary states. We show how to simplify ancilla preparation, reducing the cost of error correction by more than a factor of four. Using this optimized ancilla preparation, we then develop improved techniques for proving rigorous lower bounds on the noise threshold. The techniques are specifically intended for analysis of relatively large codes such as the 23-qubit Golay code, for which we compute a lower bound on the threshold error rate of 0.132 percent per gate for depolarizing noise. This bound is the best known for any scheme. Additional overhead can be incurred because quantum algorithms must be translated into sequences of gates that are actually available in the quantum computer. In particular, arbitrary single-qubit rotations must be decomposed into a discrete set of fault-tolerant gates. We find that by using a special class of non-deterministic circuits, the cost of decomposition can be reduced by as much as a factor of four over state-of-the-art techniques, which typically use deterministic circuits. Finally, we examine global optimization of fault-tolerant quantum circuits. Physical connectivity constraints require that qubits are moved close together before they can interact, but such movement can cause data to lay idle, wasting time and space. We adapt techniques from VLSI in order to minimize time and space usage for computations in the surface code, and we develop a software prototype to demonstrate the potential savings.
767

Batch scheduling of two-machine limited-buffer flowshop with setup and removal times

Dai, Jianbin 01 December 2003 (has links)
No description available.
768

Self-optimization of Antenna Sectorization

Faxér, Sebastian January 2014 (has links)
Sectorization is a well-established method of increasing the capacity of telecommunicationnetworks. With modern Active Antenna Systems (AAS) comes the abilityto change sectorization order dynamically, both in horizontal and vertical plane.The purpose of this thesis is to investigate when (and what type of) sectorizationis benficial. A theoretical analysis as well as simulations are performed in orderto determine which quantities to look at when making the decision to apply sectorization.Based on the conclusions from these investigations, a self-optimizingalgorithm that only turns on sectorization when it increases network performanceis developed and evaluated. It is shown that large gains can be achieved by onlyturning on sectorization when the right conditions are met. Further, we show thatadditional gains can be seen if antenna parameters such as downtilt and distributionof transmission power between sectors are set properly. Self-optimizingalgorithms for tuning of these parameters are developed and evaluated as well.NyckelordKeywords
769

Comprehensive Robustness via Moment-based Optimization : Theory and Applications

Li, Jonathan 17 December 2012 (has links)
The use of a stochastic model to predict the likelihood of future outcomes forms an integral part of decision optimization under uncertainty. In classical stochastic modeling uncertain parameters are often assumed to be driven by a particular form of probability distribution. In practice however, the distributional form is often difficult to infer from the observed data, and the incorrect choice of distribution can lead to significant quality deterioration of resultant decisions and unexpected losses. In this thesis, we present new approaches for evaluating expected future performance that do not rely on an exact distributional specification and can be robust against the errors related to committing to a particular specification. The notion of comprehensive robustness is promoted, where various degrees of model misspecification are studied. This includes fundamental one such as unknown distributional form and more involved ones such as stochastic moments and moment outliers. The approaches are developed based on the techniques of moment-based optimization, where bounds on the expected performance are sought based solely on partial moment information. They can be integrated into decision optimization and generate decisions that are robust against model misspecification in a comprehensive manner. In the first part of the thesis, we extend the applicability of moment-based optimization to incorporate new objective functions such as convex risk measures and richer moment information such as higher-order multivariate moments. In the second part, new tractable optimization frameworks are developed that account for various forms of moment uncertainty in the context of decision analysis and optimization. Financial applications such as portfolio selection and option pricing are studied.
770

Aero-structural Optimization of Divergence-critical Wings

Moon, Scott Geoffrey 15 February 2010 (has links)
This study investigates the use of the divergence speed as an additional constraint to a multi-disciplinary optimization (MDO) problem. The goal of the project is to expand the MDO toolbox by adding an aeroelastic module used where the aeroelastic characteristics present a possible safety hazard. This paper examines aeroelastic theory and MDO disciplines. The divergence constraint function is developed on a BAH wing. The optimization problem is executed on the HANSA HFB 320 transport jet using the FEAP structural solver and a Vortex Lattice Method as the aerodynamic solver. The study shows that divergence speed can function as a safety constraint but the stress constraints determine the optimum design. Furthermore, obtaining a true divergence constraint will require a finer mesh, a more efficient aerodynamic solver and non-finite difference approach to gradient determination. Thus, the addition of the divergence constraint does not yet directly benefit this MDO framework.

Page generated in 0.0267 seconds