421 |
Dependency-tracking object-oriented multidisciplinary design optimization (MDO) formulation on a large-scale systemAhlqvist, M. Alexandra 01 October 2001 (has links)
No description available.
|
422 |
On Finding the Location of an Underwater Mobile Robot Using Optimization TechniquesTunuguntla, Sai S. 12 August 1998 (has links)
This research aims at solving an engineering design problem encountered in the field of robotics using mathematical programming techniques. The problem addressed is an indispensable part of designing the operation of Ursula, an underwater mobile robot, and involves finding its location as it moves along the circumference of a nuclear reactor vessel. The study has been conducted with an intent to aid a laser based global positioning system to make this determination.
The physical nature of this problem enables it to be conceptualized as a position and orientation determination problem. Ursula tests the weldments in the reactor vessel, and its position and orientation needs to be found continuously in real-time. The kinematic errors in the setup and the use of a laser based positioning system distinguish this from traditional position and orientation determination problems. The aim of this research effort is to construct a suitable representative mathematical model for this problem, and to design and compare various solution methodologies that are computationally competitive, numerically stable, and accurate. / Master of Science
|
423 |
Application of a decomposition strategy to the optimal synthesis/design of a fuel cell sub-systemOyarzabal, Borja 08 August 2001 (has links)
The application of a decomposition methodology to the synthesis/design optimization of a stationary cogeneration fuel cell sub-system for residential/commercial applications is the focus of this work. To accomplish this, a number of different configurations for the fuel cell sub-system are presented and discussed. The most promising candidate configuration, which combines features of different configurations found in the literature, is chosen for detailed thermodynamic, geometric, and economic modeling both at design and off-design. The case is then made for the usefulness and need of decomposition in large-scale optimization. The types of decomposition strategies considered are time and physical decomposition. Specific solution approaches to the latter, namely Local-Global Optimization (LGO) and Iterative Local-Global Optimization (ILGO) are outlined in the thesis. Time decomposition and physical decomposition using the LGO approach are applied to the fuel cell sub-system. These techniques prove to be useful tools for simplifying the overall synthesis/design optimization problem of the fuel cell sub-system.
Finally, the results of the decomposed synthesis/design optimization of the fuel cell subsystem indicate that this sub-system is more economical for a relatively large cluster of residences (i.e. 50). To achieve a unit cost of power production of less than 10 cents/kWh on an exergy basis requires the manufacture of more than 1500 fuel cell sub-system units per year. In addition, based on the off-design optimization results, the fuel cell subsystem is unable by itself to satisfy the winter heat demands. Thus, the case is made for integrating the fuel cell sub-system with another sub-system, namely, a heat pump. / Master of Science
|
424 |
Addition of Features to an Existing MDO Model for ContainershipsDasgupta, Amlan 04 June 2001 (has links)
Traditionally, the "Design Spiral" is used for the design of ships. The design spiral endorses the concept that the design process is sequential and iterative. Though this procedure was very effective over the years, the current trend of engineering demands that more stress be put on the exploration of optimum design. With the advancement of computing technologies, the onus has shifted from finding better calculation schemes to formulating an economically viable design scheme. One of the objects of the FIRST project funded by MARITECH was to develop a computer tool to give the best ship design using optimization techniques. This was entrusted to the Department of Aerospace and Ocean Engineering at Virginia Polytechnic Institute and State University in Blacksburg, Virginia. A container ship was chosen as the test case. The problem was tackled from an owner's point of view. Hence, the required freight rate was chosen as the objective.
To achieve that goal, the team developed a package that consists of three modules: optimization, geometric and a performance evaluation module. Though these modules are essentially independent, the user has control over an overall manager. He can change the initial value of design parameters, set bounds and vary constraint bounds as per his needs. Though he does not know what goes on behind the user interface, he still feels secure with the design process because he has overall control. This sense of security breaks down when he has access to limited variables and constraints.
A prototype MDO tool is developed based on Microsoft's COM framework using ATL. With this design, the modules can be modified with minimum programming effort. The user interface gives the user flexibility to manipulate relevant parameters that affect the design. A geometric shape manipulation scheme is developed in which the hull form was generated by blending two hull forms. This MDO tool is used to design a container ship with the required freight rate as the objective to be minimized. It is noticed that without a structural constraint, the design tends towards one with maximum length and beam. This led to unreasonably large ratios of B/D and L/D.
A B/D constraint is applied to the design to get a better structural design. Results with this constraint enabled have pointed in the direction of adding two other design variables. This constraint increases the depth of the ship. With the increase in depth, the center of gravity of the ship also rises decreasing the GM of the ship. This lowering of GM adversely affects the GM constraint. The number of tiers on deck (NTd) is made a design variable to enable the optimizer to have the flexibility of manipulating the cargo carrying capacity. It was noticed that the ship is unable to have a high NTd because of the violation of the GM constraint. Hence, ballast has also been added as a design variable to reduce the center of gravity of the ship increasing the GM of the ship. This feature enables the optimizer to carry greater cargo on deck improving the objective function.
An effort is made to analyze the efficacy of the MDO tool by varying various parameters that affect the design. Technology factors have been introduced which give an insight on effect of key parameters. They also reflect on future design trends. Three evaluation tools: sensitivity analysis, alpha plots and restart option have been incorporated in the design process to gauge the results of optimization.
The effect of another structural constraint L/D was also investigated. This constraint tends to bring down the overall length and is inconclusive in its results. Further analysis of this constraint is needed to draw usable conclusions. The linear response surface approximation was eliminated and the original stepwise discontinuous TEU capacity function is employed in the later examples. It was found that the minimum of the required fright rate occurred at the lower limits of length and beam on each TEU capacity platform. A systematic search of TEU plateaus in the vicinity of the primary optimum was necessary to define the secondary optimum / Master of Science
|
425 |
A Model for Multidisciplinary Optimization for ContainershipsGanesan, Vikram 30 July 1999 (has links)
This thesis describes a multidisciplinary design optimization approach to containership design. The method employs widely accepted regression equations for computing resistance, weights, building costs and operating costs. The current regulations governing freeboard and the Coast Guard wind heel criterion are included.
The measures of merit used are the required freight rate and the return on investment. The system is flexible enough to allow changes in trade routes, shipyard and port parameters, and fuel costs.
The weight-displacement balance is maintained by including draft as a design variable and imposing an equality constraint on weight and displacement rather than introducing an internal loop to calculate draft at each iteration. This speeds up the optimization process.
The process takes into account the discrete container stowage issue. The carrying capacity (number of containers) is expressed as a continuous function of the principal dimensions by using a linear response surface fit that in turn makes the objective function continuous.
Speed is a design variable. The optimum speed takes into account the compromise required between higher speeds that imply higher revenue and lower speeds that imply lower fuel costs.
The optimizer used is the Design Optimization Tools (DOT) program from Vanderplaats Research and Development, Inc. Results employing the three different techniques provided in DOT have been obtained and compared.
The optimum ship tends to be the largest ship in terms of length and beam. An optimum speed is identified. The three techniques provided in DOT give fairly consistent results, but once a good optimum point is identified in the design space, the sequential linear programming algorithm is found to be the most consistent method to converge to a local optimum.
The objective function is found to be flat in the vicinity of the optimum that indicates that the designer is not confined to a severely restricted design space and has some freedom in designing the optimum ship. / Master of Science
|
426 |
Hyperheuristics: Recent DevelopmentsChakhlevitch, K., Cowling, Peter I. 18 November 2008 (has links)
Yes / We present a thorough review of hyperheuristic research to date, and analyse/compare hyperheuristic papers based on the methods used.
|
427 |
Optimal Algorithms for Affinely Constrained, Distributed, Decentralized, Minimax, and High-Order Optimization ProblemsKovalev, Dmitry 09 1900 (has links)
Optimization problems are ubiquitous in all quantitative scientific disciplines, from computer science and engineering to operations research and economics. Developing algorithms for solving various optimization problems has been the focus of mathematical research for years. In the last decade, optimization research has become even more popular due to its applications in the rapidly developing field of machine learning.
In this thesis, we discuss a few fundamental and well-studied optimization problem classes: decentralized distributed optimization (Chapters 2 to 4), distributed optimization under similarity (Chapter 5), affinely constrained optimization (Chapter 6), minimax optimization (Chapter 7), and high-order optimization (Chapter 8). For each problem class, we develop the first provably optimal algorithm: the complexity of such an algorithm cannot be improved for the problem class given. The proposed algorithms show state-of-the-art performance in practical applications, which makes them highly attractive for potential generalizations and extensions in the future.
|
428 |
Oligopolistic and oligopsonistic bilateral electricity market modeling using hierarchical conjectural variation equilibrium methodAlikhanzadeh, Amir Hessam January 2013 (has links)
An electricity market is very complex and different in its nature, when compared to other commodity markets. The introduction of competition and restructuring in global electricity markets brought more complexity and major changes in terms of governance, ownership and technical and market operations. In a liberalized electricity market, all market participants are responsible for their own decisions; therefore, all the participants are trying to make profit by participating in electricity trading. There are different types of electricity market, and in this research a bilateral electricity market has been specifically considered. This thesis not only contributes with regard to the reviewing UK electricity market as an example of a bilateral electricity market with more than 97% of long-term bilateral trading, but also proposes a dual aspect point of view with regard to the bilateral electricity market by splitting the generation and supply sides of the wholesale market. This research aims at maximizing the market participants’ profits and finds the equilibrium point of the bilateral market; hence, various methods such as equilibrium models have been reviewed with regard to management of the risks (e.g. technical and financial risks) of participating in the electricity market. This research proposes a novel Conjectural Variation Equilibrium (CVE) model for bilateral electricity markets, to reduce the market participants’ exposure to risks and maximize the profits. Hence, generation companies’ behaviors and strategies in an imperfect bilateral market environment, oligopoly, have been investigated by applying the CVE method. By looking at the bilateral market from an alternative aspect, the supply companies’ behaviors in an oligopsony environment have also been taken into consideration. At the final stage of this research, the ‘matching’ of both quantity and price between oligopolistic and oligopsonistic markets has been obtained through a novel-coordinating algorithm that includes CVE model iterations of both markets. Such matching can be achieved by adopting a hierarchical optimization approach, using the Matlab Patternsearch optimization algorithm, which acts as a virtual broker to find the equilibrium point of both markets. Index Terms-- Bilateral electricity market, Oligopolistic market, Oligopsonistic market, Conjectural Variation Equilibrium method, Patternsearch optimization, Game theory, Hierarchical optimization method
|
429 |
Theoretical and Practical Aspects of Ant Colony OptimizationBlum, Christian 23 January 2004 (has links)
Combinatorial optimization problems are of high academical as well as practical importance. Many instances of relevant combinatorial optimization problems are, due to their dimensions, intractable for complete methods such as branch and bound. Therefore, approximate algorithms such as metaheuristics received much attention in the past 20 years. Examples of metaheuristics are simulated annealing, tabu search, and evolutionary computation. One of the most recent metaheuristics is ant colony optimization (ACO), which was developed by Prof. M. Dorigo (who is the supervisor of this thesis) and colleagues. This thesis deals with theoretical as well as practical aspects of ant colony optimization.
* A survey of metaheuristics. Chapter 1 gives an extensive overview on the nowadays most important metaheuristics. This overview points out the importance of two important concepts in metaheuristics: intensification and diversification.
* The hyper-cube framework. Chapter 2 introduces a new framework for implementing ACO algorithms. This framework brings two main benefits to ACO researchers. First, from the point of view of the theoretician: we prove that Ant System (the first ACO algorithm to be proposed in the literature) in the hyper-cube framework generates solutions whose expected quality monotonically increases with the number of algorithm iterations when applied to unconstrained problems. Second, from the point of view of the experimental researcher, we show through examples that the implementation of ACO algorithms in the hyper-cube framework increases their robustness and makes the handling of the pheromone values easier.
* Deception. In the first part of Chapter 3 we formally define the notions of first and second order deception in ant colony optimization. Hereby, first order deception corresponds to deception as defined in the field of evolutionary computation and is therefore a bias introduced by the problem (instance) to be solved. Second order deception is an ACO-specific phenomenon. It describes the observation that the quality of the solutions generated by ACO algorithms may decrease over time in certain settings. In the second part of Chapter 3 we propose different ways of avoiding second order deception.
* ACO for the KCT problem. In Chapter 4 we outline an ACO algorithm for the edge-weighted k-cardinality tree (KCT) problem. This algorithm is implemented in the hyper-cube framework and uses a pheromone model that was determined to be well-working in Chapter 3. Together with the evolutionary computation and the tabu search approaches that we develop in Chapter 4, this ACO algorithm belongs to the current state-of-the-art algorithms for the KCT problem.
* ACO for the GSS problem. Chapter 5 describes a new ACO algorithm for the group shop scheduling (GSS) problem, which is a general shop scheduling problem that includes among others the well-known job shop scheduling (JSS) and the open shop scheduling (OSS) problems. This ACO algorithm, which is implemented in the hyper-cube framework and which uses a new pheromone model that was experimentally tested in Chapter 3, is currently the best ACO algorithm for the JSS as well as the OSS problem. In particular when applied to OSS problem instances, this algorithm obtains excellent results, improving the best known solution for several OSS benchmark instances. A final contribution of this thesis is the development of a general method for the solution of combinatorial optimization problems which we refer to as Beam-ACO. This method is a hybrid between ACO and a tree search technique known as beam search. We show that Beam-ACO is currently a state-of-the-art method for the application to the existing open shop scheduling (OSS) problem instances.
|
430 |
Does copula beat linearity? : Comparison of copulas and linear correlation in portfolio optimization.Blom, Joakim, Wargclou, Joakim January 2016 (has links)
Modern portfolio theory (MPT) is an investment theory which was introduced by Harry Markowitz in 1952 and describes how risk averse investors can optimize their portfolios. The objective of MPT is to assemble a portfolio by maximizing the expected return given a level of market risk or minimizing the market risk given an expected return. Although MPT has gained popularity over the years it has also been criticized for several theoretical and empirical shortcomings such as using variance as a measure of risk, measuring the dependence with linear correlation and assuming that returns are normally distributed when in fact empirical data suggests otherwise. When moving away from the assumption that returns are elliptical distributed, for example normally distributed, we can not use linear correlation as a measure of dependence in an accurate way. Copulas are a flexible tool for modeling dependence of random variables and enable us to separate the marginals from any joint distribution in order to extract the dependence structure. The objective of this paper was to examine the applicability of a copula-CVaR framework in portfolio optimization compared to the traditional MPT. Further, we studied how the presence of memory, when calibrating the copulas, affects portfolio optimization. The marginals for the copula based portfolios were constructed using Extreme Value Theory and the market risk was measured by Conditional Value at Risk. We implemented a dynamic investing strategy where the portfolios were optimized on a monthly basis with two different length of rolling calibration windows. The portfolios were backtested during a sample period from 2000-2016 and compared against two benchmarks; Markowitz portfolio based on normally distributed returns and an equally weighted, non optimized portfolio. The results demonstrated that portfolio optimization is often preferred compared to choosing an equally weighted portfolio. However, the results also indicated that the copula based portfolios do not always beat the traditional Markowitz portfolio. Furthermore, the results indicated that the choice of length of calibration window affects the selected portfolios and consequently also the performance. This result was supported both by the performance metrics and the stability of the estimated copula parameters.
|
Page generated in 0.045 seconds