• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 265
  • 87
  • 58
  • 22
  • 8
  • 6
  • 6
  • 5
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 594
  • 594
  • 426
  • 136
  • 109
  • 98
  • 93
  • 89
  • 75
  • 73
  • 68
  • 61
  • 60
  • 55
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

Applying the cross-entropy method in multi-objective optimisation of dynamic stochastic systems

Bekker, James 12 1900 (has links)
Thesis (PhD)--Stellenbosch University, 2012. / ENGLISH ABSTRACT: A difficult subclass of engineering optimisation problems is the class of optimisation problems which are dynamic and stochastic. These problems are often of a non-closed form and thus studied by means of computer simulation. Simulation production runs of these problems can be time-consuming due to the computational burden implied by statistical inference principles. In multi-objective optimisation of engineering problems, large decision spaces and large objective spaces prevail, since two or more objectives are simultaneously optimised and many problems are also of a combinatorial nature. The computational burden associated with solving such problems is even larger than for most single-objective optimisation problems, and hence an e cient algorithm that searches the vast decision space is required. Many such algorithms are currently available, with researchers constantly improving these or developing more e cient algorithms. In this context, the term \e cient" means to provide near-optimised results with minimal evaluations of objective function values. Thus far research has often focused on solving speci c benchmark problems, or on adapting algorithms to solve speci c engineering problems. In this research, a multi-objective optimisation algorithm, based on the cross-entropy method for single-objective optimisation, is developed and assessed. The aim with this algorithm is to reduce the number of objective function evaluations, particularly when time-dependent (dynamic), stochastic processes, as found in Industrial Engineering, are studied. A brief overview of scholarly work in the eld of multiobjective optimisation is presented, followed by a theoretical discussion of the cross-entropy method. The new algorithm is developed, based on this information, and assessed considering continuous, deterministic problems, as well as discrete, stochastic problems. The latter include a classical single-commodity inventory problem, the well-known buffer allocation problem, and a newly designed, laboratory-sized recon gurable manufacturing system. Near multi-objective optimisation of two practical problems were also performed using the proposed algorithm. In the rst case, some design parameters of a polymer extrusion unit are estimated using the algorithm. The management of carbon monoxide gas utilisation at an ilmenite smelter is complex with many decision variables, and the application of the algorithm in that environment is presented as a second case. Quality indicator values are estimated for thirty-four test problem instances of multi-objective optimisation problems in order to quantify the quality performance of the algorithm, and it is also compared to a commercial algorithm. The algorithm is intended to interface with dynamic, stochastic simulation models of real-world problems. It is typically implemented in a programming language while the simulation model is developed in a dedicated, commercial software package. The proposed algorithm is simple to implement and proved to be efficient on test problems. / AFRIKAANSE OPSOMMING: 'n Moeilike deelklas van optimeringsprobleme in die ingenieurswese is optimeringsprobleme van 'n dinamiese en stogastiese aard. Sulke probleme is dikwels nie-geslote en word gevolglik met behulp van rekenaarsimulasie bestudeer. Die beginsels van statistiese steekproefneming veroorsaak dat produksielopies van hierdie probleme tydrowend is weens die rekenlas wat genoodsaak word. Groot besluitnemingruimtes en doelwitruimtes bestaan in meerdoelige optimering van ingenieursprobleme, waar twee of meer doelwitte gelyktydig geoptimeer word, terwyl baie probleme ook 'n kombinatoriese aard het. Die rekenlas wat met die oplos van sulke probleme gepaard gaan, is selfs groter as vir die meeste enkeldoelwit optimeringsprobleme, en 'n doeltre ende algoritme wat die meesal uitgebreide besluitnemingsruimte verken, is gevolglik nodig. Daar bestaan tans verskeie sulke algoritmes, terwyl navorsers steeds poog om hierdie algoritmes te verbeter of meer doeltre ende algoritmes te ontwikkel. In hierdie konteks beteken \doeltre end" dat naby-optimale oplossings verskaf word deur die minimum evaluering van doelwitfunksiewaardes. Navorsing fokus dikwels op oplossing van standaard toetsprobleme, of aanpassing van algoritmes om 'n spesi eke ingenieursprobleem op te los. In hierdie navorsing word 'n meerdoelige optimeringsalgoritme gebaseer op die kruis-entropie-metode vir enkeldoelwit optimering ontwikkel en geassesseer. Die mikpunt met hierdie algoritme is om die aantal evaluerings van doelwitfunksiewaardes te verminder, spesi ek wanneer tydafhanklike (dinamiese), stogastiese prosesse soos wat dikwels in die Bedryfsingenieurswese te egekom word, bestudeer word. 'n Bondige oorsig van navorsing in die veld van meerdoelige optimering word gegee, gevolg deur 'n teoretiese bespreking van die kruis-entropiemetode. Die nuwe algoritme se ontwikkeling is hierop gebaseer, en dit word geassesseer deur kontinue, deterministiese probleme sowel as diskrete, stogastiese probleme benaderd daarmee op te los. Laasgenoemde sluit in 'n klassieke enkelitem voorraadprobleem, die bekende buffer-toedelingsprobleem, en 'n nuut-ontwerpte, laboratorium-skaal herkon gureerbare vervaardigingstelsel. Meerdoelige optimering van twee praktiese probleme is met die algoritme uitgevoer. In die eerste geval word sekere ontwerpparameters van 'n polimeer-uittrekeenheid met behulp van die algoritme beraam. Die bestuur van koolstofmonoksiedbenutting in 'n ilmeniet-smelter is kompleks met verskeie besluitnemingveranderlikes, en die toepassing van die algoritme in daardie omgewing word as 'n tweede geval aangebied. Verskeie gehalte-aanwyserwaardes word beraam vir vier-en-dertig toetsgevalle van meerdoelige optimeringsprobleme om die gehalte-prestasie van die algoritme te kwanti seer, en dit word ook vergelyk met 'n kommersi ele algoritme. Die algoritme is veronderstel om te skakel met dinamiese, stogastiese simulasiemodelle van regtew^ereldprobleme. Die algoritme sal tipies in 'n programmeertaal ge mplementeer word terwyl die simulasiemodel in doelmatige, kommersi ele programmatuur ontwikkel sal word. Die voorgestelde algoritme is maklik om te implementeer en dit het doeltre end gewerk op toetsprobleme.
72

A multi-objective evolutionary approach to simulation-based optimisation of real-world problems

Syberfeldt, Anna January 2009 (has links)
This thesis presents a novel evolutionary optimisation algorithm that can improve the quality of solutions in simulation-based optimisation. Simulation-based optimisation is the process of finding optimal parameter settings without explicitly examining each possible configuration of settings. An optimisation algorithm generates potential configurations and sends these to the simulation, which acts as an evaluation function. The evaluation results are used to refine the optimisation such that it eventually returns a high-quality solution. The algorithm described in this thesis integrates multi-objective optimisation, parallelism, surrogate usage, and noise handling in a unique way for dealing with simulation-based optimisation problems incurred by these characteristics. In order to handle multiple, conflicting optimisation objectives, the algorithm uses a Pareto approach in which the set of best trade-off solutions is searched for and presented to the user. The algorithm supports a high degree of parallelism by adopting an asynchronous master-slave parallelisation model in combination with an incremental population refinement strategy. A surrogate evaluation function is adopted in the algorithm to quickly identify promising candidate solutions and filter out poor ones. A novel technique based on inheritance is used to compensate for the uncertainties associated with the approximative surrogate evaluations. Furthermore, a novel technique for multi-objective problems that effectively reduces noise by adopting a dynamic procedure in resampling solutions is used to tackle the problem of real-world unpredictability (noise). The proposed algorithm is evaluated on benchmark problems and two complex real-world problems of manufacturing optimisation. The first real-world problem concerns the optimisation of a production cell at Volvo Aero, while the second one concerns the optimisation of a camshaft machining line at Volvo Cars Engine. The results from the optimisations show that the algorithm finds better solutions for all the problems considered than existing, similar algorithms. The new techniques for dealing with surrogate imprecision and noise used in the algorithm are identified as key reasons for the good performance.
73

Simulation and Optimization of Integrated Maintenance Strategies for an Aircraft Assembly Process

Li, Jin 11 1900 (has links)
In this thesis, the COMAC ARJ21 fuselage’s final assembly process is used as a case study. High production rate (i.e. number of aircraft assembled per year) with reasonable cost is the overall aim in this example. The output of final assembly will essentially affect the prior and subsequent processes of the overall ARJ21 production. From the collected field data, it was identified that a number of disruptions (or bottlenecks) in the assembly sequence were caused by breakdowns and maintenance of the (semi-)automatic assembly machines like portable computer numerical control (CNC) drilling machine, rivet gun and overhead crane. The focus of this thesis is therefore on the maintenance strategies (i.e. Condition-Based Maintenance (CBM)) for these equipment and how they impact the throughput of the fuselage assembly process. The fuselage assembly process is modelled and analysed by using agent-based simulation in this thesis. The agent approach allows complex process interactions of assembly, equipment and maintenance to be captured and empirically studied. In this thesis, the built network is modelled as the sequence of activities in each stage. Each stage is broken down into critical activities which are parameterized by activity lead-time and equipment used. CBM based models of uncertain degradation and imperfect maintenance are used in the simulation study. A scatter search is used to find multi-objective optimal solutions for the CBM regime, where the maintenance-related cost and production rate are the optimization objectives. In this thesis, in order to ease computation intensity caused by running multiple simulations during the optimization and to simplify a multi-objective formulation, multiple Min-Max weightings are applied to trace Pareto front. The empirical analysis reviews the trade-offs between the production rate and maintenance cost and how these objectives are influenced by the design parameters.
74

The use of real options and multi-objective optimisation in flood risk management

Woodward, Michelle January 2012 (has links)
The development of suitable long term flood risk intervention strategies is a challenge. Climate change alone is a significant complication but in addition complexities exist trying to identify the most appropriate set of interventions, the area with the highest economical benefit and the most opportune time for implementation. All of these elements pose difficulties to decision makers. Recently, there has been a shift in the current practice for appraising potential strategies and consideration is now being given to ensure flexible, adaptive strategies to account for the uncertain climatic conditions. Real Options in particular is becoming an acknowledged approach to account for the future uncertainties inherent in a flood risk investment decision. Real Options facilitates adaptive strategies as it enables the value of flexibility to be explicitly included within the decision making process. Opportunities are provided for the decision maker to modify and update investments when knowledge of the future state comes to light. In this thesis the use of Real Options in flood risk management is investigated as a method to account for the uncertainties of climate change. Each Intervention strategy is purposely designed to capture a level of flexibility and have the ability to adapt in the future if required. A state of the art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time. In addition to Real Options, this thesis also explores the use of evolutionary optimisation algorithms to aid the decision making process when identifying the most appropriate long term strategies. Although the risk analysis tool is capable of quantifying the potential benefits attributed to a strategy, it is not necessarily able to identify the most appropriate. Methods are required which can search for the optimal solutions according to a range of performance metrics. Single and multi-objective genetic algorithms are investigated in this thesis as a method to search for the most appropriate long term intervention strategies. The Real Options concepts are combined with the evolutionary multiobjective optimisation algorithm to create a decision support methodology which is capable of searching for the most appropriate long term economical yet robust intervention strategies which are flexible to future change. The methodology is applied to two individual case studies, a section of the Thames Estuary and an area on the River Dodder. The results show the inclusion of flexibility is advantageous while the outputs provide decision makers with supplementary knowledge which previously has not been considered.
75

Optimisation and computational methods to model the oculomotor system with focus on nystagmus

Avramidis, Eleftherios January 2015 (has links)
Infantile nystagmus is a condition that causes involuntary, bilateral and conjugate oscillations of the eyes, which are predominately restricted to the horizontal plane. In order to investigate the cause of nystagmus, computational models and nonlinear dynamics techniques have been used to model and analyse the oculomotor system. Computational models are important in making predictions and creating a quantitative framework for the analysis of the oculomotor system. Parameter estimation is a critical step in the construction and analysis of these models. A preliminary parameter estimation of a nonlinear dynamics model proposed by Broomhead et al. [1] has been shown to be able to simulate both normal rapid eye movements (i.e. saccades) and nystagmus oscillations. The application of nonlinear analysis to experimental jerk nystagmus recordings, has shown that the local dimensions number of the oscillation varies across the phase angle of the nystagmus cycle. It has been hypothesised that this is due to the impact of signal dependent noise (SDN) on the neural commands in the oculomotor system. The main aims of this study were: (i) to develop parameter estimation methods for the Broomhead et al. [1] model in order to explore its predictive capacity by fitting it to experimental recordings of nystagmus waveforms and saccades; (ii) to develop a stochastic oculomotor model and examine the hypothesis that noise on the neural commands could be the cause of the behavioural characteristics measured from experimental nystagmus time series using nonlinear analysis techniques. In this work, two parameter estimation methods were developed, one for fitting the model to the experimental nystagmus waveforms and one to saccades. By using the former method, we successfully fitted the model to experimental nystagmus waveforms. This fit allowed to find the specific parameter values that set the model to generate these waveforms. The types of the waveforms that we successfully fitted were asymmetric pseudo-cycloid, jerk and jerk with extended foveation. The fit of other types of nystagmus waveforms were not examined in this work. Moreover, the results showed which waveforms the model can generate almost perfectly and the waveform characteristics of a number of jerk waveforms which it cannot exactly generate. These characteristics were on a specific type of jerk nystagmus waveforms with a very extreme fast phase. The latter parameter estimation method allowed us to explore whether the model can generate horizontal saccades of different amplitudes with the same behaviour as observed experimentally. The results suggest that the model can generate the experimental saccadic velocity profiles of different saccadic amplitudes. However, the results show that best fittings of the model to the experimental data are when different model parameter values were used for different saccadic amplitude. Our parameter estimation methods are based on multi-objective genetic algorithms (MOGA), which have the advantage of optimising biological models with a multi-objective, high-dimensional and complex search space. However, the integration of these models, for a wide range of parameter combinations, is very computationally intensive for a single central processing unit (CPU). To overcome this obstacle, we accelerated the parameter estimation method by utilising the parallel capabilities of a graphics processing unit (GPU). Depending of the GPU model, this could provide a speedup of 30 compared to a midrange CPU. The stochastic model that we developed is based on the Broomhead et al. [1] model, with signal dependent noise (SDN) and constant noise (CN) added to the neural commands. We fitted the stochastic model to saccades and jerk nystagmus waveforms. It was found that SDN and CN can cause similar variability to the local dimensions number of the oscillation as found in the experimental jerk nystagmus waveforms and in the case of saccade generation the saccadic variability recorded experimentally. However, there are small differences in the simulated behaviour compared to the nystagmus experimental data. We hypothesise that these could be caused by the inability of the model to simulate exactly key jerk waveform characteristics. Moreover, the differences between the simulations and the experimental nystagmus waveforms indicate that the proposed model requires further expansion, and this could include other oculomotor subsystem(s).
76

Pricing Financial Option as a Multi-Objective Optimization Problem Using Firefly Algorithms

Singh, Gobind Preet 01 September 2016 (has links)
An option, a type of a financial derivative, is a contract that creates an opportunity for a market player to avoid risks involved in investing, especially in equities. An investor desires to know the accurate value of an option before entering into a contract to buy/sell the underlying asset (stock). There are various techniques that try to simulate real market conditions in order to price or evaluate an option. However, most of them achieved limited success due to high uncertainty in price behavior of the underlying asset. In this study, I propose two new Firefly variant algorithms to compute accurate worth for European and American option contracts and compare them with popular option pricing models (such as Black-Scholes-Merton, binomial lattice, Monte-Carlo, etc.) and real market data. In my study, I have first modelled the option pricing as a multi-objective optimization problem, where I introduced the pay-off and probability of achieving that pay-off as the main optimization objectives. Then, I proposed to use a latest nature-inspired algorithm that uses the bioluminescence of Fireflies to simulate the market conditions, a first attempt in the literature. For my thesis, I have proposed adaptive weighted-sum based Firefly algorithm and non-dominant sorting Firefly algorithm to find Pareto optimal solutions for the option pricing problem. Using my algorithm(s), I have successfully computed complete Pareto front of option prices for a number of option contracts from the real market (Bloomberg data). Also, I have shown that one of the points on the Pareto front represents the option value within 1-2 % error of the real data (Bloomberg). Moreover, with my experiments, I have shown that any investor may utilize the results in the Pareto fronts for deciding to get into an option contract and can evaluate the worth of a contract tuned to their risk ability. This implies that my proposed multi-objective model and Firefly algorithm could be used in real markets for pricing options at different levels of accuracy. To the best of my knowledge, modelling option pricing problem as a multi-objective optimization problem and using newly developed Firefly algorithm for solving it is unique and novel. / October 2016
77

Optimisation of the VARTM process

Struzziero, Giacomo January 2014 (has links)
This study focuses on the development of a multi-objective optimisation methodology for the vacuum assisted resin transfer moulding composite processing route. Simulations of the cure and filling stages of the process have been implemented and the corresponding heat transfer and flow through porous media problems solved by means of finite element analysis. The simulations involved material sub-models to describe thermal properties, cure kinetics and viscosity evolution. A Genetic algorithm which constitutes the foundation for the development of the optimisation has been adapted, implemented and tested in terms of its effectiveness using four benchmark problems. Two methodologies suitable for multi-objective optimisation of the cure and filling stages have been specified and successfully implemented. In the case of the curing stage the optimisation aims at finding a cure profile minimising both process time and temperature overshoot within the part. In the case of the filling stage the thermal profile during filling, gate locations and initial resin temperature are optimised to minimise filling time and final degree of cure at the end of the filling stage. Investigations of the design landscape for both curing and filling stage have indicated the complex nature of the problems under investigation justifying the choice for using a Genetic algorithm. Application of the two methodologies showed that they are highly efficient in identifying appropriate process designs and significant improvements compared to standard conditions are feasible. In the cure process an overshoot temperature reduction up to 75% in the case of thick component can be achieved whilst for a thin part a 60% reduction in process time can be accomplished. In the filling process a 42% filling time reduction and 14% reduction of degree of cure at the end of the filling can be achieved using the optimisation methodology. Stability analysis of the set of solutions for the curing stage has shown that different degrees of robustness are present among the individuals in the Pareto front. The optimisation methodology has also been integrated with an existing cost model that allowed consideration of process cost in the optimisation of the cure stage. The optimisation resulted in process designs that involve 500 € reduction in process cost. An inverse scheme has been developed based on the optimisation methodology aiming at combining simulation and monitoring of the filling stage for the identification of on-line permeability during an infusion. The methodology was tested using artificial data and it was demonstrated that the methodology is able to handle levels of noise from the measurements up to 5 s per sensor without affecting the quality of the outcome.
78

Optimal design of geothermal power plants

Clarke, Joshua 01 January 2014 (has links)
The optimal design of geothermal power plants across the entire spectrum of meaningful geothermal brine temperatures and climates is investigated, while accounting for vital real-world constraints that are typically ignored in the existing literature. The constrained design space of both double-flash and binary geothermal power plants is visualized, and it is seen that inclusion of real-world constraints is vital to determining the optimal feasible design of a geothermal power plant. The effect of varying condenser temperature on optimum plant performance and optimal design specifications is analyzed. It is shown that condenser temperature has a significant effect on optimal plant design as well. The optimum specific work output and corresponding optimal design of geothermal power plants across the entire range of brine temperatures and condenser temperatures is illustrated and tabulated, allowing a scientifically sound assessment of both feasibility and appropriate plant design under any set of conditions. The performance of genetic algorithms and particle swarm optimization are compared with respect to the constrained, non-linear, simulation-based optimization of a prototypical geothermal power plant, and particle swarm optimization is shown to perform significantly better than genetic algorithms. The Pareto-optimal front of specific work output and specific heat exchanger area is visualized and tabulated for binary and double-flash plants across the full range of potential geothermal brine inlet conditions and climates, allowing investigation of the specific trade-offs required between specific work output and specific heat exchanger area. In addition to the novel data, this dissertation research illustrates the development and use of a sophisticated analysis tool, based on multi-objective particle swarm optimization, for the optimal design of geothermal power plants.
79

Developing a Multi-Objective Decision Model for Maximizing IS Security within an Organization

May, Jeffrey Lee 01 January 2008 (has links)
Numerous IS researchers have argued that IS Security can be more effectively managed if the emphasis goes beyond the technical means of protecting information resources. In an effort to adopt a broader perspective that accounts for issues that transcend technical means alone, Dhillon and Torkzadeh (2006) present an array of 9 fundamental and 16 means objectives that are essential for maximizing IS security in an organization. These objectives were derived using a value-focused thinking approach and are organized into a conceptual framework. This conceptual framework provides a rigorous theoretical base for considering IS security in a manner that accounts for both technical and organizational issues; however, no direction is provided for using these objectives so that informed decisions can be made. As a result, the goal of this dissertation is to develop a decision model using Multiple Objective Decision Analysis (MODA) techniques that seek to provide informed alternatives to decision makers who desire to maximize IS security within an organization.
80

Modélisation, simulation et optimisation pour l'éco-fabrication / Modeling, simulation and optimization for sustainable manufacturing

Hassine, Hichem 09 February 2015 (has links)
Cette thèse se focalise sur la proposition et l’application des approches pour la modélisation de l’éco-fabrication. Ces approches permettent de préparer et simuler une démarche de fabrication des produits en assurant le couplage entre les objectifs écologiques et économiques.Les approches développées dans cette thèse sont basées sur les notions d’aide à la décision ainsi que l’optimisation multi objectifs. L’aide à la décision permet l’intervention en deux différents niveaux : le choix des impacts environnementaux à quantifier ainsi que le choix du scénario final de fabrication. Pour l’optimisation multi objectifs, elle assure le couplage entre les deux piliers principaux de l’éco-fabrication : l’écologie et l’économie. Au niveau de l’aide à la décision multi critères, les méthodes Evamix et Promethee ont été appliqués, tandis que les essaims particulaires ont été développés dans le cadre de l’optimisation multi objectifs.Ces approches ont été appliquées tout d’abord aux quelques opérations d’usinage : tournage et fraisage. Finalement, la chaîne de fabrication de l’acide phosphorique ainsi que celle d’acide sulfurique ont été le sujet de l’application des deux approches développées. / This thesis focuses on the proposal and implementation of approaches for modeling sustainable manufacturing. These approaches are used to prepare and simulate a process of manufacturing products providing coupling between environmental and economic objectives.The approaches developed in this thesis are based on the concepts of decision support as well as multi-objective optimization. The decision support allows intervention in two different levels: the choice of indicator to quantify the environmental impacts and the choice of the final manufacturing scenario. For multi-objective optimization, it provides the coupling between the two main pillars of sustainable manufacturing: ecology and economy. In terms of multi criteria decision aid methods, Evamix and Promethee were applied, while particulate swarms were developed as part of the multi-objective optimization. These approaches have been applied initially to some machining operations: turning and milling. Finally, the production line of phosphoric acid and sulfuric acid were the subject of application of the two approaches developed.

Page generated in 0.0564 seconds