• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 263
  • 87
  • 58
  • 22
  • 8
  • 6
  • 6
  • 5
  • 3
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • Tagged with
  • 591
  • 591
  • 424
  • 135
  • 106
  • 98
  • 93
  • 89
  • 75
  • 73
  • 68
  • 61
  • 60
  • 55
  • 55
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
71

A multi-objective evolutionary approach to simulation-based optimisation of real-world problems

Syberfeldt, Anna January 2009 (has links)
This thesis presents a novel evolutionary optimisation algorithm that can improve the quality of solutions in simulation-based optimisation. Simulation-based optimisation is the process of finding optimal parameter settings without explicitly examining each possible configuration of settings. An optimisation algorithm generates potential configurations and sends these to the simulation, which acts as an evaluation function. The evaluation results are used to refine the optimisation such that it eventually returns a high-quality solution. The algorithm described in this thesis integrates multi-objective optimisation, parallelism, surrogate usage, and noise handling in a unique way for dealing with simulation-based optimisation problems incurred by these characteristics. In order to handle multiple, conflicting optimisation objectives, the algorithm uses a Pareto approach in which the set of best trade-off solutions is searched for and presented to the user. The algorithm supports a high degree of parallelism by adopting an asynchronous master-slave parallelisation model in combination with an incremental population refinement strategy. A surrogate evaluation function is adopted in the algorithm to quickly identify promising candidate solutions and filter out poor ones. A novel technique based on inheritance is used to compensate for the uncertainties associated with the approximative surrogate evaluations. Furthermore, a novel technique for multi-objective problems that effectively reduces noise by adopting a dynamic procedure in resampling solutions is used to tackle the problem of real-world unpredictability (noise). The proposed algorithm is evaluated on benchmark problems and two complex real-world problems of manufacturing optimisation. The first real-world problem concerns the optimisation of a production cell at Volvo Aero, while the second one concerns the optimisation of a camshaft machining line at Volvo Cars Engine. The results from the optimisations show that the algorithm finds better solutions for all the problems considered than existing, similar algorithms. The new techniques for dealing with surrogate imprecision and noise used in the algorithm are identified as key reasons for the good performance.
72

Simulation and Optimization of Integrated Maintenance Strategies for an Aircraft Assembly Process

Li, Jin 11 1900 (has links)
In this thesis, the COMAC ARJ21 fuselage’s final assembly process is used as a case study. High production rate (i.e. number of aircraft assembled per year) with reasonable cost is the overall aim in this example. The output of final assembly will essentially affect the prior and subsequent processes of the overall ARJ21 production. From the collected field data, it was identified that a number of disruptions (or bottlenecks) in the assembly sequence were caused by breakdowns and maintenance of the (semi-)automatic assembly machines like portable computer numerical control (CNC) drilling machine, rivet gun and overhead crane. The focus of this thesis is therefore on the maintenance strategies (i.e. Condition-Based Maintenance (CBM)) for these equipment and how they impact the throughput of the fuselage assembly process. The fuselage assembly process is modelled and analysed by using agent-based simulation in this thesis. The agent approach allows complex process interactions of assembly, equipment and maintenance to be captured and empirically studied. In this thesis, the built network is modelled as the sequence of activities in each stage. Each stage is broken down into critical activities which are parameterized by activity lead-time and equipment used. CBM based models of uncertain degradation and imperfect maintenance are used in the simulation study. A scatter search is used to find multi-objective optimal solutions for the CBM regime, where the maintenance-related cost and production rate are the optimization objectives. In this thesis, in order to ease computation intensity caused by running multiple simulations during the optimization and to simplify a multi-objective formulation, multiple Min-Max weightings are applied to trace Pareto front. The empirical analysis reviews the trade-offs between the production rate and maintenance cost and how these objectives are influenced by the design parameters.
73

The use of real options and multi-objective optimisation in flood risk management

Woodward, Michelle January 2012 (has links)
The development of suitable long term flood risk intervention strategies is a challenge. Climate change alone is a significant complication but in addition complexities exist trying to identify the most appropriate set of interventions, the area with the highest economical benefit and the most opportune time for implementation. All of these elements pose difficulties to decision makers. Recently, there has been a shift in the current practice for appraising potential strategies and consideration is now being given to ensure flexible, adaptive strategies to account for the uncertain climatic conditions. Real Options in particular is becoming an acknowledged approach to account for the future uncertainties inherent in a flood risk investment decision. Real Options facilitates adaptive strategies as it enables the value of flexibility to be explicitly included within the decision making process. Opportunities are provided for the decision maker to modify and update investments when knowledge of the future state comes to light. In this thesis the use of Real Options in flood risk management is investigated as a method to account for the uncertainties of climate change. Each Intervention strategy is purposely designed to capture a level of flexibility and have the ability to adapt in the future if required. A state of the art flood risk analysis tool is employed to evaluate the risk associated to each strategy over future points in time. In addition to Real Options, this thesis also explores the use of evolutionary optimisation algorithms to aid the decision making process when identifying the most appropriate long term strategies. Although the risk analysis tool is capable of quantifying the potential benefits attributed to a strategy, it is not necessarily able to identify the most appropriate. Methods are required which can search for the optimal solutions according to a range of performance metrics. Single and multi-objective genetic algorithms are investigated in this thesis as a method to search for the most appropriate long term intervention strategies. The Real Options concepts are combined with the evolutionary multiobjective optimisation algorithm to create a decision support methodology which is capable of searching for the most appropriate long term economical yet robust intervention strategies which are flexible to future change. The methodology is applied to two individual case studies, a section of the Thames Estuary and an area on the River Dodder. The results show the inclusion of flexibility is advantageous while the outputs provide decision makers with supplementary knowledge which previously has not been considered.
74

Optimisation and computational methods to model the oculomotor system with focus on nystagmus

Avramidis, Eleftherios January 2015 (has links)
Infantile nystagmus is a condition that causes involuntary, bilateral and conjugate oscillations of the eyes, which are predominately restricted to the horizontal plane. In order to investigate the cause of nystagmus, computational models and nonlinear dynamics techniques have been used to model and analyse the oculomotor system. Computational models are important in making predictions and creating a quantitative framework for the analysis of the oculomotor system. Parameter estimation is a critical step in the construction and analysis of these models. A preliminary parameter estimation of a nonlinear dynamics model proposed by Broomhead et al. [1] has been shown to be able to simulate both normal rapid eye movements (i.e. saccades) and nystagmus oscillations. The application of nonlinear analysis to experimental jerk nystagmus recordings, has shown that the local dimensions number of the oscillation varies across the phase angle of the nystagmus cycle. It has been hypothesised that this is due to the impact of signal dependent noise (SDN) on the neural commands in the oculomotor system. The main aims of this study were: (i) to develop parameter estimation methods for the Broomhead et al. [1] model in order to explore its predictive capacity by fitting it to experimental recordings of nystagmus waveforms and saccades; (ii) to develop a stochastic oculomotor model and examine the hypothesis that noise on the neural commands could be the cause of the behavioural characteristics measured from experimental nystagmus time series using nonlinear analysis techniques. In this work, two parameter estimation methods were developed, one for fitting the model to the experimental nystagmus waveforms and one to saccades. By using the former method, we successfully fitted the model to experimental nystagmus waveforms. This fit allowed to find the specific parameter values that set the model to generate these waveforms. The types of the waveforms that we successfully fitted were asymmetric pseudo-cycloid, jerk and jerk with extended foveation. The fit of other types of nystagmus waveforms were not examined in this work. Moreover, the results showed which waveforms the model can generate almost perfectly and the waveform characteristics of a number of jerk waveforms which it cannot exactly generate. These characteristics were on a specific type of jerk nystagmus waveforms with a very extreme fast phase. The latter parameter estimation method allowed us to explore whether the model can generate horizontal saccades of different amplitudes with the same behaviour as observed experimentally. The results suggest that the model can generate the experimental saccadic velocity profiles of different saccadic amplitudes. However, the results show that best fittings of the model to the experimental data are when different model parameter values were used for different saccadic amplitude. Our parameter estimation methods are based on multi-objective genetic algorithms (MOGA), which have the advantage of optimising biological models with a multi-objective, high-dimensional and complex search space. However, the integration of these models, for a wide range of parameter combinations, is very computationally intensive for a single central processing unit (CPU). To overcome this obstacle, we accelerated the parameter estimation method by utilising the parallel capabilities of a graphics processing unit (GPU). Depending of the GPU model, this could provide a speedup of 30 compared to a midrange CPU. The stochastic model that we developed is based on the Broomhead et al. [1] model, with signal dependent noise (SDN) and constant noise (CN) added to the neural commands. We fitted the stochastic model to saccades and jerk nystagmus waveforms. It was found that SDN and CN can cause similar variability to the local dimensions number of the oscillation as found in the experimental jerk nystagmus waveforms and in the case of saccade generation the saccadic variability recorded experimentally. However, there are small differences in the simulated behaviour compared to the nystagmus experimental data. We hypothesise that these could be caused by the inability of the model to simulate exactly key jerk waveform characteristics. Moreover, the differences between the simulations and the experimental nystagmus waveforms indicate that the proposed model requires further expansion, and this could include other oculomotor subsystem(s).
75

Pricing Financial Option as a Multi-Objective Optimization Problem Using Firefly Algorithms

Singh, Gobind Preet 01 September 2016 (has links)
An option, a type of a financial derivative, is a contract that creates an opportunity for a market player to avoid risks involved in investing, especially in equities. An investor desires to know the accurate value of an option before entering into a contract to buy/sell the underlying asset (stock). There are various techniques that try to simulate real market conditions in order to price or evaluate an option. However, most of them achieved limited success due to high uncertainty in price behavior of the underlying asset. In this study, I propose two new Firefly variant algorithms to compute accurate worth for European and American option contracts and compare them with popular option pricing models (such as Black-Scholes-Merton, binomial lattice, Monte-Carlo, etc.) and real market data. In my study, I have first modelled the option pricing as a multi-objective optimization problem, where I introduced the pay-off and probability of achieving that pay-off as the main optimization objectives. Then, I proposed to use a latest nature-inspired algorithm that uses the bioluminescence of Fireflies to simulate the market conditions, a first attempt in the literature. For my thesis, I have proposed adaptive weighted-sum based Firefly algorithm and non-dominant sorting Firefly algorithm to find Pareto optimal solutions for the option pricing problem. Using my algorithm(s), I have successfully computed complete Pareto front of option prices for a number of option contracts from the real market (Bloomberg data). Also, I have shown that one of the points on the Pareto front represents the option value within 1-2 % error of the real data (Bloomberg). Moreover, with my experiments, I have shown that any investor may utilize the results in the Pareto fronts for deciding to get into an option contract and can evaluate the worth of a contract tuned to their risk ability. This implies that my proposed multi-objective model and Firefly algorithm could be used in real markets for pricing options at different levels of accuracy. To the best of my knowledge, modelling option pricing problem as a multi-objective optimization problem and using newly developed Firefly algorithm for solving it is unique and novel. / October 2016
76

Optimisation of the VARTM process

Struzziero, Giacomo January 2014 (has links)
This study focuses on the development of a multi-objective optimisation methodology for the vacuum assisted resin transfer moulding composite processing route. Simulations of the cure and filling stages of the process have been implemented and the corresponding heat transfer and flow through porous media problems solved by means of finite element analysis. The simulations involved material sub-models to describe thermal properties, cure kinetics and viscosity evolution. A Genetic algorithm which constitutes the foundation for the development of the optimisation has been adapted, implemented and tested in terms of its effectiveness using four benchmark problems. Two methodologies suitable for multi-objective optimisation of the cure and filling stages have been specified and successfully implemented. In the case of the curing stage the optimisation aims at finding a cure profile minimising both process time and temperature overshoot within the part. In the case of the filling stage the thermal profile during filling, gate locations and initial resin temperature are optimised to minimise filling time and final degree of cure at the end of the filling stage. Investigations of the design landscape for both curing and filling stage have indicated the complex nature of the problems under investigation justifying the choice for using a Genetic algorithm. Application of the two methodologies showed that they are highly efficient in identifying appropriate process designs and significant improvements compared to standard conditions are feasible. In the cure process an overshoot temperature reduction up to 75% in the case of thick component can be achieved whilst for a thin part a 60% reduction in process time can be accomplished. In the filling process a 42% filling time reduction and 14% reduction of degree of cure at the end of the filling can be achieved using the optimisation methodology. Stability analysis of the set of solutions for the curing stage has shown that different degrees of robustness are present among the individuals in the Pareto front. The optimisation methodology has also been integrated with an existing cost model that allowed consideration of process cost in the optimisation of the cure stage. The optimisation resulted in process designs that involve 500 € reduction in process cost. An inverse scheme has been developed based on the optimisation methodology aiming at combining simulation and monitoring of the filling stage for the identification of on-line permeability during an infusion. The methodology was tested using artificial data and it was demonstrated that the methodology is able to handle levels of noise from the measurements up to 5 s per sensor without affecting the quality of the outcome.
77

Optimal design of geothermal power plants

Clarke, Joshua 01 January 2014 (has links)
The optimal design of geothermal power plants across the entire spectrum of meaningful geothermal brine temperatures and climates is investigated, while accounting for vital real-world constraints that are typically ignored in the existing literature. The constrained design space of both double-flash and binary geothermal power plants is visualized, and it is seen that inclusion of real-world constraints is vital to determining the optimal feasible design of a geothermal power plant. The effect of varying condenser temperature on optimum plant performance and optimal design specifications is analyzed. It is shown that condenser temperature has a significant effect on optimal plant design as well. The optimum specific work output and corresponding optimal design of geothermal power plants across the entire range of brine temperatures and condenser temperatures is illustrated and tabulated, allowing a scientifically sound assessment of both feasibility and appropriate plant design under any set of conditions. The performance of genetic algorithms and particle swarm optimization are compared with respect to the constrained, non-linear, simulation-based optimization of a prototypical geothermal power plant, and particle swarm optimization is shown to perform significantly better than genetic algorithms. The Pareto-optimal front of specific work output and specific heat exchanger area is visualized and tabulated for binary and double-flash plants across the full range of potential geothermal brine inlet conditions and climates, allowing investigation of the specific trade-offs required between specific work output and specific heat exchanger area. In addition to the novel data, this dissertation research illustrates the development and use of a sophisticated analysis tool, based on multi-objective particle swarm optimization, for the optimal design of geothermal power plants.
78

Developing a Multi-Objective Decision Model for Maximizing IS Security within an Organization

May, Jeffrey Lee 01 January 2008 (has links)
Numerous IS researchers have argued that IS Security can be more effectively managed if the emphasis goes beyond the technical means of protecting information resources. In an effort to adopt a broader perspective that accounts for issues that transcend technical means alone, Dhillon and Torkzadeh (2006) present an array of 9 fundamental and 16 means objectives that are essential for maximizing IS security in an organization. These objectives were derived using a value-focused thinking approach and are organized into a conceptual framework. This conceptual framework provides a rigorous theoretical base for considering IS security in a manner that accounts for both technical and organizational issues; however, no direction is provided for using these objectives so that informed decisions can be made. As a result, the goal of this dissertation is to develop a decision model using Multiple Objective Decision Analysis (MODA) techniques that seek to provide informed alternatives to decision makers who desire to maximize IS security within an organization.
79

Modélisation, simulation et optimisation pour l'éco-fabrication / Modeling, simulation and optimization for sustainable manufacturing

Hassine, Hichem 09 February 2015 (has links)
Cette thèse se focalise sur la proposition et l’application des approches pour la modélisation de l’éco-fabrication. Ces approches permettent de préparer et simuler une démarche de fabrication des produits en assurant le couplage entre les objectifs écologiques et économiques.Les approches développées dans cette thèse sont basées sur les notions d’aide à la décision ainsi que l’optimisation multi objectifs. L’aide à la décision permet l’intervention en deux différents niveaux : le choix des impacts environnementaux à quantifier ainsi que le choix du scénario final de fabrication. Pour l’optimisation multi objectifs, elle assure le couplage entre les deux piliers principaux de l’éco-fabrication : l’écologie et l’économie. Au niveau de l’aide à la décision multi critères, les méthodes Evamix et Promethee ont été appliqués, tandis que les essaims particulaires ont été développés dans le cadre de l’optimisation multi objectifs.Ces approches ont été appliquées tout d’abord aux quelques opérations d’usinage : tournage et fraisage. Finalement, la chaîne de fabrication de l’acide phosphorique ainsi que celle d’acide sulfurique ont été le sujet de l’application des deux approches développées. / This thesis focuses on the proposal and implementation of approaches for modeling sustainable manufacturing. These approaches are used to prepare and simulate a process of manufacturing products providing coupling between environmental and economic objectives.The approaches developed in this thesis are based on the concepts of decision support as well as multi-objective optimization. The decision support allows intervention in two different levels: the choice of indicator to quantify the environmental impacts and the choice of the final manufacturing scenario. For multi-objective optimization, it provides the coupling between the two main pillars of sustainable manufacturing: ecology and economy. In terms of multi criteria decision aid methods, Evamix and Promethee were applied, while particulate swarms were developed as part of the multi-objective optimization. These approaches have been applied initially to some machining operations: turning and milling. Finally, the production line of phosphoric acid and sulfuric acid were the subject of application of the two approaches developed.
80

Planification technico-économique de la production décentralisée raccordée aux réseaux de distribution / Distribution system planning implementing distributed generation

Porkar Koumleh, Siyamak 10 January 2011 (has links)
Dans un contexte de dérégulation du marché de l’énergie électrique, une arrivée massive de GED, Génération d’Energie Dispersée (les éoliennes, la biomasse, les micro-turbines, les piles à combustibles, les panneaux solaires, ...) au niveau de la Haute Tension de niveau A (HTA, principalement 20/33 kV) et de la Basse Tension (BT, principalement 400/230V) est à prévoir.De nombreux avantages, techniques et économiques, justifient le développement de ce type de production, parmi lesquels nous relevons les suivants: la production d’énergie plus près des consommateurs d’où une baisse des coûts de transport et de distribution, ainsi que la réduction des pertes dans les lignes; la substitution de l’énergie conventionnelle «polluante» par des énergies nouvelles plus «propres» et silencieuses; un intérêt économique très important pour les exploitants de GED grâce aux subventions accordées; en matière de planification, face à une augmentation de la charge, l’insertion de GED sur le réseau de distribution permet d’éviter la construction de nouvelles lignes HTB; la plus grande facilité de trouver des sites pour installer de petits générateurs; le temps d’installation relativement court de GED; pour l’alimentation de sites isolés, il peut être plus rentable d’alimenter un réseau de distribution local avec des GED plutôt que de le relier à un poste HTB/HTA lointain ; la cogénération, une des formes de GED la plus répandue, améliore le rendement énergétique. Cette thèse traitera des points suivants : brève description des réseaux de distribution ; présentation d’une méthodologie systématique d’optimisation de la planification des réseaux de distribution incluant la GED ; étude des effets des paramètres des réseaux sur l’insertion de GED ; étude systématique des impacts de GED sur le réseau. / In the recent years, there is a worldwide wave of considerable changes in power industries, including the operation of distribution networks. Deregulation, open market, alternative and local energy sources, new energy conversion technologies and other future development of electrical power systems must pursue different goals. Also growth in the demand and change in load patterns may create major bottlenecks in the delivery of electric energy. This would cause distribution system stress. The complexity of the problems related to distribution systems planning is mainly caused by multiple objectives. It is predicted that Distributed Generation (DG) will play an increasing role in the electrical power system of the future, not only for the cost savings but also for the additional power quality. Careful coordination and placement of DGs is mandatory. Improper placement can reduce DGs benefits and even jeopardize the system operation and condition. This thesis discusses the effects of DG implementation under different distribution system conditions and states not only to decrease system costs and losses but also to improve power quality, system voltage and line congestion. Three methodologies included mathematical model to obtain the optimal DG capacity sizing and sitting investments with capability to solve large distribution system planning problem. These frameworks have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders’ lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers’ bills

Page generated in 0.1609 seconds