Spelling suggestions: "subject:"multiobjective A*"" "subject:"multibjective A*""
81 |
Planification technico-économique de la production décentralisée raccordée aux réseaux de distribution / Distribution system planning implementing distributed generationPorkar Koumleh, Siyamak 10 January 2011 (has links)
Dans un contexte de dérégulation du marché de l’énergie électrique, une arrivée massive de GED, Génération d’Energie Dispersée (les éoliennes, la biomasse, les micro-turbines, les piles à combustibles, les panneaux solaires, ...) au niveau de la Haute Tension de niveau A (HTA, principalement 20/33 kV) et de la Basse Tension (BT, principalement 400/230V) est à prévoir.De nombreux avantages, techniques et économiques, justifient le développement de ce type de production, parmi lesquels nous relevons les suivants: la production d’énergie plus près des consommateurs d’où une baisse des coûts de transport et de distribution, ainsi que la réduction des pertes dans les lignes; la substitution de l’énergie conventionnelle «polluante» par des énergies nouvelles plus «propres» et silencieuses; un intérêt économique très important pour les exploitants de GED grâce aux subventions accordées; en matière de planification, face à une augmentation de la charge, l’insertion de GED sur le réseau de distribution permet d’éviter la construction de nouvelles lignes HTB; la plus grande facilité de trouver des sites pour installer de petits générateurs; le temps d’installation relativement court de GED; pour l’alimentation de sites isolés, il peut être plus rentable d’alimenter un réseau de distribution local avec des GED plutôt que de le relier à un poste HTB/HTA lointain ; la cogénération, une des formes de GED la plus répandue, améliore le rendement énergétique. Cette thèse traitera des points suivants : brève description des réseaux de distribution ; présentation d’une méthodologie systématique d’optimisation de la planification des réseaux de distribution incluant la GED ; étude des effets des paramètres des réseaux sur l’insertion de GED ; étude systématique des impacts de GED sur le réseau. / In the recent years, there is a worldwide wave of considerable changes in power industries, including the operation of distribution networks. Deregulation, open market, alternative and local energy sources, new energy conversion technologies and other future development of electrical power systems must pursue different goals. Also growth in the demand and change in load patterns may create major bottlenecks in the delivery of electric energy. This would cause distribution system stress. The complexity of the problems related to distribution systems planning is mainly caused by multiple objectives. It is predicted that Distributed Generation (DG) will play an increasing role in the electrical power system of the future, not only for the cost savings but also for the additional power quality. Careful coordination and placement of DGs is mandatory. Improper placement can reduce DGs benefits and even jeopardize the system operation and condition. This thesis discusses the effects of DG implementation under different distribution system conditions and states not only to decrease system costs and losses but also to improve power quality, system voltage and line congestion. Three methodologies included mathematical model to obtain the optimal DG capacity sizing and sitting investments with capability to solve large distribution system planning problem. These frameworks have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders’ lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers’ bills
|
82 |
Aplicação de algoritmos genéricos multi-objetivo para alinhamento de seqüências biológicas. / Multi-objective genetic algorithms applied to protein sequence alignment.Ticona, Waldo Gonzalo Cancino 26 February 2003 (has links)
O alinhamento de seqüências biológicas é uma operação básica em Bioinformática, já que serve como base para outros processos como, por exemplo, a determinação da estrutura tridimensional das proteínas. Dada a grande quantidade de dados presentes nas seqüencias, são usadas técnicas matemáticas e de computação para realizar esta tarefa. Tradicionalmente, o Problema de Alinhamento de Seqüências Biológicas é formulado como um problema de otimização de objetivo simples, onde alinhamento de maior semelhança, conforme um esquema de pontuação, é procurado. A Otimização Multi-Objetivo aborda os problemas de otimização que possuem vários critérios a serem atingidos. Para este tipo de problema, existe um conjunto de soluções que representam um "compromiso" entre os objetivos. Uma técnica que se aplica com sucesso neste contexto são os Algoritmos Evolutivos, inspirados na Teoria da Evolução de Darwin, que trabalham com uma população de soluções que vão evoluindo até atingirem um critério de convergência ou de parada. Este trabalho formula o Problema de Alinhamento de Seqüências Biológicas como um Problema de Otimização Multi-Objetivo, para encontrar um conjunto de soluções que representem um compromisso entre a extensão e a qualidade das soluções. Aplicou-se vários modelos de Algoritmos Evolutivos para Otimização Multi-Objetivo. O desempenho de cada modelo foi avaliado por métricas de performance encontradas na literatura. / The Biological Sequence Alignment is a basic operation in Bioinformatics since it serves as a basis for other processes, i.e. determination of the protein's three-dimensional structure. Due to the large amount of data involved, mathematical and computational methods have been used to solve this problem. Traditionally, the Biological Alignment Sequence Problem is formulated as a single optimization problem. Each solution has a score that reflects the similarity between sequences. Then, the optimization process looks for the best score solution. The Multi-Objective Optimization solves problems with multiple objectives that must be reached. Frequently, there is a solution set that represents a trade-off between the objectives. Evolutionary Algorithms, which are inspired by Darwin's Evolution Theory, have been applied with success in solving this kind of problems. This work formulates the Biological Sequence Alignment as a Multi-Objective Optimization Problem in order to find a set of solutions that represent a trade-off between the extension and the quality of the solutions. Several models of Evolutionary Algorithms for Multi-Objetive Optimization have been applied and were evaluated using several performance metrics found in the literature.
|
83 |
Fundamentals of Mass Transfer in Gas CarburizingKarabelchtchikova, Olga 18 December 2007 (has links)
"Gas carburizing is an important heat treatment process used for steel surface hardening of automotive and aerospace components. The quality of the carburized parts is determined by the hardness and the case depth required for a particular application. Despite its worldwide application, the current carburizing process performance faces some challenges in process control and variability. Case depth variability if often encountered in the carburized parts and may present problems with i) manufacturing quality rejections when tight tolerances are imposed or ii) insufficient mechanical properties and increased failure rate in service. The industrial approach to these problems often involves trial and error methods and empirical analysis, both of which are expensive, time consuming and, most importantly, rarely yield optimal solutions. The objective for this work was to develop a fundamental understanding of the mass transfer during gas carburizing process and to develop a strategy for the process control and optimization. The research methodology was based on both experimental work and theoretical developments, and included modeling the thermodynamics of the carburizing atmosphere with various enriching gasses, kinetics of mass transfer at the gas-steel interface and carbon diffusion in steel. The models accurately predict: 1) the atmosphere gas composition during the enriching stage of carburizing, 2) the kinetics of carbon transfer at the gas-steel surfaces, and 3) the carbon diffusion coefficient in steel for various process conditions and steel alloying. The above models and investigations were further combined to accurately predict the surface carbon concentration and the carbon concentration profile in the steel during the heat treatment process. Finally, these models were used to develop a methodology for the process optimization to minimize case depth variation, carburizing cycle time and total cycle cost. Application of this optimization technique provides a tradeoff between minimizing the case depth variation and total cycle cost and results in significant energy reduction by shortening cycle time and thereby enhancing carburizing furnace capacity."
|
84 |
Multi-Objective Routing Optimization for Multiple Level Priority and Preemption in Multi-Tiered NetworksFarmer, Jason Z 18 December 2006 (has links)
"This thesis explores techniques for improving the Quality of Service (QoS) driven routing of IP traffic in a Network Centric Military Communications System within an HC3 (High Capacity Communications Capability) tiered topology. In this specialized network various routing algorithms, including traditional, QoS-constrained search-based, and heuristic approaches, were evaluated. An automatic system for the probabilistic generation of appropriate networks and traffic was created for Monte Carlo simulation of the systems and testing of the various routing algorithms. A new algorithm we propose, based upon a hiercharical decomposition of routes about the minimum distance routes, is described and tested. These results provide both insight into this problem and demonstrate the possibility of highly optimized solutions without exhaustive search."
|
85 |
Modelo de otimização multiobjetivo aplicado ao projeto de concepção de submarinos convencionais. / Multi-objective optimization model applied to conceptual submarine design.Pereira, Michel Henrique 25 April 2016 (has links)
Este trabalho apresenta um modelo de otimização multiobjetivo aplicado ao projeto de concepção de submarinos convencionais (i.e. de propulsão dieselelétrica). Um modelo de síntese que permite a estimativa de pesos, volume, velocidade, carga elétrica e outras características de interesse para a o projeto de concepção é formulado. O modelo de síntese é integrado a um modelo de otimização multiobjetivo baseado em algoritmos genéticos (especificamente, o algoritmo NSGA II). A otimização multiobjetivo consiste na maximização da efetividade militar do submarino e na minimização de seu custo. A efetividade militar do submarino é representada por uma Medida Geral de Efetividade (OMOE) estabelecida por meio do Processo Analítico Hierárquico (AHP). O Custo Básico de Construção (BCC) do submarino é estimado a partir dos seus grupos de peso. Ao fim do processo de otimização, é estabelecida uma Fronteira de Pareto composta por soluções não dominadas. Uma dessas soluções é selecionada para refinamento preliminar e os resultados são discutidos. Subsidiariamente, esta dissertação apresenta discussão sucinta sobre aspectos históricos e operativos relacionados a submarinos, bem como sobre sua metodologia de projeto. Alguns conceitos de Arquitetura Naval, aplicada ao projeto dessas embarcações, são também abordados. / This thesis presents a multi-objective optimization model applied to concept design of conventional submarines (i.e. diesel-electric powered boats). A synthesis model that allows the estimation of weights, volume, speed, electrical load and other design features of interest is formulated. The synthesis model is integrated with a multi-objective optimization model based on genetic algorithms (specifically, the NSGA II algorithm). The multi-objective optimization consists of maximizing the submarine\'s military effectiveness and minimizing its cost. The military effectiveness is represented by an Overall Measure of Effectiveness (OMOE) established via the Analytic Hierarchy Process (AHP). The submarine\'s Basic Construction Cost (BCC) is estimated from its weight groups. At the end of the optimization process, a Pareto Front composed of non-dominated solutions is established. One of these solutions is selected for preliminary refinement and the results are discussed. This work also presents succinct discussion about submarine historical and operational aspects and design methodology. Some Naval Architectural concepts, applied to submarine design, are also discussed.
|
86 |
End to end Multi-Objective Optimisation of H.264 and HEVC CODECsAl Barwani, Maryam Mohsin Salim January 2018 (has links)
All multimedia devices now incorporate video CODECs that comply with international video coding standards such as H.264 / MPEG4-AVC and the new High Efficiency Video Coding Standard (HEVC) otherwise known as H.265. Although the standard CODECs have been designed to include algorithms with optimal efficiency, large number of coding parameters can be used to fine tune their operation, within known constraints of for e.g., available computational power, bandwidth, consumer QoS requirements, etc. With large number of such parameters involved, determining which parameters will play a significant role in providing optimal quality of service within given constraints is a further challenge that needs to be met. Further how to select the values of the significant parameters so that the CODEC performs optimally under the given constraints is a further important question to be answered. This thesis proposes a framework that uses machine learning algorithms to model the performance of a video CODEC based on the significant coding parameters. Means of modelling both the Encoder and Decoder performance is proposed. We define objective functions that can be used to model the performance related properties of a CODEC, i.e., video quality, bit-rate and CPU time. We show that these objective functions can be practically utilised in video Encoder/Decoder designs, in particular in their performance optimisation within given operational and practical constraints. A Multi-objective Optimisation framework based on Genetic Algorithms is thus proposed to optimise the performance of a video codec. The framework is designed to jointly minimize the CPU Time, Bit-rate and to maximize the quality of the compressed video stream. The thesis presents the use of this framework in the performance modelling and multi-objective optimisation of the most widely used video coding standard in practice at present, H.264 and the latest video coding standard, H.265/HEVC. When a communication network is used to transmit video, performance related parameters of the communication channel will impact the end-to-end performance of the video CODEC. Network delays and packet loss will impact the quality of the video that is received at the decoder via the communication channel, i.e., even if a video CODEC is optimally configured network conditions will make the experience sub-optimal. Given the above the thesis proposes a design, integration and testing of a novel approach to simulating a wired network and the use of UDP protocol for the transmission of video data. This network is subsequently used to simulate the impact of packet loss and network delays on optimally coded video based on the framework previously proposed for the modelling and optimisation of video CODECs. The quality of received video under different levels of packet loss and network delay is simulated, concluding the impact on transmitted video based on their content and features.
|
87 |
Algoritmos evolutivos multi-objetivo para a reconstrução de árvores filogenéticas / Evolutionary multi-objective algorithms for Phylogenetic InferenceTicona, Waldo Gonzalo Cancino 11 February 2008 (has links)
O problema reconstrução filogenética têm como objetivo determinar as relações evolutivas das espécies, usualmente representadas em estruturas de árvores. No entanto, esse problema tem se mostrado muito difícil uma vez que o espaço de busca das possíveis árvores é muito grande. Diversos métodos de reconstrução filogenética têm sido propostos. Vários desses métodos definem um critério de otimalidade para avaliar as possíveis soluções do problema. Porém, a aplicação de diferentes critérios resulta em árvores diferentes, inconsistentes entre sim. Nesse contexto, uma abordagem multi-objetivo para a reconstrução filogenética pode ser útil produzindo um conjunto de árvores consideradas adequadas por mais de um critério. Nesta tese é proposto um algoritmo evolutivo multi-objetivo, denominado PhyloMOEA, para o problema de reconstrução filogenética. O PhyloMOEA emprega os critérios de parcimônia e verossimilhança que são dois dos métodos de reconstru ção filogenética mais empregados. Nos experimentos, o PhyloMOEA foi testado utilizando quatro bancos de seqüências freqüentemente empregados na literatura. Para cada banco de teste, o PhyloMOEA encontrou as soluções da fronteira de Pareto que representam um compromisso entre os critérios considerados. As árvores da fronteira de Pareto foram validadas estatisticamente utilizando o teste SH. Os resultados mostraram que o PhyloMOEA encontrou um número de soluções intermediárias que são consistentes com as soluções obtidas por análises de máxima parcimônia e máxima verossimilhança realizados separadamente. Além disso, os graus de suporte dos clados pertencentes às árvores encontradas pelo PhyloMOEA foram comparadas com a probabilidade posterior dos clados calculados pelo programa Mr.Bayes aplicados aos quatro bancos de teste. Os resultados indicaram que há uma relação entre ambos os valores para vários grupos de clados. Em resumo, o PhyloMOEA é capaz de encontrar uma diversidade de soluções intermediárias que são estatisticamente tão boas quanto as melhores soluções de máxima parcimônia e máxima verossimilhança. Tais soluções apresentam um compromisso entre os dois objetivos / The phylogeny reconstruction problem consists of determining the evolutionary relationships (usually represented as a tree) among species. This is a very complex problem since the tree search space is huge. Several phylogenetic reconstruction methods have been proposed. Many of them defines an optimality criterion for evaluation of possible solutions. However, different criteria may lead to distinct phylogenies, which often conflict with each other. In this context, a multi-objective approach for phylogeny reconstruction can be useful since it could produce a set of optimal trees according to mdifficultultiple criteria. In this thesis, a multi-objective evolutionary algorithm for phylogenetic reconstruction, called PhyloMOEA, is proposed. PhyloMOEA uses the parsimony and likelihood criteria, which are two of the most used phylogenetic reconstruction methods. PhyloMOEA was tested using four datasets of nucleotide sequences found in the literature. For each dataset, the proposed algorithm found a Pareto front representing a trade-off between the used criteria. Trees in the Pareto front were statistically validated using the SH-test, which has shown that a number of intermediate solutions from PhyloMOEA are consistent with solutions found by phylogenetic methods using one criterion. Moreover, clade support values from trees found by PhyloMOEA was compared to clade posterior probabilities obtained by Mr.Bayes. Results indicate a correlation between these probabilities for several clades. In summary, PhyloMOEA is able to find diverse intermediate solutions, which are not statistically worse than the best solutions for the maximum parsimony and maximum likelihood criteria. Moreover, intermediate solutions represent a trade-off between these criteria
|
88 |
Scalarization and stability in multi-objective optimization / Stabilité et scalarisation en programmation multi-objectifZamani, Moslem 12 July 2016 (has links)
Cette thèse porte sur trois questions qui se posent en optimisation multi-objectif. Dansun premier temps, nous étudions l’existence de solutions efficaces via des techniquesde scalarisation. On étend le théorème de Benson du cas convexe à un cas général.De plus, nous examinons d’autres techniques de scalarisation. Dans un second temps,nous abordons la question de robustesse. Nous examinons les concepts proposés dansla littérature sur le sujet. On étend au cas d’optimisation multi-objectif non-linéairela définition de Georgiev et ses collaborateurs. Quelques conditions nécessaires etsuffisantes pour obtenir une solution robuste moyennant des hypothèses appropriéessont données. Les relations entre cette notion de robustesse et certaines définitionsmentionnées sont mises en évidence. Deux types de modifications des fonctions objectifsont traités et les relations entre les solutions faibles/propres/ robustes efficacessont établies. Le dernier chapitre est consacré à l’analyse de sensibilité et de stabilitéen optimisation multi-objectif paramétrée. On montre sous des conditions faibles quela multi-application de l’ensemble des solutions réalisables et des valeurs réalisablessont strictement semi-différentiables. On donne quelques conditions suffisantes pourla semi-différentiabilité de l’ensemble efficace et des valeurs efficaces. De plus, nousétudions la pseudo-Lipschitz continuité des multi-applications ci dessus citées. / In this thesis, three crucial questions arising in multi-objective optimization are investigated.First, the existence of properly efficient solutions via scalarization toolsis studied. A basic theorem credited to Benson is extended from the convex caseto the general case. Some further scalarization techniques are also discussed. Thesecond part of the thesis is devoted to robustness. Various notions from the literatureare briefly reviewed. Afterwards, a norm-based definition given by Georgiev, Lucand Pardalos is generalized to nonlinear multi-objective optimization. Necessary andsufficient conditions for robust solutions under appropriate assumptions are given.Relationships between new robustness notion and some known ones are highlighted.Two kinds of modifications in the objective functions are dealt with and relationshipsbetween the weak/proper/robust efficient solutions of the problems, before and afterthe perturbation, are established. Finally, we discuss the sensitivity analysis andstability in parametrized multi-objective optimization. Strict semi-differentiability ofset-valued mappings of feasible sets and feasible values is proved under appropriateassumptions. Furthermore, some sufficient conditions for semi-differentiability of efficientsets and efficient values are presented. Finally, pseudo-Lipschitz continuity ofaforementioned set-valued mappings is investigated
|
89 |
Multi-objective optimisation in additive manufacturingStrano, Giovanni January 2012 (has links)
Additive Manufacturing (AM) has demonstrated great potential to advance product design and manufacturing, and has showed higher flexibility than conventional manufacturing techniques for the production of small volume, complex and customised components. In an economy focused on the need to develop customised and hi-tech products, there is increasing interest in establishing AM technologies as a more efficient production approach for high value products such as aerospace and biomedical products. Nevertheless, the use of AM processes, for even small to medium volume production faces a number of issues in the current state of the technology. AM production is normally used for making parts with complex geometry which implicates the assessment of numerous processing options or choices; the wrong choice of process parameters can result in poor surface quality, onerous manufacturing time and energy waste, and thus increased production costs and resources. A few commonly used AM processes require the presence of cellular support structures for the production of overhanging parts. Depending on the object complexity their removal can be impossible or very time (and resources) consuming. Currently, there is a lack of tools to advise the AM operator on the optimal choice of process parameters. This prevents the diffusion of AM as an efficient production process for enterprises, and as affordable access to democratic product development for individual users. Research in literature has focused mainly on the optimisation of single criteria for AM production. An integrated predictive modelling and optimisation technique has not yet been well established for identifying an efficient process set up for complicated products which often involve critical building requirements. For instance, there are no robust methods for the optimal design of complex cellular support structures, and most of the software commercially available today does not provide adequate guidance on how to optimally orientate the part into the machine bed, or which particular combination of cellular structures need to be used as support. The choice of wrong support and orientation can degenerate into structure collapse during an AM process such as Selective Laser Melting (SLM), due to the high thermal stress in the junctions between fillets of different cells. Another issue of AM production is the limited parts’ surface quality typically generated by the discrete deposition and fusion of material. This research has focused on the formation of surface morphology of AM parts. Analysis of SLM parts showed that roughness measured was different from that predicted through a classic model based on pure geometrical consideration on the stair step profile. Experiments also revealed the presence of partially bonded particles on the surface; an explanation of this phenomenon has been proposed. Results have been integrated into a novel mathematical model for the prediction of surface roughness of SLM parts. The model formulated correctly describes the observed trend of the experimental data, and thus provides an accurate prediction of surface roughness. This thesis aims to deliver an effective computational methodology for the multi- objective optimisation of the main building conditions that affect process efficiency of AM production. For this purpose, mathematical models have been formulated for the determination of parts’ surface quality, manufacturing time and energy consumption, and for the design of optimal cellular support structures. All the predictive models have been used to evaluate multiple performance and costs objectives; all the objectives are typically contrasting; and all greatly affected by the part’s build orientation. A multi-objective optimisation technique has been developed to visualise and identify optimal trade-offs between all the contrastive objectives for the most efficient AM production. Hence, this thesis has delivered a decision support system to assist the operator in the "process planning" stage, in order to achieve optimal efficiency and sustainability in AM production through maximum material, time and energy savings.
|
90 |
Computation offloading for algorithms in absence of the CloudSthapit, Saurav January 2018 (has links)
Mobile cloud computing is a way of delegating complex algorithms from a mobile device to the cloud to complete the tasks quickly and save energy on the mobile device. However, the cloud may not be available or suitable for helping all the time. For example, in a battlefield scenario, the cloud may not be reachable. This work considers neighbouring devices as alternatives to the cloud for offloading computation and presents three key contributions, namely a comprehensive investigation of the trade-off between computation and communication, Multi-Objective Optimisation based approach to offloading, and Queuing Theory based algorithms that present the benefits of offloading to neighbours. Initially, the states of neighbouring devices are considered to be known and the decision of computation offloading is proposed as a multi-objective optimisation problem. Novel Pareto optimal solutions are proposed. The results on a simulated dataset show up to 30% increment in performance even when cloud computing is not available. However, information about the environment is seldom known completely. In Chapter 5, a realistic environment is considered such as delayed node state information and partially connected sensors. The network of sensors is modelled as a network of queues (Open Jackson network). The offloading problem is posed as minimum cost problem and solved using Linear solvers. In addition to the simulated dataset, the proposed solution is tested on a real computer vision dataset. The experiments on the random waypoint dataset showed up to 33% boost on performance whereas in the real dataset, exploiting the temporal and spatial distribution of the targets, a significantly higher increment in performance is achieved.
|
Page generated in 0.04 seconds