Spelling suggestions: "subject:"[een] ANT COLONY"" "subject:"[enn] ANT COLONY""
41 |
Utilizing Swarm Intelligence Algorithms for Pathfinding in GamesKelman, Alexander January 2017 (has links)
The Ant Colony Optimization and Particle Swarm Optimization are two Swarm Intelligence algorithms often utilized for optimization. Swarm Intelligence relies on agents that possess fragmented knowledge, a concept not often utilized in games. The aim of this study is to research whether there are any benefits to using these Swarm Intelligence algorithms in comparison to standard algorithms such as A* for pathfinding in a game. Games often consist of dynamic environments with mobile agents, as such all experiments were conducted with dynamic destinations. Algorithms were measured on the length of their path and the time taken to calculate that path. The algorithms were implemented with minor modifications to allow them to better function in a grid based environment. The Ant Colony Optimization was modified in regards to how pheromone was distributed in the dynamic environment to better allow the algorithm to path towards a mobile target. Whereas the Particle Swarm Optimization was given set start positions and velocity in order to increase initial search space and modifications to increase particle diversity. The results obtained from the experimentation showcased that the Swarm Intelligence algorithms were capable of performing to great results in terms of calculation speed, they were however not able to obtain the same path optimality as A*. The algorithms' implementation can be improved but show potential to be useful in games.
|
42 |
Global supply chain optimization : a machine learning perspective to improve caterpillar's logistics operationsVeluscek, Marco January 2016 (has links)
Supply chain optimization is one of the key components for the effective management of a company with a complex manufacturing process and distribution network. Companies with a global presence in particular are motivated to optimize their distribution plans in order to keep their operating costs low and competitive. Changing condition in the global market and volatile energy prices increase the need for an automatic decision and optimization tool. In recent years, many techniques and applications have been proposed to address the problem of supply chain optimization. However, such techniques are often too problemspecific or too knowledge-intensive to be implemented as in-expensive, and easy-to-use computer system. The effort required to implement an optimization system for a new instance of the problem appears to be quite significant. The development process necessitates the involvement of expert personnel and the level of automation is low. The aim of this project is to develop a set of strategies capable of increasing the level of automation when developing a new optimization system. An increased level of automation is achieved by focusing on three areas: multi-objective optimization, optimization algorithm usability, and optimization model design. A literature review highlighted the great level of interest for the problem of multiobjective optimization in the research community. However, the review emphasized a lack of standardization in the area and insufficient understanding of the relationship between multi-objective strategies and problems. Experts in the area of optimization and artificial intelligence are interested in improving the usability of the most recent optimization algorithms. They stated the concern that the large number of variants and parameters, which characterizes such algorithms, affect their potential applicability in real-world environments. Such characteristics are seen as the root cause for the low success of the most recent optimization algorithms in industrial applications. Crucial task for the development of an optimization system is the design of the optimization model. Such task is one of the most complex in the development process, however, it is still performed mostly manually. The importance and the complexity of the task strongly suggest the development of tools to aid the design of optimization models. In order to address such challenges, first the problem of multi-objective optimization is considered and the most widely adopted techniques to solve it are identified. Such techniques are analyzed and described in details to increase the level of standardization in the area. Empirical evidences are highlighted to suggest what type of relationship exists between strategies and problem instances. Regarding the optimization algorithm, a classification method is proposed to improve its usability and computational requirement by automatically tuning one of its key parameters, the termination condition. The algorithm understands the problem complexity and automatically assigns the best termination condition to minimize runtime. The runtime of the optimization system has been reduced by more than 60%. Arguably, the usability of the algorithm has been improved as well, as one of the key configuration tasks can now be completed automatically. Finally, a system is presented to aid the definition of the optimization model through regression analysis. The purpose of the method is to gather as much knowledge about the problem as possible so that the task of the optimization model definition requires a lower user involvement. The application of the proposed algorithm is estimated that could have saved almost 1000 man-weeks to complete the project. The developed strategies have been applied to the problem of Caterpillar’s global supply chain optimization. This thesis describes also the process of developing an optimization system for Caterpillar and highlights the challenges and research opportunities identified while undertaking this work. This thesis describes the optimization model designed for Caterpillar’s supply chain and the implementation details of the Ant Colony System, the algorithm selected to optimize the supply chain. The system is now used to design the distribution plans of more than 7,000 products. The system improved Caterpillar’s marginal profit on such products by a factor of 4.6% on average.
|
43 |
Uma abordagem distribuída e bio-inspirada para mapeamento de ambientes internos utilizando múltiplos robôs móveis / A distributed and bioinspired approach for mapping of indoor environments using multiple mobile robotsJanderson Rodrigo de Oliveira 31 March 2014 (has links)
As estratégias de mapeamento utilizando múltiplos robôs móveis possuem uma série de vantagens quando comparadas àquelas estratégias baseadas em um único robô. As principais vantagens que podem ser elucidadas são: flexibilidade, ganho de informação e redução do tempo de construção do mapa do ambiente. No presente trabalho, um método de integração de mapas locais é proposto baseado em observações inter-robôs, considerando uma nova abordagem para a exploração do ambiente. Tal abordagem é conhecida como Sistema de Vigilância baseado na Modificação do Sistema Colônias de Formigas, ou IAS-SS. A estratégia IAS-SS é inspirada em mecanismos biológicos que definem a organização social de sistemas de enxames. Especificamente, esta estratégia é baseada em uma modificação do tradicional algoritmo de otimização por colônias de formiga. A principal contribuição do presente trabalho é a adaptação de um modelo de compartilhamento de informações utilizado em redes de sensores móveis, adaptando o mesmo para tarefas de mapeamento. Outra importante contribuição é a colaboração entre o método proposto de integração de mapas e a estratégia de coordenação de múltiplos robôs baseada na teoria de colônias de formigas. Tal colaboração permite o desenvolvimento de uma abordagem de exploração que emprega um mecanismo não físico para depósito e detecção de feromônios em ambientes reais por meio da elaboração do conceito de feromônios virtuais integrados. Resultados obtidos em simulação demonstram que o método de integração de mapas é eficiente, de modo que os ensaios experimentais foram realizados considerando-se um número variável de robôs móveis durante o processo de exploração de ambientes internos com diferentes formas e estruturas. Os resultados obtidos com os diversos experimentos realizados confirmam que o processo de integração é efetivo e adequado para executar o mapeamento do ambiente durante tarefas de exploração e vigilância do mesmo / The multiple robot map building strategies have several advantages when compared to strategies based on a single robot, in terms of flexibility, gain of information and reduction of map building time. In this work, a local map integration method is proposed based on the inter-robot observations, considering a recent approach for the environment exploration. This approach is based on the Inverse Ant System-Based Surveillance System strategy, called IASSS. The IAS-SS strategy is inspired on biological mechanisms that define the social organization of swarm systems. Specifically, it is based on a modified version of the known ant colony algorithm. The main contribution of this work is the fit of an information sharing model used in an mobile sensor network, adapting the method for mapping tasks. Another important contribution is the collaboration between the local map integration method and the multiple robot coordination strategy based on ant colony theory. Through this collaboration it is possible to develop an approach that uses a mechanism for controlling the access to pheromones in real environments. Such mechanism is based on the integrated virtual pheromones concept. Simulation results show that the map integration method is efficient, the trials are performed considering a variable number of robots and environments with different structures. Results obtained from several experiments confirm that the integration process is effective and suitable to execute mapping during the exploration task
|
44 |
Ant Clustering with ConsensusGu, Yuhua 01 April 2009 (has links)
Clustering is actively used in several research fields, such as pattern recognition, machine learning and data mining. This dissertation focuses on clustering algorithms in the data mining area. Clustering algorithms can be applied to solve the unsupervised learning problem, which deals with finding clusters in unlabeled data. Most clustering algorithms require the number of cluster centers be known in advance. However, this is often not suitable for real world applications, since we do not know this information in most cases. Another question becomes, once clusters are found by the algorithms, do we believe the clusters are exactly the right ones or do there exist better ones? In this dissertation, we present two new Swarm Intelligence based approaches for data clustering to solve the above issues. Swarm based approaches to clustering have been shown to be able to skip local extrema by doing a form of global search, our two newly proposed ant clustering algorithms take advantage of this. The first algorithm is a kernel-based fuzzy ant clustering algorithm using the Xie-Beni partition validity metric, it is a two stage algorithm, in the first stage of the algorithm ants move the cluster centers in feature space, the cluster centers found by the ants are evaluated using a reformulated kernel-based Xie-Beni cluster validity metric. We found when provided with more clusters than exist in the data our new ant-based approach produces a partition with empty clusters and/or very lightly populated clusters. Then the second stage of this algorithm was applied to automatically detect the number of clusters for a data set by using threshold solutions. The second ant clustering algorithm, using chemical recognition of nestmates is a combination of an ant based algorithm and a consensus clustering algorithm. It is a two-stage algorithm without initial knowledge of the number of clusters. The main contributions of this work are to use the ability of an ant based clustering algorithm to determine the number of cluster centers and refine the cluster centers, then apply a consensus clustering algorithm to get a better quality final solution. We also introduced an ensemble ant clustering algorithm which is able to find a consistent number of clusters with appropriate parameters. We proposed a modified online ant clustering algorithm to handle clustering large data sets. To our knowledge, we are the first to use consensus to combine multiple ant partitions to obtain robust clustering solutions. Experiments were done with twelve data sets, some of which were benchmark data sets, two artificially generated data sets and two magnetic resonance image brain volumes. The results show how the ant clustering algorithms play an important role in finding the number of clusters and providing useful information for consensus clustering to locate the optimal clustering solutions. We conducted a wide range of comparative experiments that demonstrate the effectiveness of the new approaches.
|
45 |
ACODV : Ant Colony Optimisation Distance Vector routing in ad hoc networksDu Plessis, Johan 11 April 2007 (has links)
A mobile ad hoc network is a collection of wireless mobile devices which dynamically form a temporary network, without using any existing network infrastructure or centralised administration. Each node in the network effectively becomes a router, and forwards packets towards the packet’s destination node. Ad hoc networks are characterized by frequently changing network topology, multi-hop wireless connections and the need for dynamic, efficient routing protocols. <p.This work considers the routing problem in a network of uniquely addressable sensors. These networks are encountered in many industrial applications, where the aim is to relay information from a collection of data gathering devices deployed over an area to central points. The routing problem in such networks are characterised by: <ul> <li>The overarching requirement for low power consumption, as battery powered sensors may be required to operate for years without battery replacement;</li> <li>An emphasis on reliable communication as opposed to real-time communication, it is more important for packets to arrive reliably than to arrive quickly; and</li> <li>Very scarce processing and memory resources, as these sensors are often implemented on small low-power microprocessors.</li> </ul> This work provides overviews of routing protocols in ad hoc networks, swarm intelligence, and swarm intelligence applied to ad hoc routing. Various mechanisms that are commonly encountered in ad hoc routing are experimentally evaluated under situations as close to real-life as possible. Where possible, enhancements to the mechanisms are suggested and evaluated. Finally, a routing protocol suitable for such low-power sensor networks is defined and benchmarked in various scenarios against the Ad hoc On-Demand Distance Vector (AODV) algorithm. / Dissertation (MSc)--University of Pretoria, 2005. / Computer Science / Unrestricted
|
46 |
Statistical methods for imaging data, imaging genetics and sparse estimation in linear mixed modelsOpoku, Eugene A. 21 October 2021 (has links)
This thesis presents research focused on developing statistical methods with emphasis on techniques that can be used for the analysis of data in imaging studies and sparse estimations for applications in high-dimensional data. The first contribution addresses the pixel/voxel-labeling problem for spatial hidden Markov models in image analysis. We formulate a Gaussian spatial mixture model with Potts model used as a prior for mixture allocations for the latent states in the model. Jointly estimating the model parameters, the discrete state variables and the number of states (number of mixture components) is recognized as a difficult combinatorial optimization. To overcome drawbacks associated with local algorithms, we implement and make comparisons between iterated conditional modes (ICM), simulated annealing (SA) and hybrid ICM with ant colony system (ACS-ICM) optimization for pixel labelling, parameter estimation and mixture component estimation.
In the second contribution, we develop ACS-ICM algorithm for spatiotemporal modeling of combined MEG/EEG data for computing estimates of the neural source activity. We consider a Bayesian finite spatial mixture model with a Potts model as a spatial prior and implement the ACS-ICM for simultaneous point estimation and model selection for the number of mixture components. Our approach is evaluated using simulation studies and an application examining the visual response to scrambled faces. In addition, we develop a nonparametric bootstrap for interval estimation to account for uncertainty in the point estimates. In the third contribution, we present sparse estimation strategies in linear mixed model (LMM) for longitudinal data. We address the problem of estimating the fixed effects parameters of the LMM when the model is sparse and predictors are correlated. We propose and derive the asymptotic properties of the pretest and shrinkage estimation strategies. Simulation studies is performed to compare the numerical performance of the Lasso and adaptive Lasso estimators with the pretest and shrinkage ridge estimators. The methodology is evaluated through an application of a high-dimensional data examining effective brain connectivity and genetics.
In the fourth and final contribution, we conduct an imaging genetics study to explore how effective brain connectivity in the default mode network (DMN) may be related to genetics within the context of Alzheimer’s disease. We develop an analysis of longitudinal resting-state functional magnetic resonance imaging (rs-fMRI) and genetic data obtained from a sample of 111 subjects with a total of 319 rs-fMRI scans from the Alzheimer’s Disease Neuroimaging Initiative (ADNI) database. A Dynamic Causal Model (DCM) is fit to the rs-fMRI scans to estimate effective brain connectivity within the DMN and related to a set of single nucleotide polymorphisms (SNPs) contained in an empirical disease-constrained set. We relate longitudinal effective brain connectivity estimated using spectral DCM to SNPs using both linear mixed effect (LME) models as well as function-on-scalar regression (FSR). / Graduate
|
47 |
Heuristické řešení plánovacích problémů / Heuristic Solving of Planning ProblemsNovotná, Kateřina January 2013 (has links)
This thesis deals with the implementation of the metaheuristic algorithms into the Drools Planner. The Drools Planner is an open source tool for solving optimization problems. This work describes design and implementation of Ant colony optimization metaheuristics in the Drools Planner. Evaluation of the algorithm results is done by Drools Planner benchmark with different kinds of optimization problems.
|
48 |
Circuit Design Methods with Emerging NanotechnologiesZheng, Yexin 28 December 2009 (has links)
As complementary metal-oxide semiconductor (CMOS) technology faces more and more severe physical barriers down the path of continuously feature size scaling, innovative nano-scale devices and other post-CMOS technologies have been developed to enhance future circuit design and computation. These nanotechnologies have shown promising potentials to achieve magnitude improvement in performance and integration density. The substitution of CMOS transistors with nano-devices is expected to not only continue along the exponential projection of Moore's Law, but also raise significant challenges and opportunities, especially in the field of electronic design automation. The major obstacles that the designers are experiencing with emerging nanotechnology design include: i) the existing computer-aided design (CAD) approaches in the context of conventional CMOS Boolean design cannot be directly employed in the nanoelectronic design process, because the intrinsic electrical characteristics of many nano-devices are not best suited for Boolean implementations but demonstrate strong capability for implementing non-conventional logic such as threshold logic and reversible logic; ii) due to the density and size factors of nano-devices, the defect rate of nanoelectronic system is much higher than conventional CMOS systems, therefore existing design paradigms cannot guarantee design quality and lead to even worse result in high failure ratio. Motivated by the compelling potentials and design challenges of emerging post-CMOS technologies, this dissertation work focuses on fundamental design methodologies to effectively and efficiently achieve high quality nanoscale design.
A novel programmable logic element (PLE) is first proposed to explore the versatile functionalities of threshold gates (TGs) and multi-threshold threshold gates (MTTGs). This PLE structure can realize all three- or four-variable logic functions through configuring binary control bits. This is the first single threshold logic structure that provides complete Boolean logic implementation. Based on the PLEs, a reconfigurable architecture is constructed to offer dynamic reconfigurability with little or no reconfiguration overhead, due to the intrinsic self-latching property of nanopipelining. Our reconfiguration data generation algorithm can further reduce the reconfiguration cost.
To fully take advantage of such threshold logic design using emerging nanotechnologies, we also developed a combinational equivalence checking (CEC) framework for threshold logic design. Based on the features of threshold logic gates and circuits, different techniques of formulating a given threshold logic in conjunctive normal form (CNF) are introduced to facilitate efficient SAT-based verification. Evaluated with mainstream benchmarks, our hybrid algorithm, which takes into account both input symmetry and input weight order of threshold gates, can efficiently generate CNF formulas in terms of both SAT solving time and CNF generating time.
Then the reversible logic synthesis problem is considered as we focus on efficient synthesis heuristics which can provide high quality synthesis results within a reasonable computation time. We have developed a weighted directed graph model for function representation and complexity measurement. An atomic transformation is constructed to associate the function complexity variation with reversible gates. The efficiency of our heuristic lies in maximally decreasing the function complexity during synthesis steps as well as the capability to climb out of local optimums. Thereafter, swarm intelligence, one of the machine learning techniques is employed in the space searching for reversible logic synthesis, which achieves further performance improvement.
To tackle the high defect-rate during the emerging nanotechnology manufacturing process, we have developed a novel defect-aware logic mapping framework for nanowire-based PLA architecture via Boolean satisfiability (SAT). The PLA defects of various types are formulated as covering and closure constraints. The defect-aware logic mapping is then solved efficiently by using available SAT solvers. This approach can generate valid logic mapping with a defect rate as high as 20%. The proposed method is universally suitable for various nanoscale PLAs, including AND/OR, NOR/NOR structures, etc.
In summary, this work provides some initial attempts to address two major problems confronting future nanoelectronic system designs: the development of electronic design automation tools and the reliability issues. However, there are still a lot of challenging open questions remain in this emerging and promising area. We hope our work can lay down stepstones on nano-scale circuit design optimization through exploiting the distinctive characteristics of emerging nanotechnologies. / Ph. D.
|
49 |
SNAP BiclusteringChan, William Hannibal 22 January 2010 (has links)
This thesis presents a new ant-optimized biclustering technique known as SNAP biclustering, which runs faster and produces results of superior quality to previous techniques. Biclustering techniques have been designed to compensate for the weaknesses of classical clustering algorithms by allowing cluster overlap, and allowing vectors to be grouped for a subset of their defined features. These techniques have performed well in many problem domains, particularly DNA microarray analysis and collaborative filtering. A motivation for this work has been the biclustering technique known as bicACO, which was the first to use ant colony optimization. As bicACO is time intensive, much emphasis was placed on decreasing SNAP's runtime. The superior speed and biclustering results of SNAP are due to its improved initialization and solution construction procedures. In experimental studies involving the Yeast Cell Cycle DNA microarray dataset and the MovieLens collaborative filtering dataset, SNAP has run at least 22 times faster than bicACO while generating superior results. Thus, SNAP is an effective choice of technique for microarray analysis and collaborative filtering applications. / Master of Science
|
50 |
Algoritmos híbridos para la resolución del F.L.P. (Facility Layout Problem) basados en colonias de hormigasJaén Gómez, Pedro Ildefonso 07 January 2016 (has links)
[EN] The Facilities Layout Problem in a industrial plant (FLP) pursues the good ordenation of the integrating elements (that in this work they will call themselves facilities, understan-ding those elements of the production system that they require space) of a production system and it contemplates, among other, geometric and economic aspects. The eco-nomic aspect has to do with the installation of the plant and with its operation while the geometric one is related with the architecture of the system. Under consideration of these aspects they are derived different formulations of the problem according to the geometric model adopted for represent the solution and according to the function to optimize that can include quantitative terms as installation costs and operation cost (manu-tención) and qualitative terms derived of the chart establishing relationship of activities from the met-hodology SLP. Certain tradition exists in the Educational Unit of Buildings and Architectu-re Industrial (at the moment U.D of Industrial Buldings), on the resolution of this FLP from diverse focuses, what there is origin that already from the years 90, myself, author of this thesis, as well as other partners, let us have implemented some computer applications of several types for the resolution of the same, based, by way of example, in genetic algo-rithms or in fuzzy logic. The last one goal was this one implemented with ACO ("Ant Co-lony Optimization") that this work shows. Anyway, this applications, often used in other works or even with educational ends, they have provided satisfactory results so much in the investigating scheduling as in the academic. At the beginning of the 2000, when the normative of Industrial Buldings Fire Proofing appears, when being starting from then of a preceptive normative in the greater part of industries of new installation, and the position that was continued in the real works was: in a first phase the elaboration of the layout, while in a second phase the application of the preceptive normative of fire proofing was demanded against fires to the layout obtained previously, with obligatory character so much in the industrial field, like in the subsidiary uses that aren't industrials, different from the main one. Any layout that it doesn't complete the fire proofing normative approaches in all the areas, be these industrial or not, it lacks legal validity and therefore it's not viable. In a third phase it is endowed of the thermal appropriate atmosphere, higroscopic, acous-tic and lighting to the obtained solution. In front of this reality, more and more commenda-ble starting from the appearance of the Technical Code of Buildings, that impels the per-formance designing and not in prescriptions, of the non convenience of unlying the design phases, we have started including the approach of the compartmentalization in the design like another objective in the quality of the final adopted solution, and therefore optimizable like any another. Hence in this work we have been carried out a proposal of compart-mentalization algorithm that works starting from the information and approaches that the normative of fires use, and we have also defined a proposal of objective function, as well as a series of parameters that allows to consider like it influences this compartmentaliza-tion in the flow of materials through the different facilities. / [ES] El problema de la distribución en planta de procesos industriales (FLP) persigue la ordenación óptima de los elementos (que en este trabajo se llamarán actividades, conceptuándose como aquellos elementos del sistema de producción que requieren espacio) de un sistema de producción y contempla, entre otros, aspectos geométricos y económicos.
El aspecto económico tiene que ver con la instalación de la planta y con su operación mientras que el geométrico se relaciona con la arquitectura del sistema. De la consideración de estos aspectos se derivan diferentes formulaciones del problema según el modelo geométrico adoptado para representar la solución y según la función a optimizar, que puede incluir términos cuantitativos como costes de instalación y de operación (manutención) y términos cualitativos derivados de la tabla relacional de actividades establecida desde la metodología SLP. Existe cierta tradición en la Unidad Docente de Construcción y Arquitectura Industrial (actualmente U.D de Construcciones Industriales), sobre la resolución de este problema de distribución en planta desde diversos enfoques,
lo que ha originado que ya desde los años 90, yo mismo, autor de esta Tesis Doctoral, así como otros compañeros, hayamos implementado algunas aplicaciones informáticas de varios tipos para la resolución del mismo, basadas, a modo de ejemplo, en algoritmos genéticos o en lógica borrosa. El último caso el de la aplicación informática que utiliza ACO ("Ant Colony Optimization") que se presenta en este trabajo. En cualquier caso, dichas aplicaciones, a menudo utilizadas en otras investigaciones o incluso con fines docentes, han proporcionado resultados satisfactorios tanto en el plano investigador
como en el académico. A principios de los 2000, cuando aparece la normativa de Protección Contra Incendios en Establecimientos Industriales, al tratarse a partir de entonces de una normativa de obligado cumplimiento en la gran mayoría de actividades de nueva planta, y el planteamiento que se siguió al realizar los trabajos y proyectos sobre casos reales fue en una primera fase la elaboración de la distribución en planta, mientras que en una segunda fase se exigía la aplicación de la normativa de protección contra incendios a la distribución en planta obtenida con anterioridad, con carácter obligatorio
tanto en el ámbito industrial, como en los usos subsidiaros no industriales diferentes del principal. Cualquier distribución en planta que no cumpla los criterios normativos en todas las zonas, sean éstas industriales o no, carece de validez legal y por tanto no es viable. En una tercera fase se dota del adecuado ambiente térmico, higroscópico, acústico y lumínico a la solución obtenida. Frente a esta realidad, cada vez más plausible a partir de la entrada en vigor del Código Técnico de la edificación, que impulsa el diseño basado en prestaciones y no en prescripciones, de la no conveniencia de desligar las
fases de diseño, se ha comenzado por incluir el criterio de la sectorización en el diseño como un objetivo más mesurable en la calidad de la solución final adoptada, y por lo tanto optimizable como cualquier otro. Por ello en este trabajo se ha realizado una propuesta de algoritmo de sectorización, que funciona a partir de la información y criterios que las normativas de incendios utilizan, y se ha definido también una propuesta de función objetivo, así como una serie de parámetros que permiten considerar cómo influye esta sectorización en el trasiego de materiales (fundamentalmente flujos) a través de las
distintas actividades. / [CA] El problema de la distribució en planta de processos industrials (FLP) perseguix l'ordena-ció òptima dels elements (que en este treball es cridaran activitats, conceptuant-se com aquells elements del sistema de producció que requerixen espai) d'un sistema de pro-ducció i contempla, entre altres, aspectes geomètrics i econòmics. L'aspecte econòmic té a veure amb la instal·lació de la planta i amb la seua operació mentres que el geo-mètric es relaciona amb l'arquitectura del sistema. De la consideració d'estos aspectes es deriven diferents formulacions del problema segons el model geomètric adoptat per a representar la solució i segons la funció a optimitzar, que pot incloure termes quantitatius com a costos d'instal·lació i d'operació (manutenció) i termes qualitatius derivats de la taula relacional d'activitats establida des de la metodologia SLP. Hi ha una certa tradició en la Unitat Docent de Construcció i Arquitectura Industrial (actualment U.D de Cons-truccions Industrials) , sobre la resolució d'este problema de distribució en planta des de diversos enfocaments, la qual cosa ha originat que ja des dels anys 90, jo mateix, autor d'esta tesi, així com altres companys, hàgem implementat algunes aplicacions informàti-ques de diversos tipus per a la resolució del mateix, basades, a manera d'exemple, en algoritmes genètics o en lògica borrosa. L'últim cas el de l'aplicació informàtica que uti-litza ACO ("Ant Colony Optimization") que es presenta en este treball. En tot cas, les dites aplicacions, sovint utilitzades en altres investigacions o inclús amb fins docents, han pro-porcionat resultats satisfactoris tant en el pla investigador com en l'acadèmic. A principis dels 2000, quan apareix la normativa de Protecció Contra Incendis en Establiments In-dustrials, al tractar-se a partir de llavors d'una normativa de compliment obligatori en la gran majoria d'activitats de nova planta, i el plantejament que es va seguir en els treballs i projectes reials va ser en una primera fase l'elaboració de la distribució en planta, men-tres que en una segona fase s'exigia l'aplicació de la normativa de protecció contra in-cendis a la distribució en planta obtinguda amb anterioritat, amb caràcter obligatori tant en l'àmbit industrial, com en els usos subsidiar-vos no industrials diferents del principal. Qualsevol distribució en planta que no complisca els criteris normatius en totes les zones, siguen ést. En una tercera fase es dota de l'adequat ambient tèrmic, higroscòpic, acústic i lumínic a la solució obtinguda. Enfront d'esta realitat, cada vegada més plausible a partir de l'entrada en vigor del Codi Tècnic de l'Edificació, que impulsa el disseny basat en prestacions i no en prescripcions, de la no conveniència de deslligar les fases de disseny, s'ha començat per incloure el criteri de la sectorització en el disseny com un objectiu més mesurable en la qualitat de la solució final adoptada, i per tant optimizable com qualsevol altre. Per això en este treball s'ha realitzat una proposta d'algoritme de sectorització, que funciona a partir de la informació i criteris que les normatives d'incendis utilitzen, i s'ha definit també una proposta de funció objectiu, així com una sèrie de paràmetres que permeten considerar com influïx esta sectorització en el trasbals de materials (fonamen-talment fluxos) a través de les distintes activitats. / Jaén Gómez, PI. (2015). Algoritmos híbridos para la resolución del F.L.P. (Facility Layout Problem) basados en colonias de hormigas [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/59447
|
Page generated in 0.0614 seconds