• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 22
  • 6
  • 6
  • 4
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 60
  • 60
  • 14
  • 13
  • 12
  • 8
  • 8
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 6
  • 6
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

Optimalizace tvorby rolí pomocí RBAC modelu

KLÍMA, Martin January 2017 (has links)
The aim of the thesis is to develop algorithm which will be able to optimize roles using RBAC model. The intent of the theoretical part is to analyze RBAC model and present current options which are available for role optimization. The practical part deals with development of algorithm which allows to optimize roles based on defined criteria from user. This algorithm is implemented in programming language Java and builds on Role Process Optimization Model (ROPM). In the last part is showed on example set of data how this algorithm works, step by step, with explanation of each step. Result of this algorithm is new RBAC model defined by user criteria. In this thesis are also listed different approach in role optimization, possible future development and concept of mapping RBAC model to mathematical and data-mining techniques.
32

Contribution à la modélisation et au contrôle de trajectoire de Trackers photovoltaïques à haute concentration (HCPV) / Contribution to the modeling and control of high concentrated Photovoltaic tracker (hcpv)

Sahnoun, Mohamed Aymen 18 December 2015 (has links)
Dans une optique de maximisation de la production et de réduction des coûts d’installation, de maintenance et d’entretien des trackers solaires, qui permettent d’orienter les modules photovoltaïques à haute concentration (HCPV), ces travaux de thèse se focalisent sur l’amélioration de la précision et la réduction du coût de la stratégie de génération de la trajectoire du tracker. Dans un premier temps, un simulateur de tracker HCPV est développé offrant une étude de l’influence de la performance du suivi du soleil sur la production des modules HCPV, permettant ainsi une étude et une comparaison des stratégies de génération de trajectoires. Le simulateur est basé sur un modèle comportemental de module HCPV monté sur tracker permettant de prédire la puissance maximale du module HCPV en fonction de l’erreur de position du tracker face au soleil, de l’ensoleillement direct et de la température. Une première stratégie de commande dite de référence a été implémentée sur ce simulateur. C’est une commande hybride qui repose sur un viseur solaire pour corriger l’erreur de poursuite par un calcul astronomique. Ensuite, afin d’améliorer les performances et de réduire les coûts de cette stratégie, une nouvelle approche sans capteur est développée en se basant sur une méthode d’optimisation du gradient de puissance pour la génération de la trajectoire du tracker. Une étude complémentaire est également exposée afin de mettre en évidence des algorithmes de recherche de la puissance maximale (MPPT) pouvant offrir des temps de réponse suffisamment rapides pour ne pas affecter la qualité de l’évaluation du gradient de puissance. Dans ce contexte, une commande MPPT P&O améliorée par un réseau de neurones à complexité réduite est proposée, assurant un compromis entre précision, simplicité et rapidité / This work focuses on improving the accuracy and on reducing the cost of the tracker generating trajectory strategy, in order to maximize the production and to reduce the installation and the maintenance cost of a solar tracker orienting high concentrated photovoltaic modules (HCPV). Initially, we propose a behavioral modeling of the HCPV module mounted on a dual axis tracker in order to study the influence of the tracking performance on the module power production. Then, this simulator can be used to test control strategies and to compare their performance. Firstly, a classical control strategy is implemented in the simulator. It is based on a hybrid control operating an astronomical calculation to follow the sun path, and a sun sensor to correct the tracking error. A sensorless strategy is proposed in this work to reduce the cost of the HCPV tracker control. This strategy is based on a gradient optimization algorithm to generate the tracker trajectory and to catch the sun path. Tested on the simulator, this strategy presents the same accuracy as the classical strategy while being less costly. The last study proposed in this thesis work concerns maximum power point tracking (MPPT) algorithms, in order to respond to a given problem relating to the practical implementation of gradient algorithm. In this context, we propose an original optimization of the P&O MPPT control with a neural network algorithm leading to a significant reduction of the computational cost required to train it. This approach, which is ensuring a good compromise between accuracy and complexity is sufficiently fast to not affect the quality of the evaluation of the gradient.
33

Optimisation de l’aménagement d’un Chantier de Construction en Fonction des Risques Naturels et Technologiques, Utilisation du SIG / Construction Site Layout Optimization, Considering Risk of Natural or Technological Hazard Utilizing GIS

Abune'meh, Mohammed 24 May 2017 (has links)
Les chantiers de construction contiennent plusieurs installations. Celles-ci sont susceptibles d’être endommagées par des incidents liés aux risques naturels ou technologiques comme les incendies, les explosions, les ondes de souffles, etc. Ces incidents peuvent avoir des conséquences néfastes sur l’ensemble du processus de construction, ce qui pourrait entrainer des accidents graves ayant un impact majeur sur la productivité des employés, le temps global du projet, sa qualité et son budget. Par conséquent les gestionnaires et les planificateurs du projet doivent adopter et développer des approches novatrices capables de faire face aux risques naturels potentiels, de minimiser leurs conséquences et de faciliter l’évacuation du site en cas de danger. Une de ces approches consiste à optimiser l’aménagement des chantiers de construction. En général, la réduction des dommages résultants de risques naturels ou technologiques est encore un défi scientifique.Dans cette thèse, deux modèles (déterministe et probabiliste) sont développés pour minimiser les risques au sein d’un chantier. La méthode adoptée pour le développement de ces deux modèles consiste en :• La modélisation des éléments du chantier, par exemple : le générateur électrique, les bureaux et les entrepôts de matériaux, sont modélisés en 2D, pour agir en tant que source d’aléa et/ou cible vulnérable potentielle.• La modélisation de l’aléa : elle montre l’interaction des aléas entre les composants du chantier.• La modélisation de la vulnérabilité : elle représente la faiblesse potentielle des cibles sur l’aléa généré par chaque source.• La définition de la fonction d’utilité : elle vise à offrir une disposition optimisée avec un minimum de risque total sur le chantier. L’algorithme à évolution différentielle est adopté pour exécuter le processus d’optimisation.D’une part, dans le modèle déterministe, nous utilisons le principe de la syntaxe spatiale pour étudier l’impact des configurations spatiales dans l’évaluation du risque sur le chantier. Par conséquent, comme le processus d’évacuation est pris en compte dans l’estimation du risque, le risque réel est amplifié en utilisant le facteur de pénalité appelé « profondeur moyenne ». L’algorithme de Dijkstra est appliqué sur un modèle déterministe afin de trouver les chemins les plus sûrs (chemins de moindre risque) pour évacuer les sites à partir de chaque position sur le chantier vers les lieux sûrs afin de diminuer les pertes humaines et matérielles.D’autre part, le modèle probabiliste suppose que le risque est composé de la défaillance individuelle de chaque installation sur le chantier de construction. La simulation numérique est utilisée pour trouver la distribution de probabilités des défaillances pour l’ensemble du site.Les fonctionnalités d’un SIG (Système d’Information Géographique) ont été utilisées pour présenter les données sous forme de cartes, pour produire des cartes spatiales de risque sur le chantier de construction, pour mettre en œuvre l’algorithme de Dijkastra et pour l’analyse du coût le plus faible.A titre indicatif, les modèles proposés sont utilisés dans un cas d’étude comprenant plusieurs installations. Dans le modèle déterministe, toutes ces installations agissent comme des sources d’aléa et des cibles vulnérables, en même temps, dans le modèle probabiliste, quelques-unes de ces installations agissent comme des sources d’aléa et toutes comme des cibles vulnérables. Les résultats obtenus montrent que les modèles proposés sont efficaces en raison de leur capacité à générer une disposition optimale du site avec un environnement de travail plus sûr. En outre, les modèles obtenus sont capables de mettre en évidence les zones les plus risquées d’un chantier, de générer les chemins d’évacuation les moins risqués, ce qui permettra de minimiser les blessures graves et les victimes en cas d’urgence / Construction sites contain several supporting facilities that are required to complete construction activities. These facilities are susceptible to damage due to the occurrence of natural or technological hazards such as fire, explosion, blast wave, and so on. These may cause adverse consequences for the whole construction process, which in turn lead to fatal accidents that have a major impact on worker and employee productivity, project completion time, project quality and project budget. Therefore, project planners must adopt and develop innovative approaches able to face the occurrence of potential hazards, minimize their consequences, and facilitate the evacuation of the site in case of their occurrence. One of these approaches is optimizing construction site layout. In general, generating construction site layout able minimizing risk resulting from natural or technological hazards is still a scientific challenge.In the present research, two proposed model (deterministic and probabilistic) are developed to minimize the risks within a construction site. The common methodology adopted to develop these two models consists of:• Modeling construction site components, for instance; electric generator, offices and material storages, in a 2D layout, to act as either hazardous source or potential target or both at the same time.• Modeling hazard: it shows the hazard interaction among site components and the attenuation of hazard.• Modeling vulnerability: it represents the potential weakness of whole targets to the hazard generated from each source.• Defining the utility function: it aims to afford an optimized site layout with minimum total risk in the construction site. The differential evolution algorithm is adopted to run optimization process.Particularly, in the deterministic model, we use space syntax principle in order to realize the impact of space configurations in evaluating the risk in the construction site. Therefore, as the evacuation process is considered in estimating the risk, the actual risk is amplified by utilizing penalty factor called mean depth. Furthermore, Dijkstra’s algorithm is run on deterministic model to find the safest paths (least risk paths) for evacuating sites from any position on the construction site towards the safe places in order to diminish losses and fatalities. On the other hand, the framework utilized to develop a probabilistic model assumed that the risk is combined of the individual failure of each facility within a construction site. Moreover, the numerical simulation is performed to find the probabilistic distribution of failure for the whole site.Geographic information system (GIS) capabilities were exploited, in this research, to present data in maps format, generate the spatial risk map in the construction site, and implement the Dijkstra’s algorithm and least-cost path analysis.For illustration purposes, the proposed models are employed in a case study consisting of several facilities. In the deterministic model, all of these facilities act as hazardous sources and potential targets, at the same time, while, in a probabilistic model, only three of these facilities act as fire hazardous sources, whereas, all of them are potential targets. The results revealed that the proposed models are efficient due to their capability of generating site layout with the safer work environment. In addition, the model is capable of highlighting the riskiest areas within a construction site. Moreover, the proposed models are able to generate paths through least-risk zones, which will minimize the serious injuries and victims in cases of emergencies
34

Optimalizační algoritmus pro příhradové ocelové konstrukce / Optimization Algorithm for the Truss Steel Structures

Zeizinger, Lukáš January 2021 (has links)
The work deals with the optimization of trusses construction building and transport machinery. The goal was to create an algorithm that can design an optimized design. The simulation took place on two experiments involving 52 sets of different entries, which are processed in detail into graphs. One-dimensional target mass or price function is used as part of optimization, but there is also an incorporated multidimensional purpose function. The finite element variation method for the beam system is used for the strength calculation of the truss structure and the genetic algorithm is used for optimization. At the end of the work, specific steps are formulated that lead to the most appropriate algorithm settings.
35

Multi-Objective Optimization of Plug-in HEV Powertrain Using Modified Particle Swarm Optimization

Omkar Mahesh Parkar (10725597) 10 May 2021 (has links)
Increase in the awareness environmental conservation is leading the automotive industry into the adaptation of alternatively fueled vehicles. Electric, Fuel-Cell as well as Hybrid-Electric vehicles focus on this research area with aim to efficiently utilize vehicle powertrain as the first step. Energy and Power Management System control strategies play vital role in improving efficiency of any hybrid propulsion system. However, these control strategies are sensitive to the dynamics of the powertrain components used in the given system. A kinematic mathematical model for Plug-in Hybrid Electric Vehicle (PHEV) has been developed in this study and is further optimized by determining optimal power management strategy for minimal fuel consumption as well as NOx emissions while executing a set drive cycle. A multi-objective optimization using weighted sum formulation is needed in order to observe the trade-off between the optimized objectives. Particle Swarm Optimization (PSO) algorithm has been used in this research, to determine the trade-off curve between fuel and NOx. In performing these optimizations, the control signal consisting of engine speed and reference battery SOC trajectory for a 2-hour cycle is used as the controllable decision parameter input directly from the optimizer. Each element of the control signal was split into 50 distinct points representing the full 2 hours, giving slightly less than 2.5 minutes per point, noting that the values used in the model are interpolated between the points for each time step. With the control signal consisting of 2 distinct signals, speed and SOC trajectory, as 50 element time variant signals, a multidimensional problem was formulated for the optimizer. Novel approaches to balance the optimizer exploration and convergence, as well as seeding techniques are suggested to solve the optimal control problem. The optimization of each involved individual runs at 5 different weight levels with the resulting cost populations being compiled together to visually represent with the help of Pareto front development. The obtained results of simulations and optimization are presented involving performances of individual components of the PHEV powertrain as well as the optimized PMS strategy to follow for given drive cycle. Observations of the trade-off is discussed in the case of Multi-Objective Optimizations.
36

Development of ABAQUS-MATLAB Interface for Design Optimization using Hybrid Cellular Automata and Comparison with Bidirectional Evolutionary Structural Optimization

Antony, Alen 12 1900 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / Topology Optimization is an optimization technique used to synthesize models without any preconceived shape. These structures are synthesized keeping in mind the minimum compliance problems. With the rapid improvement in advanced manufacturing technology and increased need for lightweight high strength designs topology optimization is being used more than ever. There exist a number of commercially available software's that can be used for optimizing a product. These software have a robust Finite Element Solver and can produce good results. However, these software offers little to no choice to the user when it comes to selecting the type of optimization method used. It is possible to use a programming language like MATLAB to develop algorithms that use a specific type of optimization method but the user himself will be responsible for writing the FEA algorithms too. This leads to a situation where the flexibility over the optimization method is achieved but the robust FEA of the commercial FEA tool is lost. There have been works done in the past that links ABAQUS with MATLAB but they are primarily used as a tool for finite element post-processing. Through this thesis, the aim is to develop an interface that can be used for solving optimization problems using different methods like hard-kill as well as the material penalization (SIMP) method. By doing so it's possible to harness the potential of a commercial FEA software and gives the user the requires flexibility to write or modify the codes to have an optimization method of his or her choice. Also, by implementing this interface, it can also be potentially used to unlock the capabilities of other Dassault Systèmes software's as the firm is implementing a tighter integration between all its products using the 3DExperience platform. This thesis as described uses this interface to implement BESO and HCA based topology optimization. Since hybrid cellular atomata is the only other method other than equivalent static load method that can be used for crashworthiness optimization this work suits well for the role when extended into a non-linear region.
37

COMPARING PSO-BASED CLUSTERING OVER CONTEXTUAL VECTOR EMBEDDINGS TO MODERN TOPIC MODELING

Samuel Jacob Miles (12462660) 26 April 2022 (has links)
<p>Efficient topic modeling is needed to support applications that aim at identifying main themes from a collection of documents. In this thesis, a reduced vector embedding representation and particle swarm optimization (PSO) are combined to develop a topic modeling strategy that is able to identify representative themes from a large collection of documents. Documents are encoded using a reduced, contextual vector embedding from a general-purpose pre-trained language model (sBERT). A modified PSO algorithm (pPSO) that tracks particle fitness on a dimension-by-dimension basis is then applied to these embeddings to create clusters of related documents. The proposed methodology is demonstrated on three datasets across different domains. The first dataset consists of posts from the online health forum r/Cancer. The second dataset is a collection of NY Times abstracts and is used to compare</p> <p>the proposed model to LDA. The third is a standard benchmark dataset for topic modeling which consists of a collection of messages posted to 20 different news groups. It is used to compare state-of-the-art generative document models (i.e., ETM and NVDM) to pPSO. The results show that pPSO is able to produce interpretable clusters. Moreover, pPSO is able to capture both common topics as well as emergent topics. The topic coherence of pPSO is comparable to that of ETM and its topic diversity is comparable to NVDM. The assignment parity of pPSO on a document completion task exceeded 90% for the 20News-Groups dataset. This rate drops to approximately 30% when pPSO is applied to the same Skip-Gram embedding derived from a limited, corpus specific vocabulary which is used by ETM and NVDM.</p>
38

Green Communication in IoT Networks Using a Hybrid Optimization Algorithm

Maddikunta, Praveen Kumar Reddy, Gadekallu, Thippa Reddy, Kaluri, Rajesh, Srivastava, Gautam, Parizi, Reza M., Khan, Mohammad S. 01 June 2020 (has links)
There has been a huge surge in the Internet of Things (IoT) applications in recent years. The sensor nodes in the IoT network generate data continuously that directly affects the longevity of the network. Even though the potential of IoT applications are immense, there are numerous challenges like security, privacy, load balancing, storage, heterogeneity of devices, and energy optimization that have to be addressed. Of those, the energy utilization of the network is of importance and has to be optimized. Several factors like residual energy, temperature, the load of Cluster Head (CH), number of alive nodes, and cost function affect the energy consumption of sensor nodes. In this paper, a hybrid Whale Optimization Algorithm-Moth Flame Optimization (MFO) is designed to select optimal CH, which in turn optimizes the aforementioned factors. The performance of the proposed work is then evaluated with existing algorithms with respect to the energy-specific factors. The results obtained prove that the proposed method outperforms existing approaches.
39

Back analysis of rock mass properties in the regional fault zone under Lake Mälaren

Liu, Jiaqi January 2022 (has links)
The properties of the surrounding rock mass in underground projects have significant impacts on the design and construction. However, it is quite challenging to evaluate rock mass properties due to the great uncertainties of the geological conditions. Besides, even if techniques of field test have obtained a well development, the high expense of tests and scattering results always make it difficult to cover a large domain in a complex project. In recent years, because of the maturity of numerical analysis as well as the wide use of tunnel deformation measurements, displacement-based back analysis has become a popular and effective indirect method to estimate rock mass properties.The main purpose of this thesis was to perform a displacement-based back analysis on the in-situ stress ratio and Young’s modulus of the exploratory tunnel BP201, which constitutes the passage under the Lake Mälaren in the Stockholm Bypass project. The back analysis was carried out using the Pattern search method and the Simplex method. The error function was built according to the least square method, and the commercial finite element software Plaxis 2D was used to calculate theoretical deformations. Moreover, a sensitivity analysis was performed to study the influence of starting point and how other numerical model parameters affects the results of the back analysis.The two optimization algorithms used in this study provided an in-situ stress ratio and Young’s modulus with close estimations to the measured deformations. For the specific problem analysed in this thesis, it was found that the Simplex method was more suitable than the Pattern search method. It was also concluded that a better choice of starting point can improve the precision and efficiency of the back analysis. / Bergmassans egenskaper har en stor påverkan vid dimensioneringen och utformningen av förstärkningen i tunnlar och vid indelningen av uttagssekvenser vid tunneldrivning. Det är emellertid svårt att utvärdera bergmassans egenskaper till följd av stora osäkerheter i de geologiska förhållandena. Även om tekniker för provning i fält har utvecklats är de relativt kostsamma och det är svårt att täcka in hela den geologiska domänen i stora projekt. På senare år har utvecklingen av numeriska metoder och deformationsmätningar i tunnlar möjliggjort bestämning av bergmassans egenskaper genom deformationsbaserad bakåtanalys.Syftet med detta examenarbete var att utföra en deformationsbaserad bakåtanalys av in-situ spänningstillståndet och bergmassans elasticitets-modul i bypass tunneln BP 201 under passagen av sjön Mälaren i Stockholm, vilket är en del av projektet Förbifart Stockholm. I bakåt-analysen användes metoderna ”the Pattern Search Method” och ”the Simplex Metod”. Minsta kvadratmetoden användes som felfunktion. Tvådimensionella numeriska beräkningar av deformationerna i tunneln utfördes med det finita elementprogrammet Plaxis 2D. En känslighets-analys gjordes för att studera inverkan på resultatet vid val av startpunkt i bakåtanalysen och hur osäkerheter i övriga parametrar påverkade resultatet.De två optimeringsalgoritmerna som användes i denna studie resulterade i in-situ spänningar och en elasticitetsmodul som genererade beräknade deformationer nära de uppmätta. For det studerade problemet i detta arbete var ”the Simplex Method” lämpligare att använda än ”the Pattern Search Method”. Genom att välja ett lämpligare startvärde kunde också precisionen och effektiviteten i bakåtanalysen förbättras.
40

A Comparative Study on Optimization Algorithms and its efficiency

Ahmed Sheik, Kareem January 2022 (has links)
Background: In computer science, optimization can be defined as finding the most cost-effective or notable achievable performance under certain circumstances, maximizing desired factors, and minimizing undesirable results. Many problems in the real world are continuous, and it isn't easy to find global solutions. However, computer technological development increases the speed of computations [1]. The optimization method, an efficient numerical simulator, and a realistic depiction of physical operations that we intend to describe and optimize for any optimization issue are all interconnected components of the optimization process [2]. Objectives: A literature review on existing optimization algorithms is performed. Ten different benchmark functions are considered and are implemented on the existing chosen algorithms like GA (Genetic Algorithm), ACO (Ant ColonyOptimization) Method, and Plant Intelligence Behaviour optimization algorithm to measure the efficiency of these approaches based on the factors or metrics like CPU Time, Optimality, Accuracy, and Mean Best Standard Deviation. Methods: In this research work, a mixed-method approach is used. A literature review is performed based on the existing optimization algorithms. On the other hand, an experiment is conducted by using ten different benchmark functions with the current optimization algorithms like PSO algorithm, ACO algorithm, GA, and PIBO to measure their efficiency based on the four different factors like CPU Time, Optimality, Accuracy, Mean Best Standard Deviation. This tells us which optimization algorithms perform better. Results: The experiment findings are represented within this section. Using the standard functions on the suggested method and other methods, the various metrics like CPU Time, Optimality, Accuracy, and Mean Best Standard Deviation are considered, and the results are tabulated. Graphs are made using the data obtained. Analysis and Discussion: The research questions are addressed based on the experiment's results that have been conducted. Conclusion: We finally conclude the research by analyzing the existing optimization methods and the algorithms' performance. The PIBO performs much better and can be depicted from the results of the optimal metrics, best mean, standard deviation, and accuracy, and has a significant drawback of CPU Time where its time taken is much higher when compared to the PSO algorithm and almost close to GA and performs much better than ACO algorithm.

Page generated in 0.5041 seconds