• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2606
  • 912
  • 381
  • 347
  • 331
  • 101
  • 66
  • 49
  • 40
  • 36
  • 34
  • 32
  • 31
  • 27
  • 26
  • Tagged with
  • 5945
  • 1424
  • 873
  • 728
  • 722
  • 669
  • 492
  • 492
  • 480
  • 448
  • 421
  • 414
  • 386
  • 366
  • 341
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1131

Optimisation of a Hadoop cluster based on SDN in cloud computing for big data applications

Khaleel, Ali January 2018 (has links)
Big data has received a great deal attention from many sectors, including academia, industry and government. The Hadoop framework has emerged for supporting its storage and analysis using the MapReduce programming module. However, this framework is a complex system that has more than 150 parameters and some of them can exert a considerable effect on the performance of a Hadoop job. The optimum tuning of the Hadoop parameters is a difficult task as well as being time consuming. In this thesis, an optimisation approach is presented to improve the performance of a Hadoop framework by setting the values of the Hadoop parameters automatically. Specifically, genetic programming is used to construct a fitness function that represents the interrelations among the Hadoop parameters. Then, a genetic algorithm is employed to search for the optimum or near the optimum values of the Hadoop parameters. A Hadoop cluster is configured on two severe at Brunel University London to evaluate the performance of the proposed optimisation approach. The experimental results show that the performance of a Hadoop MapReduce job for 20 GB on Word Count Application is improved by 69.63% and 30.31% when compared to the default settings and state of the art, respectively. Whilst on Tera sort application, it is improved by 73.39% and 55.93%. For better optimisation, SDN is also employed to improve the performance of a Hadoop job. The experimental results show that the performance of a Hadoop job in SDN network for 50 GB is improved by 32.8% when compared to traditional network. Whilst on Tera sort application, the improvement for 50 GB is on average 38.7%. An effective computing platform is also presented in this thesis to support solar irradiation data analytics. It is built based on RHIPE to provide fast analysis and calculation for solar irradiation datasets. The performance of RHIPE is compared with the R language in terms of accuracy, scalability and speedup. The speed up of RHIPE is evaluated by Gustafson's Law, which is revised to enhance the performance of the parallel computation on intensive irradiation data sets in a cluster computing environment like Hadoop. The performance of the proposed work is evaluated using a Hadoop cluster based on the Microsoft azure cloud and the experimental results show that RHIPE provides considerable improvements over the R language. Finally, an effective routing algorithm based on SDN to improve the performance of a Hadoop job in a large scale cluster in a data centre network is presented. The proposed algorithm is used to improve the performance of a Hadoop job during the shuffle phase by allocating efficient paths for each shuffling flow, according to the network resources demand of each flow as well as their size and number. Furthermore, it is also employed to allocate alternative paths for each shuffling flow in the case of any link crashing or failure. This algorithm is evaluated by two network topologies, namely, fat tree and leaf-spine, built by EstiNet emulator software. The experimental results show that the proposed approach improves the performance of a Hadoop job in a data centre network.
1132

Planejamento de caminhos tridimensionais para veículos aéreos não tripulados

Carvalho, Elda Nunes de 28 February 2014 (has links)
Submitted by Geyciane Santos (geyciane_thamires@hotmail.com) on 2015-06-24T12:53:55Z No. of bitstreams: 1 Dissertação - Elda Nunes de Carvalho.pdf: 2461744 bytes, checksum: 32fc857bfdb9e7a48176274b174bf16e (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2015-06-24T15:01:36Z (GMT) No. of bitstreams: 1 Dissertação - Elda Nunes de Carvalho.pdf: 2461744 bytes, checksum: 32fc857bfdb9e7a48176274b174bf16e (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2015-06-24T15:01:43Z (GMT) No. of bitstreams: 1 Dissertação - Elda Nunes de Carvalho.pdf: 2461744 bytes, checksum: 32fc857bfdb9e7a48176274b174bf16e (MD5) / Approved for entry into archive by Divisão de Documentação/BC Biblioteca Central (ddbc@ufam.edu.br) on 2015-06-24T15:19:31Z (GMT) No. of bitstreams: 1 Dissertação - Elda Nunes de Carvalho.pdf: 2461744 bytes, checksum: 32fc857bfdb9e7a48176274b174bf16e (MD5) / Made available in DSpace on 2015-06-24T15:19:31Z (GMT). No. of bitstreams: 1 Dissertação - Elda Nunes de Carvalho.pdf: 2461744 bytes, checksum: 32fc857bfdb9e7a48176274b174bf16e (MD5) Previous issue date: 2014-02-28 / Unmanned aerial vehicles (UAV’s) have been the target of research in many applications where there are hazardous conditions to the integrity of the crew. A great effort has been made to increase the autonomy of UAV’s to perform these missions. In the military domain these studies have shown good results. On the other hand, in civil area, researches are still a challenge, due to the lack of investments for the development and absence of specific rules. However, in some situations, the lack of pilots onboard becomes mission execution a nontrivial task. An example is missions on dense forest environments with, specific feature of the Amazon region. This scenario presents a critical role both in autonomous navigation mode and the remotely operated mode, due to the difficulty of navigation in such environments. The use of planners has enabled obtaining paths or trajectories that satisfy criteria navigation performance meeting the requirements of the mission. In general, a set of points is defined in the vehicle navigation space and the planner finds a path that passes through all of these points, taking into account the constraints of motion, time and / or energy consumption of the vehicle. This work addresses the construction of a path planner for fixed wing UAV navigation in three dimensional spaces about sparse forest regions with restriction of flight due to weather and environmental conditions. Unlike the traditional way, the methodology developed here gets specific points in three-dimensional space regions that favor the criteria for flight safety and maneuverability of the aircraft and also meet the requirements of the mission, paving the way from the interpolation of these points with points crossing the coming mission. Experimental tests by simulation in different scenarios show the applicability of UAVpath planner in actual service security and improving mission performance. / Os veículos aéreos não tripulados (VANT) têm sido alvo de pesquisas em diversasaplicações.Um grande esforço tem sido feito para aumentar a autonomia dos VANT’s ao realizar essas missões. Na área militar essas pesquisas têm mostrados bons resultados, por outro lado, na área civil as pesquisas aindasão um desafio, pela carência de investimentos para o desenvolvimento e falta de regulamentação especifica. No entanto, em algumas situações, a falta de pilotoabordo torna a execução da missão uma tarefa não trivial. Um exemplo disso são missões realizadas sobre ambientes florestais com mata densa, característica especificada Região Amazônica. Este cenárioapresentaum papel crítico tanto no modo de navegação autônoma quanto no modoremotamente operado, devido àdificuldade de navegação em tais ambientes. O uso de planejadores tem viabilizadoa obtenção de caminhos ou trajetórias que satisfaçam critérios de desempenho de navegação atendendo aos requisitos da missão. Em geral, um conjunto de pontos passagem é definido no espaço de navegação do veículo e o planejador encontra um caminho que passa por sobre todos esses pontos, levando em consideração as restrições de movimento, tempo, condições ambientaise/ou o consumo de energia do veiculo.Este trabalhoaborda a construção de um planejador de caminho para navegação do VANT de asa fixa no espaço tridimensional sobre regiões esparsas de florestascom restrição de voo devido àscondições de dirigibilidade, climáticas, ambientais. De maneira diferente das tradicionais, a metodologia aqui desenvolvida obtém pontos específicos em regiões do espaço tridimensional que favoreçam os critérios de segurança do voo e dirigibilidade da aeronave e também atendam aos requisitos da missão, construindo o caminho a partir da interpolação desses pontos com os pontos de passagem oriundos da missão. Testes experimentais por meio de simulaçãoem diferentes cenários mostram à aplicabilidade do planejador de caminho em VANT reais no atendimento a segurançae na melhoria do desempenho da missão.
1133

Ambiente para avaliação de algoritmos de processamento de imagens médicas. / Environment for medical image processing algorithms assessment.

Marcelo dos Santos 20 December 2006 (has links)
Constantemente, uma variedade de novos métodos de processamento de imagens é apresentada à comunidade. Porém poucos têm provado sua utilidade na rotina clínica. A análise e comparação de diferentes abordagens por meio de uma mesma metodologia são essenciais para a qualificação do projeto de um algoritmo. Porém, é difícil comparar o desempenho e adequabilidade de diferentes algoritmos de uma mesma maneira. A principal razão deve-se à dificuldade para avaliar exaustivamente um software, ou pelo menos, testá-lo num conjunto abrangente e diversificado de casos clínicos. Muitas áreas - como o desenvolvimento de software e treinamentos em Medicina - necessitam de um conjunto diverso e abrangente de dados sobre imagens e informações associadas. Tais conjuntos podem ser utilizados para desenvolver, testar e avaliar novos softwares clínicos, utilizando dados públicos. Este trabalho propõe o desenvolvimento de um ambiente de base de imagens médicas de diferentes modalidades para uso livre em diferentes propósitos. Este ambiente - implementado como uma arquitetura de base distribuída de imagens - armazena imagens médicas com informações de aquisição, laudos, algoritmos de processamento de imagens, gold standards e imagens pós-processadas. O ambiente também possui um modelo de revisão de documentos que garante a qualidade dos conjuntos de dados. Como exemplo da facilidade e praticidade de uso, são apresentadas as avaliações de duas categorias de métodos de processamento de imagens médicas: segmentação e compressão. Em adição, a utilização do ambiente em outras atividades, como no projeto do arquivo didático digital do HC-FMUSP, demonstra a robustez da arquitetura proposta e sua aplicação em diferentes propósitos. / Constantly, a variety of new image processing methods are presented to the community. However, few of them have proved to be useful when used in clinical routine. The task of analyzing and comparing different algorithms, methods and applications through a sound testing is an essential qualification of algorithm design. However, it is usually very difficult to compare the performance and adequacy of different algorithms in the same way. The main reason is due to the difficulty to assess exhaustively the software, or at least using a comprehensive and diverse number of clinical cases for comparison. Several areas such as software development, image processing and medical training need a diverse and comprehensive dataset of images and related information. Such datasets could be used to develop, test and evaluate new medical software, using public data. This work presents the development of a free, online, multipurpose and multimodality medical image database environment. The environment, implemented such as a distributed medical image database, stores medical images, reports, image processing softwares, gold standards and post-processed images. Also, this environment implements a peer review model which assures the quality of all datasets. As an example of feasibility and easyness of use, it is shown the evaluation in two categories of medical image processing methods: segmentation and compression. In addition, the use of the set of applications proposed in this work in other activities, such as the HC-FMUSP digital teaching file, shows the robustness of the proposed architecture and its applicability on different purposes.
1134

Fuzzy Ants as a Clustering Concept

Kanade, Parag M 17 June 2004 (has links)
We present two Swarm Intelligence based approaches for data clustering. The first algorithm, Fuzzy Ants, presented in this thesis clusters data without the initial knowledge of the number of clusters. It is a two stage algorithm. In the first stage the ants cluster data to initially create raw clusters which are refined using the Fuzzy C Means algorithm. Initially, the ants move the individual objects to form heaps. The centroids of these heaps are redefined by the Fuzzy C Means algorithm. In the second stage the objects obtained from the Fuzzy C Means algorithm are hardened according to the maximum membership criteria to form new heaps. These new heaps are then moved by the ants. The final clusters formed are refined by using the Fuzzy C Means algorithm. Results from experiments with 13 datasets show that the partitions produced are competitive with those from FCM. The second algorithm, Fuzzy ant clustering with centroids, is also a two stage algorithm, it requires an initial knowledge of the number of clusters in the data. In the first stage of the algorithm ants move the cluster centers in feature space. The cluster centers found by the ants are evaluated using a reformulated Fuzzy C Means criterion. In the second stage the best cluster centers found are used as the initial cluster centers for the Fuzzy C Means algorithm. Results on 18 datasets show that the partitions found by FCM using the ant initialization are better than those from randomly initialized FCM. Hard C Means was also used in the second stage and the partitions from the ant algorithm are better than from randomly initialized Hard C Means. The Fuzzy Ants algorithm is a novel method to find the number of clusters in the data and also provides good initializations for the FCM and HCM algorithms. We performed sensitivity analysis on the controlling parameters and found the Fuzzy Ants algorithm to be very sensitive to the Tcreateforheap parameter. The FCM and HCM algorithms, with random initializations can get stuck in a bad extrema, the Fuzzy ant clustering with centroids algorithm successfully avoids these bad extremas.
1135

Analyses et preuves formelles d'algorithmes distribués probabilistes / Analyses and Formal Proofs of Randomised Distributed Algorithms

Fontaine, Allyx 16 June 2014 (has links)
L’intérêt porté aux algorithmes probabilistes est, entre autres,dû à leur simplicité. Cependant, leur analyse peut devenir très complexeet ce particulièrement dans le domaine du distribué. Nous mettons en évidencedes algorithmes, optimaux en terme de complexité en bits résolvantles problèmes du MIS et du couplage maximal dans les anneaux, qui suiventle même schéma. Nous élaborons une méthode qui unifie les résultatsde bornes inférieures pour la complexité en bits pour les problèmes duMIS, du couplage maximal et de la coloration. La complexité de ces analysespouvant facilement mener à l’erreur et l’existence de nombreux modèlesdépendant d’hypothèses implicites nous ont motivés à modéliserde façon formelle les algorithmes distribués probabilistes correspondant ànotre modèle (par passage de messages, anonyme et synchrone), en vuede prouver formellement des propriétés relatives à leur analyse. Pour cela,nous développons une bibliothèque, RDA, basée sur l’assistant de preuveCoq. / Probabilistic algorithms are simple to formulate. However, theiranalysis can become very complex, especially in the field of distributedcomputing. We present algorithms - optimal in terms of bit complexityand solving the problems of MIS and maximal matching in rings - that followthe same scheme.We develop a method that unifies the bit complexitylower bound results to solve MIS, maximal matching and coloration problems.The complexity of these analyses, which can easily lead to errors,together with the existence of many models depending on implicit assumptionsmotivated us to formally model the probabilistic distributed algorithmscorresponding to our model (message passing, anonymous andsynchronous). Our aim is to formally prove the properties related to theiranalysis. For this purpose, we develop a library, called RDA, based on theCoq proof assistant.
1136

Code-aided synchronization for digital burst communications

Herzet, Cédric 21 April 2006 (has links)
This thesis deals with the synchronization of digital communication systems. Synchronization (from the Greek syn (together) and chronos (time)) denotes the task of making two systems running at the same time. In communication systems, the synchronization of the transmitter and the receiver requires to accurately estimate a number of parameters such as the carrier frequency and phase offsets, the timing epoch... In the early days of digital communications, synchronizers used to operate in either data-aided (DA) or non-data-aided (NDA) modes. However, with the recent advent of powerful coding techniques, these conventional synchronization modes have been shown to be unable to properly synchronize state-of-the-art receivers. In this context, we investigate in this thesis a new family of synchronizers referred to as code-aided (CA) synchronizers. The idea behind CA synchronization is to take benefit from the structure of the code used to protect the data to improve the estimation quality achieved by the synchronizers. In a first part of the thesis, we address the issue of turbo synchronization, i.e., the iterative synchronization of continuous parameters. In particular, we derive several mathematical frameworks enabling a systematic derivation of turbo synchronizers and a deeper understanding of their behavior. In a second part, we focus on the so-called CA hypothesis testing problem. More particularly, we derive optimal solutions to deal with this problem and propose efficient implementations of the proposed algorithms. Finally, in a last part of this thesis, we derive theoretical lower bounds on the performance of turbo synchronizers.
1137

Stochastic Modeling and Simulation of Gene Networks

Xu, Zhouyi 06 May 2010 (has links)
Recent research in experimental and computational biology has revealed the necessity of using stochastic modeling and simulation to investigate the functionality and dynamics of gene networks. However, there is no sophisticated stochastic modeling techniques and efficient stochastic simulation algorithms (SSA) for analyzing and simulating gene networks. Therefore, the objective of this research is to design highly efficient and accurate SSAs, to develop stochastic models for certain real gene networks and to apply stochastic simulation to investigate such gene networks. To achieve this objective, we developed several novel efficient and accurate SSAs. We also proposed two stochastic models for the circadian system of Drosophila and simulated the dynamics of the system. The K-leap method constrains the total number of reactions in one leap to a properly chosen number thereby improving simulation accuracy. Since the exact SSA is a special case of the K-leap method when K=1, the K-leap method can naturally change from the exact SSA to an approximate leap method during simulation if necessary. The hybrid tau/K-leap and the modified K-leap methods are particularly suitable for simulating gene networks where certain reactant molecular species have a small number of molecules. Although the existing tau-leap methods can significantly speed up stochastic simulation of certain gene networks, the mean of the number of firings of each reaction channel is not equal to the true mean. Therefore, all existing tau-leap methods produce biased results, which limit simulation accuracy and speed. Our unbiased tau-leap methods remove the bias in simulation results that exist in all current leap SSAs and therefore significantly improve simulation accuracy without sacrificing speed. In order to efficiently estimate the probability of rare events in gene networks, we applied the importance sampling technique to the next reaction method (NRM) of the SSA and developed a weighted NRM (wNRM). We further developed a systematic method for selecting the values of importance sampling parameters. Applying our parameter selection method to the wSSA and the wNRM, we get an improved wSSA (iwSSA) and an improved wNRM (iwNRM), which can provide substantial improvement over the wSSA in terms of simulation efficiency and accuracy. We also develop a detailed and a reduced stochastic model for circadian rhythm in Drosophila and employ our SSA to simulate circadian oscillations. Our simulations showed that both models could produce sustained oscillations and that the oscillation is robust to noise in the sense that there is very little variability in oscillation period although there are significant random fluctuations in oscillation peeks. Moreover, although average time delays are essential to simulation of oscillation, random changes in time delays within certain range around fixed average time delay cause little variability in the oscillation period. Our simulation results also showed that both models are robust to parameter variations and that oscillation can be entrained by light/dark circles.
1138

Homogeneous models of anechoic rubber coatings

Cederholm, Alex January 2003 (has links)
No description available.
1139

Multi-objective optimization using Genetic Algorithms

Amouzgar, Kaveh January 2012 (has links)
In this thesis, the basic principles and concepts of single and multi-objective Genetic Algorithms (GA) are reviewed. Two algorithms, one for single objective and the other for multi-objective problems, which are believed to be more efficient are described in details. The algorithms are coded with MATLAB and applied on several test functions. The results are compared with the existing solutions in literatures and shows promising results. Obtained pareto-fronts are exactly similar to the true pareto-fronts with a good spread of solution throughout the optimal region. Constraint handling techniques are studied and applied in the two algorithms. Constrained benchmarks are optimized and the outcomes show the ability of algorithm in maintaining solutions in the entire pareto-optimal region. In the end, a hybrid method based on the combination of the two algorithms is introduced and the performance is discussed. It is concluded that no significant strength is observed within the approach and more research is required on this topic. For further investigation on the performance of the proposed techniques, implementation on real-world engineering applications are recommended.
1140

Global Optimization of Monotonic Programs: Applications in Polynomial and Stochastic Programming.

Cheon, Myun-Seok 15 April 2005 (has links)
Monotonic optimization consists of minimizing or maximizing a monotonic objective function over a set of constraints defined by monotonic functions. Many optimization problems in economics and engineering often have monotonicity while lacking other useful properties, such as convexity. This thesis is concerned with the development and application of global optimization algorithms for monotonic optimization problems. First, we propose enhancements to an existing outer-approximation algorithm | called the Polyblock Algorithm | for monotonic optimization problems. The enhancements are shown to significantly improve the computational performance of the algorithm while retaining the convergence properties. Next, we develop a generic branch-and-bound algorithm for monotonic optimization problems. A computational study is carried out for comparing the performance of the Polyblock Algorithm and variants of the proposed branch-and-bound scheme on a family of separable polynomial programming problems. Finally, we study an important class of monotonic optimization problems | probabilistically constrained linear programs. We develop a branch-and-bound algorithm that searches for a global solution to the problem. The basic algorithm is enhanced by domain reduction and cutting plane strategies to reduce the size of the partitions and hence tighten bounds. The proposed branch-reduce-cut algorithm exploits the monotonicity properties inherent in the problem, and requires the solution of only linear programming subproblems. We provide convergence proofs for the algorithm. Some illustrative numerical results involving problems with discrete distributions are presented.

Page generated in 0.032 seconds