• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 93
  • 40
  • 15
  • 12
  • 10
  • 3
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 202
  • 158
  • 43
  • 40
  • 39
  • 37
  • 35
  • 29
  • 28
  • 28
  • 27
  • 24
  • 22
  • 21
  • 21
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
191

Stochastic Combinatorial Optimization / Optimisation combinatoire stochastique

Cheng, Jianqiang 08 November 2013 (has links)
Dans cette thèse, nous étudions trois types de problèmes stochastiques : les problèmes avec contraintes probabilistes, les problèmes distributionnellement robustes et les problèmes avec recours. Les difficultés des problèmes stochastiques sont essentiellement liées aux problèmes de convexité du domaine des solutions, et du calcul de l’espérance mathématique ou des probabilités qui nécessitent le calcul complexe d’intégrales multiples. A cause de ces difficultés majeures, nous avons résolu les problèmes étudiées à l’aide d’approximations efficaces.Nous avons étudié deux types de problèmes stochastiques avec des contraintes en probabilités, i.e., les problèmes linéaires avec contraintes en probabilité jointes (LLPC) et les problèmes de maximisation de probabilités (MPP). Dans les deux cas, nous avons supposé que les variables aléatoires sont normalement distribués et les vecteurs lignes des matrices aléatoires sont indépendants. Nous avons résolu LLPC, qui est un problème généralement non convexe, à l’aide de deux approximations basée sur les problèmes coniques de second ordre (SOCP). Sous certaines hypothèses faibles, les solutions optimales des deux SOCP sont respectivement les bornes inférieures et supérieures du problème du départ. En ce qui concerne MPP, nous avons étudié une variante du problème du plus court chemin stochastique contraint (SRCSP) qui consiste à maximiser la probabilité de la contrainte de ressources. Pour résoudre ce problème, nous avons proposé un algorithme de Branch and Bound pour calculer la solution optimale. Comme la relaxation linéaire n’est pas convexe, nous avons proposé une approximation convexe efficace. Nous avons par la suite testé nos algorithmes pour tous les problèmes étudiés sur des instances aléatoires. Pour LLPC, notre approche est plus performante que celles de Bonferroni et de Jaganathan. Pour MPP, nos résultats numériques montrent que notre approche est là encore plus performante que l’approximation des contraintes probabilistes individuellement.La deuxième famille de problèmes étudiés est celle relative aux problèmes distributionnellement robustes où une partie seulement de l’information sur les variables aléatoires est connue à savoir les deux premiers moments. Nous avons montré que le problème de sac à dos stochastique (SKP) est un problème semi-défini positif (SDP) après relaxation SDP des contraintes binaires. Bien que ce résultat ne puisse être étendu au cas du problème multi-sac-à-dos (MKP), nous avons proposé deux approximations qui permettent d’obtenir des bornes de bonne qualité pour la plupart des instances testées. Nos résultats numériques montrent que nos approximations sont là encore plus performantes que celles basées sur les inégalités de Bonferroni et celles plus récentes de Zymler. Ces résultats ont aussi montré la robustesse des solutions obtenues face aux fluctuations des distributions de probabilités. Nous avons aussi étudié une variante du problème du plus court chemin stochastique. Nous avons prouvé que ce problème peut se ramener au problème de plus court chemin déterministe sous certaine hypothèses. Pour résoudre ce problème, nous avons proposé une méthode de B&B où les bornes inférieures sont calculées à l’aide de la méthode du gradient projeté stochastique. Des résultats numériques ont montré l’efficacité de notre approche. Enfin, l’ensemble des méthodes que nous avons proposées dans cette thèse peuvent s’appliquer à une large famille de problèmes d’optimisation stochastique avec variables entières. / In this thesis, we studied three types of stochastic problems: chance constrained problems, distributionally robust problems as well as the simple recourse problems. For the stochastic programming problems, there are two main difficulties. One is that feasible sets of stochastic problems is not convex in general. The other main challenge arises from the need to calculate conditional expectation or probability both of which are involving multi-dimensional integrations. Due to the two major difficulties, for all three studied problems, we solved them with approximation approaches.We first study two types of chance constrained problems: linear program with joint chance constraints problem (LPPC) as well as maximum probability problem (MPP). For both problems, we assume that the random matrix is normally distributed and its vector rows are independent. We first dealt with LPPC which is generally not convex. We approximate it with two second-order cone programming (SOCP) problems. Furthermore under mild conditions, the optimal values of the two SOCP problems are a lower and upper bounds of the original problem respectively. For the second problem, we studied a variant of stochastic resource constrained shortest path problem (called SRCSP for short), which is to maximize probability of resource constraints. To solve the problem, we proposed to use a branch-and-bound framework to come up with the optimal solution. As its corresponding linear relaxation is generally not convex, we give a convex approximation. Finally, numerical tests on the random instances were conducted for both problems. With respect to LPPC, the numerical results showed that the approach we proposed outperforms Bonferroni and Jagannathan approximations. While for the MPP, the numerical results on generated instances substantiated that the convex approximation outperforms the individual approximation method.Then we study a distributionally robust stochastic quadratic knapsack problems, where we only know part of information about the random variables, such as its first and second moments. We proved that the single knapsack problem (SKP) is a semedefinite problem (SDP) after applying the SDP relaxation scheme to the binary constraints. Despite the fact that it is not the case for the multidimensional knapsack problem (MKP), two good approximations of the relaxed version of the problem are provided which obtain upper and lower bounds that appear numerically close to each other for a range of problem instances. Our numerical experiments also indicated that our proposed lower bounding approximation outperforms the approximations that are based on Bonferroni's inequality and the work by Zymler et al.. Besides, an extensive set of experiments were conducted to illustrate how the conservativeness of the robust solutions does pay off in terms of ensuring the chance constraint is satisfied (or nearly satisfied) under a wide range of distribution fluctuations. Moreover, our approach can be applied to a large number of stochastic optimization problems with binary variables.Finally, a stochastic version of the shortest path problem is studied. We proved that in some cases the stochastic shortest path problem can be greatly simplified by reformulating it as the classic shortest path problem, which can be solved in polynomial time. To solve the general problem, we proposed to use a branch-and-bound framework to search the set of feasible paths. Lower bounds are obtained by solving the corresponding linear relaxation which in turn is done using a Stochastic Projected Gradient algorithm involving an active set method. Meanwhile, numerical examples were conducted to illustrate the effectiveness of the obtained algorithm. Concerning the resolution of the continuous relaxation, our Stochastic Projected Gradient algorithm clearly outperforms Matlab optimization toolbox on large graphs.
192

Transformada imagem-floresta com funções de conexidade não suaves: pesos adaptativos, polaridade de borda e restrições de forma / Image foresting transform with non-smooth connectivity functions: adaptive weights, boundary polarity, and shape constraints

Mansilla, Lucy Alsina Choque 26 February 2014 (has links)
Segmentar uma imagem consiste em particioná-la em regiões relevantes para uma dada aplicação, como para isolar um objeto de interesse no domínio de uma imagem. A segmentação é um dos problemas mais fundamentais e desafiadores em processamento de imagem e visão computacional. Ela tem desempenhado um papel importante, por exemplo, na pesquisa em neurologia, envolvendo imagens de Ressonância Magnética (RM), para fins de diagnóstico e tratamento de doenças relacionadas com alterações na anatomia do cérebro humano. Métodos de segmentação baseados na transformada imagem- floresta (IFT, Image Foresting Transform), com funções de conexidade suaves, possuem resultados ótimos, segundo o critério da otimalidade dos caminhos descrito no artigo original da IFT, e têm sido usados com sucesso em várias aplicações, como por exemplo na segmentação de imagens RM de 1.5 Tesla. No entanto, esses métodos carecem de restrições de regularização de borda, podendo gerar segmentações com fronteiras muito irregulares e indesejadas. Eles também não distinguem bem entre bordas similares com orientações opostas, e possuem alta sensibilidade à estimativa dos pesos das arestas do grafo, gerando problemas em imagens com efeitos de inomogeneidade. Nesse trabalho são propostas extensões da IFT, do ponto de vista teórico e experimental, através do uso de funções de conexidade não suaves, para a segmentação interativa de imagens por região. A otimalidade dos novos métodos é suportada pela maximização de energias de corte em grafo, ou como o fruto de uma sequência de iterações de otimização de caminhos em grafos residuais. Como resultados principais temos: O projeto de funções de conexidade mais adaptativas e flexíveis, com o uso de pesos dinâmicos, que permitem um melhor tratamento de imagens com forte inomogeneidade. O uso de grafos direcionados, de modo a explorar a polaridade de borda dos objetos na segmentação por região, e o uso de restrições de forma que ajudam a regularizar a fronteira delineada, favorecendo a segmentação de objetos com formas mais regulares. Esses avanços só foram possíveis devido ao uso de funções não suaves. Portanto, a principal contribuição desse trabalho consiste no suporte teórico para o uso de funções não suaves, até então evitadas na literatura, abrindo novas perpectivas na pesquisa de processamento de imagens usando grafos. / Segmenting an image consist in to partition it into relevant regions for a given application, as to isolate an object of interest in the domain of an image. Segmentation is one of the most fundamental and challenging problems in image processing and computer vision. It has played an important role, for example, in neurology research, involving images of Magnetic Resonance (MR), for the purposes of diagnosis and treatment of diseases related to changes in the anatomy of the human brain. Segmentation methods based on the Image Foresting Transform (IFT), with smooth connectivity functions, have optimum results, according to the criterion of path optimality described in the original IFT paper, and have been successfully used in many applications as, for example, the segmentation of MR images of 1.5 Tesla. However, these methods present a lack of boundary regularization constraints and may produce segmentations with quite irregular and undesired boundaries. They also do not distinguish well between similar boundaries with opposite orientations, and have high sensitivity to the arc-weight estimation of the graph, producing poor results in images with strong inhomogeneity effects. In this work, we propose extensions of the IFT framework, from the theoretical and experimental points of view, through the use of non-smooth connectivity functions for region-based interactive image segmentation. The optimality of the new methods is supported by the maximization of graph cut energies, or as the result of a sequence of paths optimizations in residual graphs. We have as main results: The design of more adaptive and flexible connectivity functions, with the use of dynamic weights, that allow better handling of images with strong inhomogeneity. The use of directed graphs to exploit the boundary polarity of the objects in region-based segmentation, and the use of shape constraints that help to regularize the segmentation boundary, by favoring the segmentation of objects with more regular shapes. These advances were only made possible by the use of non-smooth functions. Therefore, the main contribution of this work is the theoretical support for the usage of non-smooth functions, which were until now avoided in literature, opening new perspectives in the research of image processing using graphs.
193

Impact of Cascading Failures on Performance Assessment of Civil Infrastructure Systems

Adachi, Takao 05 March 2007 (has links)
Water distribution systems, electrical power transmission systems, and other civil infrastructure systems are essential to the smooth and stable operation of regional economies. Since the functions of such infrastructure systems often are inter-dependent, the systems sometimes suffer unforeseen functional disruptions. For example, the widespread power outage due to the malfunction of an electric power substation, which occurred in the northeastern United States and parts of Canada in August 2003, interrupted the supply of water to several communities, leading to inconvenience and economic losses. The sequence of such failures leading to widespread outages is referred to as a cascading failure. Assessing the vulnerability of communities to natural and man-made hazards should take the possibility of such failures into account. In seismic risk assessment, the risk to a facility or a building is generally specified by one of two basic approaches: through a probabilistic seismic hazard analysis (PSHA) and a stipulated scenario earthquake (SE). A PSHA has been widely accepted as a basis for design and evaluation of individual buildings, bridges and other facilities. However, the vulnerability assessment of distributed infrastructure facilities requires a model of spatial intensity of earthquake ground motion. Since the ground motions from a PSHA represent an aggregation of earthquakes, they cannot model the spatial variation in intensity. On the other hand, when a SE-based analysis is used, the spatial correlation of seismic intensities must be properly evaluated. This study presents a new methodology for evaluating the functionality of an infrastructure system situated in a region of moderate seismicity considering functional interactions among the systems in the network, cascading failure, and spatial correlation of ground motion. The functional interactions among facilities in the systems are modeled by fault trees, and the impact of cascading failures on serviceability of a networked system is computed by a procedure from the field of operations research known as a shortest path algorithm. The upper and lower bound solutions to spatial correlation of seismic intensities over a region are obtained.
194

Méthodes exactes et heuristiques pour le problème de tournées de véhicules avec fenêtres de temps et réutilisation de véhicules

Azi, Nabila 08 1900 (has links)
Cette thèse porte sur les problèmes de tournées de véhicules avec fenêtres de temps où un gain est associé à chaque client et où l'objectif est de maximiser la somme des gains recueillis moins les coûts de transport. De plus, un même véhicule peut effectuer plusieurs tournées durant l'horizon de planification. Ce problème a été relativement peu étudié en dépit de son importance en pratique. Par exemple, dans le domaine de la livraison de denrées périssables, plusieurs tournées de courte durée doivent être combinées afin de former des journées complètes de travail. Nous croyons que ce type de problème aura une importance de plus en plus grande dans le futur avec l'avènement du commerce électronique, comme les épiceries électroniques, où les clients peuvent commander des produits par internet pour la livraison à domicile. Dans le premier chapitre de cette thèse, nous présentons d'abord une revue de la littérature consacrée aux problèmes de tournées de véhicules avec gains ainsi qu'aux problèmes permettant une réutilisation des véhicules. Nous présentons les méthodologies générales adoptées pour les résoudre, soit les méthodes exactes, les méthodes heuristiques et les méta-heuristiques. Nous discutons enfin des problèmes de tournées dynamiques où certaines données sur le problème ne sont pas connues à l'avance. Dans le second chapitre, nous décrivons un algorithme exact pour résoudre un problème de tournées avec fenêtres de temps et réutilisation de véhicules où l'objectif premier est de maximiser le nombre de clients desservis. Pour ce faire, le problème est modélisé comme un problème de tournées avec gains. L'algorithme exact est basé sur une méthode de génération de colonnes couplée avec un algorithme de plus court chemin élémentaire avec contraintes de ressources. Pour résoudre des instances de taille réaliste dans des temps de calcul raisonnables, une approche de résolution de nature heuristique est requise. Le troisième chapitre propose donc une méthode de recherche adaptative à grand voisinage qui exploite les différents niveaux hiérarchiques du problème (soit les journées complètes de travail des véhicules, les routes qui composent ces journées et les clients qui composent les routes). Dans le quatrième chapitre, qui traite du cas dynamique, une stratégie d'acceptation et de refus des nouvelles requêtes de service est proposée, basée sur une anticipation des requêtes à venir. L'approche repose sur la génération de scénarios pour différentes réalisations possibles des requêtes futures. Le coût d'opportunité de servir une nouvelle requête est basé sur une évaluation des scénarios avec et sans cette nouvelle requête. Enfin, le dernier chapitre résume les contributions de cette thèse et propose quelques avenues de recherche future. / This thesis studies vehicle routing problems with time windows, where a gain is associated with each customer and where the objective is to maximize the total gain collected minus the routing costs. Furthermore. the same vehicle might be assigned to different routes during the planning horizon. This problem has received little attention in the literature in spite of its importance in practice. For example, in the home delivery of perishable goods (like food), routes of short duration must be combined to form complete workdays. We believe that this type of problem will become increasingly important in the future with the advent of electronic services, like e-groceries, where customers can order goods through the Internet and get these goods delivered at home. In the first chapter of this thesis, we present a review of vehicle routing problems with gains, as well as vehicle routing problems with multiple use of vehicles. We discuss the general classes of problem-solving approaches for these problems, namely, exact methods, heuristics and metaheuristics. We also introduce dynamic vehicle routing problems, where new information is revealed as the routes are executed. In the second chapter, we describe an exact algorithm for a vehicle routing problem with time windows and multiple use of vehicles, where the first objective is to maximize the number of served customers. To this end, the problem is modeled as a vehicle routing problem with gains. The exact algorithm is based on column generation, coupled with an elementary shortest path algorithm with resource constraints. To solve realistic instances in reasonable computation times, a heuristic approach is required. The third chapter proposes an adaptative large neighborhood search where the various hierarchical levels of the problem are exploited (i.e., complete vehicle workdays, routes within workdays and customers within routes). The fourth chapter deals with the dynamic case. In this chapter, a strategy for accepting or rejecting new customer requests is proposed. This strategy is based on the generation of multiple scenarios for different realizations of the requests in the future. An opportunity cost for serving a new request is then computed, based on an evaluation of the scenarios with and without the new request. Finally, the last chapter summarizes the contributions of this thesis and proposes future research avenues.
195

Caracterização de redes complexas: aplicação à modelagem relacional entre sistemas autônomos da Internet / Complex networks characterization: application to relational modeling between internet autonomous systems

Nilton Alves Junior 29 March 2007 (has links)
Neste trabalho, foram utilizadas técnicas e conceitos tipicamente encontrados em estudos de Redes Complexas, uma sub-área da Física Estatística, para caracterizar a Internet e sua evolução em uma década, de 1998 a 2007. Foi considerada como unidade básica de análise, a estrutura Sistema Autônomo. Nesta caracterização, foram utilizadas várias ferramentas computacionais desenvolvidas em linguagem C/C++, que permitiram classificar, simular e modelar propriedades dinâmicas. Dentre estas propriedades podemos destacar o coeficiente de conectividade, fundamental para os estudos topológicos, e o parâmetro menor caminho médio, ambos baseados nas propriedades da matriz adjacência. Os dados experimentais foram inicialmente obtidos nos roteadores de borda da RedeRio de Computadores - FAPERJ e posteriormente, os dados relativos ao intervalo de estudo, foram retirados da base de dados disponibilizada pela Universidade de Oregon. Foi proposto um modelo de crescimento de uma rede complexa baseado nas premissas de crescimento contínuo e conexão preferencial não linear com suporte aos mecanismos de rearranjo e novas conexões entre nós já existentes. Este modelo se mostrou bastante adequado no estudo das propriedades consideradas. Foi desenvolvido um método para cálculo do menor caminho médio que apresentou performance superior àqueles normalmente utilizados pela comunidade acadêmica. O comportamento da topologia sob o ponto de vista da distribuição de probabilidades de conexão e do ranque de conectividade, apresentaram comportamento linear constante no período estudado com coeficientes médios iguais a -2,0 e -0,93, respectivamente. O parâmetro menor caminho médio global da Internet permaneceu praticamente inalterado e igual a 4, 2 ao longo da década estudada. / Connection networks are observed in many areas of human knowledge. The characterization and topological studies of these networks may be performed through distribution of connectivity degrees, rank properties, shortest path length between nodes, adjacency matrix etc, typical concepts from Complex networks, a filed of study of Statistical Physics domain. In this thesis we characterize the Internet connections evolution from 1998 to 2007. The Internet may be seen under several levels of reach and complexity considering different basic units. A wide vision is to consider the Internet basic element as an Autonomous System - AS, which is defined as a cluster of LANs or routers submitted to the same policy of usage, connectivity and technically administrated by the same network management group. The complex network considered in this work is composed by Autonomous Systems (vertices) and the established tra connection (edges) between them obtained from the BGP routing table. Many interesting property of this networks is analyzed, e.g. degree distribution (the rank and outdegree exponents) from 1998 to 2007 and the shortest path length (L), obtained by a proposed computational method (Friburgo algorithm) among each pair of ASs represented in the adjacency matrix. Finally, we present the behavior of the power law function and the shortest path length of the Internet for each year. Simulations of the connections network were carried out by a proposed model developed from continuous growth premises, possibilities of new and rearranging connections. This model was based on the concept of potential preferable connection showing a stable exponential factor that reproduces the true shortest path parameter over the decade.
196

Caracterização de redes complexas: aplicação à modelagem relacional entre sistemas autônomos da Internet / Complex networks characterization: application to relational modeling between internet autonomous systems

Nilton Alves Junior 29 March 2007 (has links)
Neste trabalho, foram utilizadas técnicas e conceitos tipicamente encontrados em estudos de Redes Complexas, uma sub-área da Física Estatística, para caracterizar a Internet e sua evolução em uma década, de 1998 a 2007. Foi considerada como unidade básica de análise, a estrutura Sistema Autônomo. Nesta caracterização, foram utilizadas várias ferramentas computacionais desenvolvidas em linguagem C/C++, que permitiram classificar, simular e modelar propriedades dinâmicas. Dentre estas propriedades podemos destacar o coeficiente de conectividade, fundamental para os estudos topológicos, e o parâmetro menor caminho médio, ambos baseados nas propriedades da matriz adjacência. Os dados experimentais foram inicialmente obtidos nos roteadores de borda da RedeRio de Computadores - FAPERJ e posteriormente, os dados relativos ao intervalo de estudo, foram retirados da base de dados disponibilizada pela Universidade de Oregon. Foi proposto um modelo de crescimento de uma rede complexa baseado nas premissas de crescimento contínuo e conexão preferencial não linear com suporte aos mecanismos de rearranjo e novas conexões entre nós já existentes. Este modelo se mostrou bastante adequado no estudo das propriedades consideradas. Foi desenvolvido um método para cálculo do menor caminho médio que apresentou performance superior àqueles normalmente utilizados pela comunidade acadêmica. O comportamento da topologia sob o ponto de vista da distribuição de probabilidades de conexão e do ranque de conectividade, apresentaram comportamento linear constante no período estudado com coeficientes médios iguais a -2,0 e -0,93, respectivamente. O parâmetro menor caminho médio global da Internet permaneceu praticamente inalterado e igual a 4, 2 ao longo da década estudada. / Connection networks are observed in many areas of human knowledge. The characterization and topological studies of these networks may be performed through distribution of connectivity degrees, rank properties, shortest path length between nodes, adjacency matrix etc, typical concepts from Complex networks, a filed of study of Statistical Physics domain. In this thesis we characterize the Internet connections evolution from 1998 to 2007. The Internet may be seen under several levels of reach and complexity considering different basic units. A wide vision is to consider the Internet basic element as an Autonomous System - AS, which is defined as a cluster of LANs or routers submitted to the same policy of usage, connectivity and technically administrated by the same network management group. The complex network considered in this work is composed by Autonomous Systems (vertices) and the established tra connection (edges) between them obtained from the BGP routing table. Many interesting property of this networks is analyzed, e.g. degree distribution (the rank and outdegree exponents) from 1998 to 2007 and the shortest path length (L), obtained by a proposed computational method (Friburgo algorithm) among each pair of ASs represented in the adjacency matrix. Finally, we present the behavior of the power law function and the shortest path length of the Internet for each year. Simulations of the connections network were carried out by a proposed model developed from continuous growth premises, possibilities of new and rearranging connections. This model was based on the concept of potential preferable connection showing a stable exponential factor that reproduces the true shortest path parameter over the decade.
197

INTERFACE DE ANÁLISE DA INTERCONEXÃO EM UMA LAN USANDO CORBA / Software development (graphical user interface) that makes possible to analyze the interconnection in a LAN (Local Area Network) using CORBA (Common Object Request Broker Architecture)

MONTEIRO, Milson Silva 07 June 2002 (has links)
Made available in DSpace on 2016-08-17T14:52:43Z (GMT). No. of bitstreams: 1 Milson Monteiro.pdf: 1924077 bytes, checksum: 78f931b493f756dec0edee7a465e1099 (MD5) Previous issue date: 2002-06-07 / Conselho Nacional de Desenvolvimento Científico e Tecnológico / This works concern software development (graphical user interface) that makes possible to analyze the interconnection in a LAN (Local Area Network) using CORBA (Common Object Request Broker Architecture) on distributed and heterogeneous environment among several outlying machines. This works presents paradigms of graphs theory: shortest paths problems (Dijkstra-Ford-Moore-Belman), maximum flow problems (Edmonds-Karp) and minimum cost flow problems (Busacker-Gowen) to formalize the interface development. We discoursed on the graphs theory and networks flows that are essentials to guarantee theoretical insight. / O objeto de estudo deste trabalho é o desenvolvimento de um software (interface gráfica do usuário) que possibilita analisar a interconexão de uma LAN (Local Area Network) usando CORBA (Common Object Request Broker Architecture) em ambientes distribuídos e heterogêneos entre diversas máquinas periféricas. Este trabalho apresenta os paradigmas da teoria de grafos: menor caminho (Dijkstra, Ford-Moore-Belman), fluxo máximo (Edmonds-Karp) e fluxo de custo mínimo (Busacker-Gowen) para formalizar o desenvolvimento da interface. Discorremos sobre a teoria de grafos e fluxos em redes que são relevantes para garantir o embasamento teórico.
198

Transformada imagem-floresta com funções de conexidade não suaves: pesos adaptativos, polaridade de borda e restrições de forma / Image foresting transform with non-smooth connectivity functions: adaptive weights, boundary polarity, and shape constraints

Lucy Alsina Choque Mansilla 26 February 2014 (has links)
Segmentar uma imagem consiste em particioná-la em regiões relevantes para uma dada aplicação, como para isolar um objeto de interesse no domínio de uma imagem. A segmentação é um dos problemas mais fundamentais e desafiadores em processamento de imagem e visão computacional. Ela tem desempenhado um papel importante, por exemplo, na pesquisa em neurologia, envolvendo imagens de Ressonância Magnética (RM), para fins de diagnóstico e tratamento de doenças relacionadas com alterações na anatomia do cérebro humano. Métodos de segmentação baseados na transformada imagem- floresta (IFT, Image Foresting Transform), com funções de conexidade suaves, possuem resultados ótimos, segundo o critério da otimalidade dos caminhos descrito no artigo original da IFT, e têm sido usados com sucesso em várias aplicações, como por exemplo na segmentação de imagens RM de 1.5 Tesla. No entanto, esses métodos carecem de restrições de regularização de borda, podendo gerar segmentações com fronteiras muito irregulares e indesejadas. Eles também não distinguem bem entre bordas similares com orientações opostas, e possuem alta sensibilidade à estimativa dos pesos das arestas do grafo, gerando problemas em imagens com efeitos de inomogeneidade. Nesse trabalho são propostas extensões da IFT, do ponto de vista teórico e experimental, através do uso de funções de conexidade não suaves, para a segmentação interativa de imagens por região. A otimalidade dos novos métodos é suportada pela maximização de energias de corte em grafo, ou como o fruto de uma sequência de iterações de otimização de caminhos em grafos residuais. Como resultados principais temos: O projeto de funções de conexidade mais adaptativas e flexíveis, com o uso de pesos dinâmicos, que permitem um melhor tratamento de imagens com forte inomogeneidade. O uso de grafos direcionados, de modo a explorar a polaridade de borda dos objetos na segmentação por região, e o uso de restrições de forma que ajudam a regularizar a fronteira delineada, favorecendo a segmentação de objetos com formas mais regulares. Esses avanços só foram possíveis devido ao uso de funções não suaves. Portanto, a principal contribuição desse trabalho consiste no suporte teórico para o uso de funções não suaves, até então evitadas na literatura, abrindo novas perpectivas na pesquisa de processamento de imagens usando grafos. / Segmenting an image consist in to partition it into relevant regions for a given application, as to isolate an object of interest in the domain of an image. Segmentation is one of the most fundamental and challenging problems in image processing and computer vision. It has played an important role, for example, in neurology research, involving images of Magnetic Resonance (MR), for the purposes of diagnosis and treatment of diseases related to changes in the anatomy of the human brain. Segmentation methods based on the Image Foresting Transform (IFT), with smooth connectivity functions, have optimum results, according to the criterion of path optimality described in the original IFT paper, and have been successfully used in many applications as, for example, the segmentation of MR images of 1.5 Tesla. However, these methods present a lack of boundary regularization constraints and may produce segmentations with quite irregular and undesired boundaries. They also do not distinguish well between similar boundaries with opposite orientations, and have high sensitivity to the arc-weight estimation of the graph, producing poor results in images with strong inhomogeneity effects. In this work, we propose extensions of the IFT framework, from the theoretical and experimental points of view, through the use of non-smooth connectivity functions for region-based interactive image segmentation. The optimality of the new methods is supported by the maximization of graph cut energies, or as the result of a sequence of paths optimizations in residual graphs. We have as main results: The design of more adaptive and flexible connectivity functions, with the use of dynamic weights, that allow better handling of images with strong inhomogeneity. The use of directed graphs to exploit the boundary polarity of the objects in region-based segmentation, and the use of shape constraints that help to regularize the segmentation boundary, by favoring the segmentation of objects with more regular shapes. These advances were only made possible by the use of non-smooth functions. Therefore, the main contribution of this work is the theoretical support for the usage of non-smooth functions, which were until now avoided in literature, opening new perspectives in the research of image processing using graphs.
199

Systém navigace pomocí GPS pro účely cementárenské technologie / GPS Navigation system for cement technology

Mináč, Ján January 2009 (has links)
This diploma work deals with proposal and implementation GPS navigation system. Work includes described basic types of cement quarry and proposal to create mathematical and software model of quarry. Then work is devoted to possibilities of basic described algorithms for searching the shortest way in graph and two algorithms are described. They are Floyd-Warshall and Dikjstra algorithms. The work describe implementation of Dijkstra algorithm to model of quarry and description of the programs Autec RouteEditor and AQL Control Library. MINÁČ, J. GPS navifgation system for cement for cement technology. Brno: Brno university of technology, Faculty of electrical engineering and communication, 2009. 90 p. Supervisor prof. Ing. František Zezulka, CSc.
200

Optimal Route Planning for Electric Vehicles / Optimal Route Planning for Electric Vehicles

Juřík, Tomáš January 2013 (has links)
In this work we present algorithms that are capable of calculating paths to destination for electric vehicles. These paths can be based on the simple metrics such as the distance, time or the paths can be based on more advanced metric such as the minimum energy demanding metric. This metric is parameterizable by the physical construction of the electrical vehicle. We also propose a new algorithm that computes energy optimal paths that are more acceptable by the driver, because it also takes into consideration the time metric while computing the path.

Page generated in 0.058 seconds