Spelling suggestions: "subject:"entropy maximization"" "subject:"syntropy maximization""
1 |
Image Thresholding Technique Based On Fuzzy Partition And Entropy MaximizationZhao, Mansuo January 2005 (has links)
Thresholding is a commonly used technique in image segmentation because of its fast and easy application. For this reason threshold selection is an important issue. There are two general approaches to threshold selection. One approach is based on the histogram of the image while the other is based on the gray scale information located in the local small areas. The histogram of an image contains some statistical data of the grayscale or color ingredients. In this thesis, an adaptive logical thresholding method is proposed for the binarization of blueprint images first. The new method exploits the geometric features of blueprint images. This is implemented by utilizing a robust windows operation, which is based on the assumption that the objects have "e;C"e; shape in a small area. We make use of multiple window sizes in the windows operation. This not only reduces computation time but also separates effectively thin lines from wide lines. Our method can automatically determine the threshold of images. Experiments show that our method is effective for blueprint images and achieves good results over a wide range of images. Second, the fuzzy set theory, along with probability partition and maximum entropy theory, is explored to compute the threshold based on the histogram of the image. Fuzzy set theory has been widely used in many fields where the ambiguous phenomena exist since it was proposed by Zadeh in 1965. And many thresholding methods have also been developed by using this theory. The concept we are using here is called fuzzy partition. Fuzzy partition means that a histogram is parted into several groups by some fuzzy sets which represent the fuzzy membership of each group because our method is based on histogram of the image . Probability partition is associated with fuzzy partition. The probability distribution of each group is derived from the fuzzy partition. Entropy which originates from thermodynamic theory is introduced into communications theory as a commonly used criteria to measure the information transmitted through a channel. It is adopted by image processing as a measurement of the information contained in the processed images. Thus it is applied in our method as a criterion for selecting the optimal fuzzy sets which partition the histogram. To find the threshold, the histogram of the image is partitioned by fuzzy sets which satisfy a certain entropy restriction. The search for the best possible fuzzy sets becomes an important issue. There is no efficient method for the searching procedure. Therefore, expansion to multiple level thresholding with fuzzy partition becomes extremely time consuming or even impossible. In this thesis, the relationship between a probability partition (PP) and a fuzzy C-partition (FP) is studied. This relationship and the entropy approach are used to derive a thresholding technique to select the optimal fuzzy C-partition. The measure of the selection quality is the entropy function defined by the PP and FP. A necessary condition of the entropy function arriving at a maximum is derived. Based on this condition, an efficient search procedure for two-level thresholding is derived, which makes the search so efficient that extension to multilevel thresholding becomes possible. A novel fuzzy membership function is proposed in three-level thresholding which produces a better result because a new relationship among the fuzzy membership functions is presented. This new relationship gives more flexibility in the search for the optimal fuzzy sets, although it also increases the complication in the search for the fuzzy sets in multi-level thresholding. This complication is solved by a new method called the "e;Onion-Peeling"e; method. Because the relationship between the fuzzy membership functions is so complicated it is impossible to obtain the membership functions all at once. The search procedure is decomposed into several layers of three-level partitions except for the last layer which may be a two-level one. So the big problem is simplified to three-level partitions such that we can obtain the two outmost membership functions without worrying too much about the complicated intersections among the membership functions. The method is further revised for images with a dominant area of background or an object which affects the appearance of the histogram of the image. The histogram is the basis of our method as well as of many other methods. A "e;bad"e; shape of the histogram will result in a bad thresholded image. A quadtree scheme is adopted to decompose the image into homogeneous areas and heterogeneous areas. And a multi-resolution thresholding method based on quadtree and fuzzy partition is then devised to deal with these images. Extension of fuzzy partition methods to color images is also examined. An adaptive thresholding method for color images based on fuzzy partition is proposed which can determine the number of thresholding levels automatically. This thesis concludes that the "e;C"e; shape assumption and varying sizes of windows for windows operation contribute to a better segmentation of the blueprint images. The efficient search procedure for the optimal fuzzy sets in the fuzzy-2 partition of the histogram of the image accelerates the process so much that it enables the extension of it to multilevel thresholding. In three-level fuzzy partition the new relationship presentation among the three fuzzy membership functions makes more sense than the conventional assumption and, as a result, performs better. A novel method, the "e;Onion-Peeling"e; method, is devised for dealing with the complexity at the intersection among the multiple membership functions in the multilevel fuzzy partition. It decomposes the multilevel partition into the fuzzy-3 partitions and the fuzzy-2 partitions by transposing the partition space in the histogram. Thus it is efficient in multilevel thresholding. A multi-resolution method which applies the quadtree scheme to distinguish the heterogeneous areas from the homogeneous areas is designed for the images with large homogeneous areas which usually distorts the histogram of the image. The new histogram based on only the heterogeneous area is adopted for partition and outperforms the old one. While validity checks filter out the fragmented points which are only a small portion of the whole image. Thus it gives good thresholded images for human face images.
|
2 |
Image Thresholding Technique Based On Fuzzy Partition And Entropy MaximizationZhao, Mansuo January 2005 (has links)
Thresholding is a commonly used technique in image segmentation because of its fast and easy application. For this reason threshold selection is an important issue. There are two general approaches to threshold selection. One approach is based on the histogram of the image while the other is based on the gray scale information located in the local small areas. The histogram of an image contains some statistical data of the grayscale or color ingredients. In this thesis, an adaptive logical thresholding method is proposed for the binarization of blueprint images first. The new method exploits the geometric features of blueprint images. This is implemented by utilizing a robust windows operation, which is based on the assumption that the objects have "e;C"e; shape in a small area. We make use of multiple window sizes in the windows operation. This not only reduces computation time but also separates effectively thin lines from wide lines. Our method can automatically determine the threshold of images. Experiments show that our method is effective for blueprint images and achieves good results over a wide range of images. Second, the fuzzy set theory, along with probability partition and maximum entropy theory, is explored to compute the threshold based on the histogram of the image. Fuzzy set theory has been widely used in many fields where the ambiguous phenomena exist since it was proposed by Zadeh in 1965. And many thresholding methods have also been developed by using this theory. The concept we are using here is called fuzzy partition. Fuzzy partition means that a histogram is parted into several groups by some fuzzy sets which represent the fuzzy membership of each group because our method is based on histogram of the image . Probability partition is associated with fuzzy partition. The probability distribution of each group is derived from the fuzzy partition. Entropy which originates from thermodynamic theory is introduced into communications theory as a commonly used criteria to measure the information transmitted through a channel. It is adopted by image processing as a measurement of the information contained in the processed images. Thus it is applied in our method as a criterion for selecting the optimal fuzzy sets which partition the histogram. To find the threshold, the histogram of the image is partitioned by fuzzy sets which satisfy a certain entropy restriction. The search for the best possible fuzzy sets becomes an important issue. There is no efficient method for the searching procedure. Therefore, expansion to multiple level thresholding with fuzzy partition becomes extremely time consuming or even impossible. In this thesis, the relationship between a probability partition (PP) and a fuzzy C-partition (FP) is studied. This relationship and the entropy approach are used to derive a thresholding technique to select the optimal fuzzy C-partition. The measure of the selection quality is the entropy function defined by the PP and FP. A necessary condition of the entropy function arriving at a maximum is derived. Based on this condition, an efficient search procedure for two-level thresholding is derived, which makes the search so efficient that extension to multilevel thresholding becomes possible. A novel fuzzy membership function is proposed in three-level thresholding which produces a better result because a new relationship among the fuzzy membership functions is presented. This new relationship gives more flexibility in the search for the optimal fuzzy sets, although it also increases the complication in the search for the fuzzy sets in multi-level thresholding. This complication is solved by a new method called the "e;Onion-Peeling"e; method. Because the relationship between the fuzzy membership functions is so complicated it is impossible to obtain the membership functions all at once. The search procedure is decomposed into several layers of three-level partitions except for the last layer which may be a two-level one. So the big problem is simplified to three-level partitions such that we can obtain the two outmost membership functions without worrying too much about the complicated intersections among the membership functions. The method is further revised for images with a dominant area of background or an object which affects the appearance of the histogram of the image. The histogram is the basis of our method as well as of many other methods. A "e;bad"e; shape of the histogram will result in a bad thresholded image. A quadtree scheme is adopted to decompose the image into homogeneous areas and heterogeneous areas. And a multi-resolution thresholding method based on quadtree and fuzzy partition is then devised to deal with these images. Extension of fuzzy partition methods to color images is also examined. An adaptive thresholding method for color images based on fuzzy partition is proposed which can determine the number of thresholding levels automatically. This thesis concludes that the "e;C"e; shape assumption and varying sizes of windows for windows operation contribute to a better segmentation of the blueprint images. The efficient search procedure for the optimal fuzzy sets in the fuzzy-2 partition of the histogram of the image accelerates the process so much that it enables the extension of it to multilevel thresholding. In three-level fuzzy partition the new relationship presentation among the three fuzzy membership functions makes more sense than the conventional assumption and, as a result, performs better. A novel method, the "e;Onion-Peeling"e; method, is devised for dealing with the complexity at the intersection among the multiple membership functions in the multilevel fuzzy partition. It decomposes the multilevel partition into the fuzzy-3 partitions and the fuzzy-2 partitions by transposing the partition space in the histogram. Thus it is efficient in multilevel thresholding. A multi-resolution method which applies the quadtree scheme to distinguish the heterogeneous areas from the homogeneous areas is designed for the images with large homogeneous areas which usually distorts the histogram of the image. The new histogram based on only the heterogeneous area is adopted for partition and outperforms the old one. While validity checks filter out the fragmented points which are only a small portion of the whole image. Thus it gives good thresholded images for human face images.
|
3 |
An Evaluation of Traffic Matrix Estimation Techniques for Large-Scale IP NetworksAdelani, Titus Olufemi 09 February 2010 (has links)
The information on the volume of traffic flowing between all possible origin and destination pairs in an IP network during a given period of time is generally referred to as traffic matrix (TM). This information, which is very important for various traffic engineering tasks, is very costly and difficult to obtain on large operational IP network, consequently it is often inferred from readily available link load measurements.
In this thesis, we evaluated 5 TM estimation techniques, namely Tomogravity (TG), Entropy Maximization (EM), Quadratic Programming (QP), Linear Programming (LP) and Neural Network (NN) with gravity and worst-case bound (WCB) initial estimates. We found that the EM technique performed best, consistently, in most of our simulations and that the gravity model yielded better initial estimates than the WCB model. A hybrid of these techniques did not result in considerable decrease in estimation errors. We, however, achieved most significant reduction in errors by combining iterative proportionally-fitted estimates with the EM technique. Therefore, we propose this technique as a viable approach for estimating the traffic matrix of large-scale IP networks.
|
4 |
An Evaluation of Traffic Matrix Estimation Techniques for Large-Scale IP NetworksAdelani, Titus Olufemi 09 February 2010 (has links)
The information on the volume of traffic flowing between all possible origin and destination pairs in an IP network during a given period of time is generally referred to as traffic matrix (TM). This information, which is very important for various traffic engineering tasks, is very costly and difficult to obtain on large operational IP network, consequently it is often inferred from readily available link load measurements.
In this thesis, we evaluated 5 TM estimation techniques, namely Tomogravity (TG), Entropy Maximization (EM), Quadratic Programming (QP), Linear Programming (LP) and Neural Network (NN) with gravity and worst-case bound (WCB) initial estimates. We found that the EM technique performed best, consistently, in most of our simulations and that the gravity model yielded better initial estimates than the WCB model. A hybrid of these techniques did not result in considerable decrease in estimation errors. We, however, achieved most significant reduction in errors by combining iterative proportionally-fitted estimates with the EM technique. Therefore, we propose this technique as a viable approach for estimating the traffic matrix of large-scale IP networks.
|
5 |
Inferência de emoções em fragmentos de textos obtidos do Facebook / Inference of emotions in fragments of texts obtained from the FacebookMedeiros, Richerland Pinto [UNESP] 27 April 2017 (has links)
Submitted by Richerland Pinto Medeiros null (rick.land@gmail.com) on 2017-06-27T15:12:38Z
No. of bitstreams: 1
DISSERTACAO_RICHERLAND_MEDEIROS.pdf: 1209454 bytes, checksum: 251490a058f4248162de9508b4627e65 (MD5) / Approved for entry into archive by LUIZA DE MENEZES ROMANETTO (luizamenezes@reitoria.unesp.br) on 2017-06-27T17:04:08Z (GMT) No. of bitstreams: 1
medeiros_rp_me_bauru.pdf: 1209454 bytes, checksum: 251490a058f4248162de9508b4627e65 (MD5) / Made available in DSpace on 2017-06-27T17:04:09Z (GMT). No. of bitstreams: 1
medeiros_rp_me_bauru.pdf: 1209454 bytes, checksum: 251490a058f4248162de9508b4627e65 (MD5)
Previous issue date: 2017-04-27 / Esta pesquisa tem como objetivo analisar o uso da técnica estatística de aprendizado de máquina Maximização de Entropia, voltado para tarefas de processamento de linguagem natural na inferência de emoções em textos obtidos da rede social Facebook. Foram estudados os conceitos primordiais das tarefas de processamento de linguagem natural, os conceitos inerentes a teoria da informação, bem como o aprofundamento no conceito de um modelo entrópico como classificador de textos. Os dados utilizados na presente pesquisa foram obtidos de textos curtos, ou seja, textos com no máximo 500 caracteres. A técnica em questão foi abordada dentro do aprendizado supervisionado de máquina, logo, parte dos dados coletados foram usados como exemplos marcados dentro de um conjunto de classes predefinidas, a fim de induzir o mecanismo de aprendizado a selecionar a classe de emoção mais provável dado o exemplo analisado. O método proposto obteve índice de assertividade médio de 90%, baseado no modelo de validação cruzada. / This research aims to analyze the use of entropy maximization machine learning statistical technique, focused on natural language processing tasks in the inferencing of emotions in short texts from Facebook social network. Were studied the primary concepts of natural language processing tasks, IT intrinsic concepts, as well as deepening the concept of Entropy model as a text classifier. All data used for this research came from short texts found in social networks and had 500 characters or less. The model was used within supervised machine learning, therefore, part of the collected data was used as examples marked within a set of predefined classes in order to induce the learning mechanism to select the most probable emotion class given the analyzed sample. The method has obtained the mean accuracy rate of 90%, based on the cross-validation model.
|
6 |
Análise estocástica do comportamento dinâmico de estruturas via métodos probabilísticos / Stochastic analysis of structural dynamic behavior via probabilistic methodsFabro, Adriano Todorovic 16 August 2018 (has links)
Orientador: José Roberto de França Arruda / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Mecânica / Made available in DSpace on 2018-08-16T06:24:37Z (GMT). No. of bitstreams: 1
Fabro_AdrianoTodorovic_M.pdf: 6602156 bytes, checksum: 3a18dd67bde7f65ae2e4dd268670356d (MD5)
Previous issue date: 2010 / Resumo: Esta dissertação tem como objetivo geral levar 'a realidade industrial subsídios para a modelagem e análise de sistemas mecânicos lineares com variabilidade, assim como metodologias computacionais para quantificação de incertezas, para fins de aplicação em projeto. Neste sentido, foram realizados estudos sobre técnicas de modelagem e análise estocástica de sistemas mecânicos lineares aplicadas, inicialmente, a algumas estruturas simples, de baixo custo computacional, por meio de simulações em MatLabR. Propõe-se uma abordagem probabilística para a modelagem de incertezas baseada no Princípio da Máxima Entropia para a flexibilidade relativa a uma trinca aberta e não propagante em uma barra modelada através do Método do Elemento Espectral (SEM). Também é apresentada uma abordagem para o tratamento de problemas de campo aleatório utilizando o SEM, onde são utilizadas soluções analíticas da decomposição de Karhunen-Lo'eve. Uma formulação para elementos de viga do tipo Euler-Bernoulli é apresentada e um exemplo em que a rigidez à flexão é modelada como um campo aleatório Gaussiano é tratado. Uma abordagem para análise estocástica do comportamento dinâmico de uma tampa de compressor hermético é proposta. Uma aproximação por elementos finitos obtida com o software Ansys R foi utilizada para representar o comportamento determinístico de uma tampa de compressor, e duas abordagens de modelagem estocástica são comparadas. Um ensaio experimental foi realizado com tampas nominalmente idênticas, sendo medidas apenas frequências naturais com excitação por impacto, de modo a
se poder compará-las com os valores obtidos teoricamente / Abstract: This dissertation has as a general objective to bring to the industrial reality subsidies for modeling and analysis of linear mechanical systems with variability, as well as computational methodologies to the uncertainty quantification, aiming industrial design applications. In that sense, theoretical studies about stochastic modeling and analysis for mechanical linear systems were performed. They were applied, firstly, to simple and computationally low cost structures using MatlabR. In that sense, a probabilistic modeling approach based on the Maximum Entropy Principle was proposed to treat the flexibility related to an open and nonpropagating crack in a rod modeled using the Spectral Element Method (SEM). An approach for the treatment of random field problems using SEM, which uses analytical solutions of the Karhunen-Lo'eve Decomposition, is also addressed. An Euler-Bernoulli beam formulation was used, and an example where the flexural stiffness is modeled as a Gaussian random field is presented. A finite element approximation obtained with the software Ansys R was used to represent the deterministic dynamic behavior of a compressor cap shell, and two stochastic modeling approaches were compared. Experiments were performed using nominally identical cap samples. Natural frequencies were measured using impact excitation in order to compare
with the theoretical results / Mestrado / Mecanica dos Sólidos e Projeto Mecanico / Mestre em Engenharia Mecânica
|
7 |
Spectral Estimation by Geometric, Topological and Optimization MethodsEnqvist, Per January 2001 (has links)
QC 20100601
|
8 |
Algorithmes de mise à l'échelle et méthodes tropicales en analyse numérique matricielleSharify, Meisam 01 September 2011 (has links) (PDF)
L'Algèbre tropicale peut être considérée comme un domaine relativement nouveau en mathématiques. Elle apparait dans plusieurs domaines telles que l'optimisation, la synchronisation de la production et du transport, les systèmes à événements discrets, le contrôle optimal, la recherche opérationnelle, etc. La première partie de ce manuscrit est consacrée a l'étude des applications de l'algèbre tropicale à l'analyse numérique matricielle. Nous considérons tout d'abord le problème classique de l'estimation des racines d'un polynôme univarié. Nous prouvons plusieurs nouvelles bornes pour la valeur absolue des racines d'un polynôme en exploitant les méthodes tropicales. Ces résultats sont particulièrement utiles lorsque l'on considère des polynômes dont les coefficients ont des ordres de grandeur différents. Nous examinons ensuite le problème du calcul des valeurs propres d'une matrice polynomiale. Ici, nous introduisons une technique de mise à l'échelle générale, basée sur l'algèbre tropicale, qui s'applique en particulier à la forme compagnon. Cette mise à l'échelle est basée sur la construction d'une fonction polynomiale tropicale auxiliaire, ne dépendant que de la norme des matrices. Les raciness (les points de non-différentiabilité) de ce polynôme tropical fournissent une pré-estimation de la valeur absolue des valeurs propres. Ceci se justifie en particulier par un nouveau résultat montrant que sous certaines hypothèses faites sur le conditionnement, il existe un groupe de valeurs propres bornées en norme. L'ordre de grandeur de ces bornes est fourni par la plus grande racine du polynôme tropical auxiliaire. Un résultat similaire est valable pour un groupe de petites valeurs propres. Nous montrons expérimentalement que cette mise à l'échelle améliore la stabilité numérique, en particulier dans des situations où les données ont des ordres de grandeur différents. Nous étudions également le problème du calcul des valeurs propres tropicales (les points de non-différentiabilité du polynôme caractéristique) d'une matrice polynômiale tropicale. Du point de vue combinatoire, ce problème est équivalent à trouver une fonction de couplage: la valeur d'un couplage de poids maximum dans un graphe biparti dont les arcs sont valués par des fonctions convexes et linéaires par morceaux. Nous avons développé un algorithme qui calcule ces valeurs propres tropicales en temps polynomial. Dans la deuxième partie de cette thèse, nous nous intéressons à la résolution de problèmes d'affectation optimale de très grande taille, pour lesquels les algorithms séquentiels classiques ne sont pas efficaces. Nous proposons une nouvelle approche qui exploite le lien entre le problème d'affectation optimale et le problème de maximisation d'entropie. Cette approche conduit à un algorithme de prétraitement pour le problème d'affectation optimale qui est basé sur une méthode itérative qui élimine les entrées n'appartenant pas à une affectation optimale. Nous considérons deux variantes itératives de l'algorithme de prétraitement, l'une utilise la méthode Sinkhorn et l'autre utilise la méthode de Newton. Cet algorithme de prétraitement ramène le problème initial à un problème beaucoup plus petit en termes de besoins en mémoire. Nous introduisons également une nouvelle méthode itérative basée sur une modification de l'algorithme Sinkhorn, dans lequel un paramètre de déformation est lentement augmenté. Nous prouvons que cette méthode itérative(itération de Sinkhorn déformée) converge vers une matrice dont les entrées non nulles sont exactement celles qui appartiennent aux permutations optimales. Une estimation du taux de convergence est également présentée.
|
9 |
An Exposition of Performance-Security Trade-offs in RANETs Based on Quantitative Network ModelsMiskeen, Guzlan M.A., Kouvatsos, Demetres D., Habib Zadeh, Esmaeil January 2013 (has links)
No / Security mechanisms, such as encryption and authentication protocols, require extra computing resources and therefore, have an adverse effect upon the performance of robotic mobile wireless ad hoc networks (RANETs). Thus, an optimal performance and security trade-off should be one of the main aspects that should be taken into consideration during the design, development, tuning and upgrading of such networks. In this context, an exposition is initially undertaken on the applicability of Petri nets (PNs) and queueing networks (QNs) in conjunction with their generalisations and hybrid integrations as robust quantitative modelling tools for the performance analysis of discrete flow systems, such as computer systems, communication networks and manufacturing systems. To overcome some of the inherent limitations of these models, a novel hybrid modelling framework is explored for the quantitative evaluation of RANETs, where each robotic node is represented by an abstract open hybrid G-GSPN_QN model with head-of-line priorities, subject to combined performance and security metrics (CPSMs). The proposed model focuses on security processing and state-based control and it is based on an open generalised stochastic PN (GSPN) with a gated multi-class 'On-Off' traffic and mobility model. Moreover, it employs a power consumption model and is linked in tandem with an arbitrary QN consisting of finite capacity channel queues with blocking for 'intra' robot component-to-component communication and 'inter' robot-to-robot transmission. Conclusions and future research directions are included.
|
Page generated in 0.1105 seconds