81 |
The effects of cultural differences on business communication : A study in OnlineFactory.asiaOdell, Fanny, Näsberg, Victoria January 2020 (has links)
Cultural differences have always been the ground to barriers within communication. Due to globalization this is becoming an increasingly important aspect of how business communication is done. Therefore, there is a need for organizations to have the skillset to deal with the cultural differences that a cross-cultural business environment entails. To understand the effect of cultural differences in business communication, more specifically the Swedish and the Filipino culture at OnlineFactory.asia, a qualitative research approach was used to conduct a case study. Interviews with two employees were conducted, one of Filipino descent and the other with both Swedish and Filipino roots. The study has followed the structure as can be seen in the frame of reference according to the studies three main subjects: globalization, culture and communication and business communication. The data gathered through the interviews showed that cultural differences have a significant impact on business communication, as can be seen through the study done at OnlineFactory.asia. The two cultures that were studied differ in cultural values which in turn had an effect on their communication. The Hofstede framework was used to analyze the differences between the two cultures and used as guidance throughout this study. Factors like language barriers, adaptability and cultural values have shown to have an impact on the way that the two employees communicate within business. OnlineFactory.asia’s educational program is a way to use cultural differences and turn them into their competitive advantage. Having an understanding of each other's cultural differences, provides the tool needed to minimize miscommunications that often occur in a cross-cultural workplace.
|
82 |
Modeling the speed-accuracy tradeoff using the tools of information theory / Modélisation du compromis vitesse précision d'une tâche de pointage humain à l'aide des outils de la théorie de l'informationGori, Julien 20 December 2018 (has links)
La loi de Fitts, qui relie le temps de mouvement MT dans une tache de pointage aux dimensions de la cible visée D et W est usuellement exprimée à partir d’une imitation de la formule de la capacité de Shannon MT = a + b log 2 (1 + D/W). Toutefois, l’analyse actuelle est insatisfaisante: elle provient d’une simple analogie entre la tache de pointage et la transmission d’un signal sur un canal bruité sans qu’il n’y ait de modèle explicite de communication.Je développe d’abord un modèle de transmission pour le pointage, où l’indice de difficulté ID = log 2 (1 + D/W) s’exprime aussi bien comme une entropie de source et une capacité de canal, permettant ainsi de réconcilier dans un premier temps l’approche de Fitts avec la théorie de l’information de Shannon. Ce modèle est ensuite exploité pour analyser des données de pointage récoltées lors d’expérimentations contrôlées mais aussi en conditions d’utilisations réelles.Je développe ensuite un second modèle, focalisé autour de la forte variabilité caractéristique du mouvement humain et qui prend en compte la forte diversité des mécanismes de contrôle du mouvement: avec ou sans voie de retour, par intermittence ou de manière continue. À partir d’une chronométrie de la variance positionnelle, évaluée à partir d’un ensemble de trajectoires, on remarque que le mouvement peut-être découpé en deux phases: une première où la variance augmente et une grande partie de la distance à couvrir est parcourue, est suivie d’une deuxième au cours de laquelle la variance diminue pour satisfaire les contraintes de précision requises par la tache.Dans la deuxième phase, le problème du pointage peut-être ramené à un problème de communication à la Shannon, où l’information est transmise d’une“source” (variance à la fin de la première phase) à une “destination” (extrémité du membre) à travers un canal Gaussien avec la présence d’une voie de retour.Je montre que la solution optimale à ce problème de transmission revient à considérer un schéma proposé par Elias. Je montre que la variance peut décroitre au mieux exponentiellement au cours de la deuxième phase, et que c’est ce résultat qui implique directement la loi de Fitts. / Fitts’ law, which relates movement time MTin a pointing task to the target’s dimensions D and Wis usually expressed by mimicking Shannon’s capacityformula MT = a + b log 2 (1 + D/W). Yet, the currentlyreceived analysis is incomplete and unsatisfactory: itstems from a vague analogy and there is no explicitcommunication model for pointing.I first develop a transmission model for pointing taskswhere the index of difficulty ID = log 2 (1 + D/W) isthe expression of both a source entropy and a chan-nel capacity, thereby reconciling Shannon’s informa-tion theory with Fitts’ law. This model is then levera-ged to analyze pointing data gathered from controlledexperiments but also from field studies.I then develop a second model which builds on thevariability of human movements and accounts for thetremendous diversity displayed by movement control:with of without feedback, intermittent or continuous.From a chronometry of the positional variance, eva-luated from a set of trajectories, it is observed thatmovement can be separated into two phases: a firstwhere the variance increases over time and wheremost of the distance to the target is covered, follo-wed by a second phase where the variance decreasesuntil it satisfies accuracy constraints. During this se-cond phase, the problem of aiming can be reduced toa Shannon-like communication problem where infor-mation is transmitted from a “source” (variance at theend of the first phase), to a “destination” (the limb ex-tremity) over a “channel” perturbed by Gaussian noisewith a feedback link. I show that the optimal solution tothis transmission problem amounts to a scheme firstsuggested by Elias. I show that the variance can de-crease at best exponentially during the second phase,and that this result induces Fitts’ law.
|
83 |
Intensity of agricultural land use and climate effects on bird biodiversity along a Greek Natura 2000 site and implications for sustainable agro-managementSoulopoulou, Polyxeni 29 September 2021 (has links)
In this work it is address the question of how certain climatic variables may be significant related to alterations of avian biodiversity in a semi-agricultural Natura wetland side in Northern Greece. Particularly, the current research highlights the effects of climate and land cover intensity on the Thermaikos gulf bird biodiversity and its importance for healthy ecosystem functioning. Also, the maintenance of a good state of conservation in the Thermaikos gulf has direct impacts on a larger scale since it benefits the rest of the Natura wetlands network considering the connectivity related to migratory birds. Furthermore, the methodology which is used is essential to help inform the science-based management of environments that support threatened and endangered wildlife and can be further applied to other wetlands in the Mediterranean with similar weather conditions and agricultural land use. The alteration in compositional diversity of bird abundances has been studied at the species level from 2012 to 2017 in one of the most important wetland Natura sites in Northern Greece and by using different biodiversity indices. Shannon Entropy was lower during 2012 (DH = 1.509) albeit remained in similar levels from 2013 and afterwards. The highest values of Shannon Entropy were recorded in 2014 (DH = 2.927) and 2016 (DH = 2.888) suggesting that there is a higher diversity compared to the other observation years and especially 2012. The yearly trends of the Simpson dominance index and the Gini-Simpson Index had quite similar patterns. The Berger-Parker index, DD, which represents the maximum proportion of any species estimated in the sample assemblage, had its highest values in 2012 (DD = 0.58) and 2017 (DD = 0.39) and its lowest in 2014 (DD = 0.13) and 2016 (DD = 0.15). A complete characterization of diversity was possible through the projection of Hill numbers and the Rényi entropy, parameterized by the order q in terms of an empirical curve. According to the Hills numbers pooled over the years, the mean species abundance (q = 0) was estimated at 31 species, the mean biodiversity (q = 1) was 13 species and the most dominant species (q = 2) were 8 species. The quantification of bird biodiversity in the particular research area patterns is a fundamental task to evaluate current management actions, improve conservation and design future management strategies. Moreover, the interplay between temperature, relative humidity and three different bird biodiversity indexes, including Shannon Entropy, Simpson’s dominance (evenness) index and the Berger-Parker index has been also examined. By using different modeling approaches, parametric and non- parametric multivariate models, we make effort to get a consensus on the interrelationships between climate and avian biodiversity. In particular, it is been shown that in most cases nonlinear models and surface-plot analysis methodology, are able to capture the relation of a considerable increase in the estimated biodiversity indexes with increased temperatures and rain levels. Thus, biodiversity is to a significant extent affected by the aforementioned climate factors at a proximate level involving synergies between the different climate factors. Finally, the combined effect of climate variables and remote sensing land cover indicators on bird richness has been also explored to detect any influence on bird diversity due to agricultural intensification. In particular the association between bird richness and environmental drivers, as well as remote sensed land cover indices was explored for seven successive seasons using correlation analysis and a Cox-Box transformed multivariate linear model. Three climate variables were tested: mean temperature, rain level and mean relative humidity and three land cover indices: the Normalized Difference Vegetation Index (NDVI), the Atmospheric Resistance Vegetation Index (ARVI) and an Agricultural Band Combination Index (ABCI). Among the environmental drivers explored, temperature, rain levels and ABCI were significantly correlated to bird richness in contrast to NDVI and ARVI which showed a lower correlation, while relative humidity displayed the poorest correlation. Additionally, the multivariable linear model indicates that temperature, rain levels and ABCI have a statistically significant effect (p<0.05) on bird species richness accounting for 73,02% of data variability. Based on the overall model results and the related 3D contour plot model simulations, we conclude that bird species richness increases with an increase in temperature and rain levels, as well as with a decrease in agricultural intensity (ABCI). Concluding, in most cases temperature, rain levels and agricultural intensity significantly influenced bird richness in a combined manner. Furthermore, agricultural intensification has resulted in most cases in the loss of bird richness. Understanding the factors that can affect the biodiversity is of great importance for rational land use planning and conservation management of semi-Natural areas. Agriculture is the main driving force that influences the topographic and biological diversity of Europe, shaping the natural landscape of the European countryside for thousands of years. Revealing potential interrelationship between biodiversity, climate drivers and landscape indicators, although is a complex—even though challenging—task, contributing to our understanding of the mechanisms connecting climate change with ecosystem functioning. Moreover, a better understanding of biodiversity functioning in relation to human activities in natural protected areas as well as climate is essential for biodiversity awareness and the design of effective biodiversity-related conservation management policies.
|
84 |
Cryptographic Key Extraction and Neural Leakage EstimationBergström, Didrik January 2024 (has links)
We investigate the extraction of cryptographic keying material from nano-scale variations of digital circuit outputs by using nested polar codes and neural leakage estimators. A runtime-efficient algorithm is developed to simulate such a system. A certain family of digital circuit outputs are known to be a source of randomness that can be used as a unique identifier for each output. By generating secret keys from these unique outputs, one can apply cryptographic methods by using the secret keys as the seed. One is required to store extra helper data generated first time the outputs are measured, since there is noise in digital circuit outputs, to be able to reconstruct the same key from every measurement of the same digital circuit. The generation of the secret keys and helper data follow a nested polar code construction, and they are generated in this thesis to estimate the Shannon entropy of the secret key and secrecy leakage to a passive attacker using neural networks. The estimators used illustrate, for the first time, that the system generates secret keys of almost maximum entropy and negligible secrecy leakage for practical cryptographic systems if the digital circuit outputs can be preprocessed to obtain almost independent and identically distributed (i.i.d.) random outputs distributed according to a binary uniform distribution. The algorithm design is evaluated and improvements for lower runtime are suggested. Ideas for future research are presented.
|
85 |
A entropia segundo Claude Shannon: o desenvolvimento do conceito fundamental da teoria da informação / Entropy according to Claude Shannon: the development of the fundamental concept of information theoryPineda, José Octávio de Carvalho 20 April 2006 (has links)
Made available in DSpace on 2016-04-28T14:16:25Z (GMT). No. of bitstreams: 1
PINEDA, J O C - A Entropia segundo Claude Shannon.pdf: 616031 bytes, checksum: 1a97063c96e16c98801a4bc8853e1530 (MD5)
Previous issue date: 2006-04-20 / This dissertation s objective is to investigate the origins of the concept of Entropy as defined by Claude Shannon in the development of the Information Theory, as well as the influences that this concept and other ones from the same theory had over other sciences, especially in Physics.
Starting from its origin in Mechanical Statistics, the concept of entropy was transformed into a measure of amount of information by Shannon. Since then the approach proposed by Information Theory has influenced other areas of knowledge and there were many attempts of integrating it with physical theories. The analysis on Information Theory main authors works viewed under a historical outlook, added to the analysis of proposals for its integration with Physics will allow to demonstrate that the integration is currently at the level of approach to physical problems and not at a more fundamental level as it was some scientists expectation / Esta dissertação tem por objetivo investigar as origens do conceito de Entropia formulado por Claude Shannon no desenvolvimento da Teoria da Informação, bem como as influências que este e outros conceitos da mesma teoria tiveram em outras ciências, em especial a Física.
Partindo de sua origem na Mecânica Estatística, o conceito de Entropia foi transformado numa medida de quantidade de informação por Shannon. Desde então, a abordagem proposta pela Teoria da Informação influenciou outras áreas do conhecimento, e ocorreram várias tentativas de integrá-la às teorias físicas. A análise das obras dos principais formuladores da Teoria da Informação, colocadas em seu contexto histórico, aliada à análise das propostas de integração desta teoria com a Física permitirá demonstrar que a interação atual entre as áreas ainda se dá ao nível de abordagem dos problemas físicos, e não numa forma mais fundamental como era a expectativa de alguns cientistas
|
86 |
Utilisation de la notion de copule en tomographie / Using the notion of copula in tomographyPougaza, Doriano-Boris 16 December 2011 (has links)
Cette thèse porte sur le lien entre la tomographie et la notion de copule. La tomographie à rayons X consiste à (re)construire la structure cachée d'un objet (une densité de matière, la distribution d'une quantité physique, ou une densité de loi conjointe) à partir de certaines données obtenues ou mesurées de l'objet (les projections, les radiographies, les densités marginales). Le lien entre les mesures et l'objet se modélise mathématiquement par la Transformée à Rayons X ou la Transformée de Radon. Par exemple, dans les problèmes d'imagerie en géométrie parallèle, lorsqu'on a seulement deux projections à deux angles de 0 et pi/2 (horizontale et verticale), le problème peut être identifié comme un autre problème très important en mathématique qui est la détermination d'une densité conjointe à partir de ses marginales. En se limitant à deux projections, les deux problèmes sont des problèmes mal posés au sens de Hadamard. Il faut alors ajouter de l'information a priori, ou bien des contraintes supplémentaires. L'apport principal de cette thèse est l'utilisation des critères de plusieurs entropies (Rényi, Tsallis, Burg, Shannon) permettant d'aboutir à une solution régularisée. Ce travail couvre alors différents domaines. Les aspects mathématiques de la tomographie via l'élément fondamental qui est la transformée de Radon. En probabilité sur la recherche d'une loi conjointe connaissant ses lois marginales d'où la notion de ``copule'' via le théorème de Sklar. Avec seulement deux projections, ce problème est extrêmement difficile. Mais en assimilant les deux projections (normalisées) aux densités marginales et l'image à reconstruire à une densité de probabilité, le lien se fait et les deux problèmes sont équivalents et peuvent se transposer dans le cadre statistique. Pour caractériser toutes les images possibles à reconstruire on a choisi alors l'outil de la théorie de probabilité, c'est-à-dire les copules. Et pour faire notre choix parmi les copules ou les images nous avons imposé le critère d'information a priori qui se base sur différentes entropies. L'entropie est une quantité scientifique importante car elle est utilisée dans divers domaines (en Thermodynamique, en théorie de l'information, etc). Ainsi, en utilisant par exemple l'entropie de Rényi nous avons découvert de nouvelles classes de copules. Cette thèse apporte de nouvelles contributions à l'imagerie, par l'interaction entre les domaines qui sont la tomographie et la théorie des probabilités et statistiques. / This thesis studies the relationship between Computed Tomography (CT) and the notion of copula. In X-ray tomography the objective is to (re)construct an image representing the distribution of a physical quantity (density of matter) inside of an object from the radiographs obtained all around the object called projections. The link between these images and the object is described by the X-ray transform or the Radon transform. In 2D, when only two projections at two angles 0 and pi/2 (horizontal and vertical) are available, the problem can be identified as another problem in mathematics which is the determination of a joint density from its marginals, hence the notion of copula. Both problems are ill-posed in the sense of Hadamard. It requires prior information or additional criteria or constraints. The main contribution of this thesis is the use of entropy as a constraint that provides a regularized solution to this ill-posed inverse problem. Our work covers different areas. The mathematics aspects of X-ray tomography where the fundamental model to obtain projections is based mainly on the Radon transform. In general this transform does not provide all necessary projections which need to be associated with certain regularization techniques. We have two projections, which makes the problem extremely difficult, and ill-posed but noting that if a link can be done, that is, if the two projections can be equated with marginal densities and the image to reconstruct to a probability density, the problem translates into the statistical framework via Sklar's theorem. And the tool of probability theory called "copula" that characterizes all possible reconstructed images is suitable. Hence the choice of the image that will be the best and most reliable arises. Then we must find techniques or a criterion of a priori information, one of the criteria most often used, we have chosen is a criterion of entropy. Entropy is an important scientific quantity because it is used in various areas, originally in thermodynamics, but also in information theory. Different types of entropy exist (Rényi, Tsallis, Burg, Shannon), we have chosen some as criteria. Using the Rényi entropy we have discovered new copulas. This thesis provides new contributions to CT imaging, the interaction between areas that are tomography and probability theory and statistics.
|
87 |
Parametric, Non-Parametric And Statistical Modeling Of Stony Coral Reef DataHoare, Armando 08 April 2008 (has links)
Like coral reefs worldwide, the Florida Reef Tract has dramatically declined within the past two decades. Monitoring of 40 sites throughout the Florida Keys National Marine Sanctuary has undertaken a multiple-parameter approach to assess spatial and temporal changes in the status of the ecosystem. The objectives of the present study consist of the following:
In chapter one, we review past coral reef studies; emphasis is placed on recent studies on the stony corals of reefs in the lower Florida Keys. We also review the economic impact of coral reefs on the state of Florida. In chapter two, we identify the underlying probability distribution function of the stony coral cover proportions and we obtain better estimates of the statistical properties of stony coral cover proportions. Furthermore, we improve present procedures in constructing confidence intervals of the true median and mean for the underlying probability distribution.
In chapter three, we investigate the applicability of the normal probability distribution assumption made on the pseudovalues obtained from the jackknife procedure for the Shannon-Wiener diversity index used in previous studies. We investigate a new and more effective approach to estimating the Shannon-Wiener and Simpson's diversity index.
In chapter four, we develop the best possible estimate of the probability distribution function of the jackknifing pseudovalues, obtained from the jackknife procedure for the Shannon-Wiener diversity index used in previous studies, using the xi nonparametric kernel density estimate method. This nonparametric procedure gives very effective estimates of the statistical measures for the jackknifing pseudovalues.
Lastly, the present study develops a predictive statistical model for stony coral cover. In addition to identifying the attributable variables that influence the stony coral cover data of the lower Florida Keys, we investigate the possible interactions present. The final form of the developed statistical model gives good estimates of the stony coral cover given some information of the attributable variables. Our nonparametric and parametric approach to analyzing coral reef data provides a sound basis for developing efficient ecosystem models that estimate future trends in coral reef diversity. This will give the scientists and managers another tool to help monitor and maintain a healthy ecosystem.
|
88 |
Prise en compte des hétérogénéités dans la restitution de l'eau nuageuse et des précipitations par radiométrie micro-onde passiveLafont, Damien 27 January 2005 (has links) (PDF)
L'observation par satellites des précipitations grâce aux radiomètres micro-ondes (MW) constitue l'une des bases de l'étude du climat. Cependant, le remplissage partiel du champ de vision des radiomètres conduit à des sous-estimations des taux de pluies retrouvés (beam-filling effect, BFE). A partir de simulations (nuages et rayonnement), la relation BFE-couverture nuageuse fractionnaire sous-pixel (CF) est étudiée. Les résultats montrent une dépendance avec le type de nuage et un maximum pour une faible couverture. CF étant le principal facteur à prendre en compte pour une correction des estimations par MW, un couplage température de brillance micro-onde-CF en entrée d'un algorithme par réseaux de neurones est effectué. Pour tester cette approche, des données satellites MW, radar et IR sont utilisées, puis deux couvertures sous-pixel (dérivées des mesures IR et radar) sont combinées aux TB. Les résultats sont encourageants et montrent l'avantage d'une synergie entre différents capteurs. Dans l'optique d'une classification intégrée aux restitutions, une méthode a été développée à méso-échelle. La classification est obtenue à partir de données IR et ne nécessite pas de seuil ciel clair / ciel nuageux.
|
89 |
Analyse et construction de codes LDPC non-binaires pour des canaux à évanouissementGorgolione, Matteo 25 October 2012 (has links) (PDF)
Au cours des 15 dernières années, des progrès spectaculaires dans l'analyse et la conception des codes définis par des graphes bipartites et décodables par des algorithmes itératifs ont permis le développement de systèmes de correction d'erreurs, avec des performances de plus en plus proches la limite théorique de Shannon. Dans ce contexte, un rôle déterminant a été joué par la famille des codes à matrice de parité creuse, appelés codes LDPC (pour " Low-Density Parity-Check ", en anglais), introduit par Gallager au début des années 60 et décrits plus tard en termes de graphes bipartites. Négligés pendant de longues années, ces codes ont été redécouverts à la fin des années 90, après que la puissance du décodage itératif a été mise en évidence grâce à l'invention des Turbo-codes. Ce n'est qu'au début des années 2000 que les techniques nécessaires à l'analyse et l'optimisation des codes LDPC ont été développées, techniques qui ont permis ensuite la construction des codes avec des performances asymptotiques proches de la limite de Shannon. Cette remarquable avancée a motivé l'intérêt croissant de la communauté scientifique et soutenu le transfert rapide de cette technologie vers le secteur industriel. Plus récemment, un intérêt tout particulier a été porté aux codes LDPC définis sur des alphabets non-binaires, grâce notamment à leur meilleure capacité de correction en " longueur finie ". Bien que Gallager ait déjà proposé l'utilisation des alphabets non-binaires, en utilisant l'arithmétique modulaire, les codes LDPC non-binaires définis sur les corps finis n'ont étés étudiés qu'à partir de la fin des années 90. Il a été montré que ces codes offrent de meilleures performances que leurs équivalents binaires lorsque le bloc codé est de longueur faible à modérée, ou lorsque les symboles transmis sur le canal sont eux-mêmes des symboles non- binaires, comme par exemple dans le cas des modulations d'ordre supérieur ou des canaux à antennes multiples. Cependant, ce gain en performance implique un coût non négligeable en termes de complexité de décodage, qui peut entraver l'utilisation des codes LDPC non binaires dans des systèmes réels, surtout lorsque le prix à payer en complexité est plus important que le gain en performance. Cette thèse traite de l'analyse et de la conception des codes LDPC non binaires pour des canaux à évanouissements. L'objectif principal de la thèse est de démontrer que, outre le gain en performance en termes de capacité de correction, l'emploi des codes LDPC non binaires peut apporter des bénéfices supplémentaires, qui peuvent compenser l'augmentation de la complexité du décodeur. La " flexibilité " et la " diversité " représentent les deux bénéfices qui seront démontrées dans cette thèse. La " flexibilité " est la capacité d'un système de codage de pouvoir s'adapter à des débits (rendements) variables tout en utilisant le même encodeur et le même décodeur. La " diversité " se rapporte à sa capacité d'exploiter pleinement l'hétérogénéité du canal de communication. La première contribution de cette thèse consiste à développer une méthode d'approximation de l'évolution de densité des codes LDPC non-binaires, basée sur la simulation Monte-Carlo d'un code " infini ". Nous montrons que la méthode proposée fournit des estimations très fines des performances asymptotiques des codes LDPC non-binaires et rend possible l'optimisation de ces codes pour une large gamme d'applications et de modèles de canaux. La deuxième contribution de la thèse porte sur l'analyse et la conception de système de codage flexible, utilisant des techniques de poinçonnage. Nous montrons que les codes LDPC non binaires sont plus robustes au poinçonnage que les codes binaires, grâce au fait que les symboles non-binaires peuvent être partialement poinçonnés. Pour les codes réguliers, nous montrons que le poinçonnage des codes non-binaires obéit à des règles différentes, selon que l'on poinçonne des symboles de degré 2 ou des symboles de degré plus élevé. Pour les codes irréguliers, nous proposons une procédure d'optimisation de la " distribution de poinçonnage ", qui spécifie la fraction de bits poinçonnés par symbole non-binaire, en fonction du degré du symbole. Nous présentons ensuite des distributions de poinçonnage optimisées pour les codes LDPC non binaires, avec des performances à seulement 0,2 - 0,5 dB de la capacité, pour des rendements poinçonnés variant de 0,5 à 0,9. La troisième contribution de la thèse concerne les codes LDPC non binaires transmis sur un canal de Rayleigh à évanouissements rapides, pour lequel chaque symbole modulé est affecté par un coefficient d'évanouissement différent. Dans le cas d'une correspondance biunivoque entre les symboles codés et les symboles modulés (c.-à-d. lorsque le code est définit sur un corps fini de même cardinalité que la constellation utilisée), certains symboles codés peuvent être complètement noyés dans le bruit, dû aux évanouissements profonds du canal. Afin d'éviter ce phénomène, nous utilisons un module d'entrelacement au niveau bit, placé entre l'encodeur et le modulateur. Au récepteur, le module de désentrelacement apporte de la diversité binaire en entrée du décodeur, en atténuant les effets des différents coefficients de fading. Nous proposons un algorithme d'entrelacement optimisé, inspirée de l'algorithme " Progressive Edge-Growth " (PEG). Ainsi, le graphe bipartite du code est élargi par un nouvel ensemble de nœuds représentant les symboles modulés, et l'algorithme proposé établit des connections entre les nœuds représentant les symboles modulés et ceux représentant les symboles codés, de manière à obtenir un graphe élargi de maille maximale. Nous montrons que l'entrelaceur optimisé permet d'obtenir un gain de performance par rapport à un entrelaceur aléatoire, aussi bien en termes de capacité de correction que de détection d'erreurs. Enfin, la quatrième contribution de la thèse consiste en un schéma de codage flexible, permettant d'atteindre la diversité maximale d'un canal à évanouissements par blocs. La particularité de notre approche est d'utiliser des codes Root-LDPC non binaires couplés avec des codes multiplicatifs non binaires, de manière à ce que le rendement de codage puisse facilement s'adapter au nombre de blocs d'évanouissement. Au niveau du récepteur, une simple technique de combinaison de diversité est utilisée en entrée du décodeur. Comme conséquence, la complexité du décodage reste inchangée quel que soit le nombre de blocs d'évanouissement et le rendement du code utilisé, tandis que la technique proposée apporte un réel bénéfice en termes de capacité de correction.
|
90 |
On Generalized Measures Of Information With Maximum And Minimum Entropy PrescriptionsDukkipati, Ambedkar 03 1900 (has links)
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where P and R are probability measures on a measurable space (X, ), plays a basic role in the
definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and other classical information measures can be expressed in terms of KL-entropy and
hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. An important theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem which equips KL-entropy with a fundamental definition and can be stated as: measure-theoretic KL-entropy equals the supremum of KL-entropies over all measurable partitions of X . In this thesis we provide the measure-theoretic formulations for ‘generalized’ information measures, and
state and prove the corresponding GYP-theorem – the ‘generalizations’ being in the sense of R ´enyi and nonextensive, both of which are explained below.
Kolmogorov-Nagumo average or quasilinear mean of a vector x = (x1, . . . , xn) with respect to a pmf p= (p1, . . . , pn)is defined ashxiψ=ψ−1nk=1pkψ(xk), whereψis an arbitrarycontinuous and strictly monotone function. Replacing linear averaging in Shannon entropy with Kolmogorov-Nagumo averages (KN-averages) and further imposing the additivity constraint – a characteristic property of underlying information associated with single event, which is logarithmic – leads to the definition of α-entropy or R ´enyi entropy. This is the first formal well-known generalization of Shannon entropy. Using this recipe of R´enyi’s generalization, one can prepare only two information measures: Shannon and R´enyi entropy. Indeed, using this formalism R´enyi
characterized these additive entropies in terms of axioms of KN-averages. On the other hand, if one generalizes the information of a single event in the definition of Shannon entropy, by replacing the logarithm with the so called q-logarithm, which is defined as lnqx =x1− 1 −1 −q , one gets
what is known as Tsallis entropy. Tsallis entropy is also a generalization of Shannon entropy but it does not satisfy the additivity property. Instead, it satisfies pseudo-additivity of the form x ⊕qy = x + y + (1 − q)xy, and hence it is also known as nonextensive entropy. One can apply
R´enyi’s recipe in the nonextensive case by replacing the linear averaging in Tsallis entropy with KN-averages and thereby imposing the constraint of pseudo-additivity. A natural question that
arises is what are the various pseudo-additive information measures that can be prepared with this recipe? We prove that Tsallis entropy is the only one. Here, we mention that one of the important characteristics of this generalized entropy is that while canonical distributions resulting from ‘maximization’ of Shannon entropy are exponential in nature, in the Tsallis case they result in power-law distributions.
The concept of maximum entropy (ME), originally from physics, has been promoted to a general principle of inference primarily by the works of Jaynes and (later on) Kullback. This connects information theory and statistical mechanics via the principle: the states of thermodynamic equi-
librium are states of maximum entropy, and further connects to statistical inference via select the probability distribution that maximizes the entropy. The two fundamental principles related to the concept of maximum entropy are Jaynes maximum entropy principle, which involves maximizing Shannon entropy and the Kullback minimum entropy principle that involves minimizing relative-entropy, with respect to appropriate moment constraints.
Though relative-entropy is not a metric, in cases involving distributions resulting from relative-entropy minimization, one can bring forth certain geometrical formulations. These are reminiscent of squared Euclidean distance and satisfy an analogue of the Pythagoras’ theorem. This property
is referred to as Pythagoras’ theorem of relative-entropy minimization or triangle equality and plays a fundamental role in geometrical approaches to statistical estimation theory like information geometry. In this thesis we state and prove the equivalent of Pythagoras’ theorem in the
nonextensive formalism. For this purpose we study relative-entropy minimization in detail and present some results.
Finally, we demonstrate the use of power-law distributions, resulting from ME-rescriptions
of Tsallis entropy, in evolutionary algorithms. This work is motivated by the recently proposed generalized simulated annealing algorithm based on Tsallis statistics.
To sum up, in light of their well-known axiomatic and operational justifications, this thesis establishes some results pertaining to the mathematical significance of generalized measures of information. We believe that these results represent an important contribution towards the ongoing
research on understanding the phenomina of information.
(For formulas pl see the original document)
ii
|
Page generated in 0.0463 seconds