Spelling suggestions: "subject:"dgraphical modes"" "subject:"boigraphical modes""
1 |
Exit charts based analysis and design of rateless codes for the erasure and Gaussian channelsMothi Venkatesan, Sabaresan 02 June 2009 (has links)
Luby Transform Codes were the first class of universal erasure codes introduced
to fully realize the concept of scalable and fault‐tolerant distribution of data over
computer networks, also called Digital Fountain. Later Raptor codes, a generalization of
the LT codes were introduced to trade off complexity with performance. In this work,
we show that an even broader class of codes exists that are near optimal for the
erasure channel and that the Raptor codes form a special case. More precisely, Raptorlike
codes can be designed based on an iterative (joint) decoding schedule wherein
information is transferred between the LT decoder and an outer decoder in an iterative
manner. The design of these codes can be formulated as a LP problem using EXIT Charts
and density evolution. In our work, we show the existence of codes, other than the
Raptor codes, that perform as good as the existing ones.
We extend this framework of joint decoding of the component codes to the
additive white Gaussian noise channels and introduce the design of Rateless codes for
these channels. Under this setting, for asymptotic lengths, it is possible to design codes
that work for a class of channels defined by the signal‐to‐noise ratio. In our work, we
show that good profiles can be designed using density evolution and Gaussian
approximation. EXIT charts prove to be an intuitive tool and aid in formulating the code
design problem as a LP problem. EXIT charts are not exact because of the inherent
approximations. Therefore, we use density evolution to analyze the performance of these codes. In the Gaussian case, we show that for asymptotic lengths, a range of
designs of Rateless codes exists to choose from based on the required complexity and
the overhead.
Moreover, under this framework, we can design incrementally redundant
schemes for already existing outer codes to make the communication system more
robust to channel noise variations.
|
2 |
Analyse de performance d'un système d'authentification utilisant des codes graphiques / Performance Analysis of an Authentication Method relying on Graphical CodesMai Hoang, Bao An 01 December 2014 (has links)
Nous étudions dans cette thèse l'influence d'un système d'authentification utilisant des codes graphiques 2D modifiés lors de l'impression par un procédé physique non-clônable. Un tel procédé part du principe qu'à très haute résolution le système d'impression acquisition peut être modélisé comme un processus stochastique, de part le caractère aléatoire de la disposition des fibres de papiers, de mélange des particules d'encre, de l'adressabilité de l'imprimante ou encore du bruit d'acquisition. Nous considérons un scénario où l'adversaire pourra estimer le code original et essaiera de le reproduire en utilisant son propre système d'impression. La première solution que nous proposons pour arriver à l'authentification est d'utiliser un test d'hypothèse à partir des modèles à priori connus et sans mémoire des canaux d'impression-acquisition de l'imprimeur légitime et du contrefacteur. Dans ce contexte nous proposons une approximation fiable des probabilités d'erreur via l'utilisation de bornes exponentiels et du principe des grandes déviations. Dans un second temps, nous analysons un scénario plus réaliste qui prends en compte une estimation a priori du canal du contrefacteur et nous mesurons l'impact de cette étape sur les performances du système d'authentification. Nous montrons qu'il est possible de calculer la distribution des probabilité de non-détection et d'en extraire par exemple ses performances moyennes. La dernière partie de cette thèse propose d'optimiser, au travers d'un jeu minimax, le canal de l'imprimeur. / We study in this thesis the impact of an authentication system based on 2D graphical codes that are corrupted by a physically unclonable noise such as the one emitted by a printing process. The core of such a system is that a printing process at very high resolution can be seen as a stochastic process and hence produces noise, this is due to the nature of different elements such as the randomness of paper fibers, the physical properties of the ink drop, the dot addressability of the printer, etc. We consider a scenario where the opponent may estimate the original graphical code and tries to reproduce the forged one using his printing process in order to fool the receiver. Our first solution to perform authentication is to use hypothesis testing on the observed memoryless sequences of a printed graphical code considering the assumption that we are able to perfectly model the printing process. The proposed approach arises from error exponent using exponential bounds as a direct application of the large deviation principle. Moreover, when looking for a more practical scenario, we take into account the estimation of the printing process used to generate the graphical code of the opponent, and we see how it impacts the performance of the authentication system. We show that it is both possible to compute the distribution of the probability of non-detection and to compute the average performance of the authentication system when the opponent channel has to be estimated. The last part of this thesis addresses the optimization problem of the printing channel.
|
3 |
A Modified Sum-Product Algorithm over Graphs with Short CyclesRaveendran, Nithin January 2015 (has links) (PDF)
We investigate into the limitations of the sum-product algorithm for binary low density parity check (LDPC) codes having isolated short cycles. Independence assumption among messages passed, assumed reasonable in all configurations of graphs, fails the most
in graphical structures with short cycles. This research work is a step forward towards
understanding the effect of short cycles on error floors of the sum-product algorithm.
We propose a modified sum-product algorithm by considering the statistical dependency
of the messages passed in a cycle of length 4. We also formulate a modified algorithm in
the log domain which eliminates the numerical instability and precision issues associated
with the probability domain. Simulation results show a signal to noise ratio (SNR) improvement for the modified sum-product algorithm compared to the original algorithm.
This suggests that dependency among messages improves the decisions and successfully
mitigates the effects of length-4 cycles in the Tanner graph. The improvement is significant at high SNR region, suggesting a possible cause to the error floor effects on such graphs. Using density evolution techniques, we analysed the modified decoding algorithm. The threshold computed for the modified algorithm is higher than the threshold computed for the sum-product algorithm, validating the observed simulation results. We also prove that the conditional entropy of a codeword given the estimate obtained using the modified algorithm is lower compared to using the original sum-product algorithm.
|
Page generated in 0.055 seconds