• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • 1
  • Tagged with
  • 7
  • 7
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • 2
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Coding techniques for insertion/deletion error correction

Cheng, Ling 04 June 2012 (has links)
D. Ing. / In Information Theory, synchronization errors can be modelled as the insertion and deletion of symbols. Error correcting codes are proposed in this research as a method of recovering from a single insertion or deletion error; adjacent multiple deletion errors; or multiple insertion, deletion and substitution errors. A moment balancing template is a single insertion or deletion correcting construction based on number theoretic codes. The implementation of this previously published technique is extended to spectral shaping codes, (d, k) constrained codes and run-length limited sequences. Three new templates are developed. The rst one is an adaptation to DC-free codes, and the second one is an adaptation to spectral null codes. The third one is a generalized moment balancing template for both (d, k) constrained codes and run-length limited sequences. Following this, two new coding methods are investigated to protect a binary sequence against adjacent deletion errors. The rst class of codes is a binary code derived from the Tenengolts non-binary single insertion or deletion correcting code, with additional selection rules. The second class of codes is designed by using interleaving techniques. The asymptotic cardinality bounds of these new codes are also derived. Compared to the previously published codes, the new codes are more exible, since they can protect against any given xed known length of adjacent deletion errors. Based on these two methods, a nested construction is further proposed to guarantee correction of adjacent deletion errors, up to a certain xed number.
2

Códigos LDPC definidos sobre corpos de inteiros finitos / LDPC codes defined over finite integer fields

Dantas, Pâmela Joyce Silva Melo, 1985- 24 August 2018 (has links)
Orientador: Renato Baldini Filho / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação / Made available in DSpace on 2018-08-24T14:17:23Z (GMT). No. of bitstreams: 1 Dantas_PamelaJoyceSilvaMelo_M.pdf: 584157 bytes, checksum: affc3041d38415e1af35e32a78ebd6e1 (MD5) Previous issue date: 2014 / Resumo: Nesta dissertação apresentamos um estudo sobre a viabilidade de construção e de utilização de códigos LDPC (Low Density Parity Check) definidos sobre corpos finitos de inteiros módulo-p, onde p é um inteiro primo. A modulação utilizada para avaliar o desempenho dos códigos obtidos é a p ? PSK. Códigos LDPC definidos sobre corpos finito de inteiros possuem estrutura algébrica bem definida, são facilmente feitos invariantes a rotação de fase da portadora no processo de modulação e podem ser feitos mais curtos que os seus equivalentes binários. O método de decodificação iterativa utilizada na avaliação do desempenho destes códigos é uma adaptação do algoritmo SISO (Soft Input Soft Output) proposto por P. G. Farrell e J. Moreira [1] e [2] que utiliza a distância euclidiana como parâmetro de confiabilidade dos símbolos da palavra código recebida. Os códigos LDPC utilizados na simulação da codificação e decodificação do canal de comunicação são definidos para o campo de inteiros Z5. O canal de comunicação foi modelado com um ruído gaussiano branco aditivo (AWGN - Additive White Gaussian Noise) e com um desvanecimento Rayleigh. Ambos modelos de canal utilizam a modulação 5 ? PSK. O desempenho dos esquemas de codificação LDPC definidos sobre Z5 foram analisados de modo comparativo com sistemas equivalentes de codificação binários e quartenários. Palavras-chave: Códigos LDPC não binários, corpos de inteiros módulo-p, modulação p ? PSK, canal AWGN, desvanecimento Rayleigh / Abstract: On this disertation we present a study on the feasibility of constructions and use of LDPC (Low Density Parity Check) codes defined over finite fields of integers modulo p, where p is a prime integer. The modulation used to evaluate the performance of the codes is obtained from a p ? PSK. LDPC codes defined over finite field of integers have well defined algebraic structure, they can be easily made invariant to phase rotation in the carrier modulation process, and can be made shorter than its binary equivalent. The iterative decoding method used during the evaluating the performance of these codes is an adaptation of the algorithm SISO (Soft Input Soft Output) proposed by P. G. Farrell and J. Moreira [1] e [2] that uses the Euclidean distance as the reliability of the parameter code word symbols received. The LDPC codes used during the simulation of encoding and decoding of the communication channel are defined for the whole body of Z5. The communication channel was modeled as additive white Gaussian noise (AWGN) and Rayleigh fading. Both communication channel models used modulation 5 ? PSK. The performance of LDPC coding schemes defined over Z5 were analyzed comparatively with equivalent systems of binary and quaternary encoding. Key-words: LDPC codes nonbinary, Field of integers modulo-p, p ? PSK modulation, AWGN Channel,Rayleigh fading OBSERVAÇÃO Verificar meu nome, pois no mestrado tava na dac o nome de solteira(Pâmela Joyce Silva Melo), mas quando fiz a matricula do doutorado troquei para o de casada(Pâmela Joyce Silva Melo Dantas) / Mestrado / Telecomunicações e Telemática / Mestra em Engenharia Elétrica
3

Análise das propriedades matemáticas associadas ao splicing alternativo através dos códigos BCH e de Varshamov-Tenengolts / Analysis of the mathematical properties associated to the alternative splicing through BCH and Varshamov-Tenengolts codes

Franco, Luiz Antonio Leandro, 1984- 25 August 2018 (has links)
Orientador: Reginaldo Palazzo Júnior / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação / Made available in DSpace on 2018-08-25T18:09:36Z (GMT). No. of bitstreams: 1 Franco_LuizAntonioLeandro_M.pdf: 1159060 bytes, checksum: 7123071f5e53a6a6c9703f83ba1395cc (MD5) Previous issue date: 2014 / Resumo: Durante milhões de anos, o homem, os animais e plantas vêm se transformando e evoluindo para se adaptar ao ambiente. Um processo que auxilia na evolução é o splicing alternativo, consistindo de uma codificação bastante conveniente, que a partir de um único gene consegue gerar várias proteínas, combinando éxons e íntrons de diferentes formas, aumentando assim a capacidade proteômica. Várias pesquisas buscam uma melhor compreensão dos mecanismos envolvidos no splicing altenativo e quais as consequências dos erros cometidos durante este processo. Este trabalho tem como objetivo principal analisar as propriedades matemáticas envolvidas no splicing alternativo por meio dos códigos corretores de erros. Os códigos (BCH) foram utilizados nos casos que ocorreram erros de substituição de nucleotídeos e os códigos de Varshamov-Tenengolts nos casos que ocorreram erros de inserção e deleção de nucleotídeos. Neste trabalho verificamos a possibilidade reproduzir matematicamente o splicing alternativo de acordo com as restrições biológicas. Para atingir este objetivo, consideramos o gene TRAV7 presente no genoma humano e o gene Hint-1 presente no nematoide Caenorhabditis Elegans / Abstract: During millions of years mankind, animals and plants have transformed themselves, continuing to evolve in order to adapt themselves to the environment. A process that helps in the evolution is the alternative splicing, consisting of a rather suitable codification, that manages to produce several proteins from a single gene, combining exons and introns of different forms, in this way increasing the proteomic capacity. Several surveys search for both a better understanding of the mechanisms involved in alternative splicing and the consequences of errors committed during this process. This study has as its main objective to analyze the mathematical properties involved in the alternative splicing through correcting codes of errors. The codes (BCH) were used in the cases when errors of substitution of nucleotides occurred and Varshamov-Tenengolts codes in the cases when errors of insertion and deletion of nucleotides occurred. In this study we verified the possibility of reproducing mathematically the splicing alternative in accordance with the biological restrictions. To achieve this objective we considered the gene TRAV7 present in the human genome and the gene Hint-1 present in the nematode Caenorhabditis Elegans / Mestrado / Telecomunicações e Telemática / Mestre em Engenharia Elétrica
4

Modelling of patterns between operational data, diagnostic trouble codes and workshop history using big data and machine learning

Virkkala, Linda, Haglund, Johanna January 2016 (has links)
The work presented in this thesis is part of a large research and development project on condition-based maintenance for heavy trucks and buses at Scania. The aim of this thesis was to be able to predict the status of a component (the starter motor) using data mining methods and to create models that can predict the failure of that component. Based on workshop history data, error codes and operational data, three sets of classification models were built and evaluated. The first model aims to find patterns in a set of error codes, to see which codes are related to a starter motor failure. The second model aims to see if there are patterns in operational data that lead to the occurrence of an error code. Finally, the two data sets were merged and a classifier was trained and evaluated on this larger data set. Two machine learning algorithms were used and compared throughout the model building: AdaBoost and random forest. There is no statistically significant difference in their performance, and both algorithms had an error rate around ~13%, ~5% and ~13% for the three classification models respectively. However, random forest is much faster, and is therefore the preferable option for an industrial implementation. Variable analysis was conducted for the error codes and operational data, resulting in rankings of informative variables. From the evaluation metric precision, it can be derived that if our random forest model predicts a starter motor failure, there is a 85.7% chance that it actually has failed. This model finds 32% (the models recall) of the failed starter motors. It is also shown that four error codes; 2481, 2639, 2657 and 2597 have the highest predictive power for starter motor failure classification. For the operational data, variables that concern the starter motor lifetime and battery health are generally ranked as important by the models. The random forest model finds 81.9% of the cases where the 2481 error code occurs. If the random forest model predicts that the error code 2481 will occur, there is a 88.2% chance that it will. The classification performance was not increased when the two data sets were merged, indicating that the patterns detected by the two first classification models do not add value toone another.
5

Sécurité et disponibilité des données stockées dans les nuages / Data availability and sécurity in cloud storage

Relaza, Théodore Jean Richard 12 February 2016 (has links)
Avec le développement de l'Internet, l'informatique s'est basée essentiellement sur les communications entre serveurs, postes utilisateurs, réseaux et data centers. Au début des années 2000, les deux tendances à savoir la mise à disposition d'applications et la virtualisation de l'infrastructure ont vu le jour. La convergence de ces deux tendances a donné naissance à un concept fédérateur qu'est le Cloud Computing (informatique en nuage). Le stockage des données apparaît alors comme un élément central de la problématique liée à la mise dans le nuage des processus et des ressources. Qu'il s'agisse d'une simple externalisation du stockage à des fins de sauvegarde, de l'utilisation de services logiciels hébergés ou de la virtualisation chez un fournisseur tiers de l'infrastructure informatique de l'entreprise, la sécurité des données est cruciale. Cette sécurité se décline selon trois axes : la disponibilité, l'intégrité et la confidentialité des données. Le contexte de nos travaux concerne la virtualisation du stockage dédiée à l'informatique en nuage (Cloud Computing). Ces travaux se font dans le cadre du projet SVC (Secured Virtual Cloud) financé par le Fond National pour la Société Numérique " Investissement d'avenir ". Ils ont conduit au développement d'un intergiciel de virtualisation du stockage, nommé CloViS (Cloud Virtualized Storage), qui entre dans une phase de valorisation portée par la SATT Toulouse-Tech-Transfer. CloViS est un intergiciel de gestion de données développé au sein du laboratoire IRIT, qui permet la virtualisation de ressources de stockage hétérogènes et distribuées, accessibles d'une manière uniforme et transparente. CloViS possède la particularité de mettre en adéquation les besoins des utilisateurs et les disponibilités du système par le biais de qualités de service définies sur des volumes virtuels. Notre contribution à ce domaine concerne les techniques de distribution des données afin d'améliorer leur disponibilité et la fiabilité des opérations d'entrées/sorties dans CloViS. En effet, face à l'explosion du volume des données, l'utilisation de la réplication ne peut constituer une solution pérenne. L'utilisation de codes correcteurs ou de schémas de seuil apparaît alors comme une alternative valable pour maîtriser les volumes de stockage. Néanmoins aucun protocole de maintien de la cohérence des données n'est, à ce jour, adapté à ces nouvelles méthodes de distribution. Nous proposons pour cela des protocoles de cohérence des données adaptés à ces différentes techniques de distribution des données. Nous analysons ensuite ces protocoles pour mettre en exergue leurs avantages et inconvénients respectifs. En effet, le choix d'une technique de distribution de données et d'un protocole de cohérence des données associé se base sur des critères de performance notamment la disponibilité en écriture et lecture, l'utilisation des ressources système (comme l'espace de stockage utilisé) ou le nombre moyen de messages échangés durant les opérations de lecture et écriture. / With the development of Internet, Information Technology was essentially based on communications between servers, user stations, networks and data centers. Both trends "making application available" and "infrastructure virtualization" have emerged in the early 2000s. The convergence of these two trends has resulted in a federator concept, which is the Cloud Computing. Data storage appears as a central component of the problematic associated with the move of processes and resources in the cloud. Whether it is a simple storage externalization for backup purposes, use of hosted software services or virtualization in a third-party provider of the company computing infrastructure, data security is crucial. This security declines according to three axes: data availability, integrity and confidentiality. The context of our work concerns the storage virtualization dedicated to Cloud Computing. This work is carried out under the aegis of SVC (Secured Virtual Cloud) project, financed by the National Found for Digital Society "Investment for the future". This led to the development of a storage virtualization middleware, named CloViS (Cloud Virtualized Storage), which is entering a valorization phase driven by SATT Toulouse-Tech-Transfer. CloViS is a data management middleware developped within the IRIT laboratory. It allows virtualizing of distributed and heterogeneous storage resources, with uniform and seamless access. CloViS aligns user needs and system availabilities through qualities of service defined on virtual volumes. Our contribution in this field concerns data distribution techniques to improve their availability and the reliability of I/O operations in CloViS. Indeed, faced with the explosion in the amount of data, the use of replication can not be a permanent solution. The use of "Erasure Resilient Code" or "Threshold Schemes" appears as a valid alternative to control storage volumes. However, no data consistency protocol is, to date, adapted to these new data distribution methods. For this reason, we propose to adapt these different data distribution techniques. We then analyse these new protocols, highlighting their respective advantages and disadvantages. Indeed, the choice of a data distribution technique and the associated data consistency protocol is based on performance criteria, especially the availability and the number of messages exchanged during the read and write operations or the use of system resources (such as storage space used).
6

Error-Correcting Codes in Spaces of Sets and Multisets and Their Applications in Permutation Channels / Zaštitni kodovi u prostorima skupova i multiskupova i njihove primene u permutacionim kanalima

Kovačević Mladen 15 October 2014 (has links)
<p>The thesis studies two communication<br />channels and corresponding error-correcting<br />codes. Multiset codes are introduced and<br />their applications described. Properties of<br />entropy and relative entropy are investigated.</p> / <p>U tezi su analizirana dva tipa komunikacionih<br />kanala i odgovarajući za&scaron;titni kodovi.<br />Uveden je pojam multiskupovnog koda i<br />opisane njegove primene. Proučavane su<br />osobine entropije i relativne entropije.</p>
7

Uma abordagem computacional para a análise de sequências de DNA por meio dos códigos corretores de erros / A computational approach for the analysis of DNA sequences using error correcting codes

Pereira, Diogo Guilherme, 1981- 08 January 2014 (has links)
Orientador: Reginaldo Palazzo Júnior / Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Elétrica e de Computação / Made available in DSpace on 2018-08-26T03:06:58Z (GMT). No. of bitstreams: 1 Pereira_DiogoGuilherme_M.pdf: 2278721 bytes, checksum: d52e9ddea8e27d992073c5cf8ba3674f (MD5) Previous issue date: 2014 / Resumo: É evidente os benefícios proporcionados pela aplicação da teoria da informação nas análises dos processos de codificação genética. Este trabalho propõe o desenvolvimento de algoritmos, e sua implementação computacional, para a realização de análises em sequências de DNA por meio dos códigos BCH. O primeiro programa irá calcular diversos polinômios geradores que serão utilizados pelos outros programas. O segundo programa se utiliza destes polinômios geradores para realizar análises em sequências de DNA e identificar palavras-código na forma de novas sequências de DNA. Já o terceiro programa, de iniciativa inédita, se utiliza tanto dos polinômios geradores quanto as palavras-código e realiza um processo de decodificação com o intuito de rastrear as mutações passiveis de ocorrer em sequências de DNA / Abstract: The benefits provided by the application of information theory in the analyses of genetic coding processes are evident. In this work the development of algorithms and their computational implementations are proposed, with the aim at performing analyses of DNA sequences by use of BCH codes. The first program calculates several generator polynomials which are used by other programs. The second program uses generator polynomials to perform DNA sequence analyses and to identify the codewords in the form of new DNA sequences. The third program by using both the generator polynomials as well as the codewords to perform a decoding process in order to predict mutations that may occur in DNA sequences / Mestrado / Telecomunicações e Telemática / Mestre em Engenharia Elétrica

Page generated in 0.0972 seconds