Return to search

Enhanced Reed Solomon decoding using soft decision information

Reed Solomon codes are a well-known family of multilevel codes that are used in a variety of applications where errors are assumed to be bursty in nature. One of its appealing features is that it can correct random errors and mixtures of both random and burst errors. Although the codes meet the Singleton bound with equality they are very inefficient with respect to the Hamming bound, allowing a very high proportion of uncorrectable errors to be detected but not corrected. The algebraic decoding techniques traditionally used suffer the disadvantage that soft decision decoding cannot be achieved in a single decoding attempt, although it is possible to erase symbols of low confidence and allow the decoder to fill the erasures. Since Reed Solomon codes are Maximum Distance Separable (MDS) codes, it is possible to erase any n - k symbols (where n is the length and k the dimension of the code) and still achieve a successful decoding. In this thesis a study is made of an approach in which a large number of erasure patterns of weight 2t are generated and the decodings compared with the received sequence using hard and soft decision voting strategies. In addition, two improvements to the enhanced decoder have been investigated. The first is to compare decoded sequences bit by bit with the received sequence. The second is to incorporate soft decisions. In its most basic form, the enhanced decoder is computationally inefficient, consequently various methods were investigated to overcome this problem. This was also true after reducing the exhaustive erasure pattern set to one which covered all possible combinations of 2t - l error patterns in a reduced erasure pattern set. Alternative algorithms investigated various methods to use the soft decision information to locate the most likely error symbols rather than use exhaustive decoding techniques. Although these algorithms were very efficient in terms of performance and substantially reduced the average computation, in the worst case the computational complexity they generated could exceed that of the exhaustive erasure pattern method. For comparison purposes the error correcting algorithm as proposed by Chase was implemented. It was found that the performance of the enhanced decoder was slightly superior to that of the Chase algorithm but with the added advantage of a reduction in complexity.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:359758
Date January 1993
CreatorsMirza, Javed
PublisherUniversity of Surrey
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://epubs.surrey.ac.uk/844171/

Page generated in 0.0017 seconds