This paper proposes a syndrome sphere decoding (SSD) algorithm. SSD achieves the frame error rate (FER) of maximum likelihood (ML) decoding for a tail-biting convolutional code (TBCC) concatenated with an expurgating linear function (ELF) with significantly lower average and maximum decoding complexity than serial list Viterbi decoding (SLVD). SSD begins by using a Viterbi decoder to find the closest trellis path, which may not be tail-biting and may not satisfy the ELF. This trellis path has a syndrome comprised of the difference between the received ELF and the ELF computed from the received message and the difference between the beginning and ending state of the trellis path. This syndrome is used to find all valid tail-biting codewords that satisfy the ELF constraint and lie within a specified distance from the closest trellis path. The proposed algorithm avoids the complexity of SLVD at the cost of a large table containing all the offsets needed for each syndrome. A hybrid decoder combines SSD and SLVD with a fixed maximum list size, balancing the maximum list size for SLVD against the size of the SSD offset table. 
                        more » 
                        « less   
                    
                            
                            ELF Codes: Concatenated Codes with an Expurgating Linear Function as the Outer Code
                        
                    
    
            An expurgating linear function (ELF) is an outer code that disallows low-weight codewords of the inner code. ELFs can be designed either to maximize the minimum distance or to minimize the codeword error rate (CER) of the expurgated code. A list-decoding sieve can efficiently identify ELFs that maximize the minimum distance of the expurgated code. For convolutional inner codes, this paper provides analytical distance spectrum union (DSU) bounds on the CER of the concatenated code. For short codeword lengths, ELFs transform a good inner code into a great concatenated code. For a constant message size of K = 64 bits or constant codeword blocklength of N = 152 bits, an ELF can reduce the gap at CER 10−6 between the DSU and the random-coding union (RCU) bounds from over 1 dB for the inner code alone to 0.23 dB for the concatenated code. The DSU bounds can also characterize puncturing that mitigates the rate overhead of the ELF while maintaining the DSU-to-RCU gap. List Viterbi decoding guided by the ELF achieves maximum likelihood (ML) decoding of the concatenated code with a sufficiently large list size. The rate-K/(K+m) ELF outer code reduces rate and list decoding increases decoder complexity. As SNR increases, the average list size converges to 1 and average complexity is similar to Viterbi decoding on the trellis of the inner code. For rare large-magnitude noise events, which occur less often than the FER of the inner code, a deep search in the list finds the ML codeword. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2008918
- PAR ID:
- 10468061
- Publisher / Repository:
- IEEE
- Date Published:
- Subject(s) / Keyword(s):
- Expurgation Tail-biting Convolutional Codes Convolutional Codes Cyclic Redundancy Code List Viterbi Decoding List Decoding
- Format(s):
- Medium: X
- Location:
- Brest, France
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            With a sufficiently large list size, the serial list Viterbi algorithm (S-LVA) provides maximum likelihood (ML) decoding of a concatenated convolutional code (CC) and an expurgating linear function (ELF), which is similar in function to a cyclic redundancy check (CRC), but doesn't enforce that the code be cyclic. However, S-LVA with a large list size requires considerable complexity. This paper exploits linearity to reduce decoding complexity for tail-biting CCs (TBCCs) concatenated with ELFs.more » « less
- 
            In the short blocklength regime, serial list decoding of tail-biting (TB) convolutional codes concatenated with an expurgating linear function (ELF) can approach the random coding union bound on frame error rate (FER) performance. Decoding complexity for a particular received word depends on how deep in the list the decoder must search to find a valid TB-ELF codeword. The average list size is close to one at low-FER operating points such as 10^−6, and serial list decoding provides a favorable average complexity compared to other decoders with similar performance for these cases. However, the average list size can be on the order of a hundred or a thousand at higher, but still practically important, FER operating points such as 10−3. It is useful to study the tradeoff between how deep the decoder is willing to search and the proximity to the frame error rate (FER) achieved by an ML decoder. Often, this tradeoff is framed in terms of a maximum list depth. However, this paper frames the tradeoff in terms of a maximum allowable metric between the received word and the trellis paths on the list. We consider metrics of Euclidean distance and angle. This new approach draws on the wealth of existing literature on bounded-metric decoding to provide characterizations of how the choice of maximum allowable metric controls the tradeoffs between FER performance and both decoding complexity and undetected error rate. These characterizations lead to an example of an ELF-TB convolutional code that outperforms recent results for polar codes in terms of the lowest SNR that simultaneously achieves both a total error rate less than T = 10^−3 and an undetected error rate below U = 10^−5.more » « less
- 
            Convolutional codes have been widely studied and used in many systems. As the number of memory elements increases, frame error rate (FER) improves but computational complexity increases exponentially. Recently, decoders that achieve reduced average complexity through list decoding have been demonstrated when the convolutional encoder polynomials share a common factor that can be understood as a CRC or more generally an expurgating linear function (ELF). However, classical convolutional codes avoid such common factors because they result in a catastrophic encoder. This paper provides a way to access the complexity reduction possible with list decoding even when the convolutional encoder polynomials do not share a common factor. Decomposing the original code into component encoders that fully exclude some polynomials can allow an ELF to be factored from each component. Dual list decoding of the component encoders can often find the ML codeword. Including a fallback to regular Viterbi decoding yields excellent FER performance while requiring less average complexity than always performing Viterbi on the original trellis. A best effort dual list decoder that avoids Viterbi has performance similar to the ML decoder. Component encoders that have a shared polynomial allow for even greater complexity reduction.more » « less
- 
            Tal, Ido (Ed.)Recently, rate-1/ n zero-terminated (ZT) and tail-biting (TB) convolutional codes (CCs) with cyclic redundancy check (CRC)-aided list decoding have been shown to closely approach the random-coding union (RCU) bound for short blocklengths. This paper designs CRC polynomials for rate-( n - 1)/ n ZT and TB CCs with short blocklengths. This paper considers both standard rate-( n -1)/ n CC polynomials and rate-( n - 1)/ n designs resulting from puncturing a rate-1/2 code. The CRC polynomials are chosen to maximize the minimum distance d min and minimize the number of nearest neighbors A dmin . For the standard rate-( n - 1)/ n codes, utilization of the dual trellis proposed by Yamada et al . lowers the complexity of CRC-aided serial list Viterbi decoding (SLVD). CRC-aided SLVD of the TBCCs closely approaches the RCU bound at a blocklength of 128. This paper compares the FER performance (gap to the RCU bound) and complexity of the CRC-aided standard and punctured ZTCCs and TBCCs. This paper also explores the complexity-performance trade-off for three TBCC decoders: a single-trellis approach, a multi-trellis approach, and a modified single-trellis approach with pre-processing using the wrap around Viterbi algorithm.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    