This paper explores list decoding of convolutional and polar codes for short messages such as those found in the 5G physical broadcast channel. A cyclic redundancy check (CRC) is used to select a codeword from a list of likely codewords. One example in the 5G standard encodes a 32-bit message with a 24-bit CRC and a 512-bit polar code with additional bits added by repetition to achieve a very low rate of 32/864. This paper shows that optimizing the CRC length improves the Eb/N0 performance of this polar code, where Eb/N0 is the ratio of the energy per data bit to the noise power spectral density. Furthermore, even better Eb/N0 performance is achieved by replacing the polar code with a tail-biting convolutional code (TBCC) with a distance-spectrum-optimal (DSO) CRC. This paper identifies the optimal CRC length to minimize the frame error rate (FER) of a rate-1/5 TBCC at a specific value of Eb/N0. We also show that this optimized TBCC/CRC can attain the same excellent Eb/N0 performance with the very low rate of 32/864 of the 5G polar code, where the low rate is achieved through repetition. We show that the proposed TBCC/CRC concatenated code outperforms the PBCH polar code described in the 5G standard both in terms of FER and decoding run time. We also explore the tradeoff between undetected error rate and erasure rate as the CRC size varies.
more »
« less
An Efficient Algorithm for Designing Optimal CRCs for Tail-Biting Convolutional Codes
Cyclic redundancy check (CRC) codes combined with convolutional codes yield a powerful concatenated code that can be efficiently decoded using list decoding. To help design such systems, this paper presents an efficient algorithm for identifying the distance-spectrum-optimal (DSO) CRC polynomial for a given tail-biting convolutional code (TBCC) when the target undetected error rate (UER) is small. Lou et al. found that the DSO CRC design for a given zero-terminated convolutional code under low UER is equivalent to maximizing the undetected minimum distance (the minimum distance of the concatenated code). This paper applies the same principle to design the DSO CRC for a given TBCC under low target UER. Our algorithm is based on partitioning the tail-biting trellis into several disjoint sets of tail-biting paths that are closed under cyclic shifts. This paper shows that the tail-biting path in each set can be constructed by concatenating the irreducible error events (IEEs) and circularly shifting the resultant path. This motivates an efficient collection algorithm that aims at gathering IEEs, and a search algorithm that reconstructs the full list of error events with bounded distance of interest, which can be used to find the DSO CRC. Simulation results show that DSO CRCs can significantly outperform suboptimal CRCs in the low UER regime.
more »
« less
- Award ID(s):
- 2008918
- PAR ID:
- 10214994
- Date Published:
- Journal Name:
- 2020 IEEE International Symposium on Information Theory (ISIT)
- Page Range / eLocation ID:
- 292 to 297
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Recently, rate-1/ω zero-terminated and tail-biting convolutional codes (ZTCCs and TBCCs) with cyclic-redundancy-check (CRC)-aided list decoding have been shown to closely approach the random-coding union (RCU) bound for short blocklengths. This paper designs CRCs for rate-(ω−1)/ω CCs with short blocklengths, considering both the ZT and TB cases. The CRC design seeks to optimize the frame error rate (FER) performance of the code resulting from the concatenation of the CRC and the CC. Utilization of the dual trellis proposed by Yamada et al. lowers the complexity of CRC-aided serial list Viterbi decoding (SLVD) of ZTCCs and TBCCs. CRC-aided SLVD of the TBCCs closely approaches the RCU bound at a blocklength of 128.more » « less
-
Tal, Ido (Ed.)Recently, rate-1/ n zero-terminated (ZT) and tail-biting (TB) convolutional codes (CCs) with cyclic redundancy check (CRC)-aided list decoding have been shown to closely approach the random-coding union (RCU) bound for short blocklengths. This paper designs CRC polynomials for rate-( n - 1)/ n ZT and TB CCs with short blocklengths. This paper considers both standard rate-( n -1)/ n CC polynomials and rate-( n - 1)/ n designs resulting from puncturing a rate-1/2 code. The CRC polynomials are chosen to maximize the minimum distance d min and minimize the number of nearest neighbors A dmin . For the standard rate-( n - 1)/ n codes, utilization of the dual trellis proposed by Yamada et al . lowers the complexity of CRC-aided serial list Viterbi decoding (SLVD). CRC-aided SLVD of the TBCCs closely approaches the RCU bound at a blocklength of 128. This paper compares the FER performance (gap to the RCU bound) and complexity of the CRC-aided standard and punctured ZTCCs and TBCCs. This paper also explores the complexity-performance trade-off for three TBCC decoders: a single-trellis approach, a multi-trellis approach, and a modified single-trellis approach with pre-processing using the wrap around Viterbi algorithm.more » « less
-
With a sufficiently large list size, the serial list Viterbi algorithm (S-LVA) provides maximum likelihood (ML) decoding of a concatenated convolutional code (CC) and an expurgating linear function (ELF), which is similar in function to a cyclic redundancy check (CRC), but doesn't enforce that the code be cyclic. However, S-LVA with a large list size requires considerable complexity. This paper exploits linearity to reduce decoding complexity for tail-biting CCs (TBCCs) concatenated with ELFs.more » « less
-
This paper proposes a syndrome sphere decoding (SSD) algorithm. SSD achieves the frame error rate (FER) of maximum likelihood (ML) decoding for a tail-biting convolutional code (TBCC) concatenated with an expurgating linear function (ELF) with significantly lower average and maximum decoding complexity than serial list Viterbi decoding (SLVD). SSD begins by using a Viterbi decoder to find the closest trellis path, which may not be tail-biting and may not satisfy the ELF. This trellis path has a syndrome comprised of the difference between the received ELF and the ELF computed from the received message and the difference between the beginning and ending state of the trellis path. This syndrome is used to find all valid tail-biting codewords that satisfy the ELF constraint and lie within a specified distance from the closest trellis path. The proposed algorithm avoids the complexity of SLVD at the cost of a large table containing all the offsets needed for each syndrome. A hybrid decoder combines SSD and SLVD with a fixed maximum list size, balancing the maximum list size for SLVD against the size of the SSD offset table.more » « less
An official website of the United States government

