skip to main content


This content will become publicly available on September 7, 2024

Title: Error Propagation Mitigation in Sliding Window Decoding of Spatially Coupled LDPC Codes
In this paper, we investigate the problem of decoder error propagation for spatially coupled low-density parity-check (SC-LDPC) codes with sliding window decoding (SWD). This problem typically manifests itself at signal-to-noise ratios (SNRs) close to capacity under low-latency operating conditions. In this case, infrequent but severe decoder error propagation can sometimes occur. To help understand the error propagation problem in SWD of SC-LDPC codes, a multi-state Markov model is developed to describe decoder behavior and to analyze the error performance of spatially coupled LDPC codes under these conditions. We then present two approaches -check node (CN) doping and variable node (VN) doping -to combating decoder error propagation and improving decoder performance. Next we describe how the performance can be further improved by employing an adaptive approach that depends on the availability of a noiseless binary feedback channel. To illustrate the effectiveness of the doping techniques, we analyze the error performance of CN doping and VN doping using the multi-state decoder model. We then present computer simulation results showing that CN and VN doping significantly improve the performance in the operating range of interest at a cost of a small rate loss and that adaptive doping further improves the performance. We also show that the rate loss is always less than that resulting from encoder termination and can be further reduced by doping only a fraction of the VNs at each doping position in the code graph with only a minor impact on performance. Finally, we show how the encoding problem for VN doping can be greatly simplified by doping only systematic bits, with little or no performance loss.  more » « less
Award ID(s):
1757207 2148358 2145917
NSF-PAR ID:
10462087
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
IEEE Journal on Selected Areas in Information Theory
ISSN:
2641-8770
Page Range / eLocation ID:
1 to 1
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    In this paper, we introduce two new methods of mitigating decoder error propagation for low-latency sliding window decoding (SWD) of spatially coupled low density parity check (SC-LDPC) codes. Building on the recently introduced idea of check node (CN) doping of regular SC-LDPC codes, here we employ variable node (VN) doping to fix (set to a known value) a subset of variable nodes in the coupling chain. Both of these doping methods have the effect of allowing SWD to recover from error propagation, at a cost of a slight rate loss. Experimental results show that, similar to CN doping, VN doping improves performance by up to two orders of magnitude compared to undoped SC-LDPC codes in the typical signal-to-noise ratio operating range. Further, compared to CN doping, VN doping has the advantage of not requiring any changes to the decoding process.In addition, a log-likelihood-ratio based window extension algorithm is proposed to reduce the effect of error propagation. Using this approach, we show that decoding latency can be reduced by up to a significant fraction without suffering any loss in performance 
    more » « less
  2. In this paper, we introduce two new methods of mitigating decoder error propagation for low-latency sliding window decoding (SWD) of spatially coupled low-density parity-check (SC-LDPC) codes. Building on the recently introduced idea of check node (CN) doping of regular SC-LDPC codes, here we employ variable node (VN) doping to fix (set to a known value) a subset of variable nodes in the coupling chain. Both of these doping methods have the effect of allowing SWD to recover from error propagation, at a cost of a slight rate loss. Experimental results show that, similar to CN doping, VN doping improves performance by up to two orders of magnitude compared to un-doped SC-LDPC codes in the typical signal-to-noise ratio operating range. Further, compared to CN doping, VN doping has the advantage of not requiring any changes to the decoding process. In addition, a log-likelihood-ratio based window extension algorithm is proposed to reduce the effect of error propagation. Using this approach, we show that decoding latency can be reduced by up to a significant fraction without suffering any loss in performance. 
    more » « less
  3. In this paper, we examine variable node (VN) doping to mitigate the error propagation problem in sliding window decoding (SWD) of spatially coupled LDPC (SC-LDPC) codes from the point of view of the encoding process. More specifically, in order to simplify the process of generating an encoded sequence with some number of doped code bits, we propose to employ systematic encoding and to limit doping to systematic bits only. Numerical results show that doping of systematic bits only achieves comparable performance to employing general (nonsystematic) encoding and full doping of all the code bits at each doping position, while benefiting from a much simpler encoding process. We then show that the inherent rate loss due to doping can be reduced by doping only a fraction of the variable nodes at each doping position with only a minor impact on performance. 
    more » « less
  4. In this paper, a method for joint source-channel coding (JSCC) based on concatenated spatially coupled low-density parity-check (SC-LDPC) codes is investigated. A construction consisting of two SC-LDPC codes is proposed: one for source coding and the other for channel coding, with a joint belief propagation-based decoder. Also, a novel windowed decoding (WD) scheme is presented with significantly reduced latency and complexity requirements. The asymptotic behavior for various graph node degrees is analyzed using a protograph-based Extrinsic Information Transfer (EXIT) chart analysis for both LDPC block codes with block decoding and for SC-LDPC codes with the WD scheme, showing robust performance for concatenated SC-LDPC codes. Simulation results show a notable performance improvement compared to existing state-of-the-art JSCC schemes based on LDPC codes with comparable latency and complexity constraints. 
    more » « less
  5. Neural Normalized MinSum (N-NMS) decoding delivers better frame error rate (FER) performance on linear block codes than conventional Normalized MinSum (NMS) by assigning dynamic multiplicative weights to each check-to-variable node message in each iteration. Previous N-NMS efforts primarily investigated short block codes (N < 1000), because the number of N-NMS parameters required to be trained scales proportionately to the number of edges in the parity check matrix and the number of iterations. This imposes an impractical memory requirement for conventional tools such as Pytorch and Tensorflow to create the neural network and store gradients. This paper provides efficient methods of training the parameters of N-NMS decoders that support longer block lengths. Specifically, this paper introduces a family of Neural 2-dimensional Normalized (N-2D-NMS) decoders with various reduced parameter sets and shows how performance varies with the parameter set selected. The N-2D-NMS decoders share weights with respect to check node and/or variable node degree. Simulation results justify a reduced parameter set, showing that the trained weights of N- NMS have a smaller value for the neurons corresponding to larger check/variable node degree. Further simulation results on a (3096,1032) Protograph-Based Raptor-Like (PBRL) code show that the N-2D-NMS decoder can achieve the same FER as N- NMS while also providing at least a 99.7% parameter reduction. Furthermore, the N-2D-NMS decoder for the (16200,7200) DVBS- 2 standard LDPC code shows a lower error floor than belief propagation. Finally, this paper proposes a hybrid decoder training structure that utilizes a neural network which combines a feedforward module with a recurrent module. The decoding performance and parameter reduction of the hybrid training depends on the length of recurrent module of the neural network. 
    more » « less