skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Graceful degradation over the BEC via non-linear codes
Abstract—We study a problem of constructing codes that transform a channel with high bit error rate (BER) into one with low BER (at the expense of rate). Our focus is on obtaining codes with smooth (“graceful”) input-output BER curves (as opposed to threshold-like curves typical for long error-correcting codes). This paper restricts attention to binary erasure channels (BEC) and contains two contributions. First, we introduce the notion of Low Density Majority Codes (LDMCs). These codes are non-linear sparse-graph codes, which output majority function evaluated on randomly chosen small subsets of the data bits. This is similar to Low Density Generator Matrix codes (LDGMs), except that the XOR function is replaced with the majority. We show that even with a few iterations of belief propagation (BP) the attained input-output curves provably improve upon performance of any linear systematic code. The effect of nonlinearity bootstraping the initial iterations of BP, suggests that LDMCs should improve performance in various applications where LDGMs have been used traditionally. Second, we establish several two-point converse bounds that lower bound the BER achievable at one erasure probability as a function of BER achieved at another one. The novel nature of our bounds is that they are specific to subclasses of codes (linear systematic and non-linear systematic) and outperform similar bounds implied by the area theorem for the EXIT function  more » « less
Award ID(s):
1717842
PAR ID:
10273761
Author(s) / Creator(s):
Date Published:
Journal Name:
IEEE International Symposium on Information Theory
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper introduces the notion of triangulation codes, a family of non-linear codes that 1) admit efficient encoding/decoding 2) their bit error rate deteriorates gracefully as the quality of the erasure channel degrades. Some coding theoretic properties of these codes are established. In the case of transmitting data over the erasure channel, it is shown that, even with sub-optimal decoding, they can achieve lower bit error rate than uncoded transmission for any number of received output symbols. 
    more » « less
  2. This paper presents new achievability bounds on the maximal achievable rate of variable-length stop-feedback (VLSF) codes operating over a binary erasure channel (BEC) at a fixed message size M=2^k . We provide bounds for two cases: The first case considers VLSF codes with possibly infinite decoding times and zero error probability. The second case limits the maximum (finite) number of decoding times and specifies a maximum tolerable probability of error. Both new achievability bounds are proved by constructing a new VLSF code that employs systematic transmission of the first k message bits followed by random linear fountain parity bits decoded with a rank decoder. For VLSF codes with infinite decoding times, our new bound outperforms the state-of-the-art result for BEC by Devassy et al. in 2016. We show that the backoff from capacity reduces to zero as the erasure probability decreases, thus giving a negative answer to the open question Devassy et al. posed on whether the 23.4% backoff to capacity at k=3 is fundamental to all BECs. For VLSF codes with finite decoding times, numerical evaluations show that the systematic transmission followed by random linear fountain coding performs better than random linear coding in terms of achievable rates. 
    more » « less
  3. Orthogonal Signal Division Multiplexing (OSDM) has shown great robustness against multipath and Doppler effects in underwater acoustic channels, while however having a low spectral efficiency. In this work, a novel transmission method, named Spatial Modulation-based OSDM (SM-OSDM), is proposed to improve the spectral efficiency while keeping the Bit Error Rate (BER) low, where multiple transducers are utilized at the transmitter side but only one is active in each time slot. The simulation results prove that the SMOSDM offers higher spectral efficiency than Single-Input SingleOutput (SISO)-OSDM and lower BER compared with MultipleInput Multiple-Output (MIMO)-OSDM. 
    more » « less
  4. Quantum low-density parity-check (QLDPC) codes have emerged as a promising technique for quantum error correction. A variety of decoders have been proposed for QLDPC codes and many utilize belief propagation (BP) decoding in some fashion. However, the use of BP decoding for degenerate QLDPC codes is known to have issues with convergence. These issues are typically attributed to short cycles in the Tanner graph and error patterns with the same syndrome due to code degeneracy. In this work, we propose a decoder for QLDPC codes based on BP guided decimation (BPGD), which has been previously studied for constraint satisfaction and lossy compression problems. This decimation process is applicable to both binary and quaternary BP and it involves sequentially freezing the value of the most reliable qubits to encourage BP convergence. We find that BPGD significantly reduces the BP failure rate due to non-convergence, achieving performance on par with BP with ordered statistics decoding and BP with stabilizer inactivation, without the need to solve systems of linear equations. To explore how and why BPGD improves performance, we discuss several interpretations of BPGD and their connection to BP syndrome decoding. 
    more » « less
  5. Streaming codes take a string of source symbols as input and output a string of coded symbols in real time, which effectively eliminate the queueing delay and are regarded as a promising scheme for low latency communications. Aiming at quantifying the fundamental latency performance of random linear streaming codes (RLSCs) over i.i.d. symbol erasure channels, this work derives the exact error probability under, simultaneously, the finite memory length and finite decoding deadline constraints. The result is then used to examine the tradeoff among memory length (complexity), decoding deadline (delay), and error probability (reliability) of RLSCs for the first time in the literature. Two critical observations are made: (i) Too much memory can adversely impact the performance under a finite decoding deadline constraint, a surprising finding not captured by the traditional wisdom that large memory length monotonically improves the performance in the asymptotic regime; (ii) The end-to-end delay of the RLSC is roughly 50% of that of the MDS block code when under identical code rate and error probability requirements. This implies that switching from block codes to RLSCs not only eliminates the queueing delay (thus 50%) but also has little negative impact on the error probability. 
    more » « less