skip to main content


This content will become publicly available on July 7, 2025

Title: Systematic Transmission With Fountain Parity Checks for Erasure Channels With Stop Feedback
This paper presents new achievability bounds on the maximal achievable rate of variable-length stop-feedback (VLSF) codes operating over a binary erasure channel (BEC) at a fixed message size M=2^k . We provide bounds for two cases: The first case considers VLSF codes with possibly infinite decoding times and zero error probability. The second case limits the maximum (finite) number of decoding times and specifies a maximum tolerable probability of error. Both new achievability bounds are proved by constructing a new VLSF code that employs systematic transmission of the first k message bits followed by random linear fountain parity bits decoded with a rank decoder. For VLSF codes with infinite decoding times, our new bound outperforms the state-of-the-art result for BEC by Devassy et al. in 2016. We show that the backoff from capacity reduces to zero as the erasure probability decreases, thus giving a negative answer to the open question Devassy et al. posed on whether the 23.4% backoff to capacity at k=3 is fundamental to all BECs. For VLSF codes with finite decoding times, numerical evaluations show that the systematic transmission followed by random linear fountain coding performs better than random linear coding in terms of achievable rates.  more » « less
Award ID(s):
1955660
PAR ID:
10540265
Author(s) / Creator(s):
;
Publisher / Repository:
IEEE
Date Published:
ISSN:
2157-8117
ISBN:
979-8-3503-8284-6
Page Range / eLocation ID:
516 to 521
Subject(s) / Keyword(s):
Codes Systematic Error probability Symbols Encoding Parity check codes Generators Parity-check Erasure Channel Error Probability Finite Time Achievable Rate Random Code Infinite Time Linear Code Message Size Binary Channel Decoding Time Random Variables Vector-based Time Instants Stopping Rule Rank Of Matrix Natural Vector Code Size
Format(s):
Medium: X
Location:
Athens, Greece
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Abstract—We study a problem of constructing codes that transform a channel with high bit error rate (BER) into one with low BER (at the expense of rate). Our focus is on obtaining codes with smooth (“graceful”) input-output BER curves (as opposed to threshold-like curves typical for long error-correcting codes). This paper restricts attention to binary erasure channels (BEC) and contains two contributions. First, we introduce the notion of Low Density Majority Codes (LDMCs). These codes are non-linear sparse-graph codes, which output majority function evaluated on randomly chosen small subsets of the data bits. This is similar to Low Density Generator Matrix codes (LDGMs), except that the XOR function is replaced with the majority. We show that even with a few iterations of belief propagation (BP) the attained input-output curves provably improve upon performance of any linear systematic code. The effect of nonlinearity bootstraping the initial iterations of BP, suggests that LDMCs should improve performance in various applications where LDGMs have been used traditionally. Second, we establish several two-point converse bounds that lower bound the BER achievable at one erasure probability as a function of BER achieved at another one. The novel nature of our bounds is that they are specific to subclasses of codes (linear systematic and non-linear systematic) and outperform similar bounds implied by the area theorem for the EXIT function 
    more » « less
  2. In this paper, we are interested in the performance of a variable-length stop-feedback (VLSF) code with m optimal decoding times for the binary-input additive white Gaussian noise channel. We first develop tight approximations to the tail probability of length-n cumulative information density. Building on the work of Yavas et al., for a given information density threshold, we formulate the integer program of minimizing the upper bound on average blocklength over all decoding times subject to the average error probability, minimum gap and integer constraints. Eventually, minimization of locally optimal upper bounds over all thresholds yields the globally minimum upper bound and the above method is called the two-step minimization. Relaxing to allow positive real-valued decoding times activates the gap constraint. We develop gap-constrained sequential differential optimization (SDO) procedure to find the optimal, gap-constrained, real-valued decoding times. In the error regime of practical interest, Polyanskiy's scheme of stopping at zero does not help. In this region, the achievability bounds estimated by the two-step minimization and gap-constrained SDO show that Polyanskiy’s achievability bound for VLSF codes can be approached with a small number of decoding times. 
    more » « less
  3. Horstein, Burnashev, Shayevitz and Feder, Naghshvar et al . and others have studied sequential transmission of a k-bit message over the binary symmetric channel (BSC) with full, noiseless feedback using posterior matching. Yang et al . provide an improved lower bound on the achievable rate using martingale analysis that relies on the small-enough difference (SED) partitioning introduced by Naghshvar et al . SED requires a relatively complex encoder and decoder. To reduce complexity, this paper replaces SED with relaxed constraints that admit the small enough absolute difference (SEAD) partitioning rule. The main analytical results show that achievable-rate bounds higher than those found by Yang et al . [2] are possible even under the new constraints, which are less restrictive than SED. The new analysis does not use martingale theory for the confirmation phase and applies a surrogate channel technique to tighten the results. An initial systematic transmission further increases the achievable rate bound. The simplified encoder associated with SEAD has a complexity below order O ( K 2 ) and allows simulations for message sizes of at least 1000 bits. For example, simulations achieve 99% of of the channel’s 0.50-bit capacity with an average block size of 200 bits for a target codeword error rate of 10 -3. 
    more » « less
  4. Streaming codes eliminate the queueing delay and are an appealing candidate for low latency communications. This work studies the tradeoff between error probability p_e and decoding deadline ∆ of infinite-memory random linear streaming codes (RLSCs) over i.i.d. symbol erasure channels (SECs). The contributions include (i) Proving pe(∆) ∼ ρ∆^{−1.5}e^{−η∆}. The asymptotic power term ∆^{−1.5} of RLSCs is a strict improvement over the ∆^{−0.5} term of random linear block codes; (ii) Deriving a pair of upper and lower bounds on the asymptotic constant ρ, which are tight (i.e., identical) for one specific class of SECs; (iii) For any c > 1 and any decoding deadline ∆, the c-optimal memory length α^*_c (∆) is defined as the minimal memory length α needed for the resulting pe to be within a factor of c of the best possible p^*_e under any α, an important piece of information for practical implementation. This work studies and derives new properties of α^*_c (∆) based on the newly developed asymptotics. 
    more » « less
  5. null (Ed.)
    This paper applies error-exponent and dispersionstyle analyses to derive finite-blocklength achievability bounds for low-density parity-check (LDPC) codes over the point-to-point channel (PPC) and multiple access channel (MAC). The error-exponent analysis applies Gallager's error exponent to bound achievable symmetrical and asymmetrical rates in the MAC. The dispersion-style analysis begins with a generalization of the random coding union (RCU) bound from random code ensembles with i.i.d. codewords to random code ensembles in which codewords may be statistically dependent; this generalization is useful since the codewords of random linear codes such as LDPC codes are dependent. Application of the RCU bound yields finite-blocklength error bounds and asymptotic achievability results for both i.i.d. random codes and LDPC codes. For discrete, memoryless channels, these results show that LDPC codes achieve first- and second-order performance that is optimal for the PPC and identical to the best prior results for the MAC. 
    more » « less