Streaming codes take a string of source symbols as input and output a string of coded symbols in real time, which effectively eliminate the queueing delay and are regarded as a promising scheme for low latency communications. Aiming at quantifying the fundamental latency performance of random linear streaming codes (RLSCs) over i.i.d. symbol erasure channels, this work derives the exact error probability under, simultaneously, the finite memory length and finite decoding deadline constraints. The result is then used to examine the tradeoff among memory length (complexity), decoding deadline (delay), and error probability (reliability) of RLSCs for the first time in the literature. Two critical observations are made: (i) Too much memory can adversely impact the performance under a finite decoding deadline constraint, a surprising finding not captured by the traditional wisdom that large memory length monotonically improves the performance in the asymptotic regime; (ii) The end-to-end delay of the RLSC is roughly 50% of that of the MDS block code when under identical code rate and error probability requirements. This implies that switching from block codes to RLSCs not only eliminates the queueing delay (thus 50%) but also has little negative impact on the error probability.
more »
« less
Detailed Asymptotics of the Delay-Reliability Tradeoff of Random Linear Streaming Codes
Streaming codes eliminate the queueing delay and are an appealing candidate for low latency communications. This work studies the tradeoff between error probability p_e and decoding deadline ∆ of infinite-memory random linear streaming codes (RLSCs) over i.i.d. symbol erasure channels (SECs). The contributions include (i) Proving pe(∆) ∼ ρ∆^{−1.5}e^{−η∆}. The asymptotic power term ∆^{−1.5} of RLSCs is a strict improvement over the ∆^{−0.5} term of random linear block codes; (ii) Deriving a pair of upper and lower bounds on the asymptotic constant ρ, which are tight (i.e., identical) for one specific class of SECs; (iii) For any c > 1 and any decoding deadline ∆, the c-optimal memory length α^*_c (∆) is defined as the minimal memory length α needed for the resulting pe to be within a factor of c of the best possible p^*_e under any α, an important piece of information for practical implementation. This work studies and derives new properties of α^*_c (∆) based on the newly developed asymptotics.
more »
« less
- Award ID(s):
- 1816013
- PAR ID:
- 10474214
- Publisher / Repository:
- IEEE
- Date Published:
- ISBN:
- 978-1-6654-7554-9
- Page Range / eLocation ID:
- 1124 to 1129
- Format(s):
- Medium: X
- Location:
- Taipei, Taiwan
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We introduce a novel family of expander-based error correcting codes. These codes can be sampled with randomness linear in the block-length, and achieve list decoding capacity (among other local properties). Our expander-based codes can be made starting from any family of sufficiently low-bias codes, and as a consequence, we give the first construction of a family of algebraic codes that can be sampled with linear randomness and achieve list-decoding capacity. We achieve this by introducing the notion of a pseudorandom puncturing of a code, where we select n indices of a base code C ⊂ 𝔽_q^m in a correlated fashion. Concretely, whereas a random linear code (i.e. a truly random puncturing of the Hadamard code) requires O(n log(m)) random bits to sample, we sample a pseudorandom linear code with O(n + log (m)) random bits by instantiating our pseudorandom puncturing as a length n random walk on an exapnder graph on [m]. In particular, we extend a result of Guruswami and Mosheiff (FOCS 2022) and show that a pseudorandom puncturing of a small-bias code satisfies the same local properties as a random linear code with high probability. As a further application of our techniques, we also show that pseudorandom puncturings of Reed-Solomon codes are list-recoverable beyond the Johnson bound, extending a result of Lund and Potukuchi (RANDOM 2020). We do this by instead analyzing properties of codes with large distance, and show that pseudorandom puncturings still work well in this regime.more » « less
-
List-decodability of Reed-Solomon codes has received a lot of attention, but the best-possible dependence between the parameters is still not well-understood. In this work, we focus on the case where the list-decoding radius is of the form r=1−ε for ε tending to zero. Our main result states that there exist Reed-Solomon codes with rate Ω(ε) which are (1−ε,O(1/ε)) -list-decodable, meaning that any Hamming ball of radius 1−ε contains at most O(1/ε) codewords. This trade-off between rate and list-decoding radius is best-possible for any code with list size less than exponential in the block length. By achieving this trade-off between rate and list-decoding radius we improve a recent result of Guo, Li, Shangguan, Tamo, and Wootters, and resolve the main motivating question of their work. Moreover, while their result requires the field to be exponentially large in the block length, we only need the field size to be polynomially large (and in fact, almost-linear suffices). We deduce our main result from a more general theorem, in which we prove good list-decodability properties of random puncturings of any given code with very large distance.more » « less
-
This paper presents new achievability bounds on the maximal achievable rate of variable-length stop-feedback (VLSF) codes operating over a binary erasure channel (BEC) at a fixed message size M=2^k . We provide bounds for two cases: The first case considers VLSF codes with possibly infinite decoding times and zero error probability. The second case limits the maximum (finite) number of decoding times and specifies a maximum tolerable probability of error. Both new achievability bounds are proved by constructing a new VLSF code that employs systematic transmission of the first k message bits followed by random linear fountain parity bits decoded with a rank decoder. For VLSF codes with infinite decoding times, our new bound outperforms the state-of-the-art result for BEC by Devassy et al. in 2016. We show that the backoff from capacity reduces to zero as the erasure probability decreases, thus giving a negative answer to the open question Devassy et al. posed on whether the 23.4% backoff to capacity at k=3 is fundamental to all BECs. For VLSF codes with finite decoding times, numerical evaluations show that the systematic transmission followed by random linear fountain coding performs better than random linear coding in terms of achievable rates.more » « less
-
We consider the problem of coded distributed computing where a large linear computational job, such as a matrix multiplication, is divided into $$k$$ smaller tasks, encoded using an $(n,k)$ linear code, and performed over $$n$$ distributed nodes. The goal is to reduce the average execution time of the computational job. We provide a connection between the problem of characterizing the average execution time of a coded distributed computing system and the problem of analyzing the error probability of codes of length $$n$$ used over erasure channels. Accordingly, we present closed-form expressions for the execution time using binary random linear codes and the best execution time any linear-coded distributed computing system can achieve. It is also shown that there exist good binary linear codes that attain, asymptotically, the best performance any linear code, not necessarily binary, can achieve. We also investigate the performance of coded distributed computing systems using polar and Reed-Muller (RM) codes that can benefit from low-complexity decoding, and superior performance, respectively, as well as explicit constructions. The proposed framework in this paper can enable efficient designs of distributed computing systems given the rich literature in the channel coding theory.more » « less
An official website of the United States government

