skip to main content


Title: On Locally Decodable Codes in Resource Bounded Channels
Constructions of locally decodable codes (LDCs) have one of two undesirable properties: low rate or high locality (polynomial in the length of the message). In settings where the encoder/decoder have already exchanged cryptographic keys and the channel is a probabilistic polynomial time (PPT) algorithm, it is possible to circumvent these barriers and design LDCs with constant rate and small locality. However, the assumption that the encoder/decoder have exchanged cryptographic keys is often prohibitive. We thus consider the problem of designing explicit and efficient LDCs in settings where the channel is slightly more constrained than the encoder/decoder with respect to some resource e.g., space or (sequential) time. Given an explicit function f that the channel cannot compute, we show how the encoder can transmit a random secret key to the local decoder using f(⋅) and a random oracle 𝖧(⋅). We then bootstrap the private key LDC construction of Ostrovsky, Pandey and Sahai (ICALP, 2007), thereby answering an open question posed by Guruswami and Smith (FOCS 2010) of whether such bootstrapping techniques are applicable to LDCs in channel models weaker than just PPT algorithms. Specifically, in the random oracle model we show how to construct explicit constant rate LDCs with locality of polylog in the security parameter against various resource constrained channels.  more » « less
Award ID(s):
1910659 1910411
NSF-PAR ID:
10200738
Author(s) / Creator(s):
; ;
Editor(s):
Yael Tauman Kalai and Adam D. Smith and Daniel Wichs
Date Published:
Journal Name:
Leibniz international proceedings in informatics
Volume:
163
ISSN:
1868-8969
Page Range / eLocation ID:
16:1--16:23
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We study common randomness where two parties have access to i.i.d. samples from a known random source, and wish to generate a shared random key using limited (or no) communication with the largest possible probability of agreement. This problem is at the core of secret key generation in cryptography, with connections to communication under uncertainty and locality sensitive hashing. We take the approach of treating correlated sources as a critical resource, and ask whether common randomness can be generated resource-efficiently. We consider two notable sources in this setup arising from correlated bits and correlated Gaussians. We design the first explicit schemes that use only a polynomial number of samples (in the key length) so that the players can generate shared keys that agree with constant probability using optimal communication. The best previously known schemes were both non-constructive and used an exponential number of samples. In the amortized setting, we characterize the largest achievable ratio of key length to communication in terms of the external and internal information costs, two well-studied quantities in theoretical computer science. In the relaxed setting where the two parties merely wish to improve the correlation between the generated keys of length k, we show that there are no interactive protocols using o(k) bits of communication having agreement probability even as small as 2–o(k). For the related communication problem where the players wish to compute a joint function f of their inputs using i.i.d samples from a known source, we give a simultaneous message passing protocol using 2O(c) bits where c is the interactive randomized public-coin communication complexity of f. This matches the lower bound shown previously while the best previously known upper bound was doubly exponential in c. Our schemes reveal a new connection between common randomness and unbiased error-correcting codes, e.g., dual-BCH codes and their analogues in Euclidean space. Read More: https://epubs.siam.org/doi/10.1137/1.9781611975031.120 
    more » « less
  2. null (Ed.)
    We construct locally decodable codes (LDCs) to correct insertion-deletion errors in the setting where the sender and receiver share a secret key or where the channel is resource-bounded. Our constructions rely on a so-called ``Hamming-to-InsDel'' compiler (Ostrovsky and Paskin-Cherniavsky, ITS '15 \& Block et al., FSTTCS '20), which compiles any locally decodable Hamming code into a locally decodable code resilient to insertion-deletion (InsDel) errors. While the compilers were designed for the classical coding setting, we show that the compilers still work in a secret key or resource-bounded setting. Applying our results to the private key Hamming LDC of Ostrovsky, Pandey, and Sahai (ICALP '07), we obtain a private key InsDel LDC with constant rate and polylogarithmic locality. Applying our results to the construction of Blocki, Kulkarni, and Zhou (ITC '20), we obtain similar results for resource-bounded channels; i.e., a channel where computation is constrained by resources such as space or time. 
    more » « less
  3. We formally introduce, define, and construct {\em memory-hard puzzles}. Intuitively, for a difficulty parameter $t$, a cryptographic puzzle is memory-hard if any parallel random access machine (PRAM) algorithm with ``small'' cumulative memory complexity ($\ll t^2$) cannot solve the puzzle; moreover, such puzzles should be both ``easy'' to generate and be solvable by a sequential RAM algorithm running in time $t$. Our definitions and constructions of memory-hard puzzles are in the standard model, assuming the existence of indistinguishability obfuscation (\iO) and one-way functions (OWFs), and additionally assuming the existence of a {\em memory-hard language}. Intuitively, a language is memory-hard if it is undecidable by any PRAM algorithm with ``small'' cumulative memory complexity, while a sequential RAM algorithm running in time $t$ can decide the language. Our definitions and constructions of memory-hard objects are the first such definitions and constructions in the standard model without relying on idealized assumptions (such as random oracles). We give two applications which highlight the utility of memory-hard puzzles. For our first application, we give a construction of a (one-time) {\em memory-hard function} (MHF) in the standard model, using memory-hard puzzles and additionally assuming \iO and OWFs. For our second application, we show any cryptographic puzzle (\eg, memory-hard, time-lock) can be used to construct {\em resource-bounded locally decodable codes} (LDCs) in the standard model, answering an open question of Blocki, Kulkarni, and Zhou (ITC 2020). Resource-bounded LDCs achieve better rate and locality than their classical counterparts under the assumption that the adversarial channel is resource bounded (e.g., a low-depth circuit). Prior constructions of MHFs and resource-bounded LDCs required idealized primitives like random oracles. 
    more » « less
  4. Locally Decodable Codes (LDCs) are error-correcting codes for which individual message symbols can be quickly recovered despite errors in the codeword. LDCs for Hamming errors have been studied extensively in the past few decades, where a major goal is to understand the amount of redundancy that is necessary and sufficient to decode from large amounts of error, with small query complexity. Despite exciting progress, we still don't have satisfactory answers in several important parameter regimes. For example, in the case of 3-query LDCs, the gap between existing constructions and lower bounds is superpolynomial in the message length. In this work we study LDCs for insertion and deletion errors, called Insdel LDCs. Their study was initiated by Ostrovsky and Paskin-Cherniavsky (Information Theoretic Security, 2015), who gave a reduction from Hamming LDCs to Insdel LDCs with a small blowup in the code parameters. On the other hand, the only known lower bounds for Insdel LDCs come from those for Hamming LDCs, thus there is no separation between them. Here we prove new, strong lower bounds for the existence of Insdel LDCs. In particular, we show that 2-query linear Insdel LDCs do not exist, and give an exponential lower bound for the length of all q-query Insdel LDCs with constant q. For q ≥ 3 our bounds are exponential in the existing lower bounds for Hamming LDCs. Furthermore, our exponential lower bounds continue to hold for adaptive decoders, and even in private-key settings where the encoder and decoder share secret randomness. This exhibits a strict separation between Hamming LDCs and Insdel LDCs. Our strong lower bounds also hold for the related notion of Insdel LCCs (except in the private-key setting), due to an analogue to the Insdel notions of a reduction from Hamming LCCs to LDCs. Our techniques are based on a delicate design and analysis of hard distributions of insertion and deletion errors, which depart significantly from typical techniques used in analyzing Hamming LDCs. 
    more » « less
  5. Error-correcting codes that admit {\em local} decoding and correcting algorithms have been the focus of much recent research due to their numerous theoretical and practical applications. An important goal is to obtain the best possible tradeoffs between the number of queries the algorithm makes to its oracle (the {\em locality} of the task), and the amount of redundancy in the encoding (the {\em information rate}). In Hamming's classical adversarial channel model, the current tradeoffs are dramatic, allowing either small locality, but superpolynomial blocklength, or small blocklength, but high locality. However, in the computationally bounded, adversarial channel model, proposed by Lipton (STACS 1994), constructions of locally decodable codes suddenly exhibit small locality and small blocklength, but these constructions require strong trusted setup assumptions e.g., Ostrovsky, Pandey and Sahai (ICALP 2007) construct private locally decodable codes in the setting where the sender and receiver already share a symmetric key. We study variants of locally decodable and locally correctable codes in computationally bounded, adversarial channels, in a setting with no public-key or private-key cryptographic setup. The only setup assumption we require is the selection of the {\em public} parameters (seed) for a collision-resistant hash function. Specifically, we provide constructions of {\em relaxed locally correctable} and {\em relaxed locally decodable codes} over the binary alphabet, with constant information rate, and poly-logarithmic locality. Our constructions, which compare favorably with their classical analogues in the computationally unbounded Hamming channel, crucially employ {\em collision-resistant hash functions} and {\em local expander graphs}, extending ideas from recent cryptographic constructions of memory-hard functions. 
    more » « less