In a recent work (Ghazi et al., SODA 2016), the authors with Komargodski and Kothari initiated the study of communication with contextual uncertainty, a setup aiming to understand how efficient communication is possible when the communicating parties imperfectly share a huge context. In this setting, Alice is given a function f and an input string x, and Bob is given a function g and an input string y. The pair (x,y) comes from a known distribution mu and f and g are guaranteed to be close under this distribution. Alice and Bob wish to compute g(x,y) with high probability. The lack of agreement between Alice and Bob on the function that is being computed captures the uncertainty in the context. The previous work showed that any problem with oneway communication complexity k in the standard model (i.e., without uncertainty, in other words, under the promise that f=g) has publiccoin communication at most O(k(1+I)) bits in the uncertain case, where I is the mutual information between x and y. Moreover, a lower bound of Omega(sqrt{I}) bits on the publiccoin uncertain communication was also shown. However, an important question that was left open is related to the power that public randomness bringsmore »
ResourceEfficient Common Randomness and SecretKey Schemes
We study common randomness where two parties have access to i.i.d. samples from a known random source, and wish to generate a shared random key using limited (or no) communication with the largest possible probability of agreement. This problem is at the core of secret key generation in cryptography, with connections to communication under uncertainty and locality sensitive hashing. We take the approach of treating correlated sources as a critical resource, and ask whether common randomness can be generated resourceefficiently.
We consider two notable sources in this setup arising from correlated bits and correlated Gaussians. We design the first explicit schemes that use only a polynomial number of samples (in the key length) so that the players can generate shared keys that agree with constant probability using optimal communication. The best previously known schemes were both nonconstructive and used an exponential number of samples. In the amortized setting, we characterize the largest achievable ratio of key length to communication in terms of the external and internal information costs, two wellstudied quantities in theoretical computer science. In the relaxed setting where the two parties merely wish to improve the correlation between the generated keys of length k, we show that there are more »
 Publication Date:
 NSFPAR ID:
 10075659
 Journal Name:
 Proceedings of the annual ACMSIAM Symposium on Discrete Algorithms
 Page Range or eLocationID:
 18341853
 ISSN:
 10719040
 Sponsoring Org:
 National Science Foundation
More Like this


We study the role of interaction in the Common Randomness Generation (CRG) and Secret Key Generation (SKG) problems. In the CRG problem, two players, Alice and Bob, respectively get samples X1, X2, . . . and Y1, Y2, . . . with the pairs (X1, Y1), (X2, Y2), . . . being drawn independently from some known probability distribution μ. They wish to communicate so as to agree on L bits of randomness. The SKG problem is the restriction of the CRG problem to the case where the key is required to be close to random even to an eavesdropper who can listen to their communication (but does not have access to the inputs of Alice and Bob). In this work, we study the relationship between the amount of communication and the number of rounds of interaction in both the CRG and the SKG problems. Specifically, we construct a family of distributions μ = μr,n,L, parametrized by integers r, n and L, such that for every r there exists a constant b = b(r) for which CRG (respectively SKG) is feasible when (Xi, Yi) ~ μr,n,L with r + 1 rounds of communication, each consisting of O(log n) bits, butmore »

We study the communication rate of coding schemes for interactive communication that transform any twoparty interactive protocol into a protocol that is robust to noise. Recently, Haeupler [11] showed that if an ∊ > 0 fraction of transmissions are corrupted, adversarially or randomly, then it is possible to achieve a communication rate of Furthermore, Haeupler conjectured that this rate is optimal for general input protocols. This stands in contrast to the classical setting of oneway communication in which errorcorrecting codes are known to achieve an optimal communication rate of 1 In this work, we show that the quadratically smaller rate loss of the oneway setting can also be achieved in interactive coding schemes for a very natural class of input protocols. We introduce the notion of average message length, or the average number of bits a party sends before receiving a reply, as a natural parameter for measuring the level of interactivity in a protocol. Moreover, we show that any protocol with average message length ℓ = Ω(poly(1/∊)) can be simulated by a protocol with optimal communication rate 1  Θ(Η(∊)) over an oblivious adversarial channel with error fraction e. Furthermore, under the additional assumption of access to public sharedmore »

Motivated by an attempt to understand the formation and development of (human) language, we introduce a "distributed compression" problem. In our problem a sequence of pairs of players from a set of K players are chosen and tasked to communicate messages drawn from an unknown distribution Q. Arguably languages are created and evolve to compress frequently occurring messages, and we focus on this aspect. The only knowledge that players have about the distribution Q is from previously drawn samples, but these samples differ from player to player. The only common knowledge between the players is restricted to a common prior distribution P and some constant number of bits of information (such as a learning algorithm). Letting T_eps denote the number of iterations it would take for a typical player to obtain an epsapproximation to Q in total variation distance, we ask whether T_eps iterations suffice to compress the messages down roughly to their entropy and give a partial positive answer. We show that a natural uniform algorithm can compress the communication down to an average cost per message of O(H(Q) + log (D(P  Q) + O(1)) in $\tilde{O}(T_eps)$ iterations while allowing for O(eps)error, where D(.  .) denotes themore »

Cryptographic protocols are often implemented at upper layers of communication networks, while errorcorrecting codes are employed at the physical layer. In this paper, we consider utilizing readilyavailable physical layer functions, such as encoders and decoders, together with shared keys to provide a thresholdtype security scheme. To this end, the effect of physical layer communication is abstracted out and the channels between the legitimate parties, Alice and Bob, and the eavesdropper Eve are assumed to be noiseless. We introduce a model for thresholdsecure coding, where Alice and Bob communicate using a shared key in such a way that Eve does not get any information, in an informationtheoretic sense, about the key as well as about any subset of the input symbols of size up to a certain threshold. Then, a framework is provided for constructing thresholdsecure codes form linear block codes while characterizing the requirements to satisfy the reliability and security conditions. Moreover, we propose a thresholdsecure coding scheme, based on ReedMuller (RM) codes, that meets security and reliability conditions. Furthermore, it is shown that the encoder and the decoder of the scheme can be implemented efficiently with quasilinear time complexity. In particular, a lowcomplexity successive cancellation decoder is shown formore »