Srinivasan, Srikanth
(Ed.)
Information complexity is one of the most powerful techniques to prove information-theoretical lower bounds, in which Shannon entropy plays a central role. Though Shannon entropy has some convenient properties, such as the chain rule, it still has inherent limitations. One of the most notable barriers is the square-root loss, which appears in the square-root gap between entropy gaps and statistical distances, e.g., Pinsker’s inequality. To bypass this barrier, we introduce a new method based on min-entropy analysis. Building on this new method, we prove the following results. - An Ω(N^{∑_i α_i - max_i {α_i}}/k) randomized communication lower bound of the k-party set-intersection problem where the i-th party holds a random set of size ≈ N^{1-α_i}. - A tight Ω(n/k) randomized lower bound of the k-party Tree Pointer Jumping problems, improving an Ω(n/k²) lower bound by Chakrabarti, Cormode, and McGregor (STOC 08). - An Ω(n/k+√n) lower bound of the Chained Index problem, improving an Ω(n/k²) lower bound by Cormode, Dark, and Konrad (ICALP 19). Since these problems served as hard problems for numerous applications in streaming lower bounds and cryptography, our new lower bounds directly improve these streaming lower bounds and cryptography lower bounds. On the technical side, min-entropy does not have nice properties such as the chain rule. To address this issue, we enhance the structure-vs-pseudorandomness decomposition used by Göös, Pitassi, and Watson (FOCS 17) and Yang and Zhang (STOC 24); both papers used this decomposition to prove communication lower bounds. In this paper, we give a new breath to this method in the multi-party setting, presenting a new toolkit for proving multi-party communication lower bounds.
more »
« less
An official website of the United States government

