skip to main content


Title: Guest Column: Average-case Complexity Through the Lens of Interactive Puzzles
We review a study of average-case complexity through the lens of interactive puzzles- interactive games between a computationally bounded Challenger and computationally-bounded Solver/Attacker. Most notably, we use this treatment to review a recent result showing that if NP is hard-on-the-average, then there exists a sampleable distribution over only true statements of an NP language, for which no probabilistic polynomial time algorithm can find witnesses. We also discuss connections to the problem of whether average-case hardness in NP implies averagecase hardness in TFNP, or the existence of cryptographic one-way functions.  more » « less
Award ID(s):
1703846
NSF-PAR ID:
10248579
Author(s) / Creator(s):
;
Date Published:
Journal Name:
ACM SIGACT News
Volume:
52
Issue:
1
ISSN:
0163-5700
Page Range / eLocation ID:
47 to 69
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Ahn, Hee-Kap ; Sadakane, Kunihiko (Ed.)
    A version of time-bounded Kolmogorov complexity, denoted KT, has received attention in the past several years, due to its close connection to circuit complexity and to the Minimum Circuit Size Problem MCSP. Essentially all results about the complexity of MCSP hold also for MKTP (the problem of computing the KT complexity of a string). Both MKTP and MCSP are hard for SZK (Statistical Zero Knowledge) under BPP-Turing reductions; neither is known to be NP-complete. Recently, some hardness results for MKTP were proved that are not (yet) known to hold for MCSP. In particular, MKTP is hard for DET (a subclass of P) under nonuniform ≤^{NC^0}_m reductions. In this paper, we improve this, to show that the complement of MKTP is hard for the (apparently larger) class NISZK_L under not only ≤^{NC^0}_m reductions but even under projections. Also, the complement of MKTP is hard for NISZK under ≤^{P/poly}_m reductions. Here, NISZK is the class of problems with non-interactive zero-knowledge proofs, and NISZK_L is the non-interactive version of the class SZK_L that was studied by Dvir et al. As an application, we provide several improved worst-case to average-case reductions to problems in NP, and we obtain a new lower bound on MKTP (which is currently not known to hold for MCSP). 
    more » « less
  2. Bojanczy, Mikolaj ; Chekuri, Chandra (Ed.)
    One-way functions (OWFs) are central objects of study in cryptography and computational complexity theory. In a seminal work, Liu and Pass (FOCS 2020) proved that the average-case hardness of computing time-bounded Kolmogorov complexity is equivalent to the existence of OWFs. It remained an open problem to establish such an equivalence for the average-case hardness of some natural NP-complete problem. In this paper, we make progress on this question by studying a conditional variant of the Minimum KT-complexity Problem (MKTP), which we call McKTP, as follows. 1) First, we prove that if McKTP is average-case hard on a polynomial fraction of its instances, then there exist OWFs. 2) Then, we observe that McKTP is NP-complete under polynomial-time randomized reductions. 3) Finally, we prove that the existence of OWFs implies the nontrivial average-case hardness of McKTP. Thus the existence of OWFs is inextricably linked to the average-case hardness of this NP-complete problem. In fact, building on recently-announced results of Ren and Santhanam [Rahul Ilango et al., 2021], we show that McKTP is hard-on-average if and only if there are logspace-computable OWFs. 
    more » « less
  3. Cryptography is largely based on unproven assumptions, which, while believable, might fail. Notably if P=NP, or if we live in Pessiland, then all current cryptographic assumptions will be broken. A compelling question is if any interesting cryptography might exist in Pessiland. A natural approach to tackle this question is to base cryptography on an assumption from fine-grained complexity. Ball, Rosen, Sabin, and Vasudevan [BRSV’17] attempted this, starting from popular hardness assumptions, such as the Orthogonal Vectors (OV) Conjecture. They obtained problems that are hard on average, assuming that OV and other problems are hard in the worst case. They obtained proofs of work, and hoped to use their average-case hard problems to build a fine-grained one-way function. Unfortunately, they proved that constructing one using their approach would violate a popular hardness hypothesis. This motivates the search for other fine-grained average-case hard problems. The main goal of this paper is to identify sufficient properties for a fine-grained average-case assumption that imply cryptographic primitives such as fine-grained public key cryptography (PKC). Our main contribution is a novel construction of a cryptographic key exchange, together with the definition of a small number of relatively weak structural properties, such that if a computational problem satisfies them, our key exchange has provable fine-grained security guarantees, based on the hardness of this problem. We then show that a natural and plausible average-case assumption for the key problem Zero-k-Clique from fine-grained complexity satisfies our properties. We also develop fine-grained one-way functions and hardcore bits even under these weaker assumptions. Where previous works had to assume random oracles or the existence of strong one-way functions to get a key-exchange computable in O(n) time secure against O(n^2) time adversaries (see [Merkle’78] and [BGI’08]), our assumptions seem much weaker. Our key exchange has a similar gap between the computation of the honest party and the adversary as prior work, while being non-interactive, implying fine-grained PKC. 
    more » « less
  4. null (Ed.)
    Consider the following two fundamental open problems in complexity theory: • Does a hard-on-average language in NP imply the existence of one-way functions? • Does a hard-on-average language in NP imply a hard-on-average problem in TFNP (i.e., the class of total NP search problem)? Our main result is that the answer to (at least) one of these questions is yes. Both one-way functions and problems in TFNP can be interpreted as promise-true distributional NP search problems—namely, distributional search problems where the sampler only samples true statements. As a direct corollary of the above result, we thus get that the existence of a hard-on-average distributional NP search problem implies a hard-on-average promise-true distributional NP search problem. In other words, It is no easier to find witnesses (a.k.a. proofs) for efficiently-sampled statements (theorems) that are guaranteed to be true. This result follows from a more general study of interactive puzzles—a generalization of average-case hardness in NP—and in particular, a novel round-collapse theorem for computationallysound protocols, analogous to Babai-Moran’s celebrated round-collapse theorem for informationtheoretically sound protocols. As another consequence of this treatment, we show that the existence of O(1)-round public-coin non-trivial arguments (i.e., argument systems that are not proofs) imply the existence of a hard-on-average problem in NP/poly. 
    more » « less
  5. Over recent years, devising classification algorithms that are robust to adversarial perturbations hasemerged as a challenging problem. In particular, deep neural nets (DNNs) seem to be susceptible tosmall imperceptible changes over test instances. However, the line of work inprovablerobustness,so far, has been focused oninformation theoreticrobustness, ruling out even theexistenceof anyadversarial examples. In this work, we study whether there is a hope to benefit fromalgorithmicnature of an attacker that searches for adversarial examples, and ask whether there isanylearning taskfor which it is possible to design classifiers that are only robust againstpolynomial-timeadversaries.Indeed, numerous cryptographic tasks (e.g. encryption of long messages) can only be secure againstcomputationally bounded adversaries, and are indeedimpossiblefor computationally unboundedattackers. Thus, it is natural to ask if the same strategy could help robust learning.We show that computational limitation of attackers can indeed be useful in robust learning bydemonstrating the possibility of a classifier for some learning task for which computational andinformation theoretic adversaries of bounded perturbations have very different power. Namely, whilecomputationally unbounded adversaries can attack successfully and find adversarial examples withsmall perturbation, polynomial time adversaries are unable to do so unless they can break standardcryptographic hardness assumptions. Our results, therefore, indicate that perhaps a similar approachto cryptography (relying on computational hardness) holds promise for achieving computationallyrobust machine learning. On the reverse directions, we also show that the existence of such learningtask in which computational robustness beats information theoretic robustness requires computationalhardness by implying (average-case) hardness o fNP. 
    more » « less