Abstract A fundamental question in computational complexity asks whether probabilistic polynomial-time algorithms can be simulated deterministically with a small overhead in time (the BPP vs. P problem). A corresponding question in the realm of interactive proofs asks whether Arthur-Merlin protocols can be simulated nondeterministically with a small overhead in time (the AM vs. NP problem). Both questions are intricately tied to lower bounds. Prominently, in both settingsblackboxderandomization, i.e., derandomization through pseudorandom generators, has been shown equivalent to lower bounds for decision problems against circuits.Recently, Chen and Tell (FOCS'21) established nearequivalences in the BPP setting betweenwhiteboxderandomization and lower bounds for multi-bit functions against algorithms on almost-all inputs. The key ingredient is a technique to translate hardness into targeted hitting sets in an instance-wise fashion based on a layered arithmetization of the evaluation of a uniform circuit computing the hard function$$f$$ on the given instance. Follow-up works managed to obtain full equivalences in the BPP setting by exploiting acompressionproperty of classical pseudorandom generator constructions. In particular, Chen, Tell, and Williams (FOCS'23) showed that derandomization of BPP is equivalent toconstructivelower bounds against algorithms that go through a compression phase.In this paper, we develop a corresponding technique for Arthur-Merlin protocols and establish similar near-equivalences in the AM setting. As an example of our results in the hardness-to-derandomization direction, consider a length-preserving function$$f$$ computable by a nondeterministic algorithm that runs in time$$n^a$$ . We show that if every Arthur-Merlin protocol that runs in time$$n^c$$ for$$c=O(\log^2 a)$$ can only compute$$f$$ correctly on finitely many inputs, then AM is in NP. We also obtain equivalences between constructive lower bounds against Arthur-Merlin protocols that go through a compression phase and derandomization of AM viatargetedgenerators. Our main technical contribution is the construction of suitable targeted hitting-set generators based on probabilistically checkable proofs of proximity for nondeterministic computations. As a by-product of our constructions, we obtain the first result indicating that whitebox derandomization of AM may be equivalent to the existence of targeted hitting-set generators for AM, an issue raised by Goldreich (LNCS, 2011). By-products in the average-case setting include the first uniform hardness vs. randomness trade-offs for AM, as well as an unconditional mild derandomization result for AM. 
                        more » 
                        « less   
                    
                            
                            Constructive Separations and Their Consequences
                        
                    
    
            For a complexity class $$C$$ and language $$L$$, a constructive separation of $$L\notin C$$ gives an efficient algorithm (also called a refuter) to findcounterexamples (bad inputs) for every $$C$$-algorithm attempting to decide $$L$$.We study the questions: Which lower bounds can be made constructive? What arethe consequences of constructive separations? We build a case thatconstructiveness serves as a dividing line between many weak lower bounds weknow how to prove, and strong lower bounds against $$P$$, $ZPP$, and $BPP$. Putanother way, constructiveness is the opposite of a complexity barrier: it is aproperty we want lower bounds to have. Our results fall into three broadcategories. 1. Our first set of results shows that, for many well-known lower boundsagainst streaming algorithms, one-tape Turing machines, and query complexity,as well as lower bounds for the Minimum Circuit Size Problem, making theselower bounds constructive would imply breakthrough separations ranging from$$EXP \neq BPP$$ to even $$P \neq NP$$. 2. Our second set of results shows that for most major open problems in lowerbounds against $$P$$, $ZPP$, and $BPP$, including $$P \neq NP$$, $$P \neq PSPACE$$,$$P \neq PP$$, $$ZPP \neq EXP$$, and $$BPP \neq NEXP$$, any proof of the separationwould further imply a constructive separation. Our results generalize earlierresults for $$P \neq NP$$ [Gutfreund, Shaltiel, and Ta-Shma, CCC 2005] and $$BPP\neq NEXP$$ [Dolev, Fandina and Gutfreund, CIAC 2013]. 3. Our third set of results shows that certain complexity separations cannotbe made constructive. We observe that for all super-polynomially growingfunctions $$t$$, there are no constructive separations for detecting high$$t$$-time Kolmogorov complexity (a task which is known to be not in $$P$$) fromany complexity class, unconditionally. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2127597
- PAR ID:
- 10579530
- Publisher / Repository:
- TheoretiCS
- Date Published:
- Journal Name:
- TheoretiCS
- Volume:
- Volume 3
- ISSN:
- 2751-4838
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Ta-Shma, Amnon (Ed.)A fundamental question in computational complexity asks whether probabilistic polynomial-time algorithms can be simulated deterministically with a small overhead in time (the BPP vs. P problem). A corresponding question in the realm of interactive proofs asks whether Arthur-Merlin protocols can be simulated nondeterministically with a small overhead in time (the AM vs. NP problem). Both questions are intricately tied to lower bounds. Prominently, in both settings blackbox derandomization, i.e., derandomization through pseudo-random generators, has been shown equivalent to lower bounds for decision problems against circuits. Recently, Chen and Tell (FOCS'21) established near-equivalences in the BPP setting between whitebox derandomization and lower bounds for multi-bit functions against algorithms on almost-all inputs. The key ingredient is a technique to translate hardness into targeted hitting sets in an instance-wise fashion based on a layered arithmetization of the evaluation of a uniform circuit computing the hard function f on the given instance. In this paper we develop a corresponding technique for Arthur-Merlin protocols and establish similar near-equivalences in the AM setting. As an example of our results in the hardness to derandomization direction, consider a length-preserving function f computable by a nondeterministic algorithm that runs in time n^a. We show that if every Arthur-Merlin protocol that runs in time n^c for c = O(log² a) can only compute f correctly on finitely many inputs, then AM is in NP. Our main technical contribution is the construction of suitable targeted hitting-set generators based on probabilistically checkable proofs for nondeterministic computations. As a byproduct of our constructions, we obtain the first result indicating that whitebox derandomization of AM may be equivalent to the existence of targeted hitting-set generators for AM, an issue raised by Goldreich (LNCS, 2011). Byproducts in the average-case setting include the first uniform hardness vs. randomness tradeoffs for AM, as well as an unconditional mild derandomization result for AM.more » « less
- 
            Existing proofs that deduce BPP = P from circuit lower bounds convert randomized algorithms into deterministic algorithms with a large polynomial slowdown. We convert randomized algorithms into deterministic ones with little slowdown . Specifically, assuming exponential lower bounds against randomized NP ∩ coNP circuits, formally known as randomized SVN circuits, we convert any randomized algorithm over inputs of length n running in time t ≥ n into a deterministic one running in time t 2+α for an arbitrarily small constant α > 0. Such a slowdown is nearly optimal for t close to n , since under standard complexity-theoretic assumptions, there are problems with an inherent quadratic derandomization slowdown. We also convert any randomized algorithm that errs rarely into a deterministic algorithm having a similar running time (with pre-processing). The latter derandomization result holds under weaker assumptions, of exponential lower bounds against deterministic SVN circuits. Our results follow from a new, nearly optimal, explicit pseudorandom generator fooling circuits of size s with seed length (1+α)log s , under the assumption that there exists a function f ∈ E that requires randomized SVN circuits of size at least 2 (1-α′) n , where α = O (α)′. The construction uses, among other ideas, a new connection between pseudoentropy generators and locally list recoverable codes.more » « less
- 
            The Exponential-Time Hypothesis ( \(\mathtt {ETH} \) ) is a strengthening of the \(\mathcal {P} \ne \mathcal {NP} \) conjecture, stating that \(3\text{-}\mathtt {SAT} \) on n variables cannot be solved in (uniform) time 2 ϵ · n , for some ϵ > 0. In recent years, analogous hypotheses that are “exponentially-strong” forms of other classical complexity conjectures (such as \(\mathcal {NP}\nsubseteq \mathcal {BPP} \) or \(co\mathcal {NP}\nsubseteq \mathcal {NP} \) ) have also been introduced, and have become widely influential. In this work, we focus on the interaction of exponential-time hypotheses with the fundamental and closely-related questions of derandomization and circuit lower bounds . We show that even relatively-mild variants of exponential-time hypotheses have far-reaching implications to derandomization, circuit lower bounds, and the connections between the two. Specifically, we prove that: (1) The Randomized Exponential-Time Hypothesis ( \(\mathsf {rETH} \) ) implies that \(\mathcal {BPP} \) can be simulated on “average-case” in deterministic (nearly-)polynomial-time (i.e., in time \(2^{\tilde{O}(\log (n))}=n^{\mathrm{loglog}(n)^{O(1)}} \) ). The derandomization relies on a conditional construction of a pseudorandom generator with near-exponential stretch (i.e., with seed length \(\tilde{O}(\log (n)) \) ); this significantly improves the state-of-the-art in uniform “hardness-to-randomness” results, which previously only yielded pseudorandom generators with sub-exponential stretch from such hypotheses. (2) The Non-Deterministic Exponential-Time Hypothesis ( \(\mathsf {NETH} \) ) implies that derandomization of \(\mathcal {BPP} \) is completely equivalent to circuit lower bounds against \(\mathcal {E} \) , and in particular that pseudorandom generators are necessary for derandomization. In fact, we show that the foregoing equivalence follows from a very weak version of \(\mathsf {NETH} \) , and we also show that this very weak version is necessary to prove a slightly stronger conclusion that we deduce from it. Lastly, we show that disproving certain exponential-time hypotheses requires proving breakthrough circuit lower bounds. In particular, if \(\mathtt {CircuitSAT} \) for circuits over n bits of size poly( n ) can be solved by probabilistic algorithms in time 2 n /polylog( n ) , then \(\mathcal {BPE} \) does not have circuits of quasilinear size.more » « less
- 
            null (Ed.)The complexity class ZPP NP[1] (corresponding to zero-error randomized algorithms with access to one NP oracle query) is known to have a number of curious properties. We further explore this class in the settings of time complexity, query complexity, and communication complexity. • For starters, we provide a new characterization: ZPP NP[1] equals the restriction of BPP NP[1] where the algorithm is only allowed to err when it forgoes the opportunity to make an NP oracle query. • Using the above characterization, we prove a query-to-communication lifting theorem , which translates any ZPP NP[1] decision tree lower bound for a function f into a ZPP NP[1] communication lower bound for a two-party version of f . • As an application, we use the above lifting theorem to prove that the ZPP NP[1] communication lower bound technique introduced by Göös, Pitassi, and Watson (ICALP 2016) is not tight. We also provide a “primal” characterization of this lower bound technique as a complexity class.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    