This content will become publicly available on November 27, 2024
 Award ID(s):
 2055568
 NSFPAR ID:
 10513593
 Editor(s):
 Rothblum, Guy; Wee, Hoeteck
 Publisher / Repository:
 Springer
 Date Published:
 Journal Name:
 Theory of Cryptography Conference (TCC 2023)
 Edition / Version:
 LNCS Volume 14372
 ISBN:
 9783031486234
 Page Range / eLocation ID:
 422451
 Format(s):
 Medium: X
 Sponsoring Org:
 National Science Foundation
More Like this

Many proofs of interactive cryptographic protocols (e.g., as in Universal Composability) operate by proving the protocol at hand to be observationally equivalent to an idealized specification. While pervasive, formal tool support for observational equivalence of cryptographic protocols is still a nascent area of research. Current mechanization efforts tend to either focus on diffequivalence, which establishes observational equivalence between protocols with identical control structures, or require an explicit witness for the observational equivalence in the form of a bisimulation relation. Our goal is to simplify proofs for cryptographic protocols by introducing a core calculus, IPDL, for cryptographic observational equivalences. Via IPDL, we aim to address a number of theoretical issues for cryptographic proofs in a simple manner, including probabilistic behaviors, distributed messagepassing, and resourcebounded adversaries and simulators. We demonstrate IPDL on a number of case studies, including a distributed coin toss protocol, Oblivious Transfer, and the GMW multiparty computation protocol. All proofs of case studies are mechanized via an embedding of IPDL into the Coq proof assistant.more » « less

Since the mid1980s it has been known that Byzantine Agreement can be solved with probability 1 asynchronously, even against an omniscient, computationally unbounded adversary that can adaptively
corrupt up tof < n/3 parties. Moreover, the problem is insoluble withf ≥ n/3 corruptions. However, Bracha’s [13 ] 1984 protocol (see also BenOr [8 ]) achievedf < n/3 resilience at the cost ofexponential expected latency2 ^{Θ (n)}, a bound that hasnever been improved in this model withf = ⌊ (n1)/3 ⌋ corruptions.In this article, we prove that Byzantine Agreement in the asynchronous, full information model can be solved with probability 1 against an adaptive adversary that can corrupt
f < n/3 parties, while incurring onlypolynomial latency with high probability . Our protocol follows an earlier polynomial latency protocol of King and Saia [33 ,34 ], which hadsuboptimal resilience, namelyf ≈ n /10^{9} [33 ,34 ].Resilience
f = (n1)/3 is uniquely difficult, as this is the point at which the influence of the Byzantine and honest players are of roughly equal strength. The core technical problem we solve is to design a collective coinflipping protocol thateventually lets us flip a coin with an unambiguous outcome. In the beginning, the influence of the Byzantine players is too powerful to overcome, and they can essentially fix the coin’s behavior at will. We guarantee that after just a polynomial number of executions of the coinflipping protocol, either (a) the Byzantine players fail to fix the behavior of the coin (thereby ending the game) or (b) we can “blacklist” players such that the blacklisting rate for Byzantine players is at least as large as the blacklisting rate for good players. The blacklisting criterion is based on a simple statistical test offraud detection . 
We consider the question of Gaussian mean testing, a fundamental task in highdimensional distribution testing and signal processing, subject to adversarial corruptions of the samples. We focus on the relative power of different adversaries, and show that, in contrast to the common wisdom in robust statistics, there exists a strict separation between adaptive adversaries (strong contamination) and oblivious ones (weak contamination) for this task. Specifically, we resolve both the informationtheoretic and computational landscapes for robust mean testing. In the exponentialtime setting, we establish the tight sample complexity of testing N(0,I) against N(αv,I), where ∥v∥2=1, with an εfraction of adversarial corruptions, to be Θ~(max(d−−√α2,dε3α4,min(d2/3ε2/3α8/3,dεα2))), while the complexity against adaptive adversaries is Θ~(max(d−−√α2,dε2α4)), which is strictly worse for a large range of vanishing ε,α. To the best of our knowledge, ours is the first separation in sample complexity between the strong and weak contamination models. In the polynomialtime setting, we close a gap in the literature by providing a polynomialtime algorithm against adaptive adversaries achieving the above sample complexity Θ~(max(d−−√/α2,dε2/α4)), and a lowdegree lower bound (which complements an existing reduction from planted clique) suggesting that all efficient algorithms require this many samples, even in the obliviousadversary setting.more » « less

We consider the question of Gaussian mean testing, a fundamental task in highdimensional distribution testing and signal processing, subject to adversarial corruptions of the samples. We focus on the relative power of different adversaries, and show that, in contrast to the common wisdom in robust statistics, there exists a strict separation between adaptive adversaries (strong contamination) and oblivious ones (weak contamination) for this task. Specifically, we resolve both the informationtheoretic and computational landscapes for robust mean testing. In the exponentialtime setting, we establish the tight sample complexity of testing N(0,I) against N(αv,I), where ∥v∥2=1, with an εfraction of adversarial corruptions, to be Θ~(max(d√α2,dε3α4,min(d2/3ε2/3α8/3,dεα2))) while the complexity against adaptive adversaries is Θ~(max(d√α2,dε2α4)) which is strictly worse for a large range of vanishing ε,α. To the best of our knowledge, ours is the first separation in sample complexity between the strong and weak contamination models. In the polynomialtime setting, we close a gap in the literature by providing a polynomialtime algorithm against adaptive adversaries achieving the above sample complexity Θ~(max(d−−√/α2,dε2/α4)), and a lowdegree lower bound (which complements an existing reduction from planted clique) suggesting that all efficient algorithms require this many samples, even in the obliviousadversary setting.more » « less

Oshman, Rothem (Ed.)Nakamoto’s consensus protocol works in a permissionless model and tolerates Byzantine failures, but only offers probabilistic agreement. Recently, the Sandglass protocol has shown such weaker guarantees are not a necessary consequence of a permissionless model; yet, Sandglass only tolerates benign failures, and operates in an unconventional partially synchronous model. We present Gorilla Sandglass, the first Byzantine tolerant consensus protocol to guarantee, in the same synchronous model adopted by Nakamoto, deterministic agreement and termination with probability 1 in a permissionless setting. We prove the correctness of Gorilla by mapping executions that would violate agreement or termination in Gorilla to executions in Sandglass, where we know such violations are impossible. Establishing termination proves particularly interesting, as the mapping requires reasoning about infinite executions and their probabilities.more » « less