Abstract We study the singular set in the thin obstacle problem for degenerate parabolic equations with weight$$|y|^a$$ for$$a \in (-1,1)$$ . Such problem arises as the local extension of the obstacle problem for the fractional heat operator$$(\partial _t - \Delta _x)^s$$ for$$s \in (0,1)$$ . Our main result establishes the complete structure and regularity of the singular set of the free boundary. To achieve it, we prove Almgren-Poon, Weiss, and Monneau type monotonicity formulas which generalize those for the case of the heat equation ($$a=0$$ ).
more »
« less
Cazamariposas: Automated Instability Debugging in SMT-Based Program Verification
Abstract Program verification languages such as Dafny and F$$ ^\star $$ often rely heavily on Satisfiability Modulo Theories (SMT) solvers for proof automation. However, SMT-based verification suffers from instability, where semantically irrelevant changes in the source program can cause spurious proof failures. While existing mitigation techniques emphasize preemptive measures, we propose a complementary approach that focuses on diagnosing and repairing specific instances of instability-induced failures. Our key technique is a novel differential analysis to pinpoint problematic quantified formulas in an unstable query. We implement this technique in Cazamariposas, a tool that automatically identifies such quantified formulas and suggests fixes. We evaluate Cazamariposas on multiple large-scale systems verification projects written in three different program verification languages. Our results demonstrate Cazamariposas ’ effectiveness as an instability debugger. In the majority of cases, Cazamariposas successfully isolates the issue to a single problematic quantifier, while providing a stabilizing fix.
more »
« less
- Award ID(s):
- 2224279
- PAR ID:
- 10641861
- Publisher / Repository:
- Springer Nature Switzerland
- Date Published:
- Page Range / eLocation ID:
- 75 to 94
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract In a Merlin–Arthur proof system, the proof verifier (Arthur) accepts valid proofs (from Merlin) with probability 1, and rejects invalid proofs with probability arbitrarily close to 1. The running time of such a system is defined to be the length of Merlin’s proof plus the running time of Arthur. We provide new Merlin–Arthur proof systems for some key problems in fine-grained complexity. In several cases our proof systems have optimal running time. Our main results include:Certifying that a list ofnintegers has no 3-SUM solution can be done in Merlin–Arthur time$$\tilde{O}(n)$$ . Previously, Carmosino et al. [ITCS 2016] showed that the problem has a nondeterministic algorithm running in$$\tilde{O}(n^{1.5})$$ time (that is, there is a proof system with proofs of length$$\tilde{O}(n^{1.5})$$ and a deterministic verifier running in$$\tilde{O}(n^{1.5})$$ time).Counting the number ofk-cliques with total edge weight equal to zero in ann-node graph can be done in Merlin–Arthur time$${\tilde{O}}(n^{\lceil k/2\rceil })$$ (where$$k\ge 3$$ ). For oddk, this bound can be further improved for sparse graphs: for example, counting the number of zero-weight triangles in anm-edge graph can be done in Merlin–Arthur time$${\tilde{O}}(m)$$ . Previous Merlin–Arthur protocols by Williams [CCC’16] and Björklund and Kaski [PODC’16] could only countk-cliques in unweighted graphs, and had worse running times for smallk.Computing the All-Pairs Shortest Distances matrix for ann-node graph can be done in Merlin–Arthur time$$\tilde{O}(n^2)$$ . Note this is optimal, as the matrix can have$$\Omega (n^2)$$ nonzero entries in general. Previously, Carmosino et al. [ITCS 2016] showed that this problem has an$$\tilde{O}(n^{2.94})$$ nondeterministic time algorithm.Certifying that ann-variablek-CNF is unsatisfiable can be done in Merlin–Arthur time$$2^{n/2 - n/O(k)}$$ . We also observe an algebrization barrier for the previous$$2^{n/2}\cdot \textrm{poly}(n)$$ -time Merlin–Arthur protocol of R. Williams [CCC’16] for$$\#$$ SAT: in particular, his protocol algebrizes, and we observe there is no algebrizing protocol fork-UNSAT running in$$2^{n/2}/n^{\omega (1)}$$ time. Therefore we have to exploit non-algebrizing properties to obtain our new protocol.Certifying a Quantified Boolean Formula is true can be done in Merlin–Arthur time$$2^{4n/5}\cdot \textrm{poly}(n)$$ . Previously, the only nontrivial result known along these lines was an Arthur–Merlin–Arthur protocol (where Merlin’s proof depends on some of Arthur’s coins) running in$$2^{2n/3}\cdot \textrm{poly}(n)$$ time.Due to the centrality of these problems in fine-grained complexity, our results have consequences for many other problems of interest. For example, our work implies that certifying there is no Subset Sum solution tonintegers can be done in Merlin–Arthur time$$2^{n/3}\cdot \textrm{poly}(n)$$ , improving on the previous best protocol by Nederlof [IPL 2017] which took$$2^{0.49991n}\cdot \textrm{poly}(n)$$ time.more » « less
-
Abstract $$B^\pm \rightarrow DK^\pm $$ transitions are known to provide theoretically clean information about the CKM angle$$\gamma $$ , with the most precise available methods exploiting the cascade decay of the neutralDintoCPself-conjugate states. Such analyses currently require binning in theDdecay Dalitz plot, while a recently proposed method replaces this binning with the truncation of a Fourier series expansion. In this paper, we present a proof of principle of a novel alternative to these two methods, in which no approximations at the level of the data representation are required. In particular, our new strategy makes no assumptions about the amplitude and strong phase variation over the Dalitz plot. This comes at the cost of a degree of ambiguity in the choice of test statistic quantifying the compatibility of the data with a given value of$$\gamma $$ , with improved choices of test statistic yielding higher sensitivity. While our current proof-of-principle implementation does not demonstrate optimal sensitivity to$$\gamma $$ , its conceptually novel approach opens the door to new strategies for$$\gamma $$ extraction. More studies are required to see if these can be competitive with the existing methods.more » « less
-
Abstract We propose a generic compiler that can convert any zero-knowledge (ZK) proof for SIMD circuits to general circuits efficiently, and an extension that can preserve the space complexity of the proof systems. Our compiler can immediately produce new results improving upon state of the art.By plugging in our compiler to Antman, an interactive sublinear-communication protocol, we improve the overall communication complexity for general circuits from$$\mathcal {O}(C^{3/4})$$ to$$\mathcal {O}(C^{1/2})$$ . Our implementation shows that for a circuit of size$$2^{27}$$ , it achieves up to$$83.6\times $$ improvement on communication compared to the state-of-the-art implementation. Its end-to-end running time is at least$$70\%$$ faster in a 10Mbps network.Using the recent results on compressed$$\varSigma $$ -protocol theory, we obtain a discrete-log-based constant-round zero-knowledge argument with$$\mathcal {O}(C^{1/2})$$ communication and common random string length, improving over the state of the art that has linear-size common random string and requires heavier computation.We improve the communication of a designatedn-verifier zero-knowledge proof from$$\mathcal {O}(nC/B+n^2B^2)$$ to$$\mathcal {O}(nC/B+n^2)$$ .To demonstrate the scalability of our compilers, we were able to extract a commit-and-prove SIMD ZK from Ligero and cast it in our framework. We also give one instantiation derived from LegoSNARK, demonstrating that the idea of CP-SNARK also fits in our methodology.more » « less
-
Abstract The selection of low-radioactive construction materials is of utmost importance for the success of low-energy rare event search experiments. Besides radioactive contaminants in the bulk, the emanation of radioactive radon atoms from material surfaces attains increasing relevance in the effort to further reduce the background of such experiments. In this work, we present the$$^{222}$$ Rn emanation measurements performed for the XENON1T dark matter experiment. Together with the bulk impurity screening campaign, the results enabled us to select the radio-purest construction materials, targeting a$$^{222}$$ Rn activity concentration of$$10\,\mathrm{\,}\upmu \mathrm{Bq}/\mathrm{kg}$$ in$$3.2\,\mathrm{t}$$ of xenon. The knowledge of the distribution of the$$^{222}$$ Rn sources allowed us to selectively eliminate problematic components in the course of the experiment. The predictions from the emanation measurements were compared to data of the$$^{222}$$ Rn activity concentration in XENON1T. The final$$^{222}$$ Rn activity concentration of$$(4.5\pm 0.1)\,\mathrm{\,}\upmu \mathrm{Bq}/\mathrm{kg}$$ in the target of XENON1T is the lowest ever achieved in a xenon dark matter experiment.more » « less
An official website of the United States government

