Compositional compiler verification is a difficult problem that focuses on separate compilation of program components with possibly different verified compilers. Logical relations are widely used in proving correctness of program transformations in higher-order languages; however, they do not scale to compositional verification of multi-pass compilers due to their lack of transitivity. The only known technique to apply to compositional verification of multi-pass compilers for higher-order languages is parametric inter-language simulations (PILS), which is however significantly more complicated than traditional proof techniques for compiler correctness. In this paper, we present a novel verification framework for lightweight compositional compiler correctness . We demonstrate that by imposing the additional restriction that program components are compiled by pipelines that go through the same sequence of intermediate representations , logical relation proofs can be transitively composed in order to derive an end-to-end compositional specification for multi-pass compiler pipelines. Unlike traditional logical-relation frameworks, our framework supports divergence preservation—even when transformations reduce the number of program steps. We achieve this by parameterizing our logical relations with a pair of relational invariants . We apply this technique to verify a multi-pass, optimizing middle-end pipeline for CertiCoq, a compiler from Gallina (Coq’s specification language) to C. The pipeline optimizes and closure-converts an untyped functional intermediate language (ANF or CPS) to a subset of that language without nested functions, which can be easily code-generated to low-level languages. Notably, our pipeline performs more complex closure-allocation optimizations than the state of the art in verified compilation. Using our novel verification framework, we prove an end-to-end theorem for our pipeline that covers both termination and divergence and applies to whole-program and separate compilation, even when different modules are compiled with different optimizations. Our results are mechanized in the Coq proof assistant.
more »
« less
Minimally Comparing Relational Abstract Domains
Value-based static analysis techniques express computed program invariants as logical formula over program variables. Researchers and practitioners use these invariants to aid in software engineering and verification tasks. When selecting abstract domains, practitioners weigh the cost of a domain against its expressiveness. However, an abstract domain's expressiveness tends to be stated in absolute terms; either mathematically via the sub-polyhedra the domain is capable of describing, empirically using a set of known properties to verify, or empirically via logical entailment using the entire invariant of the domain at each program point. Due to carry-over effects, however, the last technique can be problematic because it tends to provide simplistic and imprecise comparisons
more »
« less
- Award ID(s):
- 1942044
- PAR ID:
- 10508631
- Editor(s):
- Andre, Etienne; Sun, Jun
- Publisher / Repository:
- Springer Nature Switzerland
- Date Published:
- Journal Name:
- Automated Technology for Verification and Analysis
- Volume:
- 14216
- ISBN:
- 978-3-031-45332-8
- Page Range / eLocation ID:
- 59--75
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
We introduce Flux, which shows how logical refinements can work hand in glove with Rust's ownership mechanisms to yield ergonomic type-based verification of low-level pointer manipulating programs. First, we design a novel refined type system for Rust that indexes mutable locations, with pure (immutable) values that can appear in refinements, and then exploits Rust's ownership mechanisms to abstract sub-structural reasoning about locations within Rust's polymorphic type constructors, while supporting strong updates. We formalize the crucial dependency upon Rust's strong aliasing guarantees by exploiting the Stacked Borrows aliasing model to prove that "well-borrowed evaluations of well-typed programs do not get stuck". Second, we implement our type system in Flux, a plug-in to the Rust compiler that exploits the factoring of complex invariants into types and refinements to efficiently synthesize loop annotations-including complex quantified invariants describing the contents of containers-via liquid inference. Third, we evaluate Flux with a benchmark suite of vector manipulating programs and parts of a previously verified secure sandboxing library to demonstrate the advantages of refinement types over program logics as implemented in the state-of-the-art Prusti verifier. While Prusti's more expressive program logic can, in general, verify deep functional correctness specifications, for the lightweight but ubiquitous and important verification use-cases covered by our benchmarks, liquid typing makes verification ergonomic by slashing specification lines by a factor of two, verification time by an order of magnitude, and annotation overhead from up to 24% of code size (average 14%), to nothing at all.more » « less
-
David, Cristina; Sun, Meng (Ed.)Verification techniques express program states as logical formulas over program variables. For example, symbolic execution and abstract interpretation encode program states as a set of linear integer inequalities. However, for real-world programs these formulas tend to become large, which affects scalability of analyses. To address this problem, researchers developed complementary approaches which either remove redundant inequalities or extract a subset of inequalities sufficient for specific reasoning, i.e., formula slicing. For arbitrary linear integer inequalities, such reduction approaches either have high complexities or over-approximate. However, efficiency and precision of these approaches can be improved for a restricted type of logical formulas used in relational numerical abstract domains. While previous work investigated custom efficient redundant inequality elimination for Zones states, our work examines custom semantic slicing algorithms that identify a minimal set of changed inequalities in Zones statesmore » « less
-
Abstract—We present SaSTL—a novel Spatial Aggregation Signal Temporal Logic—for the efficient runtime monitoring of safety and performance requirements in smart cities. We first describe a study of over 1,000 smart city requirements, some of which can not be specified using existing logic such as Signal Temporal Logic (STL) and its variants. To tackle this limitation, we develop two new logical operators in SaSTL to augment STL for expressing spatial aggregation and spatial counting characteristics that are commonly found in real city requirements. We also develop efficient monitoring algorithms that can check a SaSTL requirement in parallel over multiple data streams (e.g., generated by multiple sensors distributed spatially in a city).We evaluate our SaSTL monitor by applying to two case studies with large-scale real city sensing data (e.g., up to 10,000 sensors in one requirement). The results show that SaSTL has a much higher coverage expressiveness than other spatial-temporal logics, and with a significant reduction of computation time for monitoring requirements. We also demonstrate that the SaSTL monitor can help improve the safety and performance of smart cities via simulated experiments.more » « less
-
Abstract Behavioral software contracts allow programmers to strengthen the obligations and promises that they express with conventional types. They lack expressive power, though, when it comes to invariants that hold across several function calls. Trace contracts narrow this expressiveness gap. A trace contract is a predicate over the sequence of values that flow through function calls and returns. This paper presents a principled design, an implementation, and an evaluation of trace contracts.more » « less