skip to main content


Title: Time for All Programs, Not Just Real-Time Programs
We argue that the utility of time as a semantic property of software is not limited to the domain of real-time systems. This paper outlines four concurrent design patterns: alignment, precedence, simultaneity, and consistency, all of which are relevant to general-purpose software applications. We show that a semantics of logical time provides a natural framework for reasoning about concurrency, makes some difficult problems easy, and offers a quantified interpretation of the CAP theorem, enabling quantified evaluation of the tradeoff between consistency and availability.  more » « less
Award ID(s):
1836601
NSF-PAR ID:
10311562
Author(s) / Creator(s):
;
Publisher / Repository:
Springer
Date Published:
Journal Name:
International Symposium on Leveraging Applications of Formal Methods (ISoLA 2021)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The importance of alternative methods for measuring the Hubble constant, such as time-delay cosmography, is highlighted by the recent Hubble tension. It is paramount to thoroughly investigate and rule out systematic biases in all measurement methods before we can accept new physics as the source of this tension. In this study, we perform a check for systematic biases in the lens modelling procedure of time-delay cosmography by comparing independent and blind time-delay predictions of the system WGD 2038−4008 from two teams using two different software programs:GLEEandLENSTRONOMY. The predicted time delays from the two teams incorporate the stellar kinematics of the deflector and the external convergence from line-of-sight structures. The un-blinded time-delay predictions from the two teams agree within 1.2σ, implying that once the time delay is measured the inferred Hubble constant will also be mutually consistent. However, there is a ∼4σdiscrepancy between the power-law model slope and external shear, which is a significant discrepancy at the level of lens models before the stellar kinematics and the external convergence are incorporated. We identify the difference in the reconstructed point spread function (PSF) to be the source of this discrepancy. When the same reconstructed PSF was used by both teams, we achieved excellent agreement, within ∼0.6σ, indicating that potential systematics stemming from source reconstruction algorithms and investigator choices are well under control. We recommend that future studies supersample the PSF as needed and marginalize over multiple algorithms or realizations for the PSF reconstruction to mitigate the systematics associated with the PSF. A future study will measure the time delays of the system WGD 2038−4008 and infer the Hubble constant based on our mass models.

     
    more » « less
  2. null (Ed.)
    An accurate sense of elapsed time is essential for the safe and correct operation of hardware, software, and networked systems. Unfortunately, an adversary can manipulate the system's time and violate causality, consistency, and scheduling properties of underlying applications. Although cryptographic techniques are used to secure data, they cannot ensure time security as securing a time source is much more challenging, given that the result of inquiring time must be delivered in a timely fashion. In this paper, we first describe general attack vectors that can compromise a system's sense of time. To counter these attacks, we propose a secure time architecture, TIMESEAL that leverages a Trusted Execution Environment (TEE) to secure time-based primitives. While CPU security features of TEEs secure code and data in protected memory, we show that time sources available in TEE are still prone to OS attacks. TIMESEAL puts forward a high-resolution time source that protects against the OS delay and scheduling attacks. Our TIMESEAL prototype is based on Intel SGX and provides sub-millisecond (msec) resolution as compared to 1-second resolution of SGX trusted time. It also securely bounds the relative time accuracy to msec under OS attacks. In essence, TIMESEAL provides the capability of trusted timestamping and trusted scheduling to critical applications in the presence of a strong adversary. It delivers all temporal use cases pertinent to secure sensing, computing, and actuating in networked systems. 
    more » « less
  3. We propose an algorithm to impute and forecast a time series by transforming the observed time series into a matrix, utilizing matrix estimation to recover missing values and de-noise observed entries, and performing linear regression to make predictions. At the core of our analysis is a representation result, which states that for a large class of models, the transformed time series matrix is (approximately) low-rank. In effect, this generalizes the widely used Singular Spectrum Analysis (SSA) in the time series literature, and allows us to establish a rigorous link between time series analysis and matrix estimation. The key to establishing this link is constructing a Page matrix with non-overlapping entries rather than a Hankel matrix as is commonly done in the literature (e.g., SSA). This particular matrix structure allows us to provide finite sample analysis for imputation and prediction, and prove the asymptotic consistency of our method. Another salient feature of our algorithm is that it is model agnostic with respect to both the underlying time dynamics and the noise distribution in the observations. The noise agnostic property of our approach allows us to recover the latent states when only given access to noisy and partial observations a la a Hidden Markov Model; e.g., recovering the time-varying parameter of a Poisson process without knowing that the underlying process is Poisson. Furthermore, since our forecasting algorithm requires regression with noisy features, our approach suggests a matrix estimation based method—coupled with a novel, non-standard matrix estimation error metric—to solve the error-in-variable regression problem, which could be of interest in its own right. Through synthetic and real-world datasets, we demonstrate that our algorithm outperforms standard software packages (including R libraries) in the presence of missing data as well as high levels of noise. 
    more » « less
  4. Degree of canopy cover is linked to transpiration, carbon cycling and primary productivity of an ecosystem. In modern ecology, canopy structure is often quantified as Leaf Area Index (LAI), which is the amount of overstory leaf coverage relative to ground area. Although a key aspect of vegetation, the degree of canopy cover has proven difficult to reconstruct in deep time. One method, Reconstructed Leaf Area Index (rLAI), was developed to infer canopy structure using the relationship between non-grass leaf epidermal phytolith (plant biosilica) morphology, and leaf coverage in modern forests. This method leverages the observed correlation between epidermal phytolith size, shape (margin undulation), and light availability. When more light is available in a canopy, epidermal phytoliths tend to be smaller and less undulate, whereas less light availability is linked to larger and more undulate epidermal phytoliths. However, the calibration set used to develop this method was compiled from field sites and samples from localities in Costa Rica and it remains unclear how applicable it is to temperate North American fossil sites due to lack of data from relevant vegetation types and taxonomic differences between plant communities in the Neotropics vs. mid-latitude North America. For example, preliminary results measuring rLAI in phytolith assemblages from the Miocene of the North American Great Plains have yielded surprisingly high degrees of canopy density despite containing high relative abundances of open-habitat grasses. To test whether vegetational and taxonomic differences impact the calibration set, we constructed a new North American calibration using 24 quadrats from six sites, representing reasonable modern analogs for Miocene vegetation in eastern North America. Specifically, we sampled in Bennett Springs State Park in Lebanon, MO; Mark Twain National Forest in Rolla, MO; Tellico in Franklin, NC and Congaree National Park in Hopkins, SC. All sites include a range of canopy covers and vegetation types, from oak savannas and oak woodlands to mixed hardwood forests, pine savannas, and old growth bottomland forests. From each quadrat, we collected a soil sample and took hemispherical photos of the local canopy. From modern soil samples, biosilica was extracted in the lab, yielding phytolith assemblages which were scanned for epidermal phytoliths using a compound microscope. Recovered epidermal phytoliths size and margin undulation were measured and assemblage averages were used to predict measured LAI at each quadrat. Hemispherical photographs were processed using the software Gap Light Analyzer to obtain LAI values. We hypothesize there will be a linear relationship between actual LAI and LAI calculated from epidermal phytolith morphology, but its relationship will differ from that found in South America. Results will be used to reevaluate canopy coverage in sites within the Great Plains Miocene as well as applied to Pacific Northwest Miocene sites, both to understand changes to vegetation during global climatic events in their respective regions. 
    more » « less
  5. Recent trends in software-defined networking have extended network programmability to the data plane. Unfortunately, the chance of introducing bugs increases significantly. Verification can help prevent bugs by assuring that the program does not violate its requirements. Although research on the verification of P4 programs is very active, we still need tools to make easier for programmers to express properties and to rapidly verify complex invariants. In this paper, we leverage assertions and symbolic execution to propose a more general P4 verification approach. Developers annotate P4 programs with assertions expressing general network correctness properties; the result is transformed into C models and all possible paths symbolically executed. We implement a prototype, and use it to show the feasibility of the verification approach. Because symbolic execution does not scale well, we investigate a set of techniques to speed up the process for the specific case of P4 programs. We use the prototype implemented to show the gains provided by three speed up techniques (use of constraints, program slicing, parallelization), and experiment with different compiler optimization choices. We show our tool can uncover a broad range of bugs, and can do it in less than a minute considering various P4 applications. 
    more » « less