skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Time for All Programs, Not Just Real-Time Programs
We argue that the utility of time as a semantic property of software is not limited to the domain of real-time systems. This paper outlines four concurrent design patterns: alignment, precedence, simultaneity, and consistency, all of which are relevant to general-purpose software applications. We show that a semantics of logical time provides a natural framework for reasoning about concurrency, makes some difficult problems easy, and offers a quantified interpretation of the CAP theorem, enabling quantified evaluation of the tradeoff between consistency and availability.  more » « less
Award ID(s):
1836601
PAR ID:
10311562
Author(s) / Creator(s):
;
Publisher / Repository:
Springer
Date Published:
Journal Name:
International Symposium on Leveraging Applications of Formal Methods (ISoLA 2021)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    An accurate sense of elapsed time is essential for the safe and correct operation of hardware, software, and networked systems. Unfortunately, an adversary can manipulate the system's time and violate causality, consistency, and scheduling properties of underlying applications. Although cryptographic techniques are used to secure data, they cannot ensure time security as securing a time source is much more challenging, given that the result of inquiring time must be delivered in a timely fashion. In this paper, we first describe general attack vectors that can compromise a system's sense of time. To counter these attacks, we propose a secure time architecture, TIMESEAL that leverages a Trusted Execution Environment (TEE) to secure time-based primitives. While CPU security features of TEEs secure code and data in protected memory, we show that time sources available in TEE are still prone to OS attacks. TIMESEAL puts forward a high-resolution time source that protects against the OS delay and scheduling attacks. Our TIMESEAL prototype is based on Intel SGX and provides sub-millisecond (msec) resolution as compared to 1-second resolution of SGX trusted time. It also securely bounds the relative time accuracy to msec under OS attacks. In essence, TIMESEAL provides the capability of trusted timestamping and trusted scheduling to critical applications in the presence of a strong adversary. It delivers all temporal use cases pertinent to secure sensing, computing, and actuating in networked systems. 
    more » « less
  2. The importance of alternative methods for measuring the Hubble constant, such as time-delay cosmography, is highlighted by the recent Hubble tension. It is paramount to thoroughly investigate and rule out systematic biases in all measurement methods before we can accept new physics as the source of this tension. In this study, we perform a check for systematic biases in the lens modelling procedure of time-delay cosmography by comparing independent and blind time-delay predictions of the system WGD 2038−4008 from two teams using two different software programs:GLEEandLENSTRONOMY. The predicted time delays from the two teams incorporate the stellar kinematics of the deflector and the external convergence from line-of-sight structures. The un-blinded time-delay predictions from the two teams agree within 1.2σ, implying that once the time delay is measured the inferred Hubble constant will also be mutually consistent. However, there is a ∼4σdiscrepancy between the power-law model slope and external shear, which is a significant discrepancy at the level of lens models before the stellar kinematics and the external convergence are incorporated. We identify the difference in the reconstructed point spread function (PSF) to be the source of this discrepancy. When the same reconstructed PSF was used by both teams, we achieved excellent agreement, within ∼0.6σ, indicating that potential systematics stemming from source reconstruction algorithms and investigator choices are well under control. We recommend that future studies supersample the PSF as needed and marginalize over multiple algorithms or realizations for the PSF reconstruction to mitigate the systematics associated with the PSF. A future study will measure the time delays of the system WGD 2038−4008 and infer the Hubble constant based on our mass models. 
    more » « less
  3. Recent trends in software-defined networking have extended network programmability to the data plane. Unfortunately, the chance of introducing bugs increases significantly. Verification can help prevent bugs by assuring that the program does not violate its requirements. Although research on the verification of P4 programs is very active, we still need tools to make easier for programmers to express properties and to rapidly verify complex invariants. In this paper, we leverage assertions and symbolic execution to propose a more general P4 verification approach. Developers annotate P4 programs with assertions expressing general network correctness properties; the result is transformed into C models and all possible paths symbolically executed. We implement a prototype, and use it to show the feasibility of the verification approach. Because symbolic execution does not scale well, we investigate a set of techniques to speed up the process for the specific case of P4 programs. We use the prototype implemented to show the gains provided by three speed up techniques (use of constraints, program slicing, parallelization), and experiment with different compiler optimization choices. We show our tool can uncover a broad range of bugs, and can do it in less than a minute considering various P4 applications. 
    more » « less
  4. We propose a broad class of models for time series of curves (functions) that can be used to quantify near long‐range dependence or near unit root behavior. We establish fundamental properties of these models and rates of consistency for the sample mean function and the sample covariance operator. The latter plays a role analogous to sample cross‐covariances for multivariate time series, but is far more important in the functional setting because its eigenfunctions are used in principal component analysis, which is a major tool in functional data analysis. It is used for dimension reduction of feature extraction. We also establish a central limit theorem for functions following our model. Both the consistency rates and the normalizations in the Central Limit Theorem (CLT) are nonstandard. They reflect the local unit root behavior and the long memory structure at moderate lags. 
    more » « less
  5. We propose an algorithm to impute and forecast a time series by transforming the observed time series into a matrix, utilizing matrix estimation to recover missing values and de-noise observed entries, and performing linear regression to make predictions. At the core of our analysis is a representation result, which states that for a large class of models, the transformed time series matrix is (approximately) low-rank. In effect, this generalizes the widely used Singular Spectrum Analysis (SSA) in the time series literature, and allows us to establish a rigorous link between time series analysis and matrix estimation. The key to establishing this link is constructing a Page matrix with non-overlapping entries rather than a Hankel matrix as is commonly done in the literature (e.g., SSA). This particular matrix structure allows us to provide finite sample analysis for imputation and prediction, and prove the asymptotic consistency of our method. Another salient feature of our algorithm is that it is model agnostic with respect to both the underlying time dynamics and the noise distribution in the observations. The noise agnostic property of our approach allows us to recover the latent states when only given access to noisy and partial observations a la a Hidden Markov Model; e.g., recovering the time-varying parameter of a Poisson process without knowing that the underlying process is Poisson. Furthermore, since our forecasting algorithm requires regression with noisy features, our approach suggests a matrix estimation based method—coupled with a novel, non-standard matrix estimation error metric—to solve the error-in-variable regression problem, which could be of interest in its own right. Through synthetic and real-world datasets, we demonstrate that our algorithm outperforms standard software packages (including R libraries) in the presence of missing data as well as high levels of noise. 
    more » « less