skip to main content

Title: Synthesizing contracts correct modulo a test generator
We present an approach to learn contracts for object-oriented programs where guarantees of correctness of the contracts are made with respect to a test generator. Our contract synthesis approach is based on a novel notion of tight contracts and an online learning algorithm that works in tandem with a test generator to synthesize tight contracts. We implement our approach in a tool called Precis and evaluate it on a suite of programs written in C#, studying the safety and strength of the synthesized contracts, and compare them to those synthesized by Daikon.
Authors:
; ; ; ; ;
Award ID(s):
1816615
Publication Date:
NSF-PAR ID:
10357348
Journal Name:
Proceedings of the ACM on Programming Languages
Volume:
5
Issue:
OOPSLA
Page Range or eLocation-ID:
1 to 27
ISSN:
2475-1421
Sponsoring Org:
National Science Foundation
More Like this
  1. Braverman, Mark (Ed.)
    We introduce a hitting set generator for Polynomial Identity Testing based on evaluations of low-degree univariate rational functions at abscissas associated with the variables. In spite of the univariate nature, we establish an equivalence up to rescaling with a generator introduced by Shpilka and Volkovich, which has a similar structure but uses multivariate polynomials in the abscissas. We study the power of the generator by characterizing its vanishing ideal, i.e., the set of polynomials that it fails to hit. Capitalizing on the univariate nature, we develop a small collection of polynomials that jointly produce the vanishing ideal. As corollaries, we obtain tight bounds on the minimum degree, sparseness, and partition size of set-multi-linearity in the vanishing ideal. Inspired by an alternating algebra representation, we develop a structured deterministic membership test for the vanishing ideal. As a proof of concept we rederive known derandomization results based on the generator by Shpilka and Volkovich, and present a new application for read-once oblivious arithmetic branching programs that provably transcends the usual combinatorial techniques.
  2. When cells take up material from their surroundings, they must first transport this cargo across their outer membrane, a flexible sheet of tightly organized fat molecules that act as a barrier to the environment. Cells can achieve this by letting their membrane surround the object, pulling it inwards until it is contained in a pouch that bulges into the cell. This bag is then corded up so it splits off from the outer membrane. The ‘cord’ is a protein called dynamin, which is thought to form a tight spiral around the bag’s neck, closing it over and pinching it away. The structure of dynamin is fairly well known, and yet several theories compete to explain how it may snap the bag off the outer membrane. Here, Pannuzzo et al. have created a computer simulation that faithfully replicates the geometry and the elasticity of the membrane and of dynamin, and used it to test different ways the protein could work. The first test featured simple constriction, where the dynamin spiral contracts around the membrane to pinch it; this only separated the bag from the membrane after implausibly tight constriction. The second test added elongation, with the spiral lengthening as well asmore »reducing its diameter, but this further reduced the ability for the protein to snap off the membrane. The final test combined constriction and rotation, whereby dynamin ‘twirls’ as it presses on the neck of the bag: this succeeded in efficiently severing the membrane once the dynamin spiral disassembled. Indeed, the simulations suggested that dynamin might start to dismantle while it constricts, without compromising its role. In fact, getting rid of excess length as the protein contracts helps to dissolve any remnants of a membrane connection. Defects in dynamin are associated with conditions such as centronuclear myopathy and Charcot‐Marie‐Tooth peripheral neuropathy. Recent research also indicates that the protein is involved in a much wider range of neurological disorders that include Alzheimer's, Parkinson's, Huntington's, and amyotrophic lateral sclerosis. The models created by Pannuzzo et al. are useful tools to understand how dynamin and similar proteins work and sometimes fail.« less
  3. Our ability to synthesize sensory data that preserves specific statistical properties of the real data has had tremendous implications on data privacy and big data analytics. The synthetic data can be used as a substitute for selective real data segments,that are sensitive to the user, thus protecting privacy and resulting in improved analytics.However, increasingly adversarial roles taken by data recipients such as mobile apps, or other cloud-based analytics services, mandate that the synthetic data, in addition to preserving statistical properties, should also be difficult to distinguish from the real data. Typically, visual inspection has been used as a test to distinguish between datasets. But more recently, sophisticated classifier models (discriminators), corresponding to a set of events, have also been employed to distinguish between synthesized and real data. The model operates on both datasets and the respective event outputs are compared for consistency. In this paper, we take a step towards generating sensory data that can pass a deep learning based discriminator model test, and make two specific contributions: first, we present a deep learning based architecture for synthesizing sensory data. This architecture comprises of a generator model, which is a stack of multiple Long-Short-Term-Memory (LSTM) networks and a Mixture Densitymore »Network. second, we use another LSTM network based discriminator model for distinguishing between the true and the synthesized data. Using a dataset of accelerometer traces, collected using smartphones of users doing their daily activities, we show that the deep learning based discriminator model can only distinguish between the real and synthesized traces with an accuracy in the neighborhood of 50%.« less
  4. Oxygen concentration varies tremendously within the body and has proven to be a critical variable in cell differentiation, proliferation, and drug metabolism among many other physiological processes. Currently, researchers study the gas’s role in biology using low-throughput gas control incubators or hypoxia chambers in which all cells in a vessel are exposed to a single oxygen concentration. Here, we introduce a device that can simultaneously deliver 12 unique oxygen concentrations to cells in a 96-well plate and seamlessly integrate into biomedical research workflows. The device inserts into 96-well plates and delivers gas to the headspace, thus avoiding undesirable contact with media. This simple approach isolates each well using gas-tight pressure-resistant gaskets effectively creating 96 “mini-incubators”. Each of the 12 columns of the plate is supplied by a distinct oxygen concentration from a gas-mixing gradient generator supplied by two feed gases. The wells within each column are then supplied by an equal flow-splitting distribution network. Using equal feed flow rates, concentrations ranging from 0.6 to 20.5% were generated within a single plate. A549 lung carcinoma cells were then used to show that O2 levels below 9% caused a stepwise increase in cell death for cells treated with the hypoxia-activated anticancer drugmore »tirapirizamine (TPZ). Additionally, the 96-well plate was further leveraged to simultaneously test multiple TPZ concentrations over an oxygen gradient and generate a three-dimensional (3D) dose−response landscape. The results presented here show how microfluidic technologies can be integrated into, rather than replace, ubiquitous biomedical labware allowing for increased throughput oxygen studies.« less
  5. Public blockchains have spurred the growing popularity of decentralized transactions and smart contracts, especially on the financial market. However, public blockchains exhibit their limitations on the transaction throughput, storage availability, and compute capacity. To avoid transaction gridlock, public blockchains impose large fees and per-block resource limits, making it difficult to accommodate the ever-growing high transaction demand. Previous research endeavors to improve the scalability and performance of blockchain through various technologies, such as side-chaining, sharding, secured off-chain computation, communication network optimizations, and efficient consensus protocols. However, these approaches have not attained a widespread adoption due to their inability in delivering a cloud-like performance, in terms of the scalability in transaction throughput, storage, and compute capacity. In this work, we determine that the major obstacle to public blockchain scalability is their underlying unstructured P2P networks. We further show that a centralized network can support the deployment of decentralized smart contracts. We propose a novel approach for achieving scalable decentralization: instead of trying to make blockchain scalable, we deliver decentralization to already scalable cloud by using an Ethereum smart contract. We introduce Blockumulus, a framework that can deploy decentralized cloud smart contract environments using a novel technique called overlay consensus. Through experiments, wemore »demonstrate that Blockumulus is scalable in all three dimensions: computation, data storage, and transaction throughput. Besides eliminating the current code execution and storage restrictions, Blockumulus delivers a transaction latency between 2 and 5 seconds under normal load. Moreover, the stress test of our prototype reveals the ability to execute 20,000 simultaneous transactions under 26 seconds, which is on par with the average throughput of worldwide credit card transactions.« less