skip to main content


Title: Composable Rate-Independent Computation in Continuous Chemical Reaction Networks
Biological regulatory networks depend upon chemical interactions to process information. Engineering such molecular computing systems is a major challenge for synthetic biology and related fields. The chemical reaction network (CRN) model idealizes chemical interactions, abstracting away specifics of the molecular implementation, and allowing rigorous reasoning about the computational power of chemical kinetics. Here we focus on function computation with CRNs, where we think of the initial concentrations of some species as the input and the eventual steady-state concentration of another species as the output. Specifically, we are concerned with CRNs that are rate-independent (the computation must be correct independent of the reaction rate law) and composable (𝑓∘𝑔 can be computed by concatenating the CRNs computing f and g). Rate independence and composability are important engineering desiderata, permitting implementations that violate mass-action kinetics, or even “well-mixedness”, and allowing the systematic construction of complex computation via modular design. We show that to construct composable rate-independent CRNs, it is necessary and sufficient to ensure that the output species of a module is not a reactant in any reaction within the module. We then exactly characterize the functions computable by such CRNs as superadditive, positive-continuous, and piecewise rational linear. Our results show that composability severely limits rate-independent computation unless more sophisticated input/output encodings are used.  more » « less
Award ID(s):
1652824 1618895
NSF-PAR ID:
10109867
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Computational Methods in Systems Biology. CMSB 2018.
Volume:
11095
Page Range / eLocation ID:
256-273
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Understanding the algorithmic behaviors that are in principle realizable in a chemical system is necessary for a rigorous understanding of the design principles of biological regulatory networks. Further, advances in synthetic biology herald the time when we will be able to rationally engineer complex chemical systems and when idealized formal models will become blueprints for engineering. Coupled chemical interactions in a well-mixed solution are commonly formalized as chemical reaction networks (CRNs). However, despite the widespread use of CRNs in the natural sciences, the range of computational behaviors exhibited by CRNs is not well understood. Here, we study the following problem: What functions f : ℝ k → ℝ can be computed by a CRN, in which the CRN eventually produces the correct amount of the “output” molecule, no matter the rate at which reactions proceed? This captures a previously unexplored but very natural class of computations: For example, the reaction X 1 + X 2 → Y can be thought to compute the function y = min ( x 1 , x 2 ). Such a CRN is robust in the sense that it is correct whether its evolution is governed by the standard model of mass-action kinetics, alternatives such as Hill-function or Michaelis-Menten kinetics, or other arbitrary models of chemistry that respect the (fundamentally digital) stoichiometric constraints (what are the reactants and products?). We develop a reachability relation based on a broad notion of “what could happen” if reaction rates can vary arbitrarily over time. Using reachability, we define stable computation analogously to probability 1 computation in distributed computing and connect it with a seemingly stronger notion of rate-independent computation based on convergence in the limit t → ∞ under a wide class of generalized rate laws. Besides the direct mapping of a concentration to a nonnegative analog value, we also consider the “dual-rail representation” that can represent negative values as the difference of two concentrations and allows the composition of CRN modules. We prove that a function is rate-independently computable if and only if it is piecewise linear (with rational coefficients) and continuous (dual-rail representation), or non-negative with discontinuities occurring only when some inputs switch from zero to positive (direct representation). The many contexts where continuous piecewise linear functions are powerful targets for implementation, combined with the systematic construction we develop for computing these functions, demonstrate the potential of rate-independent chemical computation. 
    more » « less
  2. Chen, Ho-Lin ; Evans, Constantine G. (Ed.)
    Discrete chemical reaction networks formalize the interactions of molecular species in a well-mixed solution as stochastic events. Given their basic mathematical and physical role, the computational power of chemical reaction networks has been widely studied in the molecular programming and distributed computing communities. While for Turing-universal systems there is a universal measure of optimal information encoding based on Kolmogorov complexity, chemical reaction networks are not Turing universal unless error and unbounded molecular counts are permitted. Nonetheless, here we show that the optimal number of reactions to generate a specific count x ∈ ℕ with probability 1 is asymptotically equal to a "space-aware" version of the Kolmogorov complexity of x, defined as K̃s(x) = min_p {|p|/log|p| + log(space(𝒰(p))) : 𝒰(p) = x}, where p is a program for universal Turing machine 𝒰. This version of Kolmogorov complexity incorporates not just the length of the shortest program for generating x, but also the space usage of that program. Probability 1 computation is captured by the standard notion of stable computation from distributed computing, but we limit our consideration to chemical reaction networks obeying a stronger constraint: they "know when they are done" in the sense that they produce a special species to indicate completion. As part of our results, we develop a module for encoding and unpacking any b bits of information via O(b/log{b}) reactions, which is information-theoretically optimal for incompressible information. Our work provides one answer to the question of how succinctly chemical self-organization can be encoded - in the sense of generating precise molecular counts of species as the desired state. 
    more » « less
  3. Embedding computation in biochemical environments incompatible with traditional electronics is expected to have a wide-ranging impact in synthetic biology, medicine, nanofabrication, and other fields. Natural biochemical systems are typically modeled by chemical reaction networks (CRNs) which can also be used as a specification language for synthetic chemical computation. In this paper, we identify a syntactically checkable class of CRNs called noncompetitive (NC) whose equilibria are absolutely robust to reaction rates and kinetic rate law, because their behavior is captured solely by their stoichiometric structure. In spite of the inherently parallel nature of chemistry, the robustness property allows for programming as if each reaction applies sequentially. We also present a technique to program NC-CRNs using well-founded deep learning methods, showing a translation procedure from rectified linear unit (ReLU) neural networks to NC-CRNs. In the case of binary weight ReLU networks, our translation procedure is surprisingly tight in the sense that a single bimolecular reaction corresponds to a single ReLU node and vice versa. This compactness argues that neural networks may be a fitting paradigm for programming rate-independent chemical computation. As proof of principle, we demonstrate our scheme with numerical simulations of CRNs translated from neural networks trained on traditional machine learning datasets, as well as tasks better aligned with potential biological applications including virus detection and spatial pattern formation. 
    more » « less
  4. Ouldridge, Thomas E. ; Wickham, Shelley F.J. (Ed.)
    A barrier to wider adoption of molecular computation is the difficulty of implementing arbitrary chemical reaction networks (CRNs) that are robust and replicate the kinetics of designed behavior. DNA Strand Displacement (DSD) cascades have been a favored technology for this purpose due to their potential to emulate arbitrary CRNs and known principles to tune their reaction rates. Progress on leakless cascades has demonstrated that DSDs can be arbitrarily robust to spurious "leak" reactions when incorporating systematic domain level redundancy. These improvements in robustness result in slower kinetics of designed reactions. Existing work has demonstrated the kinetic and thermodynamic effects of sequence mismatch introduction and elimination during displacement. We present a systematic, sequence modification strategy for optimizing the kinetics of leakless cascades without practical cost to their robustness. An in-depth case study explores the effects of this optimization when applied to a typical leakless translator cascade. Thermodynamic analysis of energy barriers and kinetic experimental data support that DSD cascades can be fast and robust. 
    more » « less
  5. The use of non-traditional computing devices is growing rapidly. One paradigm of interest is chemical reaction networks (CRNs) which can model and use chemical interactions for computation. These CRNs are used to develop programs at the nanoscale for applications such as intelligent drug delivery. In practice, these programs are developed in simulation environments, and then compiled into physical systems. A challenge when designing CRNs for computation is the lack of techniques to verify and validate correctness. In this work, we adapt software testing and repair techniques for use in this domain. In initial work, we designed a testing framework to handle the challenges presented by CRN programs; this includes distributed computation and stochastic behavior. We extended this framework to implement automated program repair of CRN models and automated test generation via program invariants. For future work, we will develop a notion of fault localization for these programs, develop a theory of mutation generation, and address issues regarding flakiness present in this computing paradigm. 
    more » « less