Developing a thermodynamic theory of computation is a challenging task at the interface of nonequilibrium thermodynamics and computer science. In particular, this task requires dealing with difficulties such as stochastic halting times, unidirectional (possibly deterministic) transitions, and restricted initial conditions, features common in real-world computers. Here, we present a framework which tackles all such difficulties by extending the martingale theory of nonequilibrium thermodynamics to generic nonstationary Markovian processes, including those with broken detailed balance and/or absolute irreversibility. We derive several universal fluctuation relations and second-law-like inequalities that provide both lower and upper bounds on the intrinsic dissipation (mismatch cost) associated with any periodic process—in particular, the periodic processes underlying all current digital computation. Crucially, these bounds apply even if the process has stochastic stopping times, as it does in many computational machines. We illustrate our results with exhaustive numerical simulations of deterministic finite automata processing bit strings, one of the fundamental models of computation from theoretical computer science. We also provide universal equalities and inequalities for the acceptance probability of words of a given length by a deterministic finite automaton in terms of thermodynamic quantities, and outline connections between computer science and stochastic resetting. Our results, while motivated from the computational context, are applicable far more broadly.
more »
« less
This content will become publicly available on November 5, 2025
Is stochastic thermodynamics the key to understanding the energy costs of computation?
The relationship between the thermodynamic and computational properties of physical systems has been a major theoretical interest since at least the 19th century. It has also become of increasing practical importance over the last half-century as the energetic cost of digital devices has exploded. Importantly, real-world computers obey multiple physical constraints on how they work, which affects their thermodynamic properties. Moreover, many of these constraints apply to both naturally occurring computers, like brains or Eukaryotic cells, and digital systems. Most obviously, all such systems must finish their computation quickly, using as few degrees of freedom as possible. This means that they operate far from thermal equilibrium. Furthermore, many computers, both digital and biological, are modular, hierarchical systems with strong constraints on the connectivity among their subsystems. Yet another example is that to simplify their design, digital computers are required to be periodic processes governed by a global clock. None of these constraints were considered in 20th-century analyses of the thermodynamics of computation. The new field of stochastic thermodynamics provides formal tools for analyzing systems subject to all of these constraints. We argue here that these tools may help us understand at a far deeper level just how the fundamental thermodynamic properties of physical systems are related to the computation they perform.
more »
« less
- PAR ID:
- 10584434
- Publisher / Repository:
- National Academy of Sciences
- Date Published:
- Journal Name:
- Proceedings of the National Academy of Sciences
- Volume:
- 121
- Issue:
- 45
- ISSN:
- 0027-8424
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Digital computers implement computations using circuits, as do many naturally occurring systems (e.g., gene regulatory networks). The topology of any such circuit restricts which variables may be physically coupled during the operation of the circuit. We investigate how such restrictions on the physical coupling affects the thermodynamic costs of running the circuit. To do this we first calculate the minimal additional entropy production that arises when we run a given gate in a circuit. We then build on this calculation, to analyze how the thermodynamic costs of implementing a computation with a full circuit, comprising multiple connected gates, depends on the topology of that circuit. This analysis provides a rich new set of optimization problems that must be addressed by any designer of a circuit, if they wish to minimize thermodynamic costs.more » « less
-
Abstract Real-world computers have operational constraints that cause nonzero entropy production (EP). In particular, almost all real-world computers are ‘periodic’, iteratively undergoing the same physical process; and ‘local’, in that subsystems evolve whilst physically decoupled from the rest of the computer. These constraints are so universal because decomposing a complex computation into small, iterative calculations is what makes computers so powerful. We first derive the nonzero EP caused by the locality and periodicity constraints for deterministic finite automata (DFA), a foundational system of computer science theory. We then relate this minimal EP to the computational characteristics of the DFA. We thus divide the languages recognised by DFA into two classes: those that can be recognised with zero EP, and those that necessarily have non-zero EP. We also demonstrate the thermodynamic advantages of implementing a DFA with a physical process that is agnostic about the inputs that it processes.more » « less
-
Cells are the basic units of all living matter which harness the flow of energy to drive the processes of life. While the biochemical networks involved in energy transduction are well-characterized, the energetic costs and constraints for specific cellular processes remain largely unknown. In particular, what are the energy budgets of cells? What are the constraints and limits energy flows impose on cellular processes? Do cells operate near these limits, and if so how do energetic constraints impact cellular functions? Physics has provided many tools to study nonequilibrium systems and to define physical limits, but applying these tools to cell biology remains a challenge. Physical bioenergetics, which resides at the interface of nonequilibrium physics, energy metabolism, and cell biology, seeks to understand how much energy cells are using, how they partition this energy between different cellular processes, and the associated energetic constraints. Here we review recent advances and discuss open questions and challenges in physical bioenergetics.more » « less
-
Modern science depends on computers, but not all scientists have access to the scale of computation they need. A digital divide separates scientists who accelerate their science using large cyberinfrastructure from those who do not, or who do not have access to the compute resources or learning opportunities to develop the skills needed. The exclusionary nature of the digital divide threatens equity and the future of innovation by leaving people out of the scientific process while over-amplifying the voices of a small group who have resources. However, there are potential solutions: recent advancements in public research cyberinfrastructure and resources developed during the open science revolution are providing tools that can help bridge this divide. These tools can enable access to fast and powerful computation with modest internet connections and personal computers. Here we contribute another resource for narrowing the digital divide: scalable virtual machines running on public cloud infrastructure. We describe the tools, infrastructure, and methods that enabled successful deployment of a reproducible and scalable cyberinfrastructure architecture for a collaborative data synthesis working group in February 2023. This platform enabled 45 scientists with varying data and compute skills to leverage 40,000 hours of compute time over a 4-day workshop. Our approach provides an open framework that can be replicated for educational and collaborative data synthesis experiences in any data- and compute-intensive discipline.more » « less
An official website of the United States government
