skip to main content

Title: When you can’t count, sample!: Computable entropies beyond equilibrium from basin volumes
In statistical mechanics, measuring the number of available states and their probabilities, and thus the system’s entropy, enables the prediction of the macroscopic properties of a physical system at equilibrium. This predictive capacity hinges on the knowledge of the a priori probabilities of observing the states of the system, given by the Boltzmann distribution. Unfortunately, the successes of equilibrium statistical mechanics are hardto replicate out of equilibrium, where the a priori probabilities of observing states are, in general, not known, precluding the naı̈ve application of common tools. In the last decade, exciting developments have occurred that enable direct numerical estimation of the entropy and density of states of athermal and non-equilibrium systems, thanks to significant methodological advances in the computation of the volume of high-dimensional basins of attraction. Here, we provide a detailed account of these methods, underscoring the challenges present in such estimations, recent progress on the matter, and promising directions for future work.  more » « less
Award ID(s):
Author(s) / Creator(s):
Date Published:
Journal Name:
Papers in Physics
Page Range / eLocation ID:
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. By utilizing notions from statistical mechanics, we develop a general and self-consistent theoretical framework capable of describing any weakly nonlinear optical multimode system involving conserved quantities. We derive the fundamental relations that govern the grand canonical ensemble through maximization of the Gibbs entropy at equilibrium. In this classical picture of statistical photo-mechanics, we obtain analytical expressions for the probability distribution, the grand partition function, and the relevant thermodynamic potentials. Our results universally apply to any other weakly nonlinear multimode bosonic system.

    more » « less
  2. Abstract

    Entropy and Information are key concepts not only in Information Theory but also in Physics: historically in the fields of Thermodynamics, Statistical and Analytical Mechanics, and, more recently, in the field of Information Physics. In this paper we argue that Information Physics reconciles and generalizes statistical, geometric, and mechanistic views on information. We start by demonstrating how the use and interpretation of Entropy and Information coincide in Information Theory, Statistical Thermodynamics, and Analytical Mechanics, and how this can be taken advantage of when addressing Earth Science problems in general and hydrological problems in particular. In the second part we discuss how Information Physics provides ways to quantify Information and Entropy from fundamental physical principles. This extends their use to cases where the preconditions to calculate Entropy in the classical manner as an aggregate statistical measure are not met. Indeed, these preconditions are rarely met in the Earth Sciences due either to limited observations or the far‐from‐equilibrium nature of evolving systems. Information Physics therefore offers new opportunities for improving the treatment of Earth Science problems.

    more » « less
  3. Abstract

    Today’s thermodynamics is largely based on the combined law for equilibrium systems and statistical mechanics derived by Gibbs in 1873 and 1901, respectively, while irreversible thermodynamics for nonequilibrium systems resides essentially on the Onsager Theorem as a separate branch of thermodynamics developed in 1930s. Between them, quantum mechanics was invented and quantitatively solved in terms of density functional theory (DFT) in 1960s. These three scientific domains operate based on different principles and are very much separated from each other. In analogy to the parable of the blind men and the elephant articulated by Perdew, they individually represent different portions of a complex system and thus are incomplete by themselves alone, resulting in the lack of quantitative agreement between their predictions and experimental observations. Over the last two decades, the author’s group has developed a multiscale entropy approach (recently termed as zentropy theory) that integrates DFT-based quantum mechanics and Gibbs statistical mechanics and is capable of accurately predicting entropy and free energy of complex systems. Furthermore, in combination with the combined law for nonequilibrium systems presented by Hillert, the author developed the theory of cross phenomena beyond the phenomenological Onsager Theorem. The zentropy theory and theory of cross phenomena jointly provide quantitative predictive theories for systems from electronic to any observable scales as reviewed in the present work.

    more » « less
  4. Abstract Statistical thermodynamics is valuable as a conceptual structure that shapes our thinking about equilibrium thermodynamic states. A cloud of unresolved questions surrounding the foundations of the theory could lead an impartial observer to conclude that statistical thermodynamics is in a state of crisis though. Indeed, the discussion about the microscopic origins of irreversibility has continued in the scientific community for more than a hundred years. This paper considers these questions while beginning to develop a statistical thermodynamics for finite non-equilibrium systems. Definitions are proposed for all of the extrinsic variables of the fundamental thermodynamic relation that are consistent with existing results in the equilibrium thermodynamic limit. The probability density function on the phase space is interpreted as a subjective uncertainty about the microstate, and the Gibbs entropy formula is modified to allow for entropy creation without introducing additional physics or modifying the phase space dynamics. Resolutions are proposed to the mixing paradox, Gibbs’ paradox, Loschmidt’s paradox, and Maxwell’s demon thought experiment. Finally, the extrinsic variables of the fundamental thermodynamic relation are evaluated as functions of time and space for a diffusing ideal gas, and the initial and final values are shown to coincide with the expected equilibrium values. 
    more » « less
  5. Standard methods for synthesis of control policies in Markov decision processes with unknown transition probabilities largely rely on a combination of exploration and exploitation. While these methods often offer theoretical guarantees on system performance, the number of time steps and samples needed to initially explore the environment before synthesizing a well-performing control policy is impractically large. This paper partially alleviates such a burden by incorporating a priori existing knowledge into learning, when such knowledge is available. Based on prior information about bounds on the differences between the transition probabilities at different states, we propose a learning approach where the transition probabilities at a given state are not only learned from outcomes of repeatedly performing a certain action at that state, but also from outcomes of performing actions at states that are known to have similar transition probabilities. Since the directly obtained information is more reliable at determining transition probabilities than second-hand information, i.e., information obtained from similar but potentially slightly different states, samples obtained indirectly are weighted with respect to the known bounds on the differences of transition probabilities. While the proposed strategy can naturally lead to errors in learned transition probabilities, we show that, by proper choice of the weights, such errors can be reduced, and the number of steps needed to form a near-optimal control policy in the Bayesian sense can be significantly decreased. 
    more » « less