skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Determinism
This article is about deterministic models, what they are, why they are useful, and what their limitations are. First, the article emphasizes that determinism is a property of models, not of physical systems. Whether a model is deterministic or not depends on how one defines the inputs and behavior of the model. To define behavior, one has to define an observer. The article compares and contrasts two classes of ways to define an observer, one based on the notion of “state” and another that more flexibly defines the observables. The notion of “state” is shown to be problematic and lead to nondeterminism that is avoided when the observables are defined differently. The article examines determinism in models of the physical world. In what may surprise many readers, it shows that Newtonian physics admits nondeterminism and that quantum physics may be interpreted as a deterministic model. Moreover, it shows that both relativity and quantum physics undermine the notion of “state” and therefore require more flexible ways of defining observables. Finally, the article reviews results showing that sufficiently rich sets of deterministic models are incomplete. Specifically, nondeterminism is inescapable in any system of models rich enough to encompass Newton’s laws.  more » « less
Award ID(s):
1836601
PAR ID:
10602472
Author(s) / Creator(s):
 
Publisher / Repository:
Association for Computing Machinery (ACM)
Date Published:
Journal Name:
ACM Transactions on Embedded Computing Systems
Volume:
20
Issue:
5
ISSN:
1539-9087
Format(s):
Medium: X Size: p. 1-34
Size(s):
p. 1-34
Sponsoring Org:
National Science Foundation
More Like this
  1. Actors have become widespread in programming languages and programming frameworks focused on parallel and distributed computing. While actors provide a more disciplined model for concurrency than threads, their interactions, if not constrained, admit nondeterminism. As a consequence, actor programs may exhibit unintended behaviors and are less amenable to rigorous testing. We show that nondeterminism can be handled in a number of ways, surveying dataflow dialects, process networks, synchronous-reactive models, and discrete-event models. These existing approaches, however, tend to require centralized control, pose challenges to modular system design, or introduce a single point of failure. We describe “reactors,” a new coordination model that combines ideas from several of the aforementioned approaches to enable determinism while preserving much of the style of actors. Reactors promote modularity and allow for distributed execution. By using a logical model of time that can be associated with physical time, reactors also admit control over timing. 
    more » « less
  2. Abstract It is believed that Euclidean Yang–Mills theories behave like the massless Gaussian free field (GFF) at short distances. This makes it impossible to define the main observables for these theories—the Wilson loop observables—in dimensions greater than two, because line integrals of the GFF do not exist in such dimensions. Taking forward a proposal of Charalambous and Gross, this article shows that it is possible to define Euclidean Yang–Mills theories on the 3D unit torus as ‘random distributional gauge orbits’, provided that they indeed behave like the GFF in a certain sense. One of the main technical tools is the existence of the Yang–Mills heat flow on the 3D torus starting from GFF-like initial data, which is established in a companion paper. A key consequence of this construction is that under the GFF assumption, one can define a notion of ‘regularized Wilson loop observables’ for Euclidean Yang–Mills theories on the 3D unit torus. 
    more » « less
  3. We argue that a phenomenological analysis of consciousness similar to that of Husserl shows that the effects of phenomenal qualities shape our perception of the world. It also shows the way the physical and mathematical sciences operate, allowing us to accurately describe the observed regularities in terms of communicable mathematical laws. The latter say nothing about the intrinsic features of things. They only refer to the observed regularities in their behaviors, providing rigorous descriptions of how the universe works, to which any viable ontology must conform. Classical mechanistic determinism limits everything that can occur to what happens in an instant and leaves no room for novelty or any intrinsic aspect that is not epiphenomenal. The situation changes with quantum probabilistic determinism if one takes seriously the ontology that arises from its axioms of objects, systems in certain states, and the events they produce in other objects. As Bertrand Russell pointed out almost a century ago, an ontology of events with an internal phenomenal aspect, now known as panprotopsychism, is better suited to explaining the phenomenal aspects of consciousness. The central observation of this paper is that many objections to panpsychism and panprotopsychism, which are usually called the combination problem, arise from implicit hypotheses based on classical physics about supervenience. These are inappropriate at the quantum level, where an exponential number of emergent properties and states arise. The analysis imposes conditions on the possible implementations of quantum cognition mechanisms in the brain. 
    more » « less
  4. Many programming languages and programming frameworks focus on parallel and distributed computing. Several frameworks are based on actors, which provide a more disciplined model for concurrency than threads. The interactions between actors, however, if not constrained, admit nondeterminism. As a consequence, actor programs may exhibit unintended behaviors and are less amenable to rigorous testing. We show that nondeterminism can be handled in a number of ways, surveying dataflow dialects, process networks, synchronous-reactive models, and discrete-event models. These existing approaches, however, tend to require centralized control, pose challenges to modular system design, or introduce a single point of failure. We describe “reactors,” a new coordination model that combines ideas from several of these approaches to enable determinism while preserving much of the style of actors. Reactors promote modularity and allow for distributed execution. By using a logical model of time that can be associated with physical time, reactors also provide control over timing. Reactors also expose parallelism that can be exploited on multicore machines and in distributed configurations without compromising determinacy. 
    more » « less
  5. In recent years, applications of quantum simulation have been developed to study the properties of strongly interacting theories. This has been driven by two factors: on the one hand, needs from theorists to have access to physical observables that are prohibitively difficult to study using classical computing; on the other hand, quantum hardware becoming increasingly reliable and scalable to larger systems. In this work, we discuss the feasibility of using quantum optical simulation for studying scattering observables that are presently inaccessible via lattice QCD and are at the core of the experimental program at Jefferson Laboratory, the future Electron-Ion Collider, and other accelerator facilities. We show that recent progress in measurement-based photonic quantum computing can be leveraged to provide deterministic generation of required exotic gates and implementation in a single photonic quantum processor. Published by the American Physical Society2024 
    more » « less