This article is about deterministic models, what they are, why they are useful, and what their limitations are. First, the article emphasizes that determinism is a property of models, not of physical systems. Whether a model is deterministic or not depends on how one defines the inputs and behavior of the model. To define behavior, one has to define an observer. The article compares and contrasts two classes of ways to define an observer, one based on the notion of “state” and another that more flexibly defines the observables. The notion of “state” is shown to be problematic and lead to nondeterminism that is avoided when the observables are defined differently. The article examines determinism in models of the physical world. In what may surprise many readers, it shows that Newtonian physics admits nondeterminism and that quantum physics may be interpreted as a deterministic model. Moreover, it shows that both relativity and quantum physics undermine the notion of “state” and therefore require more flexible ways of defining observables. Finally, the article reviews results showing that sufficiently rich sets of deterministic models are incomplete. Specifically, nondeterminism is inescapable in any system of models rich enough to encompass Newton’s laws.
more »
« less
Quantum Panprotopsychism and the Combination Problem
We argue that a phenomenological analysis of consciousness similar to that of Husserl shows that the effects of phenomenal qualities shape our perception of the world. It also shows the way the physical and mathematical sciences operate, allowing us to accurately describe the observed regularities in terms of communicable mathematical laws. The latter say nothing about the intrinsic features of things. They only refer to the observed regularities in their behaviors, providing rigorous descriptions of how the universe works, to which any viable ontology must conform. Classical mechanistic determinism limits everything that can occur to what happens in an instant and leaves no room for novelty or any intrinsic aspect that is not epiphenomenal. The situation changes with quantum probabilistic determinism if one takes seriously the ontology that arises from its axioms of objects, systems in certain states, and the events they produce in other objects. As Bertrand Russell pointed out almost a century ago, an ontology of events with an internal phenomenal aspect, now known as panprotopsychism, is better suited to explaining the phenomenal aspects of consciousness. The central observation of this paper is that many objections to panpsychism and panprotopsychism, which are usually called the combination problem, arise from implicit hypotheses based on classical physics about supervenience. These are inappropriate at the quantum level, where an exponential number of emergent properties and states arise. The analysis imposes conditions on the possible implementations of quantum cognition mechanisms in the brain.
more »
« less
- Award ID(s):
- 2206557
- PAR ID:
- 10580240
- Publisher / Repository:
- Imprint Academic
- Date Published:
- Journal Name:
- Mind and Matter
- Volume:
- 22
- Issue:
- 1
- ISSN:
- 1611-8812
- Page Range / eLocation ID:
- 51 to 94
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Quantum reference frames are expected to differ from classical reference frames because they have to implement typical quantum features such as fluctuations and correlations. Here, we show that fluctuations and correlations of reference variables, in particular of time, are restricted by their very nature of being used for reference. Mathematically, this property is implemented by imposing constraints on the system to make sure that reference variables are not physical degrees of freedom. These constraints not only relate physical degrees of freedom to reference variables in order to describe their behavior, they also restrict quantum fluctuations of reference variables and their correlations with system degrees of freedom. We introduce the notion of “almost-positive” states as a suitable mathematical method. An explicit application of their properties to examples of recent interest in quantum reference frames reveals previously unrecognized restrictions on possible frame–system interactions. While currently discussed clock models rely on assumptions that, as shown here, make them consistent as quantum reference frames, relaxing these assumptions will expose the models to new restrictions that appear to be rather strong. Almost-positive states also shed some light on a recent debate about the consistency of relational quantum mechanics.more » « less
-
Topological quantum memory can protect information against local errors up to finite error thresholds. Such thresholds are usually determined based on the success of decoding algorithms rather than the intrinsic properties of the mixed states describing corrupted memories. Here we provide an intrinsic characterization of the breakdown of topological quantum memory, which both gives a bound on the performance of decoding algorithms and provides examples of topologically distinct mixed states. We employ three information-theoretical quantities that can be regarded as generalizations of the diagnostics of ground-state topological order, and serve as a definition for topological order in error-corrupted mixed states. We consider the topological contribution to entanglement negativity and two other metrics based on quantum relative entropy and coherent information. In the concrete example of the two-dimensional (2D) Toric code with local bit-flip and phase errors, we map three quantities to observables in 2D classical spin models and analytically show they all undergo a transition at the same error threshold. This threshold is an upper bound on that achieved in any decoding algorithm and is indeed saturated by that in the optimal decoding algorithm for the Toric code. Published by the American Physical Society2024more » « less
-
null (Ed.)We review the Montevideo Interpretation of quantum mechanics, which is based on the use of real clocks to describe physics, using the framework that was recently introduced by Höhn, Smith, and Lock to treat the problem of time in generally covariant systems. These new methods, which solve several problems in the introduction of a notion of time in such systems, do not change the main results of the Montevideo Interpretation. The use of the new formalism makes the construction more general and valid for any system in a quantum generally covariant theory. We find that, as in the original formulation, a fundamental mechanism of decoherence emerges that allows for supplementing ordinary environmental decoherence and avoiding its criticisms. The recent results on quantum complexity provide additional support to the type of global protocols that are used to prove that within ordinary—unitary—quantum mechanics, no definite event—an outcome to which a probability can be associated—occurs. In lieu of this, states that start in a coherent superposition of possible outcomes always remain as a superposition. We show that, if one takes into account fundamental inescapable uncertainties in measuring length and time intervals due to general relativity and quantum mechanics, the previously mentioned global protocols no longer allow for distinguishing whether the state is in a superposition or not. One is left with a formulation of quantum mechanics purely defined in quantum mechanical terms without any reference to the classical world and with an intrinsic operational definition of quantum events that does not need external observers.more » « less
-
Abstract Quantum neuromorphic computing (QNC) is a sub-field of quantum machine learning (QML) that capitalizes on inherent system dynamics. As a result, QNC can run on contemporary, noisy quantum hardware and is poised to realize challenging algorithms in the near term. One key issue in QNC is the characterization of the requisite dynamics for ensuring expressive quantum neuromorphic computation. We address this issue by proposing a building block for QNC architectures, what we call quantum perceptrons (QPs). Our proposed QPs compute based on the analog dynamics of interacting qubits with tunable coupling constants. We show that QPs are, with restricted resources, a quantum equivalent to the classical perceptron, a simple mathematical model for a neuron that is the building block of various machine learning architectures. \framing{Moreover, we show that QPs are theoretically capable of producing any unitary operation.} Thus, QPs are computationally more expressive than their classical counterparts. As a result, QNC architectures built our of QPs are, theoretically, universal. We introduce a technique for mitigating barren plateaus in QPs called entanglement thinning. We demonstrate QPs' effectiveness by applying them to numerous QML problems, including calculating the inner products between quantum states, entanglement witnessing, and quantum metrology. Finally, we discuss potential implementations of QPs and how they can be used to build more complex QNC architectures.more » « less
An official website of the United States government

