Abstract Some microscopic dynamics are also macroscopically irreversible, dissipating energy and producing entropy. For many-particle systems interacting with deterministic thermostats, the rate of thermodynamic entropy dissipated to the environment is the average rate at which phase space contracts. Here, we use this identity and the properties of a classical density matrix to derive upper and lower bounds on the entropy flow rate from the spectral properties of the local stability matrix. These bounds are an extension of more fundamental bounds on the Lyapunov exponents and phase space contraction rate of continuous-time dynamical systems. They are maximal and minimal rates of entropy production, heat transfer, and transport coefficients set by the underlying dynamics of the system and deterministic thermostat. Because these limits on the macroscopic dissipation derive from the density matrix and the local stability matrix, they are numerically computable from the molecular dynamics. As an illustration, we show that these bounds are on the electrical conductivity for a system of charged particles subject to an electric field.
more »
« less
Classical Fisher information for differentiable dynamical systems
Fisher information is a lower bound on the uncertainty in the statistical estimation of classical and quantum mechanical parameters. While some deterministic dynamical systems are not subject to random fluctuations, they do still have a form of uncertainty. Infinitesimal perturbations to the initial conditions can grow exponentially in time, a signature of deterministic chaos. As a measure of this uncertainty, we introduce another classical information, specifically for the deterministic dynamics of isolated, closed, or open classical systems not subject to noise. This classical measure of information is defined with Lyapunov vectors in tangent space, making it less akin to the classical Fisher information and more akin to the quantum Fisher information defined with wavevectors in Hilbert space. Our analysis of the local state space structure and linear stability leads to upper and lower bounds on this information, giving it an interpretation as the net stretching action of the flow. Numerical calculations of this information for illustrative mechanical examples show that it depends directly on the phase space curvature and speed of the flow.
more »
« less
- Award ID(s):
- 2124510
- PAR ID:
- 10531430
- Publisher / Repository:
- AIP Publishing
- Date Published:
- Journal Name:
- Chaos: An Interdisciplinary Journal of Nonlinear Science
- Volume:
- 33
- Issue:
- 10
- ISSN:
- 1054-1500
- Page Range / eLocation ID:
- 103139
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Shahriar, Selim M; Scheuer, Jacob (Ed.)Quantum optomechanics has led to advances in quantum sensing, optical manipulation of mechanical systems, and macroscopic quantum physics. However, previous studies have typically focused on dispersive optomechanical coupling, which modifies the phase of the light field. Here, we discuss recent advances in “imaging-based” quantum optomechanics – where information about the mechanical resonator’s motion is imprinted onto the spatial mode of the optical field, akin to how information encoded in an image. Additionally, we find radiation pressure backaction, a phenomenon not usually discussed in imaging studies, comes from spatially uncorrelated fluctuations of the optical field. First, we examine a simple thought experiment in which the displacement of a membrane resonator can be measured by extracting the amplitude of specific spatial modes. Torsion modes are naturally measured with this coupling and are interesting for applications such as precision torque sensing, tests of gravity, and measurements of angular displacement at and beyond the standard quantum limit. As an experimental demonstration, we measure the angular displacement of the torsion mode of a Si3N4 nanoribbon near the quantum imprecision limit using both an optical lever and a spatial mode demultiplexer. Finally, we discuss the potential for future imaging-based quantum optomechanics experiments, including observing pondermotive squeezing of different spatial modes and quantum back-action evasion in angular displacement measurements.more » « less
-
Quantum computing utilizes superposition and entanglement to surpass classical computer capabilities. Central to this are qubits and their use to realize parallel quantum algorithms through circuits of simple one or two qubit gates. Controlling and measuring quantum systems is challenging. Here, we introduce a paradigm utilizing logical phi-bits, classical analogues of qubits using nonlinear acoustic waves, supported by an externally driven acoustic metastructure. These phi-bits bridge a low-dimensional linearly scaling physical space to a high-dimensional exponentially scaling Hilbert space in which parallel processing of information can be realized in the form of unitary operations. Here, we show the implementation of a nontrivial three-phi-bit unitary operation analogous to a quantum circuit but achieved via a single action on the metastructure, whereby the qubit-based equivalent requires sequences of qubit gates. A phi-bit-based approach might offer advantages over quantum systems, especially in tasks requiring large complex unitary operations. This breakthrough hints at a fascinating intersection of classical and quantum worlds, potentially redefining computational paradigms by harnessing nonlinear classical mechanical systems in quantum-analogous manners, blending the best of both domains.more » « less
-
Cumulative memory – the sum of space used per step over the duration of a computation – is a fine-grained measure of time-space complexity that was introduced to analyze cryptographic applications like password hashing. It is a more accurate cost measure for algorithms that have infrequent spikes in memory usage and are run in environments such as cloud computing that allow dynamic allocation and de-allocation of resources during execution, or when many multiple instances of an algorithm are interleaved in parallel. We prove the first lower bounds on cumulative memory complexity for both sequential classical computation and quantum circuits. Moreover, we develop general paradigms for bounding cumulative memory complexity inspired by the standard paradigms for proving time-space tradeoff lower bounds that can only lower bound the maximum space used during an execution. The resulting lower bounds on cumulative memory that we obtain are just as strong as the best time-space tradeoff lower bounds, which are very often known to be tight. Although previous results for pebbling and random oracle models have yielded time-space tradeoff lower bounds larger than the cumulative memory complexity, our results show that in general computational models such separations cannot follow from known lower bound techniques and are not true for many functions. Among many possible applications of our general methods, we show that any classical sorting algorithm with success probability at least 1/poly(n) requires cumulative memory Ω(n^2), any classical matrix multiplication algorithm requires cumulative memory Ω(n^6/T), any quantum sorting circuit requires cumulative memory Ω(n^3/T), and any quantum circuit that finds k disjoint collisions in a random function requires cumulative memory Ω(k^3 n/T^2). (Full version of ICALP 2023 paper.)more » « less
-
Cumulative memory---the sum of space used per step over the duration of a computation---is a fine-grained measure of time-space complexity that was introduced to analyze cryptographic applications like password hashing. It is a more accurate cost measure for algorithms that have infrequent spikes in memory usage and are run in environments such as cloud computing that allow dynamic allocation and de-allocation of resources during execution, or when many multiple instances of an algorithm are interleaved in parallel. We prove the first lower bounds on cumulative memory complexity for both sequential classical computation and quantum circuits. Moreover, we develop general paradigms for bounding cumulative memory complexity inspired by the standard paradigms for proving time-space tradeoff lower bounds that can only lower bound the maximum space used during an execution. The resulting lower bounds on cumulative memory that we obtain are just as strong as the best time-space tradeoff lower bounds, which are very often known to be tight. Although previous results for pebbling and random oracle models have yielded time-space tradeoff lower bounds larger than the cumulative memory complexity, our results show that in general computational models such separations cannot follow from known lower bound techniques and are not true for many functions. Among many possible applications of our general methods, we show that any classical sorting algorithm with success probability at least 1/\poly(n) requires cumulative memory \Omega(n^2), any classical matrix multiplication algorithm requires cumulative memory \Omega(n^6/T) , any quantum sorting circuit requires cumulative memory \Omega(n^3/T) , and any quantum circuit that finds k disjoint collisions in a random function requires cumulative memory \Omega(k^ 3 n/T^2) .more » « less