Abstract Transfer entropy is emerging as the statistical approach of choice to support the inference of causal interactions in complex systems from time-series of their individual units. With reference to a simple dyadic system composed of two coupled units, the successful application of net transfer entropy-based inference relies on unidirectional coupling between the units and their homogeneous dynamics. What happens when the units are bidirectionally coupled and have different dynamics? Through analytical and numerical insights, we show that net transfer entropy may lead to erroneous inference of the dominant direction of influence that stems from its dependence on the units’ individual dynamics. To control for these confounding effects, one should incorporate further knowledge about the units’ time-histories through the recent framework offered by momentary information transfer. In this realm, we demonstrate the use of two measures: controlled and fully controlled transfer entropies, which consistently yield the correct direction of dominant coupling irrespective of the sources and targets individual dynamics. Through the study of two real-world examples, we identify critical limitations with respect to the use of net transfer entropy in the inference of causal mechanisms that warrant prudence by the community. 
                        more » 
                        « less   
                    
                            
                            Inferring directional interactions in collective dynamics: a critique to intrinsic mutual information
                        
                    
    
            Abstract Pairwise interactions are critical to collective dynamics of natural and technological systems. Information theory is the gold standard to study these interactions, but recent work has identified pitfalls in the way information flow is appraised through classical metrics—time-delayed mutual information and transfer entropy. These pitfalls have prompted the introduction of intrinsic mutual information to precisely measure information flow. However, little is known regarding the potential use of intrinsic mutual information in the inference of directional influences to diagnose interactions from time-series of individual units. We explore this possibility within a minimalistic, mathematically tractable leader–follower model, for which we document an excess of false inferences of intrinsic mutual information compared to transfer entropy. This unexpected finding is linked to a fundamental limitation of intrinsic mutual information, which suffers from the same sins of time-delayed mutual information: a thin tail of the null distribution that favors the rejection of the null-hypothesis of independence. 
        more » 
        « less   
        
    
    
                            - PAR ID:
- 10388275
- Publisher / Repository:
- IOP Publishing
- Date Published:
- Journal Name:
- Journal of Physics: Complexity
- Volume:
- 4
- Issue:
- 1
- ISSN:
- 2632-072X
- Format(s):
- Medium: X Size: Article No. 015001
- Size(s):
- Article No. 015001
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            null (Ed.)The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer.more » « less
- 
            Abstract Some microscopic dynamics are also macroscopically irreversible, dissipating energy and producing entropy. For many-particle systems interacting with deterministic thermostats, the rate of thermodynamic entropy dissipated to the environment is the average rate at which phase space contracts. Here, we use this identity and the properties of a classical density matrix to derive upper and lower bounds on the entropy flow rate from the spectral properties of the local stability matrix. These bounds are an extension of more fundamental bounds on the Lyapunov exponents and phase space contraction rate of continuous-time dynamical systems. They are maximal and minimal rates of entropy production, heat transfer, and transport coefficients set by the underlying dynamics of the system and deterministic thermostat. Because these limits on the macroscopic dissipation derive from the density matrix and the local stability matrix, they are numerically computable from the molecular dynamics. As an illustration, we show that these bounds are on the electrical conductivity for a system of charged particles subject to an electric field.more » « less
- 
            A<sc>bstract</sc> We establish a connection between the averaged null energy condition (ANEC) and the monotonicity of the renormalization group, by studying the light-ray operator ∫duTuuin quantum field theories that flow between two conformal fixed points. In four dimensions, we derive an exact sum rule relating this operator to the Euler coefficient in the trace anomaly, and show that the ANEC implies thea-theorem. The argument is based on matching anomalies in the stress tensor 3-point function, and relies on special properties of contact terms involving light-ray operators. We also illustrate the sum rule for the example of a free massive scalar field. Averaged null energy appears in a variety of other applications to quantum field theory, including causality constraints, Lorentzian inversion, and quantum information. The quantum information perspective provides a new derivation of thea-theorem from the monotonicity of relative entropy. The equation relating our sum rule to the dilaton scattering amplitude in the forward limit suggests an inversion formula for non-conformal theories.more » « less
- 
            A<sc>bstract</sc> The mutual information characterizes correlations between spatially separated regions of a system. Yet, in experiments we often measure dynamical correlations, which involve probing operators that are also separated in time. Here, we introduce a space-time generalization of mutual information which, by construction, satisfies several natural properties of the mutual information and at the same time characterizes correlations across subsystems that are separated in time. In particular, this quantity, that we call thespace-time mutual information, bounds all dynamical correlations. We construct this quantity based on the idea of the quantum hypothesis testing. As a by-product, our definition provides a transparent interpretation in terms of an experimentally accessible setup. We draw connections with other notions in quantum information theory, such as quantum channel discrimination. Finally, we study the behavior of the space-time mutual information in several settings and contrast its long-time behavior in many-body localizing and thermalizing systems.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
