Academia.eduAcademia.edu

Transfer Entropy

description364 papers
group24 followers
lightbulbAbout this topic
Transfer entropy is a statistical measure used to quantify the amount of information transferred from one stochastic process to another. It assesses the directional influence between time series, capturing the flow of information and causal relationships, thereby providing insights into the dynamics of complex systems.
lightbulbAbout this topic
Transfer entropy is a statistical measure used to quantify the amount of information transferred from one stochastic process to another. It assesses the directional influence between time series, capturing the flow of information and causal relationships, thereby providing insights into the dynamics of complex systems.

Key research themes

1. How can Transfer Entropy be extended and interpreted to quantify sensitivity and critical transitions in complex dynamical systems?

This research theme explores theoretical generalizations and interpretations of transfer entropy to capture nuanced aspects of information transfer, especially in systems exhibiting phase transitions, criticality, or sensitivity to control parameters. Understanding these extensions is crucial for applying information-theoretic tools to detect and characterize abrupt changes in complex systems within physics, statistical mechanics, and dynamical systems.

Key finding: Introduces Fisher Transfer Entropy (FTE), a novel information-theoretic measure blending transfer entropy and Fisher information to quantify sensitivity gains to control parameters during state transitions within another... Read more
Key finding: Demonstrates that for a broad class of non-i.i.d., non-ergodic processes, Boltzmann entropy remains valid but assumes generalized functional forms identical to generalized entropies. By considering asymptotic reversible... Read more
Key finding: Proves an exact universal relation between Renyi entropy flow and full counting statistics of energy transfer for a system weakly coupled to a non-equilibrium external time-dependent system. This link allows quantification of... Read more

2. What are the mathematical properties, composition rules, and physical interpretations of generalized entropies related to or extending Transfer Entropy frameworks?

This theme focuses on the foundational mathematical structures behind entropies extending Boltzmann-Gibbs measures, including Tsallis and Renyi entropies, their nonadditive composition rules, and connections to physical properties such as finite heat baths and phase space topology. These generalizations provide alternative frameworks to understand information metrics like transfer entropy in systems with complex correlations or non-Markovian dynamics.

Key finding: Derives how deviations from classical additive entropy arise naturally from finite heat capacity baths and temperature fluctuations, linking the Tsallis and Renyi entropy parameters (q) to physical heat-bath constraints.... Read more
Key finding: Demonstrates that nonadditivity in Tsallis entropy fundamentally arises from the topology of phase space, particularly in small finite systems of point-like particles. Investigates the source term in a nonextensive transport... Read more
Key finding: Analyzes essential properties of Tsallis entropy, including its behavior under transformations, stochastic orders, and aging properties of random lifetimes. Studies its use as an uncertainty measure for coherent and mixed... Read more

3. How does transfer entropy relate to the physical nature, meaning, and experimental quantification of entropy and information flow in thermal and quantum systems?

This theme investigates conceptual and practical aspects of entropy from thermodynamic and quantum perspectives, especially regarding how transfer entropy and related measures connect with entropy's physical meaning, production, and experimental measurement. It includes clarifying the semantics of entropy, the role of symmetry and disorder, and entropy changes in open quantum systems governed by non-Hermitian dynamics, facilitating interpretation and quantification of information transfer in physical systems.

Key finding: Extends von Neumann entropy to quantum systems described by non-Hermitian Hamiltonians accounting for sinks or sources, showing that entropy production is no longer zero but can increase or decrease depending on system... Read more
Key finding: Provides a nuanced physical interpretation of thermodynamic entropy as thermal displacement of thermal energy relative to absolute temperature, arguing entropy reflects random thermal energy redistribution within material... Read more
Key finding: Proposes that ordering corresponds to the introduction of symmetry in physical systems and shows that increasing symmetry diminishes entropy, using binary 1D and 2D systems of elementary magnets as examples. The study... Read more

All papers in Transfer Entropy

Quantifying distributed information processing is crucial to understanding collective motion in animal groups. Recent studies have begun to apply rigorous methods based on information theory to quantify such distributed computation.... more
In the framework of information dynamics, the temporal evolution of coupled systems can be studied by decomposing the predictive information about an assigned target system into amounts quantifying the information stored inside the system... more
This paper deals with the information transfer mechanisms underlying causal relations between brain regions under resting condition. fMRI images of a large set of healthy individuals from the 1000 Functional Connectomes Beijing Zang... more
The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this... more
Finding interdependency relations between (possibly multivariate) time series provides valuable knowledge about the processes that generate the signals. Information theory sets a natural framework for non-parametric measures of several... more
We develop networks of international stock market indices using information and correlation based measures. We use 83 stock market indices of a diversity of countries, as well as their single day lagged values, to probe the correlation... more
This paper presents a data-driven pipeline for studying asymmetries in mutual interdependencies between distinct components of EEG signal. Due to volume conductance, estimating coherence between scalp electrodes may lead to spurious... more
Recent studies suggest that the functional organization of brain networks is altered in patients with severe disorders of consciousness (DOC), including coma [1]. A better characterization of these large-scale disturbances of brain... more
The recurrent circuitry of the cerebral cortex generates an emergent pattern of activity that is organized into rhythmic periods of firing and silence referred to as slow oscillations (ca 1 Hz). Slow oscillations not only are dominant... more
Causal relationships can often be found in visual object tracking between the motions of the camera and that of the tracked object. This object motion may be an effect of the camera motion, e.g. an unsteady handheld camera. But it may... more
Complexity of industrial plants and their stringent environmental and safety regulations have necessitated early detection and isolation of process faults. All the existing fault isolation methods can be categorized into two general... more
Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential,... more
In this study we investigated the olfactory coding mechanisms of the honey bee Apis mellifera by analyzing the dynamics of the antennal lobe glomeruli in response to the presentation of different olfactory stimuli. We produced functional... more
Download research papers for free!