Academia.eduAcademia.edu

Transfer Entropy

description364 papers
group24 followers
lightbulbAbout this topic
Transfer entropy is a statistical measure used to quantify the amount of information transferred from one stochastic process to another. It assesses the directional influence between time series, capturing the flow of information and causal relationships, thereby providing insights into the dynamics of complex systems.
lightbulbAbout this topic
Transfer entropy is a statistical measure used to quantify the amount of information transferred from one stochastic process to another. It assesses the directional influence between time series, capturing the flow of information and causal relationships, thereby providing insights into the dynamics of complex systems.

Key research themes

1. How can Transfer Entropy be extended and interpreted to quantify sensitivity and critical transitions in complex dynamical systems?

This research theme explores theoretical generalizations and interpretations of transfer entropy to capture nuanced aspects of information transfer, especially in systems exhibiting phase transitions, criticality, or sensitivity to control parameters. Understanding these extensions is crucial for applying information-theoretic tools to detect and characterize abrupt changes in complex systems within physics, statistical mechanics, and dynamical systems.

Key finding: Introduces Fisher Transfer Entropy (FTE), a novel information-theoretic measure blending transfer entropy and Fisher information to quantify sensitivity gains to control parameters during state transitions within another... Read more
Key finding: Demonstrates that for a broad class of non-i.i.d., non-ergodic processes, Boltzmann entropy remains valid but assumes generalized functional forms identical to generalized entropies. By considering asymptotic reversible... Read more
Key finding: Proves an exact universal relation between Renyi entropy flow and full counting statistics of energy transfer for a system weakly coupled to a non-equilibrium external time-dependent system. This link allows quantification of... Read more

2. What are the mathematical properties, composition rules, and physical interpretations of generalized entropies related to or extending Transfer Entropy frameworks?

This theme focuses on the foundational mathematical structures behind entropies extending Boltzmann-Gibbs measures, including Tsallis and Renyi entropies, their nonadditive composition rules, and connections to physical properties such as finite heat baths and phase space topology. These generalizations provide alternative frameworks to understand information metrics like transfer entropy in systems with complex correlations or non-Markovian dynamics.

Key finding: Derives how deviations from classical additive entropy arise naturally from finite heat capacity baths and temperature fluctuations, linking the Tsallis and Renyi entropy parameters (q) to physical heat-bath constraints.... Read more
Key finding: Demonstrates that nonadditivity in Tsallis entropy fundamentally arises from the topology of phase space, particularly in small finite systems of point-like particles. Investigates the source term in a nonextensive transport... Read more
Key finding: Analyzes essential properties of Tsallis entropy, including its behavior under transformations, stochastic orders, and aging properties of random lifetimes. Studies its use as an uncertainty measure for coherent and mixed... Read more

3. How does transfer entropy relate to the physical nature, meaning, and experimental quantification of entropy and information flow in thermal and quantum systems?

This theme investigates conceptual and practical aspects of entropy from thermodynamic and quantum perspectives, especially regarding how transfer entropy and related measures connect with entropy's physical meaning, production, and experimental measurement. It includes clarifying the semantics of entropy, the role of symmetry and disorder, and entropy changes in open quantum systems governed by non-Hermitian dynamics, facilitating interpretation and quantification of information transfer in physical systems.

Key finding: Extends von Neumann entropy to quantum systems described by non-Hermitian Hamiltonians accounting for sinks or sources, showing that entropy production is no longer zero but can increase or decrease depending on system... Read more
Key finding: Provides a nuanced physical interpretation of thermodynamic entropy as thermal displacement of thermal energy relative to absolute temperature, arguing entropy reflects random thermal energy redistribution within material... Read more
Key finding: Proposes that ordering corresponds to the introduction of symmetry in physical systems and shows that increasing symmetry diminishes entropy, using binary 1D and 2D systems of elementary magnets as examples. The study... Read more

All papers in Transfer Entropy

Visual tracking of unknown objects in unconstrained video-sequences is extremely challenging due to a number of unsolved issues. This thesis explores several of these and examines possible approaches to tackle them. The unconstrained... more
We follow the main stocks belonging to the New York Stock Exchange and to Nasdaq from 2003 to 2012, through years of normality and of crisis, and study the dynamics of networks built on two measures expressing relations between those... more
The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used in a variety of contexts in artificial intelligence. We show that, when system dynamics are given by distributed nonlinear systems, this... more
Finding interdependency relations between (possibly multivariate) time series provides valuable knowledge about the processes that generate the signals. Information theory sets a natural framework for non-parametric measures of several... more
We develop networks of international stock market indices using information and correlation based measures. We use 83 stock market indices of a diversity of countries, as well as their single day lagged values, to probe the correlation... more
This dissertation consists of six distinct research studies that are broadly classified into two parts. The first part is concerned with the application of emerging data analysis tools rooted in causal inference, nonlinear chaotic... more
Quantifying distributed information processing is crucial to understanding collective motion in animal groups. Recent studies have begun to apply rigorous methods based on information theory to quantify such distributed computation.... more
Animals and humans engage in an enormous variety of behaviors which are orchestrated through a complex interaction of physical and informational processes: the physical interaction of the bodies with the environment is intimately coupled... more
This paper presents a data-driven pipeline for studying asymmetries in mutual interdependencies between distinct components of EEG signal. Due to volume conductance, estimating coherence between scalp electrodes may lead to spurious... more
The recurrent circuitry of the cerebral cortex generates an emergent pattern of activity that is organized into rhythmic periods of firing and silence referred to as slow oscillations (ca 1 Hz). Slow oscillations not only are dominant... more
Causal relationships can often be found in visual object tracking between the motions of the camera and that of the tracked object. This object motion may be an effect of the camera motion, e.g. an unsteady handheld camera. But it may... more
ABSTRACT Early fault detection and isolation in industrial systems is vitally necessary to prevent any potential product damage. The paper proposes a new decentralized multi-unit fault isolation methodology in which all the known process... more
We develop networks of international stock market indices using information and correlation based measures. We use 83 stock market indices of a diversity of countries, as well as their single day lagged values, to probe the correlation... more
Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential,... more
1] Ecohydrological systems may be characterized as nonlinear, complex, open dissipative systems. Such systems consist of many coupled processes, and the couplings change depending on the system state or scale in space and time at which... more
Transfer entropy (TE) is a recently proposed measure of the information flow between coupled linear or nonlinear systems. In this study, we suggest improvements in the selection of parameters for the estimation of TE that significantly... more
This paper presents a data-driven pipeline for studying asymmetries in mutual interdependencies between distinct components of EEG signal. Due to volume conductance, estimating coherence between scalp electrodes may lead to spurious... more
Download research papers for free!